Overview
Alright, so you're sold. But who are we?
Diversity and humility are not just big words hung up on the walls. At CredAble, we put people at the heart of everything we do and our core values are the driving force behind our success.
CredAble is an NBFC technology-powered supply chain funding solutions company Leveraging our trade finance expertise, technology platform, and access to 3rd party capital, we arrange funding programs for enterprise supply chains and do direct lending to SMEs. Led by a team of industry experts, CredAble is at the forefront of powering tech-enabled working capital financing. Programs are anchored around enterprise clients, where we provide funding linked to transactions with suppliers (payables) and distributors (receivables). We are series B funded startup with Axis Bank limited as a strategic investor.
You Will Be Responsible For
- Develop, maintain, and optimize scalable data pipelines to ensure efficient data extraction, transformation, and loading (ETL).
- Build data solutions on GCP using services like BigQuery, Cloud Storage, Dataflow, and Pub/Sub.
- Use Python for data processing and automation, integrating with GCP services for robust, scalable data pipelines.
- Develop and maintain data models and ETL workflows in collaboration with data scientists and analysts.
- Leverage Looker to create intuitive, data-driven dashboards and reports that provide business insights.
- Use JavaScript for building interactive data visualizations and custom web applications that enhance data usability.
- Troubleshoot and resolve data processing and infrastructure issues, ensuring data reliability and quality.
- Work closely with cross-functional teams to ensure seamless data flow across platforms and systems.
- Ensure optimal performance and scalability of data pipelines and queries.
- Provide support for data governance, including security, compliance, and documentation.
Vice President
What will you bring to the table?
- Proficiency in Google Cloud Platform (GCP), particularly with services like BigQuery, Cloud Storage, Dataflow, and Pub/Sub.
- Strong Python skills for data processing, automation, and integration with cloud services.
- Experience with Looker for building and maintaining interactive dashboards and reports.
- Familiarity with JavaScript for creating custom web interfaces or visualizations or automation, and integration with cloud services.
- Solid understanding of ETL processes, data warehousing, and data pipeline architectures.
- Familiarity with Git for version control and CI/CD practices for smooth development workflows.
- Strong problem-solving skills, with an ability to troubleshoot and resolve data-related issues.
- Ability to work in a fast-paced, collaborative environment, interacting with cross-functional teams.
- Experience with Apache Airflow for orchestrating and scheduling data pipelines.
- Familiarity with data governance practices, data quality tools, and security protocols.
- Experience with Docker and Kubernetes for containerized deployment of data workflows.
- Knowledge of machine learning models and integrating them into data pipelines.
Bachelor’s degree in Computer Science, Engineering, Mathematics, or a related field (or equivalent experience).
Besides making the best move of your career, what’s in it for you?
- Working in a highly entrepreneurial setup with a visionary team passionate to help scale new heights of business success.
- Exposure to exploring limitless possibilities and ideas no matter how impossible they may seem today.
- CredAble thrives on transparency and a culture to nurture growth.
- Being part of CredAble enables you to push beyond the ordinary.