Overview
About Exponentia.aiExponentia.ai
is a fast-growing AI-first technology services company, partnering with enterprises to shape and accelerate their journey to AI maturity. With a presence across the US, UK, UAE, India, and Singapore, we bring together deep domain knowledge, cloud-scale engineering, and cutting-edge artificial intelligence to help our clients transform into agile, insight-driven organizations.
We are proud partners with global technology leaders such as Databricks, Microsoft, AWS, and Qlik, and have been consistently recognized for innovation, delivery excellence, and trusted advisories.
Awards & Recognitions
- Innovation Partner of the Year – Databricks, 2024
- Digital Impact Award, UK – 2024 (TMT Sector)
- Rising Star – APJ Databricks Partner Awards 2023
- Qlik’s Most Enabled Partner – APAC
With a team of 450+ AI engineers, data scientists, and consultants, we are on a mission to redefine how work is done, by combining human intelligence with AI agents to deliver exponential outcomes.
Learn more: www.exponentia.ai
Role Overview
We are seeking a highly skilled Azure Data Engineer to lead Business Intelligence Operations. This role involves collaborating with senior stakeholders and CXOs to design, implement, and manage advanced data solutions leveraging Azure Data Services and Databricks. The ideal candidate will have strong technical expertise, leadership capabilities, and experience in managing large-scale data engineering projects.
Key Responsibilities
- Stakeholder Collaboration: Work closely with CXOs and business teams to understand source data and translate business requirements into technical solutions using Databricks, Azure Synapse, and Azure Data Factory.
- ETL/ELT Development: Design, implement, and manage end-to-end ETL/ELT processes for MIS automation, system integrations, and streaming data ingestion using Delta Live Tables.
- Data Lake Governance: Ensure proper access control and CI/CD governance for data lake environments.
- Change Management: Review existing data pipelines, SQL & Python code for potential issues, and manage change requests through the Change Approval Board.
- Pipeline Delivery: Estimate and deliver ad-hoc and scheduled data pipelines processing high-volume structured and unstructured data.
- Monitoring & Troubleshooting: Set up proactive ETL monitoring and resolve ETL issues promptly.
- Compliance: Ensure adherence to data governance, security, and masking standards for all data deliveries.
- Coding Standards: Guarantee that all developments follow standard coding patterns and are fully documented for audit purposes.