Overview
Immediate requirement for developers who has sound knowledge in the following skillsets.
· Data Integration (ETL) Experts
· Expertise in Datawarehouse Development and Concepts
o Dimension/Fact
o Type 1/Type 2 SCD
o Data Mart Updates
o Conformed Dimensions via MDM Integration
· Experience with ETL Tools like DBT, Python, SSIS or PowerCenter
· Strong SQL Skills
· Snowflake Experience preferred
· DBT Experience Preferred
· Experience with Life Science/Healthcare Data
Roles and Responsibilities
· Develop scalable, reliable, and efficient data pipelines using dbt, Python, or other ELT tools
· Implement and maintain scalable data orchestration and transformation, ensuring data accuracy, consistency, and timeliness
· Collaborate with cross-functional teams to understand data requirements and translate them into technical solutions
· Work closely with data analysts and data scientists to provide them with clean and structured datasets for analysis and modeling
· Build scalable, complex DBT models
· Design and build scalable data orchestration and transformation
Preferred Qualifications
· Bachelor's degree in Computer Science or Engineering, or related field
· 5+ years experience in data engineering field, with deep SQL knowledge
· 5+ years of working with Git, and demonstrated experience collaborating with other engineers across repositories
· 2+ years of working with Snowflake
· 2+ years working with Reltio
· Familiarity with Azure/AWS
· Certification in Snowflake or DBT is a plus
Interested Candidates drop your resume here. We will get back to you shortly
Job Type: Full-time
Pay: From ₹400,000.00 per year
Benefits:
- Provident Fund
Work Location: In person