Overview
Responsibilities:
· Design, develop, and optimize ETL/ELT pipelines to migrate and transform data from SAP to Snowflake.
· Develop and maintain scalable data models (fact and dimension tables) within Snowflake to support analytical use cases.
· Collaborate with SAP, AtScale, and Power BI teams to ensure seamless data flow across the platform.
· Write and optimize SQL queries in Snowflake for performance, scalability, and data integrity.
· Ensure data quality and consistency in the Snowflake data warehouse.
· Work with stakeholders to understand data requirements and ensure alignment between data architecture and business goals.
· Continuously monitor and tune data performance, troubleshooting issues as needed.
Requirements:
· 5+ years of experience as a Data Engineer, with 3+ years of hands-on experience in Snowflake.
· Expertise in SQL, Snowflake Architecture, and Snowflake best practices (e.g., data clustering, partitioning, etc.).
· Experience working with ETL/ELT tools and integrating data from multiple sources (preferably SAP).
· Strong knowledge of data pipelines, data integration, and cloud data platforms.
· Familiarity with data modeling principles and best practices for building data warehouses.
· Strong problem-solving skills and experience in optimizing large datasets for query performance.
Job Type: Full-time
Pay: Up to ₹100,000.00 per month
Application Deadline: 24/04/2025