Overview
Job Description : A minimum of 8-10 years of experience in data engineering, encompassing the development and scaling of data
warehouse and data lake platforms.Working hours-8 hours , with a few hours of overlap during EST Time zone. This overlap hours is mandatory as meetings happen during this overlap hours. Working hours will be 12 PM-9 PM
Responsibilities :
Mandatory Skills: Snowflake experiance, Data Architecture experiance, ETL process experiance, Large Data migration solutioning experiance
- Lead the design and architecture of data solutions leveraging Snowflake, ensuring scalability, performance, and reliability.
- Collaborate with stakeholders to understand business requirements and translate them into technical specifications and data models.
- Develop and maintain data architecture standards, guidelines, and best practices, including data governance principles and DataOps methodologies.
- Oversee the implementation of data pipelines, ETL processes, and data governance frameworks within Snowflake environments.
- Provide technical guidance and mentorship to data engineering teams, fostering skill development and knowledge sharing.
- Conduct performance tuning and optimization of Snowflake databases and queries.
- Stay updated on emerging trends and advancements in Snowflake, cloud data technologies, data governance ,and Data Ops practices.
·Extensive experience in designing and implementing data solutions using Snowflake. DBT,
- Proficiency in data modeling, schema design, and optimization within Snowflake environments.
- Strong understanding of cloud data warehousing concepts and best practices, particularly with Snowflake. Expertise in Dimension Modeling is a must
- Expertise in python/java/scala, SQL, ETL processes, and data integration techniques, with a focus on Snowflake.
- Familiarity with other cloud platforms and data technologies (e.g., AWS, Azure, GCP )
- Demonstrated experience in implementing data governance frameworks and DataOps practices.
- Working experience in SAP environments
- Familiarity with realtime streaming technologies and Change Data Capture (CDC) mechanisms.
- Knowledge of data governance principles and DataOps methodologies
- Proven track record of architecting and delivering complex data solutions in cloud platforms/ Snowflake.
Secondary Skills (If Any): ·Experience with data visualization tools (e.g., Tableau, Power BI) is a plus.
- Knowledge of data security and compliance standards
- Excellent communication and presentation skills, with the ability to convey complex technical concepts to juniors, non-technical stakeholders.
- Strong problem-solving and analytical skills -Ability to work effectively in a collaborative team environment and lead cross-functional initiatives.
- Bachelor’s or Master’s degree in Computer Science, Information Systems, or related field.
·Certifications related Snowflake (e.g., SnowPro core/Snowpro advanced Architect/Snowpro advance Data
Engineer ) are desirable but not mandatory.