Pune, Maharashtra, India
Information Technology
Full-Time
algoleap
Overview
Senior Data Engineer
We are looking for an experienced and highly skilled Senior Data Engineer to join our data governance team. This role is pivotal in driving our data strategy forward, with a primary focus on leading the migration of our data infrastructure from PostgreSQL to Snowflake.
The ideal candidate is a hands-on expert in building and managing large-scale data pipelines and possesses deep, practical experience in complex database migrations. As a technical expert, you will be responsible for executing critical projects, mentoring team members, and ensuring the highest standards of quality and accountability .
Key Responsibilities
– Plan, and execute the end-to-end migration of large-scale datasets and data pipelines from PostgreSQL to Snowflake, ensuring minimal downtime and data integrity.
– Design, build, and optimize robust, scalable, and automated ETL/ELT data pipelines using Python and modern data engineering technologies.
– Guide and mentor other data engineers, fostering a culture of technical excellence, collaboration, and knowledge sharing. Provide code reviews and architectural oversight.
– Take full ownership of data engineering projects from conception through to deployment and ongoing maintenance. Be accountable for the quality, reliability, and timeliness of deliverables.
– Work closely with the team, and business stakeholders to understand their data needs and deliver high-quality data solutions that drive business value.
– Tackle complex data challenges, troubleshoot production issues, and implement performance optimizations within our data warehouse and pipeline infrastructure.
Skills & Qualifications
– Bachelor's or Master's degree in Computer Science, Engineering, or a related quantitative field.
– 5+ years of relevant professional experience in a data engineering role.
– Proven, hands-on experience leading at least one significant, large-scale data migration from PostgreSQL to Snowflake.
– Expert-level proficiency in Python for data processing and pipeline orchestration (e.g., using libraries like Pandas, SQLAlchemy, and frameworks like Airflow or Dagster).
– Deep expertise in advanced SQL, data modeling, and data warehousing concepts.
– Strong understanding of the Snowflake architecture, features, and best practices.
– Familiarity with cloud services (AWS, GCP, or Azure) and their data-related offerings.
– Excellent problem-solving skills, a meticulous attention to detail, and a proven ability to manage multiple projects with tight deadlines.
– Strong communication and teamwork skills, with a collaborative mindset and a genuine willingness to help others succeed in a fast-paced, innovative environment.
Other Key Expectations
We are looking for an experienced and highly skilled Senior Data Engineer to join our data governance team. This role is pivotal in driving our data strategy forward, with a primary focus on leading the migration of our data infrastructure from PostgreSQL to Snowflake.
The ideal candidate is a hands-on expert in building and managing large-scale data pipelines and possesses deep, practical experience in complex database migrations. As a technical expert, you will be responsible for executing critical projects, mentoring team members, and ensuring the highest standards of quality and accountability .
Key Responsibilities
– Plan, and execute the end-to-end migration of large-scale datasets and data pipelines from PostgreSQL to Snowflake, ensuring minimal downtime and data integrity.
– Design, build, and optimize robust, scalable, and automated ETL/ELT data pipelines using Python and modern data engineering technologies.
– Guide and mentor other data engineers, fostering a culture of technical excellence, collaboration, and knowledge sharing. Provide code reviews and architectural oversight.
– Take full ownership of data engineering projects from conception through to deployment and ongoing maintenance. Be accountable for the quality, reliability, and timeliness of deliverables.
– Work closely with the team, and business stakeholders to understand their data needs and deliver high-quality data solutions that drive business value.
– Tackle complex data challenges, troubleshoot production issues, and implement performance optimizations within our data warehouse and pipeline infrastructure.
Skills & Qualifications
– Bachelor's or Master's degree in Computer Science, Engineering, or a related quantitative field.
– 5+ years of relevant professional experience in a data engineering role.
– Proven, hands-on experience leading at least one significant, large-scale data migration from PostgreSQL to Snowflake.
– Expert-level proficiency in Python for data processing and pipeline orchestration (e.g., using libraries like Pandas, SQLAlchemy, and frameworks like Airflow or Dagster).
– Deep expertise in advanced SQL, data modeling, and data warehousing concepts.
– Strong understanding of the Snowflake architecture, features, and best practices.
– Familiarity with cloud services (AWS, GCP, or Azure) and their data-related offerings.
– Excellent problem-solving skills, a meticulous attention to detail, and a proven ability to manage multiple projects with tight deadlines.
– Strong communication and teamwork skills, with a collaborative mindset and a genuine willingness to help others succeed in a fast-paced, innovative environment.
Other Key Expectations
- Ideal candidate is expected to work on-site (Hyderabad or Gurugram location) 12 days per month or 3 days per week.
- Ideal candidate should have strong communication and teamwork skills, with a collaborative spirit and a willingness to help others when needed.
Similar Jobs
View All
Talk to us
Feel free to call, email, or hit us up on our social media accounts.
Email
info@antaltechjobs.in