Information Technology
Full-Time
Reyna Solutions
Overview
About The Role
We are looking for a skilled and motivated Data Engineer with strong expertise in Python and Snowflake to join our data team. The ideal candidate will be responsible for designing, developing, and maintaining robust ETL/ELT data pipelines that enable efficient data integration, transformation, and management. This role offers the opportunity to work closely with data analysts, data scientists, and business stakeholders to drive data-driven decision-making through high-quality, reliable data Responsibilities :
We are looking for a skilled and motivated Data Engineer with strong expertise in Python and Snowflake to join our data team. The ideal candidate will be responsible for designing, developing, and maintaining robust ETL/ELT data pipelines that enable efficient data integration, transformation, and management. This role offers the opportunity to work closely with data analysts, data scientists, and business stakeholders to drive data-driven decision-making through high-quality, reliable data Responsibilities :
- Develop, optimize, and maintain scalable ETL/ELT pipelines using Python and Snowflake to support data ingestion and transformation from multiple sources.
- Design and implement efficient data models, schemas, and stored procedures in Snowflake to enable robust data storage and querying capabilities.
- Create and enhance Python-based frameworks to automate onboarding and integration of new external datasets, ensuring smooth data ingestion processes.
- Transform and store semi-structured data (JSON, XML, Parquet, etc.) in Snowflake, ensuring optimal performance and usability.
- Write clean, reusable, and efficient Python code for data manipulation, automation of workflows, and scripting tasks to improve operational efficiency.
- Collaborate closely with data analysts, data scientists, and business teams to gather and understand detailed data requirements, translating them into technical solutions.
- Ensure data quality, data integrity, and governance standards are adhered to across all systems and processes.
- Manage large volumes of data and automate complex data workflows using Python scripts and RESTful APIs.
- Monitor, troubleshoot, and optimize the performance of data pipelines to ensure cost efficiency, scalability, and reliability.
- Work in coordination with DevOps and cloud infrastructure teams to deploy, maintain, and support data engineering solutions in cloud environments (AWS, Azure, & Experience :
- Bachelors or Masters degree in Computer Science, Engineering, Data Science, or a related technical discipline.
- Proven experience (typically 3-6 years) in data engineering or related roles with hands-on expertise in Python and Snowflake.
- Strong proficiency in Python for data processing, automation, and scripting tasks.
- Experience with Snowflake data platform including designing schemas, writing SQL queries, stored procedures, and managing data ingestion workflows.
- Solid understanding of ETL/ELT concepts, data pipeline architecture, and best practices in data engineering.
- Familiarity with handling and transforming semi-structured data formats such as JSON, XML, Avro, or Parquet.
- Experience working with APIs and integrating data from multiple external sources.
- Strong problem-solving skills, analytical mindset, and attention to detail with a focus on data accuracy and quality.
- Ability to work collaboratively across teams including data science, analytics, and business units to deliver data solutions aligned with organizational needs.
- Knowledge of cloud environments (AWS, Azure, GCP) and DevOps practices related to data deployment and monitoring is preferred.
- Excellent communication skills to articulate technical concepts clearly to both technical and non-technical stakeholders.
Similar Jobs
View All
Talk to us
Feel free to call, email, or hit us up on our social media accounts.
Email
info@antaltechjobs.in