Indore, Madhya Pradesh, India
Information Technology
Full-Time
Etelligens Technologies
Overview
GCP Data Engineer
We are looking for a GCP Data Engineer to design, develop, and maintain scalable data pipelines and cloud-based data platforms.
You will work on building and optimizing data workflows, implementing robust data solutions using Google Cloud Platform (GCP) technologies, and collaborating closely with cross-functional teams to deliver high-impact, data-driven insights.
This role requires a deep understanding of data architecture, GCP ecosystem, ETL/ELT processes, and the ability to lead, mentor, and execute with precision.
Key Responsibilities
We are looking for a GCP Data Engineer to design, develop, and maintain scalable data pipelines and cloud-based data platforms.
You will work on building and optimizing data workflows, implementing robust data solutions using Google Cloud Platform (GCP) technologies, and collaborating closely with cross-functional teams to deliver high-impact, data-driven insights.
This role requires a deep understanding of data architecture, GCP ecosystem, ETL/ELT processes, and the ability to lead, mentor, and execute with precision.
Key Responsibilities
- Design, build, and maintain robust data extraction, transformation, and loading (ETL/ELT) pipelines across both on-premises and cloud platforms.
- Develop and support data products, pipelines, and analytical platforms leveraging GCP services.
- Perform application impact assessments, requirement reviews, and provide accurate work estimates.
- Create test strategies and implement site reliability engineering (SRE) measures for data systems.
- Participate in agile development sprints and contribute to solution design reviews.
- Mentor and guide junior Data Engineers on best practices and design patterns.
- Lead root cause analysis and resolution of critical data operations and post-implementation issues.
- Conduct technical data stewardship activities, including metadata management, data security, and privacy-by-design principles.
- Use Python and GCP technologies to automate data workflows and transformations.
- Work with SQL for data modeling, transformations, and analytical queries.
- Automate job scheduling and orchestration using Control-M, Apache Airflow, or Prefect.
- Write Unix shell scripts to support automation and monitoring of data operations.
- Support BI/analytics teams with structured and well-modeled data.
- Use Infrastructure as Code (IaC) tools like Terraform, Ansible, or Puppet for automated deployments and configuration management.
- Strong experience with Python, SQL, and Unix/Linux scripting.
- Proficient in GCP Data Services.
- Experience in designing and managing ETL/ELT pipelines across hybrid environments.
- Working knowledge of orchestration tools: Apache Airflow, Control-M, or Prefect.
- Understanding of modern data warehousing and cloud-based analytics architecture.
- Familiarity with Infrastructure-as-Code using Terraform, Puppet, or Ansible.
- Strong debugging and problem-solving abilities in complex data environments.
- Ability to work in Agile teams and deliver in short sprint cycles.
- Bachelors degree in Computer Science, Software Engineering, Data Science, Mathematics, or related field.
- 4+ years of hands-on experience in data engineering.
- 2+ years of experience in data architecture and solution design.
- GCP Certified Data Engineer certification is preferred.
- Excellent communication skills and the ability to collaborate with cross-functional teams.
Similar Jobs
View All
Talk to us
Feel free to call, email, or hit us up on our social media accounts.
Email
info@antaltechjobs.in