Overview
Role: Cloud Data Engineer (GCP)
Experience: 8+ Years
Location: India (Hybrid/Remote)
Notice Period: Immediate to 15 days max.
Job Description:
Required Skills: Python, ETL, SQL, GCP, Bigquery, Pub/Sub, Airflow.
Good to Have: DBT, Data mesh
Job Title: Senior GCP Engineer – Data Mesh & Data Product Specialist
We are hiring a Senior GCP Developer to join our high-performance data engineering team. This is a mission-critical role where you will design, build, and maintain scalable ETL pipelines and frameworks in a Data Mesh architecture. You will work with modern tools like Python, dbt, BigQuery (GCP), and SQL to deliver high-quality data products that power decision-making across the organization.
We are looking for a highly skilled professional who thrives in demanding environments, takes ownership of their work, and delivers results with precision and reliability.
Key Responsibilities
* Design, Build, and Maintain ETL Pipelines: Develop robust, scalable, and efficient ETL workflows to ingest, transform, and load data into distributed data products within the Data Mesh architecture.
* Data Transformation with dbt: Use dbt to build modular, reusable transformation workflows that align with the principles of Data Products.
* Cloud Expertise: Leverage Google Cloud Platform (GCP) services such as BigQuery, Cloud Storage, Pub/Sub, and Dataflow to implement highly scalable data solutions.
* Data Quality & Governance: Enforce strict data quality standards by implementing validation checks, anomaly detection mechanisms, and monitoring frameworks.
* Performance Optimization: Continuously optimize ETL pipelines for speed, scalability, and cost efficiency.
* Collaboration & Ownership: Work closely with data product owners, BI developers, and stakeholders to understand requirements and deliver on expectations. Take full ownership of your deliverables.
* Documentation & Standards: Maintain detailed documentation of ETL workflows, enforce coding standards, and adhere to best practices in data engineering.
* Troubleshooting & Issue Resolution: Proactively identify bottlenecks or issues in pipelines and resolve them quickly with minimal disruption.
Required Skills & Experience
* 10+ or 7+ years of hands-on experience in designing and implementing ETL workflows in large-scale environments (Lead & Dev)
* Advanced proficiency in Python for scripting, automation, and data processing.
* Expert-level knowledge of SQL for querying large datasets with performance optimization techniques.
* Deep experience working with modern transformation tools like dbt in production environments.
* Strong expertise in cloud platforms like Google Cloud Platform (GCP) with hands-on experience using BigQuery.
* Familiarity with Data Mesh principles and distributed data architectures is mandatory.
* Proven ability to handle complex projects under tight deadlines while maintaining high-quality standards.
* Exceptional problem-solving skills with a strong focus on delivering results.
Job Type: Permanent
Pay: ₹3,000,000.00 - ₹4,500,000.00 per year
Schedule:
- Day shift
- Monday to Friday
Experience:
- Python: 4 years (Required)
- ETL: 4 years (Required)
- SQL: 5 years (Required)
- Bigquery: 3 years (Required)
- Airflow: 3 years (Required)
- Data Mesh: 4 years (Required)
- GCP: 7 years (Required)
- Data Products: 5 years (Required)
Work Location: Remote