Overview
Employment Type
Full-time / Contract
Work Location
Remote / On-site (Location to be confirmed)
Experience Level
4–9 Years
Educational Requirements
Bachelor's in Technology or related field; Master’s degree preferred
Job Description
We are looking for a skilled GCP Data Engineer to join our team and work on building and optimizing data solutions on Google Cloud Platform. The ideal candidate will have strong experience in GCP services, data warehousing, and real-time analytics solutions. You will be responsible for implementing end-to-end data pipelines and supporting advanced analytics initiatives.
Key Responsibilities
· Design and implement scalable data pipelines using GCP services such as Dataflow, Dataproc, and Cloud Composer
· Develop and manage BigQuery data warehouses and optimize query performance
· Ingest, transform, and enrich data from various sources using Python or SQL
· Collaborate with data analysts, data scientists, and stakeholders to define data requirements
· Implement data security, governance, and quality controls across pipelines
· Leverage Pub/Sub for real-time data streaming and messaging
· Use Terraform or Deployment Manager for infrastructure automation
· Set up and manage CI/CD pipelines using Cloud Build or Jenkins
· Ensure system reliability and maintainability through monitoring and logging (Stackdriver, Cloud Monitoring)
Required Skills & Technologies
· Strong hands-on experience with GCP data services like BigQuery, Dataflow, Pub/Sub, and Dataproc
· Proficiency in Python, SQL, and data modeling techniques
· Experience with ETL/ELT tools and orchestration using Cloud Composer or Apache Airflow
· Knowledge of data lake and data warehouse best practices
· Understanding of IAM roles, VPCs, and GCP networking fundamentals
· Familiarity with CI/CD processes and tools (Cloud Build, Git, Jenkins)
· Experience in handling structured and unstructured data at scale
Preferred Skills
· Experience with Apache Beam or Spark on GCP
· Knowledge of Looker or Data Studio for BI and reporting
· GCP Professional Data Engineer certification is a plus
· Familiarity with Kubernetes and containerized deployments on GKE
Job Types: Full-time, Permanent, Contractual / Temporary, Freelance
Contract length: 12 months
Pay: Up to ₹120,172.87 per month
Benefits:
- Flexible schedule
- Paid sick time
- Paid time off
- Work from home
Schedule:
- Day shift
- Monday to Friday
- Morning shift
- Rotational shift
- UK shift
- US shift
Application Question(s):
- What is your current and expected CTC?
Work Location: Remote