Bangalore, Karnataka, India
Information Technology
Full-Time
Platform 3 Solutions
Overview
We're looking for a hands-on Google Cloud Platform consultant to design, build, and operate scalable data platforms on GCP. The right candidate will be fluent with GCP data services (Dataproc, Cloud Storage, BigQuery, Data Catalogue, IAP, Cloud Asset Inventory), able to containerise and deploy workloads (Kubernetes/GKE preferred), and comfortable translating existing AWS deployments into equivalent, secure, cost-effective GCP architectures. DevOps experience (CI/CD, Terraform, Docker) and production support skills are a strong plus.
Responsibilities
- Architect, deploy, and operate data processing and analytics solutions on GCP (Dataproc, BigQuery, Cloud Storage, Data Catalogue, Dataflow/Pub-Sub, where appropriate).
- Translate/map existing AWS data/analytics deployments to GCP plan migrations, identify gaps, estimate costs, and provide runbooks.
- Build and maintain containerised workloads and orchestrate them on GKE; help teams adopt Kubernetes best practices.
- Implement IAM, IAP, and Cloud Asset Inventory to secure resources and manage access/control.
- Implement Infrastructure-as-Code (Terraform/Deployment Manager) for repeatable provisioning.
- Develop CI/CD pipelines for data applications (Cloud Build, Jenkins, GitHub Actions, or similar).
- Optimise performance and costs for BigQuery, Dataproc clusters, and storage architecture.
- Tune Spark jobs and manage Dataproc cluster configurations.
- Implement monitoring, logging, alerting, and operational dashboards on Cloud Monitoring and Logging.
- Troubleshoot production incidents, perform root cause analysis, and drive corrective actions.
- Document architectures, runbooks, operational runbooks, and migration playbooks.
- Coach/partner with application and AWS teams to ensure parity and knowledge transfer.
- Strong hands-on experience with GCP services: Dataproc, Cloud Storage, BigQuery, Data Catalogue, IAM, IAP, Cloud Asset Inventory.
- Proficiency in Docker and Kubernetes with practical knowledge of deployments, services, scaling, and resource management.
- Understanding of Spark, ETL design, and Parquet-based processing.
- Strong networking and security understanding (VPC, IAM, firewall rules, service accounts).
- Practical exposure to DevOps processes, CI/CD, and automation workflows.
- Experience with Infrastructure-as-Code using Terraform.
- Strong ability to interpret AWS deployments and convert them to GCP equivalents.
- Experience with Cloud Monitoring, Logging, and building alerting systems.
- Effective communication and documentation skills.
- Experience with Dataflow, Pub/Sub, Composer, or Cloud Functions.
- Understanding of hybrid and multi-cloud architectures.
- Familiarity with encryption, DLP, and compliance frameworks.
- Knowledge of cost management strategies across both AWS and GCP.
This job was posted by Tvarithaa S from Platform 3 Solutions.
Similar Jobs
View All
Talk to us
Feel free to call, email, or hit us up on our social media accounts.
Email
info@antaltechjobs.in