Kolkata, West Bengal, India
Information Technology
Full-Time
AuxoAI
Overview
DescriptionAuxoAI is hiring a Data Architect GCP to lead enterprise data platform design, architecture modernization, and solution delivery across global client engagements. In this client-facing role, you will architect scalable data platforms using GCP-native services, guide onshore/offshore data engineering teams, and define best practices across ingestion, transformation, governance, and consumption layers.
Role
This role is ideal for someone who combines deep GCP platform expertise with leadership experience, and is confident working with both engineering teams and executive :
- Design and implement enterprise-scale data architectures using GCP services, with BigQuery as the central analytics platform
- Lead end-to-end implementation of medallion architecture (Raw Processed Curated) patterns
- Oversee data ingestion pipelines using Cloud Composer, Dataflow (Apache Beam), Pub/Sub, and Cloud Storage
- Implement scalable ELT workflows using Dataform and modular SQLX transformations
- Optimize BigQuery workloads through advanced partitioning, clustering, and materialized views
- Lead architectural reviews, platform standardization, and stakeholder engagements across engineering and business teams
- Implement data governance frameworks leveraging tools like Atlan, Collibra, and Dataplex
- Collaborate with ML teams to support Vertex AI-based pipeline design and model deployment
- Enable downstream consumption through Power BI, Looker, and optimized data marts
- Drive adoption of Infrastructure-as-Code (Terraform) and promote reusable architecture templates
- Manage a distributed team of data engineers; set standards, review code, and ensure platform stability
- 10+ years of experience in data architecture and engineering
- 4+ years of hands-on GCP experience, including BigQuery, Dataflow, Cloud Composer, Dataform, and Cloud Storage
- Deep understanding of streaming + batch data patterns, event-driven ingestion, and modern warehouse design
- Proven leadership of cross-functional, distributed teams in client-facing roles
- Strong programming skills in Python and SQL
- Experience working with data catalog tools (Atlan, Collibra), Dataplex, and enterprise source connectors
- Excellent communication and stakeholder management skills
- GCP Professional Data Engineer or Cloud Architect certification
- Experience with Vertex AI Model Registry, Feature Store, or ML pipeline integration
- Familiarity with AlloyDB, Cloud Spanner, Firestore, and enterprise integration tools (e.g., Salesforce, SAP, Oracle)
- Background in legacy platform migration (Oracle, Azure, SQL Server)
(ref:hirist.tech)
Similar Jobs
View All
Talk to us
Feel free to call, email, or hit us up on our social media accounts.
Email
info@antaltechjobs.in