Bangalore, Karnataka, India
Information Technology
Full-Time
Acuity Analytics
Overview
Job PurposeAs an Sr. Data Architect, you will design and build data architecture for data science solutions agentic AI systems that integrate with enterprise data platforms and applications. You will work on multi-agent architecture, tool-based LLM workflows (MCP), and production-grade GenAI integration, creating agents that can reason, collaborate, call tools/APIs, and deliver reliable, explainable outcomes to business users.
Desired Skills And Experience
Required Skills & Qualifications:
- 8+ years in data/AI architecture; hands‑on Azure & Snowflake expertise (compute, storage, networking, security, observability).
- Azure expertise across Azure Functions, ADF, Cosmos/DynamoDB, Azure KeyVault, VNet, API management, AKS, ACR etc.
- Advanced SQL & Python skills and familiarity with PySpark for ETL scripting
- Experience integrating data from APIs and relational databases, setting up medallion architecture, performance tuning tables, queries, and stored procedures
- Strong knowledge of Snowflake (Snowpipe, Database, Warehouse) including Lambda-Snowflake queries and data synchronization
- Expertise (preferably with Snowflake Cortex AI module) and strong understanding of MCP protocol for agentic orchestration.
- Recent, hands-on experience with AI / ML data pipelines
- CI / CD knowledge for DevOps (build, release, version control)
- Strong understanding of data architecture and cloud infrastructure for AI workloads
- Strong understanding of data governance best practices for data quality, cataloguing, and lineage.
- Excellent communication skills, both written and verbal
- Experience with delivering projects within an agile environment
- Experience in project management and team management
A candidate will be responsible for delivery of work including:
- Come up with architecture designs (Arch. Designs), define and own the agentic AI and data science architecture across cloud (Azure, Snowflake), enabling robust production pipelines and a service layer that bridges web/app experiences with the agentic platform
- Understand and refine the Data Governance (Data Gov.) roadmap, including data cataloguing (Data Cat.), lineage, and quality
- Leverage the Snowflake Model Context Protocol (MCP) Server to build and integrate AI-powered data applications and agents
- Design end‑to‑end AI pipelines (ingestion > embeddings >vector search >model orchestration >memory >multi‑agent reasoning) with reliability, alignment, and governance
- Own architectural decisions for agentic systems: reliability, security, scalability, cost, and governance
- Design service layer (API gateway, contracts, auth., rate‑limits) mediating web/app & agentic platform
- Architect AI pipelines (data ingest, embeddings, vector stores, orchestration, memory/state, reasoning)
- Design and manage implementation of high-performance ELT/ETL pipelines
- Provide architectural guidance on data modeling, data mesh, and data lake structures
- Ensure cost optimization, platform security, and scalability across deployments
Collaborate with engineering to implement context‑aware agents capable of autonomous planning and tool execution
- Define observability & audit (AnswerLog, telemetry, lineage); publish architecture docs/runbooks for reuse across programs
Similar Jobs
View All
Talk to us
Feel free to call, email, or hit us up on our social media accounts.
Email
info@antaltechjobs.in