Mumbai, Maharashtra, India
Information Technology
Full-Time
Ascendion
Overview
DescriptionWe are seeking an experienced Data Engineer Agentic AI (5 to 9 years) to design, develop, and optimize data pipelines and agentic AI solutions in modern cloud environments. This role demands strong engineering fundamentals, cloud-native expertise, and hands-on involvement in AI/ML and emerging multi?agent AI frameworks. The ideal candidate excels in cross-functional collaboration, thrives in agile environments, and communicates complex ideas effectively.
Key Responsibilities
Data Engineering & Pipeline Development :
- Design, build, and optimize scalable data engineering solutions in Azure/GCP cloud environments.
- Develop secure and reliable ETL/ELT pipelines using Python, PySpark, SQL, and UNIX/Linux scripting.
- Apply expertise in query optimization, data structures, transformations, metadata management, dependency tracking, and workload orchestration.
- Automate workflows using orchestration and scheduling tools.
Work with cloud-native services including :
- Kubernetes, containerized services, cluster management, cloud storage, and workspace management.
- Build and maintain CI/CD pipelines following DevOps best practices.
- Collaborate effectively using Git-based version control in multi-developer environments.
Support and enhance AI/ML initiatives including :
- Feature engineering, model deployment, and MLOps platform integrations.
- Work with AI/ML frameworks such as TensorFlow, PyTorch, and libraries like scikit-learn, MLflow.
- Build solutions using multi-agent AI tech stacks, including :
i. PydanticAI
ii. LangChain
iii. LangGraph
- Exposure to Agent-to-Agent Protocol or Model Context Protocol (MCP) is highly desirable.
- Cross-functional Collaboration & Agile Delivery
- Partner with data scientists, ML engineers, architects, and business teams.
- Translate complex technical concepts for non-technical stakeholders.
- Participate in Agile ceremonies within Scrum or Kanban frameworks.
- Track and manage work effectively using the ServiceNow ticketing system.
- Ensure performance, scalability, and operational excellence across data and AI systems.
- 5 to 9 years of experience in data engineering or related roles.
- Strong hands-on programming in Python, PySpark, SQL, and UNIX/Linux scripting.
- Proven experience designing and deploying data solutions in Azure or GCP (cloud certification preferred).
- Strong understanding of query tuning, distributed computing, data modeling, and data lifecycle management.
- CI/CD pipelines and DevOps tools
- Containerization (Docker), Kubernetes
- Cloud-native compute and storage services
- Excellent communication and documentation skills.
- Experience with automation and orchestration tools (Airflow, Databricks, Prefect, etc.).
- Good understanding of modern API and microservice architectures.
- Knowledge of Agile methodologies (Scrum, Kanban).
(ref:hirist.tech)
Similar Jobs
View All
Talk to us
Feel free to call, email, or hit us up on our social media accounts.
Email
info@antaltechjobs.in