40,00,000 - 50,00,000 INR - Annual
Mumbai
9 - 15 Yrs
Information Technology
Hybrid
Full-Time
Antal International

Overview
Summary role description:
Hiring for a Solution Architect – Databricks for global technology consulting and system integration firm specializing in data engineering, AI and ML.
Company description:
Our client is a US-headquartered global technology consulting and system integration firm specializing in data engineering, artificial intelligence (AI), machine learning (ML), and cloud infrastructure modernization. Renowned for its strategic partnership with Databricks, they emerged as a leader in the data and AI space, delivering tailored, data-driven solutions across Healthcare, BFSI, Retail, and Manufacturing sectors.
Role details:
- Title / Designation: Solution Architect (Databricks)
- Location: Mumbai
- Work Mode: Hybrid
Role & responsibilities:
- Design, develop, and implement scalable data engineering solutions using Spark/Scala, Python, or PySpark.
- Architect and maintain robust data lakes using open-source frameworks like Apache Spark, MLflow, and Delta Lake.
- Lead end-to-end data processing and migration projects across cloud platforms (AWS, Azure, or GCP).
- Develop and optimize data pipelines for batch and real-time processing (e.g., Spark Streaming).
- Work on IoT, event-driven, and microservices-based architectures in private and public cloud environments.
- Drive performance tuning and scalability enhancements for big data solutions.
- Support CI/CD pipelines for data platform deployments.
- Collaborate with cross-functional teams to deliver high-impact data platform and analytics solutions.
- Apply best practices in distributed computing and contribute to the continuous improvement of the data engineering environment.
Candidate requirements:
- Experience: 10+ years in consulting, with 7+ years specifically in data engineering, data platforms, and analytics.
- Technical Expertise:
- Expert-level hands-on coding in Spark/Scala, Python, or PySpark
- In-depth knowledge of Spark architecture: Spark Core, Spark SQL, DataFrames, RDD caching, Spark Streaming, Spark MLlib
- Deep experience in distributed computing and understanding of Spark internals
- Hands-on experience with Databricks development
- Proficient in using cloud platforms (AWS, Azure, or GCP) for data migration and processing
- Familiar with CI/CD tools and practices for production deployments
- Strong understanding of cloud architectures, migration strategies, and trade-offs (private vs. public cloud)
- Excellent problem-solving skills and ability to optimize solutions for performance and scalability
Selection process:
- Interview with Senior Solution Architect
- Interview with CIO
- HR Discussions
Talk to us
Feel free to call, email, or hit us up on our social media accounts.
Email
info@antaltechjobs.in