Free cookie consent management tool by TermsFeed Data Engineer- Databricks | Antal Tech Jobs
Back to Jobs
3 Days ago

Data Engineer- Databricks

decor
Vishakhapatnam, Andhra Pradesh, India
Information Technology
Full-Time
Vadis Technologies A Moody's Company

Overview

Location(s):

  • Quay Building 8th Floor, Bagmane Tech Park, Bengaluru, IN

Line Of Business: Data Estate(DE)

Job Category

  • Engineering & Technology

Experience Level: Experienced Hire

At Moody's, we unite the brightest minds to turn today’s risks into tomorrow’s opportunities. We do this by striving to create an inclusive environment where everyone feels welcome to be who they are-with the freedom to exchange ideas, think innovatively, and listen to each other and customers in meaningful ways.

If you are excited about this opportunity but do not meet every single requirement, please apply! You still may be a great fit for this role or other open roles. We are seeking candidates who model our values: invest in every relationship, lead with curiosity, champion diverse perspectives, turn inputs into actions, and uphold trust through integrity.

Job Summary

  • We are seeking a highly experienced and motivated Senior Data Engineer to join our dynamic team. The ideal candidate possesses a strong software engineering background and deep expertise in designing, building, optimizing, and maintaining scalable data pipelines and infrastructure. You will leverage your extensive experience with Apache Spark, Apache Kafka, and various big data technologies to process and manage large datasets effectively. Working within an Agile/Scrum environment, you will take ownership of complex tasks, delivering high-quality, well-tested solutions independently. Responsibilities:
  • Design, develop, implement, and maintain robust, scalable, and efficient batch and real-time data pipelines using Apache Spark (Python/PySpark and Scala) and Apache Kafka.
  • Work extensively with large, complex datasets residing in various storage systems (e.g., data lakes, data warehouses, distributed file systems).
  • Build and manage real-time data streaming solutions to ingest, process, and serve data with low latency using Apache Kafka.
  • Optimize data processing jobs and data storage solutions for performance, scalability, and cost-effectiveness within big data ecosystems.
  • Implement comprehensive automated testing (unit, integration, end-to-end) to ensure data quality, pipeline reliability, and code robustness.
  • Collaborate closely with data scientists, analysts, software engineers, and product managers to understand data requirements and deliver effective solutions.
  • Actively participate in Agile/Scrum ceremonies, including sprint planning, daily stand-ups, sprint reviews, and retrospectives.
  • Take ownership of assigned tasks and projects, driving them to completion independently while adhering to deadlines and quality standards.
  • Troubleshoot and resolve complex issues related to data pipelines, platforms, and performance.
  • Contribute to the evolution of our data architecture, standards, and best practices.
  • Mentor junior engineers and share knowledge within the team.
  • Document technical designs, processes, and implementation details. Required Qualifications:
  • Bachelor's or Master's degree in Computer Science, Engineering, Information Technology, or a related field (or equivalent practical experience).
  • 10+ years of professional software engineering experience with a proven track record of building complex, scalable systems. Significant hands-on experience (typically 5+ years) specifically in data engineering roles.
  • Expert-level proficiency in designing and implementing data processing solutions using Apache Spark, with strong skills in both Python (PySpark) and Scala.
  • Demonstrable experience building, deploying, and managing data streaming pipelines using Apache Kafka and its ecosystem (e.g., Kafka Connect, Kafka Streams).
  • Solid understanding and practical experience working with big data technologies and concepts (e.g., Hadoop ecosystem - HDFS, Hive, distributed computing, partitioning, file formats like Parquet/Avro).
  • Proven experience working effectively in an Agile/Scrum development environment, participating in sprints and related ceremonies.
  • Demonstrated ability to work independently, manage priorities, and deliver end-to-end solutions with a strong focus on automated testing and quality assurance.
  • Excellent problem-solving, debugging, and analytical skills.
  • Strong communication and interpersonal skills.
  • Preferred Qualifications:
  • Experience with cloud-based data platforms and services (e.g., AWS EMR, S3, Kinesis, MSK; Azure Databricks, ADLS, AWS Glue).
  • Experience with workflow orchestration tools (e.g., Airflow, Dagster, Prefect).
  • Experience with containerization technologies (Docker) and orchestration (Kubernetes).
  • Familiarity with data warehousing solutions (e.g., Snowflake, Redshift, BigQuery).
  • Experience with Infrastructure as Code (IaC) tools like Terraform or CloudFormation.
  • Knowledge of CI/CD practices and tools (e.g., Jenkins, GitLab CI, GitHub Actions) applied to data pipelines.
  • Experience with data modeling and database design (SQL and NoSQL).

Moody’s is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability, protected veteran status, sexual orientation, gender expression, gender identity or any other characteristic protected by law.

Candidates for Moody's Corporation may be asked to disclose securities holdings pursuant to Moody’s Policy for Securities Trading and the requirements of the position. Employment is contingent upon compliance with the Policy, including remediation of positions in those holdings as necessary.

For more information on the Securities Trading Program, please refer to the STP Quick Reference guide on ComplianceNet

Please note: STP categories are assigned by the hiring teams and are subject to change over the course of an employee’s tenure with Moody’s.
Share job
Similar Jobs
View All
1 Day ago
Travelxp - Senior Software Developer - Backend Technologies
Information Technology
  • Mumbai, Maharashtra, India
About The JobJob Title : Senior Software DeveloperLocation : Mumbai OnsiteExperience : 2-3 yearsRole OverviewWe are looking for a Senior Software Developer - Backend Focus who's passionate about building robust, scalable systems and pushing the boun...
decor
1 Day ago
Technical Lead
Information Technology
  • Mumbai, Maharashtra, India
Tech Lead(Fullstack) - Nexa (Conversational Voice AI Platform)Location : Bangalore Type: Full-timeExperience : 4+ years (preferably in early-stage startups)Tech Stack : Python (core), Node.js, React.jsAbout NexaNexa is a new venture by the founders ...
decor
1 Day ago
Sr. Software Engineer - .NET C# Job
Information Technology
  • Mumbai, Maharashtra, India
We use cookies to offer you the best possible website experience. Your cookie preferences will be stored in your browser’s local storage. This includes cookies necessary for the website's operation. Additionally, you can freely decide and change any...
decor
1 Day ago
Zinnia - Senior Data Scientist - Analytics/Python
Information Technology
  • Mumbai, Maharashtra, India
Who We AreZinnia is the leading technology platform for accelerating life and annuities growth.With innovative enterprise solutions and data insights, Zinnia simplifies the experience of buying, selling, and administering insurance products.All of w...
decor
1 Day ago
Senior Data/BI Engineer - Tableau/Power BI
Information Technology
  • Mumbai, Maharashtra, India
What You'll Do Data Analytics & Modeling : Apply strong Data Analytics and Analytical Skills to understand complex business requirements and translate them into effective Data Modeling solutions. Data Pipeline Development : Design, develop, and ma...
decor
1 Day ago
BigStep Technologies - Artificial Intelligence/Machine Learning Engineer - LLM Models
Information Technology
  • Mumbai, Maharashtra, India
Responsibilities Research, design, develop, and test Python code. Collaborate with the Product and Data Science teams to define core AI/ML platform features. Understand business requirements, functional specifications, and research pre-trained mo...
decor
1 Day ago
TIFIN - IT System Administrator
Information Technology
  • Mumbai, Maharashtra, India
ResponsibilitiesWe are looking for a proactive and detail-oriented Systems Administrator - IT Operations to join our team in Mumbai. In this role, you will be responsible for managing IT infrastructure, ensuring system uptime, supporting end-user de...
decor
1 Day ago
Markovate - Senior Data Engineer - Spark/Hadoop
Information Technology
  • Mumbai, Maharashtra, India
About Markovate.At Markovate, we dont just follow trendswe drive them.We transform businesses through innovative AI and digital solutions that turn vision into reality.Our team harnesses breakthrough technologies to craft bespoke strategies that ali...
decor

Talk to us

Feel free to call, email, or hit us up on our social media accounts.
Social media