Bangalore, Karnataka, India
Information Technology
Full-Time
Bajaj Finserv
Overview
Location Name:Pune Corporate Office - Mantri
Job Purpose
Job Purpose – To effectively design, develop, and manage data solutions using ETL technologies such as Azure Databricks (ADB) and Azure Data Factory (ADF), work with NoSQL databases like Cosmos DB, and apply object-oriented programming (OOP) principles through C#/.NET for scalable data integration and backend services.
Duties And Responsibilities
- Databricks Development: Build and manage Databricks notebooks using SQL and PySpark within a reusable framework.
- ETL Development: Design and maintain data integration pipelines in Azure Data Factory.
- Database Proficiency: Strong knowledge of SQL and experience with relational databases like SQL Server, MySQL, etc.
- API/Data Exchange Layer: Develop and manage backend services in C#/.NET, hosted on Azure App Service.
- NoSQL Expertise: Work with Cosmos DB or MongoDB (documents & collections), including concepts like indexing, partitioning, change feed, throttling, etc.
- CI/CD & Release Management: Utilize DevOps pipelines for code versioning, automated testing, and release.
- Time-Series Data Handling: Leverage Azure Data Explorer (ADX) for time-series data, with knowledge of materialized views, sharding, and caching policies.
- Real-Time Streaming (Nice to Have): Basic understanding of Azure Event Hub, including batching, offsets/checkpoints, payload handling, and throttling.
- Cloud Platform Familiarity (Preferred): Exposure to Azure cloud services and architecture best practices.
Key Responsibilities –
- Translate business requirements into technical solutions in collaboration with the PMO team.
- Own end-to-end delivery of data projects, ensuring on-time execution and adherence to quality standards.
- Design technical architecture and guide development efforts for enhancements and new projects.
- Develop and maintain robust ETL pipelines and data integration modules across systems.
- Ensure high data quality, platform stability, and resolution of critical process issues.
- Monitor and resolve performance bottlenecks in data workflows and programs.
- Establish best practices, standard operating procedures, and drive their implementation across teams.
- Act as a liaison with business users and product managers to support daily data needs and strategic initiatives.
- Coordinate with internal and external development teams to troubleshoot and resolve issues efficiently.
- Manage workload through effective planning, prioritization, and progress tracking.
- Making critical decisions during production issues, including root cause analysis, quick fixes, and long-term resolutions.
- Prioritizing and escalating support tasks effectively to minimize downtime and business impact.
- Driving decisions around technical design, architecture, and optimization to ensure performance, scalability, and maintainability of solutions.
- Balancing development timelines with on-time delivery in a dynamic and evolving technical environment.
- Coordinating with multiple internal and external stakeholders to align priorities, resolve dependencies, and ensure smooth execution.
- Ensuring code quality, data integrity, and performance while scaling data solutions across diverse systems and platforms.
- Qualifications
- Bachelors Degree in Computer Engineering, Computer Science, Information Technology, or a related discipline.
- A Masters degree in a related field is an advantage.
- Professional certifications such as Azure Certified Solutions Architect, Google Professional Cloud Architect, TOGAF, or DAMA Certified Data Management Professional (CDMP) are desirable.
- Work Experience
- 6+ years of experience in data architecture, solution design, and/or enterprise architecture.
- 2-3 years of experience managing technical project deliveries or acting in a delivery lead capacity.
- Strong expertise in data modeling (conceptual, logical, physical), data integration, and ETL design.
- Proficiency with cloud-based data services (AWS, Azure, GCP) and modern data architectures (data lake, lakehouse, data mesh).
- Deep understanding of relational databases, NoSQL, big data platforms (Hadoop, Spark), and real-time data processing.
- Experience with Agile methodologies and project management frameworks (Scrum, Kanban).
- Strong leadership, stakeholder management, and problem-solving skills.
- Skills Keywords
- Azure Databricks - PySpark, SQ - Must Have
- Azure Data Factory – For ETL & Data Integrations - Must Have
- OOPS Concept Implementation in C#/.Net
- COSMOS Database for NoSQL DB
- Event Hub & Kafka for Change Feed & Real-Time Streaming - Good to Have
- Azure Data Explorer as Time Series Database with Kusto Query Language (KQL) As Programming Language - Good to Have
Similar Jobs
View All
Talk to us
Feel free to call, email, or hit us up on our social media accounts.
Email
info@antaltechjobs.in