Chennai, Tamil Nadu, India
Information Technology
Full-Time
Bajaj Finserv
Overview
Location Name:Pune Corporate Office - Mantri
Job Purpose
Own delivery governance and solution architecture for the Enterprise Data Platform. Define and oversee end?to?end data solutions—including batch and real?time pipelines—using SQL, Python, and PySpark with strong focus on streaming, Change Data Capture (CDC), and database mirroring. Translate business needs (including Loan Management System use cases) into scalable, secure, and cost?effective data architectures and integration patterns.
Duties And Responsibilities
- Define target?state and incremental roadmaps for data platforms and domain solutions; produce HLD/LLD, logical/physical models, and interface contracts.
- Lead PMO activities: scope planning, backlog management, sprint planning, risk/issue/decision logs, dependencies, release planning, and stakeholder reporting.
- Architect real?time and batch data ingestion using SQL, Python, PySpark; design streaming jobs (e.g., Structured Streaming) with CDC and database mirroring for near?real?time use cases.
- Create Data Flow Diagrams (DFDs) and end?to?end data architecture involving multiple publishers/consumers, defining SLAs/SLOs and data contracts.
- Drive integration patterns across systems (APIs, event streaming, file?based, message queues) including schema evolution and idempotency strategies.
- Establish data quality, observability, and lineage standards (validation, alerts, runbooks) across environments.
- Ensure security, privacy, and compliance (access controls, encryption, PII handling) aligned with enterprise policies.
- Coordinate with internal teams (BI/Analytics, App Dev, Infra/Security, DevOps) and external vendors/partners for delivery and support.
- Review estimates, monitor burn/cost, track delivery to milestones; drive retrospectives and continuous improvement.
- Support UAT, cutover, and hypercare; perform RCA for incidents and implement preventive actions.
- Maintain high?quality documentation (architecture packs, runbooks, decision records) and conduct knowledge share sessions.
- Select architecture patterns (batch vs. streaming), CDC approach, and storage/format strategies to meet SLAs and cost goals.
- Approve interface designs, data contracts, and non?functional requirements (performance, reliability, security).
- Prioritize scope and sequence of increments/releases based on value, risk, and dependencies.
- Recommend tooling for orchestration, testing, observability, and CI/CD.
- Balancing architecture quality with delivery timelines across parallel initiatives.
- Managing integration complexity with Loan Management Systems and upstream/downstream platforms.
- Maintaining reliability, data quality, and low latency for streaming/CDC workloads at scale.
- Handling evolving business requirements and schema drift across diverse sources.
Required Qualifications and Experience
- Graduate or Post?Graduate in Computer Science, Information Technology, or Data Science/Technologies.
- 3–4 years of hands?on experience in data engineering/solution architecture/PMO within data platforms.
- SQL, Python, PySpark
- Data streaming, Change Data Capture (CDC), Database Mirroring
- Integration patterns (APIs, events/queues, file), schema evolution, and idempotency
- Version control and DevOps pipelines (e.g., Git/GitHub, Azure DevOps)
- Preferred: Azure Databricks, Azure Data Factory, Data Lake Storage; data modeling and performance optimization
- Understanding of Loan Management Systems (LMS) in banking/NBFC context
- Strong grasp of system?level integrations across enterprise platforms
- Ability to create DFDs and end?to?end data architecture involving multiple publishers and consumers
Similar Jobs
View All
Talk to us
Feel free to call, email, or hit us up on our social media accounts.
Email
info@antaltechjobs.in