Overview
Job Title: Sr. Data Engineer
Location: L&T Finance, Mahape, Navi Mumbai, India
Experience Level: 7-9 Years
WHO WE ARE:
L&T Finance is one of India’s leading Non-Banking Financial Companies (NBFCs), known for
its innovation-driven lending solutions across retail, rural, and infrastructure finance. With a
strong commitment to digital transformation and data-led decision making, we offer a dynamic
workplace where your contributions shape the financial future of millions. Join us to be a part of
an organization that values growth, integrity, and impact.
PROFESSIONAL SUMMARY:
An accomplished Lead Data Engineer with 7-9 years of profound experience in designing,
developing, and deploying enterprise-grade data solutions, particularly within the BFSI/NBFC
sector or other highly regulated, data-intensive industries. Demonstrates expert proficiency in
managing the entire data lifecycle, from ingestion and transformation to storage and
consumption, ensuring data quality, security, and compliance. Proven track record of leading
complex data projects, establishing engineering best practices, mentoring high-performing teams,
and serving as a subject matter expert for data infrastructure and pipeline challenges. Adept at
strategic planning and technical execution to meet evolving business intelligence and advanced
analytics needs.
RESPONSIBILITIES:
● Advanced Data Pipeline Development: Design, develop, and optimize complex
ETL/ELT processes and real-time data streaming solutions to ingest, cleanse, transform,
and deliver large-scale datasets from diverse on-premise and cloud sources into data
warehouses, data lakes, and analytical platforms.
● Performance Engineering: Proactively identify and resolve complex data-related
performance bottlenecks and scalability challenges, ensuring optimal processing, storage,
and retrieval for high-volume analytical and operational use cases.
● Technical & Team Mentoring: Provide strong technical leadership, guidance, and
mentorship to a team of data engineers. Foster a culture of excellence, best practices,
code quality, and continuous learning through expert code reviews and technical
guidance.
● Cross-functional Collaboration: Serve as a primary technical liaison, partnering closely
with business stakeholders, data scientists, product owners, and other engineering teams
to translate complex business requirements into robust and efficient data engineering
solutions.
● Cloud Data Platform Mastery: Leverage, optimize, and manage cloud-native data
services (preferably GCP) for comprehensive data storage, processing, analytics, and
machine learning workflows.
● Automation, Monitoring & Resilience: Design and implement advanced automation for
data pipeline deployment, testing, monitoring, and alerting, focusing on resilience,
disaster recovery, and proactive issue resolution.
● Technical Documentation & Standards: Establish and maintain comprehensive
technical documentation for all data architectures, pipelines, data models, and
engineering standards.
● Innovation & Evangelism: Continuously research, evaluate, and recommend cutting-
edge data technologies, tools, and methodologies to enhance our data capabilities and
drive competitive advantage. Champion data-driven decision-making across the
organization.
TECHNICAL SKILLS
● Core Data Engineering: Expert-level hands-on experience with large-scale data
warehousing (e.g., Google BigQuery, Snowflake), data lakes (e.g., GCP-based lakes, S3,
ADLS), and advanced distributed processing frameworks (e.g., Apache Spark, Apache
Kafka).
● Programming Languages: Expert proficiency in Python for complex data engineering,
scripting, automation, and API integration. Strong experience with Java or Python for big
data applications.
● Database & Querying: Exceptional SQL proficiency, including performance tuning,
complex query optimization, and advanced data modeling (relational, dimensional,
NoSQL). Experience with database administration concepts.
● ETL/Orchestration Tools: Deep hands-on experience with enterprise-grade data
orchestration tools (e.g., Apache Airflow, GCP Cloud Dataflow, Dataproc, Talend, Ab-
Initio).
● Cloud Platforms: Strong and proven experience with GCP (Google Cloud Platform)
data services (e.g., BigQuery, Dataflow, Dataproc, Cloud Storage, Pub/Sub, Composer).
Experience with AWS or Azure data ecosystems is a plus.
● APIs & Integrations: Extensive experience designing and implementing robust data
ingestion and integration solutions using REST APIs and various data connectors.
● Version Control & CI/CD: Mastery of Git, advanced branching strategies, and
comprehensive CI/CD pipelines for automated deployment and testing of data solutions.
Collaboration & Communication:
● Exceptional ability to collaborate, influence, and build strong relationships with cross-
functional teams including data mart, Business Intelligence, analytics, and data science.
● Superior documentation and requirement gathering skills, translating intricate business
needs into technical specifications and architectural designs.
● Highly skilled in stakeholder management, conflict resolution, and articulating complex
technical strategies and data value propositions to both technical and non-technical
audiences.
● Demonstrated experience in driving the adoption of advanced data engineering practices
and solutions across large enterprise functions.
Personality Traits & Leadership
● Possesses an extreme attention to detail with an unwavering commitment to data
integrity, accuracy, and operational reliability.
● Highly self-driven, proactive, and capable of taking complete ownership and leading
complex data engineering initiatives independently from strategic planning to successful
execution.
● Strategic and process-oriented thinker with a disciplined approach to complex problem-
solving, architectural design, and system optimization.
● A strong influencer and thought leader with the ability to inspire, mentor, and promote a
culture of data discipline, innovation, and engineering excellence across teams.
● Demonstrates unimpeachable professional integrity, especially when handling sensitive
financial and customer data.
● Highly adaptable to rapid change and thrives in a demanding, fast-paced NBFC
environment with evolving business priorities.
QUALIFICATIONS
● BE/B. Tech and/or M. Tech in Computer Science, Information Technology, or a related
engineering discipline.
● 7-9 years of progressive industry experience in data engineering, with a significant
portion (at least 2-3 years) in a lead, senior data engineer capacity.
● Exceptional problem-solving and advanced analytical skills, with a proven ability to
debug, diagnose, and resolve complex data and system issues.
● Outstanding communication skills, both written and verbal, with the ability to present
complex technical information clearly and concisely.
● Demonstrated strong leadership capabilities, including team building, mentoring, and
leading large-scale technical projects.