
Overview
About Lowe’s
Lowe’s is a FORTUNE® 100 home improvement company serving approximately 16 million customer transactions a week in the United States. With total fiscal year 2024 sales of more than $83 billion, Lowe’s operates over 1,700 home improvement stores and employs approximately 300,000 associates. Based in Mooresville, N.C., Lowe’s supports the communities it serves through programs focused on creating safe, affordable housing, improving community spaces, helping to develop the next generation of skilled trade experts and providing disaster relief to communities in need. For more information, visit Lowes.com.
Lowe’s India, the Global Capability Center of Lowe’s Companies Inc., is a hub for driving our technology, business, analytics, and shared services strategy. Based in Bengaluru with over 4,500 associates, it powers innovations across omnichannel retail, AI/ML, enterprise architecture, supply chain, and customer experience. From supporting and launching homegrown solutions to fostering innovation through its Catalyze platform, Lowe’s India plays a pivotal role in transforming home improvement retail while upholding strong commitment to social impact and sustainability. For more information, visit Lowes India
About the Team
The highly motivated team builds scalable data platforms, scorecards, and AI/ML-driven insights to empower Distribution Centers (DCs) with data-informed decision-making. The team delivers end-to-end solutions to forecast labor and transportation needs, optimize inventory flow, and improve operational efficiency across the supply chain network. By integrating real-time data, advanced analytics, and machine learning, the team enables proactive planning and performance visibility to meet evolving business demands.
Job Summary:
We are seeking a highly experienced Lead Data Engineer to drive innovation and deliver scalable, secure, and maintainable data solutions across the HR technology landscape. This role blends technical expertise, leadership, and cross-functional collaboration to architect and build cloud-native data pipelines and platforms.
Roles & Responsibilities:
Core Responsibilities:
Strategic Engineering & Leadership
Lead the development and delivery of integrated business and enterprise data solutions using technologies such as Google Cloud Platform (GCP), BigQuery, Dataproc, and Apache Airflow
Resolve complex cross-application issues and lead initiatives with significant business impact across HR portfolios
Contribute to project planning, prioritization, and execution with minimal guidance, ensuring adherence to architectural and security standards
Mentor and guide team members across all phases of the software development lifecycle
Educate peers and teams on architectural best practices, standards, and reusable design patterns
Champion improvements in engineering, testing, and operational excellence
Communicate architectural changes clearly and effectively while coaching teams on implementation
Data Engineering & Platform Development
Architect and develop reusable frameworks for cloud-native data ingestion, curation, and pipeline orchestration using GCP-native services like BigQuery, Dataproc, and Airflow
Integrate and process structured and unstructured data from diverse sources such as DBMS, file systems, APIs, and real-time streams
Build robust and scalable data lakes supporting cross-functional HR domains, while ensuring high data quality, performance, and observability
Maintain and monitor platform availability and health; collaborate with infrastructure and DevOps teams for platform optimization and issue resolution
Model Hosting & MLOps Collaboration
Partner with Data Science teams to host models, enabling scalable inference pipelines with strong non-functional design (e.g., logging, authentication, error handling, concurrency)
Translate advanced statistical models into production-grade, scalable data flows
Contribute to ML model deployment pipelines and performance tuning using Vertex AI and other MLOps tools
Governance & Compliance
Ensure data security and compliance with enterprise governance, industry standards, and internal policies
Implement secure design principles and reusable patterns that align with corporate risk and compliance mandates
Years of Experience:
7 years of experience in Data, BI or Platform Engineering, Data Warehousing/ETL, or Software Engineering
6 years of experience working on project(s) involving the implementation of solutions applying development life cycles (SDLC)
Education Qualification & Certifications (optional)
Required Minimum Qualifications:
Bachelor's degree in engineering, Computer Science, CIS, or related field
Skill Set Required
Cloud & Data Platform: GCP, BigQuery, Dataproc, Cloud Storage, Pub/Sub, Composer (Airflow), Dataflow
BI & Visualization: Looker, Looker Studio Pro, LookML, semantic data modeling
Orchestration & Pipelines: Apache Airflow, Terraform, CI/CD pipelines, Git
Programming & Tools: Python, SQL, REST APIs, JSON, YAML, Linux
AI/ML Integration: Vertex AI, agentic AI systems, model hosting via REST, MLOps practices
Data Integration: ETL/ELT, ingestion from DBMS, APIs, files, and streaming data
Security & Compliance: IAM, encryption, governance frameworks
Secondary Skills (desired)
Experience with Vertex AI, Agentic AI frameworks, or other advanced AI/LLM orchestration tools
Hands-on experience in Machine Learning, Data Science pipelines, or automated decision systems
Familiarity with DevSecOps and automated monitoring frameworks
Lowe's is an equal opportunity employer and administers all personnel practices without regard to race, color, religious creed, sex, gender, age, ancestry, national origin, mental or physical disability or medical condition, sexual orientation, gender identity or expression, marital status, military or veteran status, genetic information, or any other category protected under federal, state, or local law.