Nashik, Maharashtra, India
Information Technology
Full-Time
EverestDX Inc
Overview
Description
We are seeking a highly skilled Lead Data Engineer to design, build, and scale robust data platforms and pipelines that empower business decisions and analytics.
This role combines deep technical expertise with strategic leadership, guiding a team of data engineers to deliver scalable, reliable, and high-performing data solutions.
The ideal candidate will have strong experience in cloud data ecosystems, data modeling, data quality frameworks, and cross-functional collaboration.
Key Responsibilities
Data Architecture & Engineering :
We are seeking a highly skilled Lead Data Engineer to design, build, and scale robust data platforms and pipelines that empower business decisions and analytics.
This role combines deep technical expertise with strategic leadership, guiding a team of data engineers to deliver scalable, reliable, and high-performing data solutions.
The ideal candidate will have strong experience in cloud data ecosystems, data modeling, data quality frameworks, and cross-functional collaboration.
Key Responsibilities
Data Architecture & Engineering :
- Design and implement scalable, high-performance data pipelines for ingestion, transformation, and processing of structured and unstructured data.
- Architect data lake, data warehouse, and real-time streaming solutions using modern cloud platforms (AWS, Azure, or GCP).
- Develop and maintain ETL/ELT processes ensuring data accuracy, consistency, and integrity across systems.
- Build and optimize data models (OLAP and OLTP) to support business intelligence, analytics, and reporting.
- Evaluate and integrate new data management tools and frameworks to enhance data engineering capabilities.
- Lead a team of data engineers mentor, review code, and enforce engineering best practices.
- Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and translate them into technical solutions.
- Establish and enforce data governance, quality, and security standards in alignment with enterprise policies.
- Partner with DevOps teams to ensure CI/CD automation, testing, and deployment of data pipelines.
- Monitor data pipeline performance and proactively resolve bottlenecks or failures.
- Optimize data workflows for cost efficiency, scalability, and processing speed.
- Drive adoption of observability and monitoring frameworks (e.g., Datadog, Prometheus, or custom dashboards).
- 8+ years of professional experience in data engineering, with at least 23 years in a lead or senior capacity.
- Strong proficiency in Python, SQL, and one or more data pipeline orchestration tools (Airflow, Dagster, Prefect, etc.).
- Deep understanding of data warehousing concepts (e.g., Snowflake, Redshift, BigQuery, Synapse).
- Experience with streaming technologies such as Kafka, Spark Streaming, or Flink.
- Hands-on experience with cloud platforms (AWS, Azure, or GCP) particularly data services like S3, Glue, Dataflow, or Databricks.
- Expertise in schema design, data modeling, and performance tuning.
- Familiarity with containerization and orchestration tools (Docker, Kubernetes).
- Strong understanding of CI/CD principles, version control (Git), and infrastructure-as-code (Terraform, CloudFormation).
- Knowledge of data security, encryption, and compliance frameworks (GDPR, SOC 2).
- Experience leading large-scale data platform modernization or migration projects.
- Exposure to machine learning pipeline orchestration and MLOps.
- Certification in cloud data engineering (AWS Certified Data Analytics, GCP Data Engineer, etc.).
- Familiarity with modern data stack tools (dbt, Fivetran, Looker, Power BI).
- Strong communication skills and ability to collaborate in cross-functional agile environments
Similar Jobs
View All
Talk to us
Feel free to call, email, or hit us up on our social media accounts.
Email
info@antaltechjobs.in