Gurugram, Haryana, India
Space Exploration & Research, Information Technology
Full-Time
Velodata Global Pvt Ltd
Overview
Position- Data Engineer
Experience- 3+ years
Location : Trivandrum, Hybrid
Salary : Upto 8 LPA
Job Summary
We are seeking a highly motivated and skilled Data Engineer with 3+ years of experience to
join our growing data team. In this role, you will be instrumental in designing, building, and
maintaining robust, scalable, and efficient data pipelines and infrastructure. You will work closely
with data scientists, analysts, and other engineering teams to ensure data availability, quality,
and accessibility for various analytical and machine learning initiatives.
Key Responsibilities
and load data from diverse sources into data warehouses/lakes.
○ Implement data models and schemas that support analytical and reporting
requirements.
○ Build and maintain robust data APIs for data consumption by various applications
and services.
services (AWS, Azure, GCP) or on-premise solutions.
○ Ensure data security, privacy, and compliance with relevant regulations.
○ Monitor data pipelines for performance, reliability, and data quality, implementing
alerting and anomaly detection.
understand data requirements and translate them into technical solutions.
○ Optimize existing data processes for efficiency, cost-effectiveness, and
performance.
○ Participate in code reviews, contribute to documentation, and uphold best
practices in data engineering.
consumers.
○ Provide support and expertise to teams consuming data from the data platform.
Required Qualifications
Experience- 3+ years
Location : Trivandrum, Hybrid
Salary : Upto 8 LPA
Job Summary
We are seeking a highly motivated and skilled Data Engineer with 3+ years of experience to
join our growing data team. In this role, you will be instrumental in designing, building, and
maintaining robust, scalable, and efficient data pipelines and infrastructure. You will work closely
with data scientists, analysts, and other engineering teams to ensure data availability, quality,
and accessibility for various analytical and machine learning initiatives.
Key Responsibilities
- Design and Development:
and load data from diverse sources into data warehouses/lakes.
○ Implement data models and schemas that support analytical and reporting
requirements.
○ Build and maintain robust data APIs for data consumption by various applications
and services.
- Data Infrastructure:
services (AWS, Azure, GCP) or on-premise solutions.
○ Ensure data security, privacy, and compliance with relevant regulations.
○ Monitor data pipelines for performance, reliability, and data quality, implementing
alerting and anomaly detection.
- Collaboration & Optimization:
understand data requirements and translate them into technical solutions.
○ Optimize existing data processes for efficiency, cost-effectiveness, and
performance.
○ Participate in code reviews, contribute to documentation, and uphold best
practices in data engineering.
- Troubleshooting & Support:
consumers.
○ Provide support and expertise to teams consuming data from the data platform.
Required Qualifications
- Bachelor's degree in Computer Science, Engineering, or a related quantitative field.
- 3+ years of hands-on experience as a Data Engineer or in a similar role.
- Strong proficiency in at least one programming language commonly used for data
- Extensive experience with SQL and relational databases (e.g., PostgreSQL, MySQL,
- Proven experience with ETL/ELT tools and concepts.
- Experience with data warehousing concepts and technologies (e.g., Snowflake,
- Familiarity with cloud platforms (AWS, Azure, or GCP) and their data services (e.g.,
- Understanding of data modeling techniques (e.g., dimensional modeling, Kimball,
- Experience with version control systems (e.g., Git).
- Excellent problem-solving, analytical, and communication skills.
- Master's degree in a relevant field.
- Experience with Apache Spark (PySpark, Scala Spark) or other big data processing
- Familiarity with NoSQL databases (e.g., MongoDB, Cassandra).
- Experience with data streaming technologies (e.g., Kafka, Kinesis).
- Knowledge of containerization technologies (e.g., Docker, Kubernetes).
- Experience with workflow orchestration tools (e.g., Apache Airflow, Azure Data
- Understanding of DevOps principles as applied to data pipelines.
- Prior experience in Telecom is a plus.
Similar Jobs
View All
Talk to us
Feel free to call, email, or hit us up on our social media accounts.
Email
info@antaltechjobs.in