Kochi, Kerala, India
Information Technology
Full-Time
Velodata Global Pvt Ltd
Overview
Role-Technology
Mandatory Skills
Notice Period
Experience Range (years)
Location
Budget
Lead Data Engineer
AWS, Python, Pyspark
Immediate to 15 days
7+ Yrs
Kerala Local candidates only. (Kochi/TVM)
23 LPA
Job Overview
We are seeking an experienced Senior Data Engineer to lead the development of a scalable data
ingestion framework while ensuring high data quality and validation. The successful candidate
will also be responsible for designing and implementing robust APIs for seamless data
integration. This role is ideal for someone with deep expertise in building and managing big data
pipelines using modern AWS-based technologies, and who is passionate about driving quality
and efficiency in data processing systems.
Key Responsibilities
diverse sources.
automated data pipelines.
integrity of incoming data.
and internal systems.
them into technical solutions.
pipeline development (using tools like GitLab).
Required Qualifications
Redshift.
Preferred Qualifications
Mandatory Skills
Notice Period
Experience Range (years)
Location
Budget
Lead Data Engineer
AWS, Python, Pyspark
Immediate to 15 days
7+ Yrs
Kerala Local candidates only. (Kochi/TVM)
23 LPA
Job Overview
We are seeking an experienced Senior Data Engineer to lead the development of a scalable data
ingestion framework while ensuring high data quality and validation. The successful candidate
will also be responsible for designing and implementing robust APIs for seamless data
integration. This role is ideal for someone with deep expertise in building and managing big data
pipelines using modern AWS-based technologies, and who is passionate about driving quality
and efficiency in data processing systems.
Key Responsibilities
- Data Ingestion Framework:
- Design & Development: Architect, develop, and maintain an end-to-end data
diverse sources.
- Framework Optimization: Use AWS services such as AWS Glue, Lambda,
automated data pipelines.
- Data Quality & Validation:
- Validation Processes: Develop and implement automated data quality checks,
integrity of incoming data.
- Monitoring & Reporting: Establish comprehensive monitoring, logging, and
- API Development:
- Design & Implementation: Architect and develop secure, high-performance
and internal systems.
- Documentation & Best Practices: Create thorough API documentation and
- Collaboration & Agile Practices:
- Cross-Functional Communication: Work closely with business stakeholders,
them into technical solutions.
- Agile Development: Participate in sprint planning, code reviews, and agile
pipeline development (using tools like GitLab).
Required Qualifications
- Experience & Technical Skills:
- Professional Background: At least 5 years of relevant experience in data
- Programming Skills: Proficiency in Python and/or PySpark, SQL for
- AWS Expertise: Extensive experience using AWS services including AWS Glue,
- Data Platforms: Familiarity with big data systems (e.g., AWS EMR, Apache
Redshift.
- API Development: Proven experience in designing and implementing RESTful
- CI/CD & Agile: Hands-on experience with CI/CD pipelines (preferably with
- Soft Skills:
- Strong problem-solving abilities and attention to detail.
- Excellent communication and interpersonal skills with the ability to work
- Capacity to quickly learn and adapt to new technologies and evolving business
Preferred Qualifications
- Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field.
- Experience with additional AWS services such as Kinesis, Firehose, and SQS.
- Familiarity with data lakehouse architectures and modern data quality frameworks.
- Prior experience in a role that required proactive data quality management and API-
- To adhere to the Information Security Management policies and procedures
Similar Jobs
View All
Talk to us
Feel free to call, email, or hit us up on our social media accounts.
Email
info@antaltechjobs.in