Overview
Role: AWS Data Engineer
Location: Gurgaon
Type: Permanent
Mode: Hybrid
Job Description:
Responsibilities and Duties:
- 7-10 years of strong development experience performing ETL and/or data pipeline implementations.
- Expert in programming languages, preferably Python.
- Expert in delivering end-to-end analytic solutions using AWS services (EMR, Airflow, S3, Athena, Kinesis, Redshift).
- Experience in batch technologies like Hadoop, Hive, Athena, Presto.
- Strong SQL skills, including query optimization, schema design, complex analytics.
- Expert in data modeling and metadata management like Glue Catalog etc.
- Experience in deployment tools like GitHub actions, Jenkins, AWS Code Pipeline etc.
- Experience in data quality tools like Deque or Great Expectations is Nice To Have.
Job Type: Permanent
Pay: ₹3,000,000.00 - ₹4,000,000.00 per year
Schedule:
- Day shift
- Monday to Friday
Application Question(s):
- Please update your CV to reflect the skills required for the screening questions.
If you have experience with these technologies, they must be clearly listed in your CV. Without these skills, we will not be able to consider your profile for this role.
Experience:
- Python: 3 years (Required)
- Total: 7 years (Required)
- PySpark: 3 years (Required)
- SQL / Queries: 4 years (Required)
- AWS Elastic MapReduce (EMR): 4 years (Required)
- Amazon Managed Workflow for Apache Airflow (MWAA): 3 years (Required)
- AWS CDK, Cloud-formation, Lambda, Step-funtion: 3 years (Required)
- Athena: 3 years (Required)
- AWS Glue Catalog: 3 years (Required)
- Data Engineer: 6 years (Required)
Work Location: In person
Similar Jobs
View All
Talk to us
Feel free to call, email, or hit us up on our social media accounts.
Email
info@antaltechjobs.in