Overview
Job Role: Data Engineer with Python and AWS
Experience: 5+years
Location: 100% Remote - South Indian Candidates only.
Job Type: Contract
Duration: 6 Months
Budget: Upto 90K per month
Need Immediate joiners
Required Qualifications
* Experience & Technical Skills:
o Professional Background: At least 5 years of relevant experience in data
engineering with a strong emphasis on analytical platform development.
o Programming Skills: Proficiency in Python and/or PySpark, SQL for
developing ETL processes and handling large-scale data manipulation.
o AWS Expertise: Extensive experience using AWS services including AWS Glue,
Lambda, Step Functions, and S3 to build and manage data ingestion frameworks.
o Data Platforms: Familiarity with big data systems (e.g., AWS EMR, Apache
Spark, Apache Iceberg) and databases like DynamoDB, Aurora, Postgres, or
Redshift.
o API Development: Proven experience in designing and implementing RESTful
APIs and integrating them with external and internal systems.
o CI/CD & Agile: Hands-on experience with CI/CD pipelines (preferably with
GitLab) and Agile development methodologies.
* Soft Skills:
o Strong problem-solving abilities and attention to detail.
o Excellent communication and interpersonal skills with the ability to work
independently and collaboratively.
o Capacity to quickly learn and adapt to new technologies and evolving business
requirements.
Preferred Qualifications
* Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field.
* Experience with additional AWS services such as Kinesis, Firehose, and SQS.
* Familiarity with data lakehouse architectures and modern data quality frameworks.
* Prior experience in a role that required proactive data quality management and API-
driven integrations in complex, multi-cluster environments.
* To adhere to the Information Security Management policies and procedures
Required Qualifications