Overview
Position: Data Engineer
Location: Bangalore, Mumbai (Hybrid)
Experience: 5-7 years
UK Shift: 2 PM to 10 PM IST
Tech Stack: Python, PySpark, Terraform/CloudFormation, SQL, Kafka/Kinesis, Airflow, AWS
About BinQle:
BinQle is a global delivery and execution partner supporting enterprises across the US, India, Singapore, and Australia. We specialize in structured hiring, managed teams, and long-term delivery support.
Role Summary:
We are looking for a hands-on senior data engineer to design, build, and maintain scalable data pipelines on AWS. You will be responsible for end-to-end data platform delivery from ingestion to warehousing and data lakes. This is a high-visibility role requiring both technical depth and stakeholder communication.
What You'll Do:
- Build and optimize ETL pipelines using PySpark, AWS Glue, Lambda, and Airflow.
- Implement infrastructure as code using Terraform or CloudFormation.
- Work with streaming data (Kafka/Kinesis) and batch processing.
- Develop and maintain data warehouses/lakes with strong data modeling practices.
- Ensure data governance, GDPR/CISO compliance, and cost optimization.
- Mentor junior engineers and collaborate with data product managers and analytics teams.
- Drive CI/CD, automation, and agile delivery (Scrum/Kanban).
What We're Looking For (Must-Haves):
- 5+ years of data engineering experience with strong AWS cloud expertise.
- Proven experience in large-scale data transformation and modern data architectures (Lakehouse, Data Mesh).
- Excellent communication skills—able to explain complex solutions to both technical and non-technical stakeholders.
- Experience working in UK shifts or global delivery models.