Chennai, Tamil Nadu, India
Space Exploration & Research, Information Technology
Full-Time
D Square Consulting Services Pvt Ltd
Overview
Skills:
python, cloud plateform, Data Pipeline, Version control, ETL, API development, Docker, Kubernetes,
JD - Data Engineer
This is a full-time position with D Square Consulting Services Pvt Ltd
Experience: 5+ years
Office visits: Required once a month
Location: Multiple locations (Hybrid)
Notice Period: Immediate joiners
Job Summary
We are seeking a skilled and experienced Data Engineer with strong Python expertise, API development experience, and a deep understanding of containerised, CI/CD-driven workflows. You will play a key role in designing, building, and scaling data pipelines and backend services that support our analytics and business intelligence platforms. This is a hands-on engineering role that requires a strong technical foundation and a collaborative mindset. We are looking for multiple locations such as Chennai, Mumbai, Bangalore, Pune, Gurgaon, Hyderabad
Key Responsibilities
python, cloud plateform, Data Pipeline, Version control, ETL, API development, Docker, Kubernetes,
JD - Data Engineer
This is a full-time position with D Square Consulting Services Pvt Ltd
Experience: 5+ years
Office visits: Required once a month
Location: Multiple locations (Hybrid)
Notice Period: Immediate joiners
Job Summary
We are seeking a skilled and experienced Data Engineer with strong Python expertise, API development experience, and a deep understanding of containerised, CI/CD-driven workflows. You will play a key role in designing, building, and scaling data pipelines and backend services that support our analytics and business intelligence platforms. This is a hands-on engineering role that requires a strong technical foundation and a collaborative mindset. We are looking for multiple locations such as Chennai, Mumbai, Bangalore, Pune, Gurgaon, Hyderabad
Key Responsibilities
- Design, implement, and optimise robust, scalable data pipelines and ETL workflows using modern Python tools and libraries.
- Build and maintain production-grade RESTful and/or GraphQL APIs to serve data to internal and external stakeholders.
- Collaborate with Data Analysts, Scientists, and Engineering teams to enable end-to-end data solutions.
- Containerize data services using Docker and manage deployments within Kubernetes environments.
- Develop and maintain CI/CD pipelines using GitHub Actions to automate testing, data validations, and deployment processes.
- Ensure code quality through rigorous unit testing, type annotations, and adherence to Python best practices.
- Participate in architecture reviews, design discussions, and code reviews in an agile development process.
- Proactively identify opportunities to optimize data access, transformation, and governance.
- Bachelors or Masters degree in Computer Science, Engineering, or a related technical field.
- 5+ years of hands-on experience in data engineering or backend development roles.
- Expert-level Python skills, with a strong understanding of idiomatic patterns, async programming, and typing.
- Proven experience in building production-grade RESTful or GraphQL APIs using frameworks like FastAPI, Graphene, or Strawberry.
- Hands-on experience with Docker, container-based workflows, and CI/CD automation using GitHub Actions.
- Experience working with Kubernetes for orchestrating deployments in production environments.
- Proficient with SQL and data modelling; familiarity with ETL tools, data lakes, or warehousing concepts is a plus.
- Strong communicator with a proactive and self-driven approach to problem-solving and collaboration.
- Familiarity with data orchestration tools (e.g., Airflow, Prefect).
- Experience with streaming data platforms like Kafka or Spark.
- Knowledge of data governance, security, and observability best practices.
- Exposure to cloud platforms like AWS, GCP, or Azure.
Similar Jobs
View All
Talk to us
Feel free to call, email, or hit us up on our social media accounts.
Email
info@antaltechjobs.in