Mumbai, Maharashtra, India
Information Technology
Full-Time
Sigmoid
Overview
Responsibilities- Design and develop scalable data solutions using Python, PySpark & SQL.
- Work on Big Data stack (Hadoop, Spark, HBase, ElasticSearch, Databricks, AWS).
- Implement APIs, integrations & automation for large-scale distributed systems.
- Collaborate with clients (including Fortune 500s) and internal teams on data engineering initiatives.
- Ensure best practices: clean code, unit testing, deployments & optimization.
- Strong hands-on skills in PySpark, AWS, SQL, and Databricks.
- Solid understanding of DSA (Data Structures & Algorithms).
- Proven experience with Big Data technologies (Hadoop, Spark, etc. )
- Strong problem-solving & analytical mindset.
- Excellent communication & teamwork skills.
- Self-starter with the ability to work in a fast-paced, high-growth environment.
This job was posted by Shrishti Gupta from Sigmoid.
Similar Jobs
View All
Talk to us
Feel free to call, email, or hit us up on our social media accounts.
Email
info@antaltechjobs.in