Vellore, Tamil Nadu, India
Information Technology
Full-Time
Unic Sol India Pvt Ltd
Overview
Job Title : Big Data Engineer
Experience : 6+ Years
Location : Bangalore
Employment Type : Full-Time
Job Description
We are looking for a skilled and experienced Big Data Engineer with a strong background in Python, Hadoop ecosystem, and cloud technologies to join our data engineering team in Bangalore. The ideal candidate will be responsible for building scalable data pipelines, managing data lakes, and optimizing large-scale data processing frameworks.
Key Responsibilities
Experience : 6+ Years
Location : Bangalore
Employment Type : Full-Time
Job Description
We are looking for a skilled and experienced Big Data Engineer with a strong background in Python, Hadoop ecosystem, and cloud technologies to join our data engineering team in Bangalore. The ideal candidate will be responsible for building scalable data pipelines, managing data lakes, and optimizing large-scale data processing frameworks.
Key Responsibilities
- Design, develop, and maintain scalable and efficient big data pipelines using PySpark and Spark.
- Work with large datasets using Hadoop, Hive, and HDFS.
- Develop data ingestion frameworks using Python and Spark to integrate structured and unstructured data from multiple sources.
- Implement data transformation, data validation, and data quality frameworks.
- Optimize performance of big data processing applications.
- Work on cloud platforms (AWS) and manage data storage using S3.
- Collaborate with cross-functional teams including data scientists, analysts, and business stakeholders.
- Ensure best practices in data security, governance, and compliance are followed.
- Strong programming skills in Python and Scala.
- Hands-on experience with Hadoop, Hive, and Spark.
- Experience with PySpark and distributed computing.
- Proficient in working with AWS cloud services, especially S3 & EMR.
- Strong knowledge of data structures, algorithms, and problem-solving.
- Good understanding of data architecture and ETL pipelines.
- Exposure to AWS services like EMR, Lambda, Glue.
- Experience with workflow orchestration tools like Airflow or Oozie.
- Familiarity with DevOps practices, CI/CD pipelines.
- Knowledge of data warehousing concepts and tools like Redshift or Snowflake.
- Bachelors/Masters degree in Computer Science, Engineering, or a related field.
- 6+ years of relevant experience in Big Data technologies.
Similar Jobs
View All
Talk to us
Feel free to call, email, or hit us up on our social media accounts.
Email
info@antaltechjobs.in