Gurugram, Haryana, India
Information Technology
Full-Time
UST
Overview
Role Description
ocation: Any UST Location
Experience: 6 to 9 years
Mandatory Skills: PySpark, GCP, Hadoop, Hive, AWS, GCP
Good to Have: CI/CD and DevOps experience
Job Description
We are seeking a highly skilled Senior Big Data Engineer to join our team at UST. The ideal candidate will have solid experience in Big Data technologies, cloud platforms, and data processing frameworks with a strong focus on PySpark and Google Cloud Platform (GCP).
Key Responsibilities
Spark,Hadoop,Hive,Gcp
ocation: Any UST Location
Experience: 6 to 9 years
Mandatory Skills: PySpark, GCP, Hadoop, Hive, AWS, GCP
Good to Have: CI/CD and DevOps experience
Job Description
We are seeking a highly skilled Senior Big Data Engineer to join our team at UST. The ideal candidate will have solid experience in Big Data technologies, cloud platforms, and data processing frameworks with a strong focus on PySpark and Google Cloud Platform (GCP).
Key Responsibilities
- Design, develop, and maintain scalable data pipelines and ETL workflows using PySpark, Hadoop, and Hive.
- Deploy and manage big data workloads on cloud platforms like GCP and AWS.
- Work closely with cross-functional teams to understand data requirements and deliver high-quality solutions.
- Optimize data processing jobs for performance and cost-efficiency on cloud infrastructure.
- Implement automation and CI/CD pipelines to streamline deployment and monitoring of data workflows.
- Ensure data security, governance, and compliance in cloud environments.
- Troubleshoot and resolve data issues, monitoring job executions and system health.
- PySpark: Strong experience in developing data processing jobs and ETL pipelines.
- Google Cloud Platform (GCP): Hands-on experience with BigQuery, Dataflow, Dataproc, or similar services.
- Hadoop Ecosystem: Expertise with Hadoop, Hive, and related big data tools.
- AWS: Familiarity with AWS data services like S3, EMR, Glue, or Redshift.
- Strong SQL and data modeling skills.
- Experience with CI/CD tools and DevOps practices (Jenkins, GitLab, Terraform, etc.).
- Containerization and orchestration knowledge (Docker, Kubernetes).
- Experience with Infrastructure as Code (IaC).
- Knowledge of data governance and data security best practices.
Spark,Hadoop,Hive,Gcp
Similar Jobs
View All
Talk to us
Feel free to call, email, or hit us up on our social media accounts.
Email
info@antaltechjobs.in