Noida, Uttar Pradesh, India
Information Technology
Full-Time
RAPSYS TECHNOLOGIES PTE LTD
Overview
π We're Hiring: Data Platform Engineer! π
We are seeking an experienced Data Platform Engineer to design, build, and maintain scalable data infrastructure using AWS cloud services. The ideal candidate will have expertise in Python, PySpark, EMR, and Apache Airflow to develop robust data pipelines and analytics solutions that drive business insights.
π Location: Pune, India
β° Work Mode: Work from anywhere
πΌ Role: AWS, Python, Pyspark, EMR, Apache Airflow- Data Platform Engineer
What You'll Do
π― Design and implement scalable data pipelines using Apache Airflow
βοΈ Build and optimize AWS EMR clusters for big data processing
π Develop data processing applications using Python and PySpark
π Create ETL workflows for data ingestion and transformation
π§ Monitor and troubleshoot data platform performance
π€ Collaborate with data scientists and analysts on data requirements
What We're Looking For
β 6+ years of experience in data engineering
β Strong expertise in AWS services (EMR, S3, Glue, Lambda)
β Proficiency in Python and PySpark for big data processing
β Hands-on experience with Apache Airflow for workflow orchestration
β Knowledge of data warehousing and ETL best practices
β Experience with SQL and NoSQL databases
Ready to make an impact? π Apply now and let's grow together!
We are seeking an experienced Data Platform Engineer to design, build, and maintain scalable data infrastructure using AWS cloud services. The ideal candidate will have expertise in Python, PySpark, EMR, and Apache Airflow to develop robust data pipelines and analytics solutions that drive business insights.
π Location: Pune, India
β° Work Mode: Work from anywhere
πΌ Role: AWS, Python, Pyspark, EMR, Apache Airflow- Data Platform Engineer
What You'll Do
π― Design and implement scalable data pipelines using Apache Airflow
βοΈ Build and optimize AWS EMR clusters for big data processing
π Develop data processing applications using Python and PySpark
π Create ETL workflows for data ingestion and transformation
π§ Monitor and troubleshoot data platform performance
π€ Collaborate with data scientists and analysts on data requirements
What We're Looking For
β 6+ years of experience in data engineering
β Strong expertise in AWS services (EMR, S3, Glue, Lambda)
β Proficiency in Python and PySpark for big data processing
β Hands-on experience with Apache Airflow for workflow orchestration
β Knowledge of data warehousing and ETL best practices
β Experience with SQL and NoSQL databases
Ready to make an impact? π Apply now and let's grow together!
Similar Jobs
View All
Talk to us
Feel free to call, email, or hit us up on our social media accounts.
Email
info@antaltechjobs.in