3000000 - 3000000 INR - Yearly
Chennai, Tamil Nadu, India
Information Technology
Full-Time
Uplers
Overview
Experience: 5.00 + years
Salary: INR 3000000.00 / year (based on experience)
Expected Notice Period: 15 Days
Shift: (GMT+05:30) Asia/Kolkata (IST)
Opportunity Type: Remote
Placement Type: Full Time Permanent position(Payroll and Compliance to be managed by: Zero to 1)
(*Note: This is a requirement for one of Uplers' client - Zero to 1)
What do you need for this opportunity?
Must have skills required:
Advanced python, SQL Mastery, 5+ years in Data Engineering with AWS and Python, Linux Proficiency:, Strong AWS Experience
Zero to 1 is Looking for:
We are seeking a highly skilled and motivated Data Engineer to join our team. In this role, you will leverage your expertise in AWS, Python, SQL, and data management in general to build and optimize scalable data pipelines, support our ETL/ELT and API processes, and enable seamless data flow across our organization and to our end users. You will work closely with our engineering, analytics, and product teams to drive data-driven solutions and ensure efficient data infrastructure.
Responsibilities:
✅ Design, develop, and maintain scalable data pipelines using a variety of AWS services
✅ Work with relational databases such as Aurora (PostgreSQL) and Redshift for large-scale data management
✅ Develop and optimize ETL/ELT processes to ensure smooth data integration from multiple sources
✅ Build and maintain serverless applications using AWS services like Lambda, Fargate, and SQS
✅ Collaborate with cross-functional teams to ensure data accessibility and integrity across different platforms
✅ Use Python, SQL, and Pandas for data manipulation, analysis, and automation of workflows
✅ Implement unit testing to ensure code reliability, using PyTest
✅ Manage API integrations and development, using FastAPI
✅ Assist with infrastructure as code (IaC) using CDK
Must Have Skills & Qualifications:
✅ Extensive Work Experience: 5+ years in Data Engineering with AWS and Python
✅.Strong AWS Experience: Particularly with services like Aurora (PostgreSQL or MySQL), Redshift, Fargate, Lambda, SQS, and IAM. Experience using tools like Boto3 and AWS CLI
✅.Linux Proficiency: Comfortable in a Linux environment, including things like shell scripting, common CLI tools, creating and testing Linux based Docker images
✅.SQL Mastery: Expertise in complex SQL queries and database optimization
✅.Advanced Python: Strong hands-on experience with Python, ideally with strong experience with Data Engineering tools, like Pandas, SQLAlchemy, Pandera
✅.Boto3 Expertise: Solid understanding of AWS SDK for Python (Boto3), AWS CLI, and AWS automation tools
✅.Git Proficiency: Comfortable with version control, branching, and collaboration using Git
✅.ORM Skills: Ideally with SQLAlchemy, for database interactions
How to apply for this opportunity?
Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement.
(Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well).
So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Salary: INR 3000000.00 / year (based on experience)
Expected Notice Period: 15 Days
Shift: (GMT+05:30) Asia/Kolkata (IST)
Opportunity Type: Remote
Placement Type: Full Time Permanent position(Payroll and Compliance to be managed by: Zero to 1)
(*Note: This is a requirement for one of Uplers' client - Zero to 1)
What do you need for this opportunity?
Must have skills required:
Advanced python, SQL Mastery, 5+ years in Data Engineering with AWS and Python, Linux Proficiency:, Strong AWS Experience
Zero to 1 is Looking for:
We are seeking a highly skilled and motivated Data Engineer to join our team. In this role, you will leverage your expertise in AWS, Python, SQL, and data management in general to build and optimize scalable data pipelines, support our ETL/ELT and API processes, and enable seamless data flow across our organization and to our end users. You will work closely with our engineering, analytics, and product teams to drive data-driven solutions and ensure efficient data infrastructure.
Responsibilities:
✅ Design, develop, and maintain scalable data pipelines using a variety of AWS services
✅ Work with relational databases such as Aurora (PostgreSQL) and Redshift for large-scale data management
✅ Develop and optimize ETL/ELT processes to ensure smooth data integration from multiple sources
✅ Build and maintain serverless applications using AWS services like Lambda, Fargate, and SQS
✅ Collaborate with cross-functional teams to ensure data accessibility and integrity across different platforms
✅ Use Python, SQL, and Pandas for data manipulation, analysis, and automation of workflows
✅ Implement unit testing to ensure code reliability, using PyTest
✅ Manage API integrations and development, using FastAPI
✅ Assist with infrastructure as code (IaC) using CDK
Must Have Skills & Qualifications:
✅ Extensive Work Experience: 5+ years in Data Engineering with AWS and Python
✅.Strong AWS Experience: Particularly with services like Aurora (PostgreSQL or MySQL), Redshift, Fargate, Lambda, SQS, and IAM. Experience using tools like Boto3 and AWS CLI
✅.Linux Proficiency: Comfortable in a Linux environment, including things like shell scripting, common CLI tools, creating and testing Linux based Docker images
✅.SQL Mastery: Expertise in complex SQL queries and database optimization
✅.Advanced Python: Strong hands-on experience with Python, ideally with strong experience with Data Engineering tools, like Pandas, SQLAlchemy, Pandera
✅.Boto3 Expertise: Solid understanding of AWS SDK for Python (Boto3), AWS CLI, and AWS automation tools
✅.Git Proficiency: Comfortable with version control, branching, and collaboration using Git
✅.ORM Skills: Ideally with SQLAlchemy, for database interactions
How to apply for this opportunity?
- Step 1: Click On Apply! And Register or Login on our portal.
- Step 2: Complete the Screening Form & Upload updated Resume
- Step 3: Increase your chances to get shortlisted & meet the client for the Interview!
Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement.
(Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well).
So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Similar Jobs
View All
Talk to us
Feel free to call, email, or hit us up on our social media accounts.
Email
info@antaltechjobs.in