Pune, Maharashtra, India
Information Technology
Full-Time
Jeavio
Overview
We are seeking an experienced Senior Data Engineer to join our team.
The ideal candidate will have a strong background in data engineering and AWS infrastructure, with hands-on experience in building and maintaining data pipelines and the necessary infrastructure components.
Responsibilities
The role will involve using a mix of data engineering tools and AWS services to design, build, and optimize data Responsibilities :
The ideal candidate will have a strong background in data engineering and AWS infrastructure, with hands-on experience in building and maintaining data pipelines and the necessary infrastructure components.
Responsibilities
The role will involve using a mix of data engineering tools and AWS services to design, build, and optimize data Responsibilities :
- Design, develop, and maintain data pipelines using Airflow and AWS services.
- Implement and manage data warehousing solutions with Databricks and PostgreSQL.
- Automate tasks using GIT / Jenkins.
- Develop and optimize ETL processes, leveraging AWS services like S3, Lambda, AppFlow, and DMS.
- Create and maintain visual dashboards and reports using Looker.
- Collaborate with cross-functional teams to ensure smooth integration of infrastructure components.
- Ensure the scalability, reliability, and performance of data platforms.
- Work with Jenkins for infrastructure and functional areas of expertise :
- Working as a senior individual contributor on a data intensive project.
- Strong experience in building high performance, resilient & secure data processing pipelines preferably using Python based stack.
- Extensive experience in building data intensive applications with a deep understanding of querying and modeling with relational databases preferably on time-series data.
- Intermediate proficiency in AWS services (S3, Airflow).
- Proficiency in Python and PySpark.
- Proficiency with ThoughtSpot or Databricks.
- Intermediate proficiency in database scripting (SQL).
- Basic experience with Jenkins for task to Have :
- Intermediate proficiency in data analytics tools (Power BI / Tableau / Looker / ThoughSpot).
- Experience working with AWS Lambda, Glue, AppFlow, and other AWS transfer services.
- Exposure to PySpark and data automation tools like Jenkins or CircleCI.
- Familiarity with Terraform for infrastructure-as-code.
- Experience in data quality testing to ensure the accuracy and reliability of data pipelines.
- Proven experience working directly with U.
- Ability to work independently and take the lead on and experience :
- Bachelors or masters in computer science or related fields.
- 5+ years of needed :
- Databricks.
- PostgreSQL.
- Python & Pyspark.
- AWS Stack.
- Power BI / Tableau / Looker / ThoughSpot.
- Familiarity with GIT and/or CI/CD tools.
Similar Jobs
View All
Talk to us
Feel free to call, email, or hit us up on our social media accounts.
Email
info@antaltechjobs.in