Bangalore, Karnataka, India
Information Technology
Full-Time
Virtusa
Overview
We are seeking a highly motivated and experienced Senior DevOps Engineer to join our growing team. You will play a pivotal role in automating and streamlining our data infrastructure, ensuring reliable and efficient data pipelines built with Airflow, Python, and DBT. This role demands both technical expertise in these tools and a passion for DevOps practices.
Responsibilities
Design and implement robust CI/CD pipelines for data workflows using Airflow.
Develop and maintain custom Python scripts for data processing and transformation tasks.
Utilize DBT for building and managing data models within the data warehouse.
Configure and manage infrastructure on cloud platforms (e.g., AWS, Azure, GCP).
Implement containerization technologies like Docker and Kubernetes.
Monitor and troubleshoot data pipelines and infrastructure for performance and stability.
Implement security best practices and ensure data governance compliance.
Collaborate with data engineers to understand their needs and translate them into technical solutions.
Stay up-to-date with the latest advancements in DevOps tools and technologies.
Technical Skills
7+ years of experience in a DevOps or related role.
Extensive hands-on experience with Airflow, Python, and DBT.
Proficiency in scripting languages like Bash, Shell, etc.
Experience With Cloud Platforms (AWS, Azure, GCP Preferred).
Familiarity with containerization technologies (Docker, Kubernetes).
Strong understanding of CI/CD principles and practices.
Knowledge of data security and governance best practices.
Bonus Points
Experience with data lineage and data quality tools.
Experience with cloud-based data warehousing solutions.
Responsibilities
Design and implement robust CI/CD pipelines for data workflows using Airflow.
Develop and maintain custom Python scripts for data processing and transformation tasks.
Utilize DBT for building and managing data models within the data warehouse.
Configure and manage infrastructure on cloud platforms (e.g., AWS, Azure, GCP).
Implement containerization technologies like Docker and Kubernetes.
Monitor and troubleshoot data pipelines and infrastructure for performance and stability.
Implement security best practices and ensure data governance compliance.
Collaborate with data engineers to understand their needs and translate them into technical solutions.
Stay up-to-date with the latest advancements in DevOps tools and technologies.
Technical Skills
7+ years of experience in a DevOps or related role.
Extensive hands-on experience with Airflow, Python, and DBT.
Proficiency in scripting languages like Bash, Shell, etc.
Experience With Cloud Platforms (AWS, Azure, GCP Preferred).
Familiarity with containerization technologies (Docker, Kubernetes).
Strong understanding of CI/CD principles and practices.
Knowledge of data security and governance best practices.
Bonus Points
Experience with data lineage and data quality tools.
Experience with cloud-based data warehousing solutions.
Similar Jobs
View All
Talk to us
Feel free to call, email, or hit us up on our social media accounts.
Email
info@antaltechjobs.in