
Overview
EPAM is a leading global provider of digital platform engineering and development services. We are committed to having a positive impact on our customers, our employees, and our communities. We embrace a dynamic and inclusive culture. Here you will collaborate with multi-national teams, contribute to a myriad of innovative projects that deliver the most creative and cutting-edge solutions, and have an opportunity to continuously learn and grow. No matter where you are located, you will join a dedicated, creative, and diverse community that will help you discover your fullest potential.
We are looking for a talented and motivated Senior Data DevOps Engineer with expertise in MLOps to join our growing team.
The selected candidate should demonstrate a comprehensive understanding of data engineering, automation of data workflows, and operationalization of machine learning models. This position requires a team-oriented professional capable of creating, deploying, and maintaining scalable data and ML pipelines that meet organizational priorities.
Responsibilities
- Design, deploy, and monitor CI/CD pipelines for data workflows and machine learning model operations
- Set up and manage infrastructure for data processing and model training with cloud services
- Automate data validation, transformation, and workflow orchestration tasks
- Work collaboratively with data scientists, software engineers, and product teams to integrate ML models into production environments
- Enhance model serving and monitoring processes to improve system performance and reliability
- Oversee data versioning, lineage tracking, and ensure reproducibility of machine learning experiments
- Identify areas to improve deployment workflows, scalability, and infrastructure durability
- Implement security protocols to safeguard data integrity and maintain compliance with regulations
- Diagnose and resolve technical challenges across data and ML pipeline systems
Requirements
- Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a closely related field
- 4+ years of hands-on experience in Data DevOps, MLOps, or similar fields
- Proficiency in cloud platforms such as Azure, AWS, or GCP
- Background in Infrastructure as Code (IaC) tools like Terraform, CloudFormation, or Ansible
- Expertise in containerization and orchestration tools like Docker and Kubernetes
- Skills in working with data processing frameworks such as Apache Spark or Databricks
- Proficiency in Python, along with skills in using libraries like Pandas, TensorFlow, or PyTorch
- Familiarity with CI/CD tools such as Jenkins, GitLab CI/CD, or GitHub Actions
- Understanding of version control systems like Git and MLOps platforms such as MLflow or Kubeflow
- Competency in monitoring, logging, and alerting systems like Prometheus and Grafana
- Strong analytical abilities with an independent and collaborative work approach
- Excellent communication skills and proficiency in technical documentation
Nice to have
- Understanding of DataOps principles and tools including Airflow and dbt
- Knowledge of data governance practices and tools like Collibra
- Background in Big Data technologies such as Hadoop or Hive
- Qualifications in cloud platforms or data engineering certifications
We offer
- Opportunity to work on technical challenges that may impact across geographies
- Vast opportunities for self-development: online university, knowledge sharing opportunities globally, learning opportunities through external certifications
- Opportunity to share your ideas on international platforms
- Sponsored Tech Talks & Hackathons
- Unlimited access to LinkedIn learning solutions
- Possibility to relocate to any EPAM office for short and long-term projects
- Focused individual development
- Benefit package:
- Health benefits
- Retirement benefits
- Paid time off
- Flexible benefits
- Forums to explore beyond work passion (CSR, photography, painting, sports, etc.)