Vadodara, Gujarat, India
Real Estate & Construction
Full-Time
UnifyCX
Overview
Job Summary
As a DevOps Engineer specializing in data, you will be responsible for implementing and managing our cloud-based data infrastructure using AWS and Snowflake. You will collaborate with data engineers, data scientists, and other stakeholders to design, deploy, and maintain a robust data ecosystem that supports our analytics and business intelligence initiatives. Your expertise in modern data tech stacks, MLOps methodologies, automation, and information security will be crucial in enhancing our data pipelines and ensuring data integrity and availability.
Key Responsibilities
As a DevOps Engineer specializing in data, you will be responsible for implementing and managing our cloud-based data infrastructure using AWS and Snowflake. You will collaborate with data engineers, data scientists, and other stakeholders to design, deploy, and maintain a robust data ecosystem that supports our analytics and business intelligence initiatives. Your expertise in modern data tech stacks, MLOps methodologies, automation, and information security will be crucial in enhancing our data pipelines and ensuring data integrity and availability.
Key Responsibilities
- Infrastructure Management: Design, deploy, and manage AWS cloud infrastructure for data storage, processing, and analytics, ensuring high availability and scalability while adhering to security best practices.
- Data Pipeline Deployment: Collaborate with data engineering teams to deploy and maintain efficient data pipelines using tools like Apache Airflow, dbt, or similar technologies.
- Snowflake Administration: Implement and manage Snowflake data warehouse solutions, optimizing performance and ensuring data security and governance.
- MLOps Implementation: Collaborate with data scientists to implement MLOps practices, facilitating the deployment, monitoring, and governance of machine learning models in production environments.
- Information Security: Integrate security controls into all aspects of the data infrastructure, including encryption, access control, and compliance with data protection regulations (e.g., GDPR, HIPAA).
- CI/CD Implementation: Develop and maintain continuous integration and continuous deployment (CI/CD) pipelines for data-related applications and services, including model training and deployment workflows.
- Support and Troubleshooting: Deploy updates and fixes, provide Level 2 technical support, and perform root cause analysis of production errors to resolve technical issues effectively.
- Tool Development: Build tools to reduce the occurrence of errors and improve the customer experience, and develop software to integrate with internal back-end systems.
- Automation and Visualization: Develop scripts to automate data visualization and streamline reporting processes.
- System Maintenance: Design procedures for system troubleshooting and maintenance, ensuring smooth operation of the data infrastructure.
- Monitoring and Performance Tuning: Implement monitoring solutions to track data workflows and system performance, proactively identifying and resolving issues.
- Collaboration: Work closely with data scientists, analysts, and other stakeholders to understand data requirements and support analytics initiatives.
- Documentation: Create and maintain documentation for data architecture, processes, workflows, and security protocols to ensure knowledge sharing and compliance.
- Bachelor’s degree in Computer Science, Engineering, or 3+ years of experience as a DevOps engineer or in a similar engineering role.
- Strong expertise in AWS services (e.g., EC2, S3, Lambda, RDS) and cloud infrastructure best practices.
- Proficient in Snowflake, including data modeling, performance tuning, and query optimization.
- Experience with modern data technologies and tools (e.g., Apache Airflow, dbt, ETL processes).
- Familiarity with MLOps frameworks and methodologies, such as MLflow, Kubeflow, or SageMaker.
- Knowledge of containerization and orchestration tools (e.g., Docker, Kubernetes).
- Proficiency in scripting languages, including Python, Ruby, PHP, and Perl, and automation frameworks.
- Proficiency with Git and GitHub workflows.
- Working knowledge of databases and SQL.
- Strong understanding of CI/CD tools and practices (e.g., Jenkins, GitLab CI).
- Knowledge of information security principles, including data encryption, access control, and compliance frameworks.
- Excellent problem-solving attitude and collaborative team spirit.
- Strong communication skills, both verbal and written.
- Experience with data governance and compliance frameworks.
- Familiarity with data visualization tools (e.g., Tableau, Looker).
- Knowledge of machine learning frameworks and concepts is a plus.
- Relevant security certifications (e.g., CISSP, CISM, AWS Certified Security) are a plus.
- Competitive salary and benefits package.
- Opportunities for professional development and continuous learning.
- A collaborative and innovative work environment.
- Flexible work arrangements.
Similar Jobs
View All
Talk to us
Feel free to call, email, or hit us up on our social media accounts.
Email
info@antaltechjobs.in