
Overview
Remote Work: Hybrid
Overview:
Let’s create tomorrow together.
A Data Engineer will be responsible for Designs, develops, programs and implements Machine Learning solutions , Implements Artificial/Augmented Intelligence systems/Agentic Workflows/Data Engineer Workflows, Performs Statistical Modelling and Measurements by applying data engineering, feature engineering, statistical methods, ML modelling and AI techniques on structured, unstructured, diverse “big data” sources of machine acquire data to generate actionable insights and foresights for real life business problem solutions and product features development and enhancements.A strong understanding of databases, SQL, cloud technologies, and modern data integration and orchestration tools like Azure Data Factory (ADF) are required to succeed in this role.
Responsibilities:
- Integrates state-of-the-art machine learning algorithms as well as the development of new methods
- Develops tools to support analysis and visualization of large datasets
- Develops, codes software programs, implements industry standard auto ML models (Speech, Computer vision, Text Data, LLM), Statistical models, relevant ML models (devices/machine acquired data), AI models and algorithms
- Identifies meaningful foresights based on predictive ML models from large data and metadata sources; interprets and communicates foresights, insights and findings from experiments to product managers, service managers, business partners and business managers
- Makes use of Rapid Development Tools (Business Intelligence Tools, Graphics Libraries, Data modelling tools) to effectively communicate research findings using visual graphics, Data Models, machine learning model features, feature engineering / transformations to relevant stakeholders
- Analyze, review and track trends and tools in Data Science, Machine Learning, Artificial Intelligence and IoT space
- Interacts with Cross-Functional teams to identify questions and issues for data engineering, machine learning models feature engineering
- Evaluates and makes recommendations to evolve data collection mechanism for Data capture to improve efficacy of machine learning models prediction
- Meets with customers, partners, product managers and business leaders to present findings, predictions, foresights; Gather customer specific requirements of business problems/processes; Identify data collection constraints and alternatives for implementation of models
- Working knowledge of MLOps, LLMs and Agentic AI/Workflows
- Programming Skills: Proficiency in Python and experience with ML frameworks like TensorFlow, PyTorch
- LLM Expertise: Hands-on experience in training, fine-tuning, and deploying LLMs
- Foundational Model Knowledge: Strong understanding of open-weight LLM architectures, including training methodologies, fine-tuning techniques, hyperparameter optimization, and model distillation.
- Data Pipeline Development: Strong understanding of data engineering concepts, feature engineering, and workflow automation using Airflow or Kubeflow.
- Cloud & MLOps: Experience deploying ML models in cloud environments like AWS, GCP (Google Vertex AI), or Azure using Docker and Kubernetes.Designs and implementation predictive and optimisation models incorporating diverse data types
- strong in SQL, Azure Data Factory (ADF)
Qualifications:
Minimum Education:
- Minimum Work Experience (years):
Scala, Go.
o 1+ years of experience in SQL and data transformation
o 1+ years of experience in developing distributed systems using open source technologies
such as Spark and Dask.
o 1+ years of experience with relational databases or NoSQL databases running in Linux
environments (MySQL, MariaDB, PostgreSQL, MongoDB, Redis).
- Key Skills and Competencies:
o Experience in data models in the Retail and Consumer products industry is desired.
o Experience working on agile projects and understanding of agile concepts is desired.
o Demonstrated ability to learn new technologies quickly and independently.
o Excellent verbal and written communication skills, especially in technical communications.
o Ability to work and achieve stretch goals in a very innovative and fast-paced environment.
o Ability to work collaboratively in a diverse team environment.
o Ability to telework
o Expected travel: Not expected.
To protect candidates from falling victim to online fraudulent activity involving fake job postings and employment offers, please be aware our recruiters will always connect with you via @zebra.com email accounts. Applications are only accepted through our applicant tracking system and only accept personal identifying information through that system. Our Talent Acquisition team will not ask for you to provide personal identifying information via e-mail or outside of the system. If you are a victim of identity theft contact your local police department.