Bangalore, Karnataka, India
Space Exploration & Research, Information Technology
Full-Time
emagine
Overview
SummaryAs a Data Scientist, your main function is to advance AI adoption within the organization through the design and production of machine learning and generative AI capabilities that drive measurable business outcomes.
Main Responsibilities
As a Data Scientist, you will:
- Explore and prototype new SyGrow features from hypotheses to validated experiments with measurable outcomes.
- Translate sales and stakeholder needs into analytical problems, KPIs, and success criteria.
- Build, evaluate, and iterate ML models for recommendations, scoring, forecasting, and insights.
- Implement robust data ingestion, feature engineering, and training pipelines in Azure ML and Microsoft Fabric.
- Productionize models with CI/CD, versioning, telemetry, and rollback using Azure DevOps.
- Leverage LLMs to augment the sales copilot, defining guardrails and evaluation strategies.
- Ensure privacy, security, and compliance across data and model lifecycles.
- Monitor performance, drift, and cost; run A/B tests and incorporate user feedback.
- Collaborate with engineering and product teams to ship incremental value in agile sprints.
- Document decisions, share knowledge, and contribute reusable components and templates.
- Integrate with CRM/BI workflows and data sources to operationalize outcomes.
- Mentor interns/juniors and promote best practices in Data Science and MLOps.
Key Requirements
- Master’s in Data Science, Computer Science, Engineering, Mathematics, or equivalent practical experience.
- 2–5 years hands-on experience delivering ML/DS solutions from exploration to production.
- Experience in partnering with business stakeholders, preferably in sales/commercial.
- Experience working in agile teams.
- Strong technical knowledge in Python, Pandas, scikit-learn, and foundational statistics, along with ML algorithms.
- Proven experience with MLOps implementation on Azure.
Nice to Have
- Experience with Azure Machine Learning, Azure OpenAI, Azure Data Factory, Azure DevOps, or Microsoft Fabric.
- Familiarity with monitoring and cost governance tools.
- Understanding of prompt engineering and RAG data engineering fundamentals (SQL/ETL/APIs).
Other Details
Location: Koregaon Park, Pune
Hybrid Working: 3 days in office in CET Timezone
Similar Jobs
View All
Talk to us
Feel free to call, email, or hit us up on our social media accounts.
Email
info@antaltechjobs.in