Chennai, Tamil Nadu, India
Information Technology
Contract
YASH Technologies Middle East
Overview
We use cookies to offer you the best possible website experience. Your cookie preferences will be stored in your browser’s local storage. This includes cookies necessary for the website's operation. Additionally, you can freely decide and change any time whether you accept cookies or choose to opt out of cookies to improve website's performance, as well as cookies used to display content tailored to your interests. Your experience of the site and the services we are able to offer may be impacted if you do not accept all cookies.
Press Tab to Move to Skip to Content Link
Skip to main content
Search by Location
Employee Login
Search by Keyword
Search by Location
Show More Options
Loading...
Requisition ID
All
Skills
All
Select How Often (in Days) To Receive An Alert:
Create Alert
Select How Often (in Days) To Receive An Alert:
Apply now »
Date: Jun 26, 2025
Job Requisition Id: 61702
Location:
Pune, IN
YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation.
At YASH, we’re a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth – bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future.
We are looking forward to hire Pentaho Data Integration Professionals in the following areas :
Job Description:
Experience required: 4-5 years
Duration: 3 months
Job Summary:
The Contractual Data Engineer will play a supportive role in migrating the existing data pipelines from Pentaho Data Integration (PDI/Spoon) to Azure Data Factory (ADF). This individual will primarily assist the core team with data validation, documentation, and the development of new pipelines under guidance.
Key Expectations & Tasks:
Our Hyperlearning workplace is grounded upon four principles
Press Tab to Move to Skip to Content Link
Skip to main content
- Home Page
- Home Page
- Life At YASH
- Core Values
- Careers
- Business Consulting Jobs
- Digital Jobs
- ERP
- IT Infrastructure Jobs
- Sales & Marketing Jobs
- Software Development Jobs
- Solution Architects Jobs
- Join Our Talent Community
- Social Media
- Facebook
Search by Location
- Home Page
- Home Page
- Life At YASH
- Core Values
- Careers
- Business Consulting Jobs
- Digital Jobs
- ERP
- IT Infrastructure Jobs
- Sales & Marketing Jobs
- Software Development Jobs
- Solution Architects Jobs
- Join Our Talent Community
- Social Media
- Facebook
Employee Login
Search by Keyword
Search by Location
Show More Options
Loading...
Requisition ID
All
Skills
All
Select How Often (in Days) To Receive An Alert:
Create Alert
Select How Often (in Days) To Receive An Alert:
Apply now »
- Apply Now
- Start apply with LinkedIn
- Please wait...
Date: Jun 26, 2025
Job Requisition Id: 61702
Location:
Pune, IN
YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation.
At YASH, we’re a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth – bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future.
We are looking forward to hire Pentaho Data Integration Professionals in the following areas :
Job Description:
Experience required: 4-5 years
Duration: 3 months
Job Summary:
The Contractual Data Engineer will play a supportive role in migrating the existing data pipelines from Pentaho Data Integration (PDI/Spoon) to Azure Data Factory (ADF). This individual will primarily assist the core team with data validation, documentation, and the development of new pipelines under guidance.
Key Expectations & Tasks:
- Understanding Existing Pentaho Pipelines:
- Familiarize with the existing Pentaho Data Integration (PDI/Spoon) jobs and transformations.
- Document the source systems, target systems, data transformations, and business rules implemented in current Pentaho pipelines.
- Identify and document data sources, data types, and data dependencies within the existing Pentaho environment.
- Azure Data Factory (ADF) Development Support:Under guidance, assist in the creation and configuration of ADF pipelines, datasets, linked services, and data flows.
- Translate Pentaho transformation logic into ADF activities (e.g., Copy Data, Data Flow, Stored Procedure, Custom activities).
- Participate in the development of data ingestion, transformation, and loading processes within ADF.
- Execute test cases to validate data integrity and accuracy after migration.
- Compare source data (from Pentaho processed outputs) with target data (from ADF processed outputs) to identify discrepancies.
- Document and report any data discrepancies or issues found during testing
- Assist in troubleshooting and debugging data pipeline issues.
- Contribute to the creation of detailed documentation for migrated ADF pipelines, including design specifications, data lineage, and operational runbooks.
- Document best practices and lessons learned during the migration process.
- Participate in knowledge transfer sessions to the broader team.
- Collaborate effectively with the core development team, data engineers, and other stakeholders.
- Provide regular updates on progress and any roadblocks encountered.
- Strong understanding of ETL/ELT concepts, data warehousing principles, and data modeling.
- Hands-on experience in developing and troubleshooting jobs and transformations in Pentaho.
- Proficient in writing complex SQL queries for data extraction, manipulation, and validation (e.g., joins, subqueries, aggregations).
- Ability to analyze complex data flows, identify issues, and propose solutions.
- Meticulous approach to data validation and documentation.
- Good written and verbal communication skills.
- Familiarity with ADF concepts, activities, and development environment.
- Understanding of cloud platforms (Azure preferred) and data storage services (e.g., Azure Blob Storage, Azure Data Lake Storage).
- Experience with Git or Azure DevOps for code management.
- Basic scripting skills (e.g., Python, PowerShell) for automation or data manipulation.
Our Hyperlearning workplace is grounded upon four principles
- Flexible work arrangements, Free spirit, and emotional positivity
- Agile self-determination, trust, transparency, and open collaboration
- All Support needed for the realization of business goals,
- Stable employment with a great atmosphere and ethical corporate culture
- Apply Now
- Start apply with LinkedIn
- Please wait...
- Careers Home
- View All Jobs
- Top Jobs
- Blogs
- Events
- Webinars
- Media
- Contact Us
Similar Jobs
View All
Talk to us
Feel free to call, email, or hit us up on our social media accounts.
Email
info@antaltechjobs.in