
Overview
Job Information
Date Opened
Job Type
Industry
State/Province
Zip/Postal Code
City
Country
About Us
At Innover, we endeavor to see our clients become connected, insight-driven businesses. Our integrated Digital Experiences, Data & Insights and Digital Operations studios help clients embrace digital transformation and drive unique outstanding experiences that apply to the entire customer lifecycle. Our connected studios work in tandem to reimagine the convergence of innovation, technology, people, and business agility to deliver impressive returns on investments. We help organizations capitalize on current trends and game-changing technologies molding them into future-ready enterprises.
Take a look at how each of our studios represents deep pockets of expertise and delivers on the promise of data-driven, connected enterprises.
Job Description
- Assist in developing ETL/ELT pipelines using Azure Data Factory and Synapse Pipelines to ingest and transform data from multiple internal and external sources.
- Participate in data integration efforts, connecting to REST APIs, flat files, cloud storage, on-prem databases, and SaaS applications.
- Implement basic to moderately complex data transformations (joins, filters, aggregations, derived columns) using SQL or mapping data flows.
- Support the creation and execution of data quality checks, validations, and logging mechanisms to ensure data accuracy and completeness.
- Help monitor and troubleshoot pipeline failures or data anomalies, escalating issues when needed.
- Contribute to data cataloging, lineage documentation, and metadata tracking using Microsoft Purview.
- Maintain clean, reusable code and follow team development practices.
- Work closely with BI developers to ensure data is properly prepared for Power BI dashboards and reports.
- 3+ years in a data engineering, data analytics, or software development role with a data focus.
- Exposure to or hands-on experience with Azure Data Factory, Synapse Analytics, SQL, and Azure Data Lake.
- Familiarity with data transformation logic and data integration concepts.
- Understanding of data quality assurance techniques and data validation principles.
- Basic scripting skills (e.g., Python) and version control (Git).
- Eagerness to learn modern data tools including Microsoft Fabric, Power BI, Purview, and CI/CD for data pipelines.