Free cookie consent management tool by TermsFeed Software Engineer /Senior Software Engineer - Azure Data Brick with ETL and Power BI | Antal Tech Jobs
Back to Jobs
3 Days ago

Software Engineer /Senior Software Engineer - Azure Data Brick with ETL and Power BI

decor
Mumbai, Maharashtra, India
Information Technology
Full-Time
CGI

Overview

Company:

Founded in 1976, CGI is among the largest independent IT and business consulting services firms in the world. With 94,000 consultants and professionals across the globe, CGI delivers an end-to-end portfolio of capabilities, from strategic IT and business consulting to systems integration, managed IT and business process services and intellectual property solutions. CGI works with clients through a local relationship model complemented by a global delivery network that helps clients digitally transform their organizations and accelerate results. CGI Fiscal 2024 reported revenue is CA$14.68 billion and CGI shares are listed on the TSX (GIB.A) and the NYSE (GIB). Learn more at cgi.com.

Position: Software Engineer /Senior Software Engineer - Azure Data Brick with ETL and Power BI

Experience: 6-10 years

Category: Software Development/ Engineering

Location: Chennai

Position ID: J1125-1381

Employment Type: Full Time

The ETL Developer will be responsible for designing, developing, and implementing robust and scalable data architectures and ETL solutions. This role requires expertise in Databricks for data processing and Lakehouse architecture, Kimball dimensional modelling for data warehousing, and Power BI and Microsoft Fabric for data visualization and analytics platform implementation. As an ETL Developer, you will design, develop and implement scalable, secure, and high-performance data solutions using the Databricks Lakehouse Platform. You will work with ETL Architects to on architectural standards and collaborate with stakeholders to align data strategies with business goals. Your role will focus on leveraging Databricks' unified analytics capabilities to build enterprise-grade data platforms that support advanced analytics, machine learning, and real-time data processing.

Your future duties and responsibilities

Develop and build ingestion, transformation, storage, and consumption layers on Databricks

  • Develop and maintain data models, data flow diagrams, and other solution documentation based on Kimball dimensional modelling principles.
  • Develop and implement Lakehouse solution using Delta Lake, Unity Catalog, and structured streaming.
  • Proven experience in implementing data governance using Unity Catalog, including fine-grained access control, column-level lineage, data classification, audit logging, and centralized metadata management across workspaces and cloud environments
  • Develop scalable ETL/ELT pipelines using Apache Spark, PySpark, and Databricks Workflows.
  • Development of integration of Databricks with enterprise systems such as data catalogs, data quality frameworks, ML platforms, and BI tools.
  • Design and development of high-performance reporting models and paginated reports, configurable inquiries and interactive dashboards using Power BI.
  • Experience with CI/CD pipelines, version control, and automated testing for Databricks notebooks and jobs.
  • Experience in performance tuning and cluster configuration.
  • Participate in architectural reviews, code audits, to ensure adherence to standards and scalability.
  • Collaborate closely with clients, business stakeholders, and internal teams to translate business requirements into technical solutions.
  • Stay current with Databricks innovations and advocate for adoption of new features and capabilities.

Required Qualifications To Be Successful In This Role

Education Qualification: Bachelor’s degree in computer science or related field or higher with minimum 4 years of relevant experience.

6-10years of experience in ETL/Power BI, with 3+ years in Databricks and Apache Spark. Strong proficiency in SQL & DAX

  • Experience in project migrating Snowflake and other custom EDW/ ETL solutions to Databricks.
  • Experience of migrating different reporting solutions like Cognos, SAP BO etc to Power BI and Databricks.
  • Experience with Kimball dimensional modelling and data warehousing concepts
  • Experience in designing and deploying ETL/ELT pipelines for large-scale data integration.
  • Proficiency in Power BI for paginated report and dashboard development, including DAX.
  • Strong experience in Delta Lake, structured streaming, PySpark, and SQL.
  • Strong understanding of Lakehouse architecture, data mesh, and modern data stack principles.
  • Experience with Unity Catalog, Databricks Repos, Jobs API, and Workflows.
  • Proven ability to design and implement secure, governed, and highly available data platforms.
  • Familiarity with cloud platforms (Azure, AWS, GCP) and their integration with Databricks.
  • Experience with CI/CD, DevOps, and infrastructure-as-code tools (Terraform, GitHub Actions, Azure DevOps).
  • Knowledge of machine learning lifecycle, MLflow, and model deployment strategies.
  • An understanding of E-R data models (conceptual, logical, and physical).
  • Strong analytical skills, including a thorough understanding of how to interpret customer business requirements and translate them into technical designs and solutions.
  • Strong communication skills both verbal and written. Capable of collaborating effectively across a variety of IT and Business groups, across regions, roles and able to interact effectively with all levels.
  • Strong problem-solving skills. Ability to identify where focus is needed and bring clarity to business objectives, requirements, and priorities.

Must to Have :

  • Azure Databricks, Databricks Lakehouse Architecture
  • ETL / ELT, Data Architecture
  • Apache Spark / PySpark, Delta Lake, Delta Live Tables (DLT)
  • Unity Catalog, Medallion Architecture
  • Dimensional Modeling (Star & Snowflake), Kimball, Data Vault
  • Slowly Changing Dimensions (SCD Types 1, 2, 3)
  • Data Governance, RBAC, Data Lineage, Metadata Management
  • CI/CD & DevOps (Azure DevOps, GitHub Actions, Terraform)
  • SQL, Power BI, Self-Service Analytics, Semantic Model, Paginated Reports
  • Data Quality (Great Expectations), Performance Tuning, Cost Optimization
  • Cloud Platforms (Azure, AWS, GCP), Azure Data Factory, Synapse, Event Hubs

Nice-to-Have Skills:

  • Streaming frameworks (Kafka, Event Hubs), workspace automation
  • Advanced data modeling for Finance, Performance Budgeting, HRM systems
  • Subject-area models for financial reporting, workforce analytics, payroll insights
  • Delta Change Data Feed (CDF) and real-time data marts
  • Certifications:
  • Databricks Certified Data Engineer Associate / Professional
  • Databricks Certified Associate Developer for Apache Spark
  • Azure / Power BI certifications

Share job
Similar Jobs
View All
1 Day ago
Principle Systems Engineer
Information Technology
  • 3000000 - 4000000 INR - Annual
  • 12 - 20 Yrs
  • Hyderabad
Background and Experience: •College Degree or equivalent in Computer Science, Computer Engineering, Electrical Engineering, or related technical discipline. •Ten (10)+ years of related experience, or a master’s degree with eight (8)+ years of exper...
decor
1 Day ago
Fullstack Engineer
Information Technology
  • 4 - 8 Yrs
  • Bangalore
We are looking for a Full Stack Engineer who enjoys building scalable web applications and working across both backend and frontend technologies. This role involves developing high-performance applications, designing APIs, and working closely with pr...
decor
1 Day ago
Senior Backend Developer
Information Technology
  • 5 - 10 Yrs
  • Bangalore
We are looking for a Senior Backend Engineer to design and build scalable backend systems powering high-performance applications. This is a hands-on engineering role for someone who enjoys solving complex technical challenges and building systems fro...
decor
1 Day ago
AI Automation Tester
Information Technology
  • 20000 - 40000 INR - Monthly
  • Mumbai, Maharashtra, India
AI Automation Tester – Job DescriptionJob DescriptionWe are looking for a motivated AI Automation Tester with up to 1.5 years of experience to support quality assurance activities for ERP-based applications. The candidate will assist in developing ...
decor
1 Day ago
Data Scientist
Information Technology
  • Mumbai, Maharashtra, India
Role description JD Skills About RCG Global Services At Myridius, we transform the way businesses operate. Formerly known as RCG Global Services, our more than 50 years of expertise now drive a new vision—propelling organizations through the rapidly...
decor
1 Day ago
Business Analyst
Information Technology
  • Mumbai, Maharashtra, India
Job Title: Business AnalystExperience: 6+yrsLocation: Gurgaon, Bengaluru, Chennai, Hyderabad, Jaipur, Jodhpur, Kolkata, Madurai, Mumbai, New Delhi, Noida, PuneNotice Period: Immediate Joiners OnlyJob SummaryWe are seeking a detail-oriented and analyt...
decor
1 Day ago
Senior AI Engineer – Internal Tools & Systems
Information Technology
  • 4800000 - 6400000 INR - Yearly
  • Mumbai, Maharashtra, India
About the RoleWe are seeking a Senior AI Engineer – Internal Tools & Systems to design, build, and operate production-grade, AI-powered internal tools that increase organizational efficiency and reduce manual effort across teams such as Finance, Sa...
decor
1 Day ago
Data Governance Architect
Information Technology
  • Mumbai, Maharashtra, India
Indore, Madhya Pradesh, India;Noida, Uttar Pradesh, India;Bengaluru, Karnataka, India;Pune, Maharashtra, India;Hyderabad, Telangana, India Qualification : Required Skills & Qualifications 10+ years of experience in Data Governance Strong understandi...
decor

Talk to us

Feel free to call, email, or hit us up on our social media accounts.
Social media