Free cookie consent management tool by TermsFeed Data Engineer - Gen AI data pipeline - Consultant (I&T - Deloitte Engineering) | Antal Tech Jobs
Back to Jobs
2 Days ago

Data Engineer - Gen AI data pipeline - Consultant (I&T - Deloitte Engineering)

decor
Gurugram, Haryana, India
Information Technology
Full-Time
Deloitte

Overview

Summary

Position Summary

Job Role: Data Engineer – Gen AI Data Pipeline- Consultant

Are you looking to work at a place that builds robust, high-quality software solutions? ‘Deloitte Consulting’ is the answer. As an Analyst/Consultant/Engineer at Deloitte Consulting, you will be

responsible for quality assurance on large-scale complex software solutions at enterprise level. These applications are often high-volume mission critical systems that would provide you an exposure with end-to-end functional and domain knowledge. You will work with business, functional and technical teams on the project located across shores. You will be responsible to independently lead a team, mentor team members, and drive all test deliverables across project life cycle. You will be involved in end-to-end delivery of the project from testing strategy, estimations, planning, execution and reporting.

Work you’ll do

A Cloud Data Engineer will be responsible for following activities: 

  • Participate in application architecture and design discussions. Work with team leads in defining solution Design for development. 
  • Analyze business/functional requirements and develop data processing pipelines for them.   
  • Perform unit testing and participate in integration in collaboration with other team members. 
  • Perform peer code reviews and ensure its alignment with pre-defined architectural standards, guidelines, best practices, and meet quality standards. 
  • Work on defects\bugs and help other team members. 
  • Understand and comply with the established agile development methodology. Participate in various Agile ceremonies like – scrum meetings, sprint planning’s etc. 
  • Proactively identify opportunities for code/process/design improvements. 
  • Participate in customer support activities for existing clients using Converge Health’s existing platform\products.

The Team

Deloitte Consulting LLP’s Technology Consulting practice is dedicated to helping our clients build tomorrow by solving today’s complex business problems involving strategy, procurement, design, delivery, and assurance of technology solutions. Our service areas include analytics and information management, delivery, cyber risk services, and technical strategy and architecture, as well as the spectrum of digital strategy, design, and development services offered by Deloitte Digital. Learn more about our Technology Consulting practice on www.deloitte.com.

Qualifications And Experience

Required:

Education:

B.E./B.Tech/M.C.A./M.Sc.

Data Engineering Principles:

  • Proficient in data warehousing concepts.
  • Experienced with ETL (Extract, Transform, Load) processes.
  • Skilled in SQL and handling data in JSON and other semi-structured formats.
  • Hands-on experience with Python for data processing tasks. Big Data and Cloud Platforms:
  • Experience with Big Data technologies on cloud platforms such as AWS or Cloudera.
  • AWS Cloud Platform:Working experience on the AWS cloud platform.
  • AWS Data Pipeline:
  • Knowledgeable in building data pipelines on AWS using services like Lambda, S3, Athena, Kinesis, etc.
  • Performance Tuning:
  • Proficient in performance tuning on various RDBMS (Relational Database Management Systems) such as Oracle, SQL Server, Redshift, Impala, etc.
  • Data Modeling Concepts:
  • Good understanding of dimensional, relational, or hybrid data modeling.
  • Continuous Integration Tools:
  • Experience with CI tools such as Jenkins.
  • Proficient with GIT version control systems.
  • Familiar with issue tracking tools like JIRA.
  • Agile Development:
  • Familiar with Agile development methodologies.
  • Generative AI Experience:
  • Retrieval-Augmented Generation (RAG):
  • Must have Implemented RAG techniques to enhance data retrieval and improve the relevance of generated content.
  • Integrated RAG models with existing data pipelines to optimize information retrieval processes.
  • Vector Databases: Utilizing vector databases for efficient storage and retrieval of high-dimensional data.
  • Knowledge on vector search algorithms to enhance the performance of AI-driven applications.
  • Large Language Models (LLMs):
  • Experience with deploying and fine-tuning large language models for various NLP tasks.
  • Integrated LLMs into data processing workflows to automate and enhance data analysis.
  • LangChain:
  • Knowledge on LangChain’s for building and managing complex data workflows.
  • Experience in scalable data pipelines using LangChain to streamline data processing and integration tasks.
  • Efficiency Improvements : Should have implementation experience to reduce data processing times by optimizing ETL workflows and leveraging cloud-native solutions.

Data Engineering Principles:

  • Proficient in data warehousing concepts.
  • Experienced with ETL (Extract, Transform, Load) processes.
  • Skilled in SQL and handling data in JSON and other semi-structured formats.
  • Hands-on experience with Python for data processing tasks. Big Data and Cloud Platforms:
  • Experience with Big Data technologies on cloud platforms such as AWS or Cloudera.
  • AWS Cloud Platform:Working experience on the AWS cloud platform.
  • AWS Data Pipeline:
  • Knowledgeable in building data pipelines on AWS using services like Lambda, S3, Athena, Kinesis, etc.
  • Performance Tuning:
  • Proficient in performance tuning on various RDBMS (Relational Database Management Systems) such as Oracle, SQL Server, Redshift, Impala, etc.
  • Data Modeling Concepts:
  • Good understanding of dimensional, relational, or hybrid data modeling.
  • Continuous Integration Tools:
  • Experience with CI tools such as Jenkins.
  • Proficient with GIT version control systems.
  • Familiar with issue tracking tools like JIRA.
  • Agile Development:
  • Familiar with Agile development methodologies.
  • Generative AI Experience:
  • Retrieval-Augmented Generation (RAG):
  • Must have Implemented RAG techniques to enhance data retrieval and improve the relevance of generated content.
  • Integrated RAG models with existing data pipelines to optimize information retrieval processes.
  • Vector Databases: Utilizing vector databases for efficient storage and retrieval of high-dimensional data.
  • Knowledge on vector search algorithms to enhance the performance of AI-driven applications.
  • Large Language Models (LLMs):
  • Experience with deploying and fine-tuning large language models for various NLP tasks.
  • Integrated LLMs into data processing workflows to automate and enhance data analysis.
  • LangChain:
  • Knowledge on LangChain’s for building and managing complex data workflows.
  • Experience in scalable data pipelines using LangChain to streamline data processing and integration tasks.
  • Efficiency Improvements : Should have implementation experience to reduce data processing times by optimizing ETL workflows and leveraging cloud-native solutions.

Our purpose

Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities.

Our people and culture

Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work.

Professional development

At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India .

Benefits To Help You Thrive

At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you.

Recruiting tips

From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters.

Requisition code: 303074
Share job
Similar Jobs
View All
1 Day ago
Software Engineer - Backend Development
Information Technology
  • Indore, Madhya Pradesh, India
Job Title : Software Engineer BackendLocation : Bengaluru, IndiaPosition Summary : We are looking for an experienced Software Engineer Backend with expertise in Python, object-oriented programming, Django, and Node.js. The ideal candidate should als...
decor
1 Day ago
Python Full Stack Developer in Hyderabad
Information Technology
  • Indore, Madhya Pradesh, India
Key Responsibilities Write reusable, testable, and efficient code Design and implement low-latency, high-availability, and performant applications Integrate user-facing elements developed by front-end developers with server-side logic Implement ...
decor
1 Day ago
Studio Diseno Innovative - Senior Business Analyst
Information Technology
  • Indore, Madhya Pradesh, India
Job DescriptionThe impact that you will be makingThe role would require you to give end-to-end support to the product development and analytics services teams and maintain a strong relationship with our clients.What Lands You In This Role 3-6 years...
decor
1 Day ago
Technical Project Manager - Full Stack Development
Information Technology
  • Indore, Madhya Pradesh, India
Job Title : Full Stack ( With (Technical Project Manager)Location : Bengaluru, IndiaExperience : 7-8 yearsRole OverviewWe are looking for a proactive and technically strong Technical Project Manager (TPM) to drive the successful delivery of cloud-na...
decor
1 Day ago
Nihilent Technologies - Data Architect - ETL Platform
Information Technology
  • Indore, Madhya Pradesh, India
Designation : Data Architect.Location : Pune.Experience : 10-15 years.Job DescriptionRole & Responsibilities : The architect should have experience in architecting large scale analytics solutions using native services such as Azure Synapse, Data La...
decor
1 Day ago
Interesting Job Opportunity: Full Stack Developer - .Net/AngularJS
Information Technology
  • Indore, Madhya Pradesh, India
Job DescriptionWe are seeking an experienced and .NET full stack Developer with 3 to 6 years of hands-on development experience to join our growing team. The ideal candidate will be a skilled software engineer who can drive the development of .NET C...
decor
1 Day ago
Studio Diseno Innovative - Java Software Engineer - Spring/Microservices Architecture
Information Technology
  • Indore, Madhya Pradesh, India
Job DescriptionResponsibilities : Understand project requirements and translate them into technical solutions. Develop and maintain Java-based applications with a focus on efficiency, reliability, and scalability. Utilize Core Java, JSP, and Serv...
decor
1 Day ago
Data Engineer - Spark/Apache Flink
Information Technology
  • Indore, Madhya Pradesh, India
Required Skills Proficiency in multiple programming languages ideally Python Proficiency in at least one cluster computing frameworks (preferably Spark, alternatively Flink or Storm) Proficiency in at least one cloud data lakehouse platforms (pre...
decor

Talk to us

Feel free to call, email, or hit us up on our social media accounts.
Social media