Free cookie consent management tool by TermsFeed DataBricks - Data Engineer | Antal Tech Jobs
Back to Jobs
13 Weeks ago

DataBricks - Data Engineer

decor
Thiruvananthapuram, Kerala, India
Information Technology
Other
Wipro Limited

Overview

About the role:

The Enterprise Document Delivery team is seeking a Sr ETL Software Engineer who will participate in the entire system development lifecycle of applications related to document generation processing, printing, electronic and postal delivery for the bank. The team supports high volume statement applications residing on different platforms. The position will work with various Lines of Business across the organization to understand their requirements and architecture to design and develop the required solution. The candidate should have a strong knowledge SDLC process as well as Agile experience , and ensure that all phases of our development are 100% Wells Fargo Technology SDLC compliant. This position requires someone that can be flexible, wants to be part of a dynamic team, and is able to manage multiple priorities and tasks simultaneously.

Responsibilities of the role include the following:

  • ETL Design and Development: Develop, maintain, and optimize ETL processes for data ingestion, transformation, and data warehousing across multiple platforms, including both SQL and NoSQL databases.
  • Data Pipelines: Design, build, and manage scalable data pipelines using technologies like Databricks, Apache Spark, Python, SQL, and NoSQL databases.
  • NoSQL/MongoDB Expertise: Work with MongoDB to design efficient document schemas, implement query optimization, and handle large-scale unstructured data.
  • Data Integration: Collaborate with cross-functional teams to ensure seamless data integration between different sources such as databases (both relational and NoSQL), APIs, and external files.
  • Performance Optimization: Implement and monitor performance metrics, optimize data processing performance, and manage ETL job scheduling and dependencies.
  • Data Quality: Ensure data quality and integrity across ETL pipelines, implementing processes for data validation, cleansing, and enrichment.
  • Automation: Automate repeatable ETL tasks and data processing workflows to improve efficiency and accuracy.
  • Collaboration: Work closely with data architects, analysts, and business stakeholders to gather and understand data requirements.
  • Cloud Platforms: Leverage cloud services (Azure, GCP) for data storage, processing, and infrastructure management, ensuring scalability and cost efficiency.
  • Best Practices: Maintain documentation, adhere to data governance, and best practices in data management, including security and compliance.
  • Build Microservices APIS to expose ETL services .


ESSENTIAL QUALIFICATIONS

  • Experience: Minimum of 5+ years of experience as a Data Engineer or ETL Developer in complex, large-scale data environments.
  • SSIS, Databricks Expertise: Strong hands-on experience working with SSIS, Databricks, including using Apache Spark for data processing and optimization.
  • ETL Tools: Proficient with various ETL tools and frameworks such as Informatica, Talend, or SSIS or DataBricks
  • Big Data Technologies: In-depth knowledge of big data processing frameworks like Spark, Hadoop, Kafka, etc.
  • NoSQL/MongoDB: Expertise in working with NoSQL databases, especially MongoDB, for large-scale data storage, retrieval, and optimization.
  • Programming Skills: Proficient in SQL, Python or Java, Power shell for building data pipelines.
  • SQL and NoSQL Proficiency: Strong knowledge of SQL and experience working with both relational databases and NoSQL databases like MongoDB.
  • Data Modeling: Expertise in designing and implementing data models, including OLAP, OLTP, dimensional, and document-based models (NoSQL).
  • Data warehousing: Data warehousing is a key part of ETL process, as it stores data from multiple sources in an organized manner and will be needed to build a repeatable ETL workflow to support many different data sources.
  • Redesign and refactor legacy custom ETL processes to reusable ETL workflows that can ingest diverse data sources to normalize data to standard JSON / XML output.
  • Data Governance: Knowledge of data governance, security standards, and best practices for managing sensitive data.
  • Version Control: Experience with Git or other version control systems for code management.
  • Certification: Databricks certification, MongoDB certification, or other relevant certifications in data engineering, cloud platforms, or big data technologies.
  • Soft Skills: Strong problem-solving skills, excellent communication, and the ability to work in a collaborative team environment.
  • Analytical Mindset: Ability to translate business requirements into scalable, efficient, and reliable ETL solutions.
  • Solid understanding of legacy communication protocols and migration strategies.
  • Experience with cloud platforms like AWS, Azure, or Google Cloud or TKGI


Project Details:

  • Team(s) using SSIS to transform mainframe formatted files to standard JSON and XML file format
  • Converting applications from hosted platform to distributed cloud hosted environment
  • Evaluating and reengineering customed ETL workflows to reusable ETL microservice API
  • Target migration from SSIS to Databricks


Role Purpose

The purpose of this role is to provide significant technical expertise in architecture planning and design of the concerned tower (platform, database, middleware, backup etc) as well as managing its day-to-day operations

Do

  • Provide adequate support in architecture planning, migration & installation for new projects in own tower (platform/dbase/ middleware/ backup)
    • Lead the structural/ architectural design of a platform/ middleware/ database/ back up etc. according to various system requirements to ensure a highly scalable and extensible solution
    • Conduct technology capacity planning by reviewing the current and future requirements
    • Utilize and leverage the new features of all underlying technologies to ensure smooth functioning of the installed databases and applications/ platforms, as applicable
    • Strategize & implement disaster recovery plans and create and implement backup and recovery plans
  • Manage the day-to-day operations of the tower
    • Manage day-to-day operations by troubleshooting any issues, conducting root cause analysis (RCA) and developing fixes to avoid similar issues.
    • Plan for and manage upgradations, migration, maintenance, backup, installation and configuration functions for own tower
    • Review the technical performance of own tower and deploy ways to improve efficiency, fine tune performance and reduce performance challenges
    • Develop shift roster for the team to ensure no disruption in the tower
    • Create and update SOPs, Data Responsibility Matrices, operations manuals, daily test plans, data architecture guidance etc.
    • Provide weekly status reports to the client leadership team, internal stakeholders on database activities w.r.t. progress, updates, status, and next steps
    • Leverage technology to develop Service Improvement Plan (SIP) through automation and other initiatives for higher efficiency and effectiveness

  • Team Management
    • Resourcing
      • Forecast talent requirements as per the current and future business needs
      • Hire adequate and right resources for the team
  • Train direct reportees to make right recruitment and selection decisions
  • Talent Management
    • Ensure 100% compliance to Wipro’s standards of adequate onboarding and training for team members to enhance capability & effectiveness
    • Build an internal talent pool of HiPos and ensure their career progression within the organization
  • Promote diversity in leadership positions
  • Performance Management
    • Set goals for direct reportees, conduct timely performance reviews and appraisals, and give constructive feedback to direct reports.
    • Ensure that organizational programs like Performance Nxt are well understood and that the team is taking the opportunities presented by such programs to their and their levels below
  • Employee Satisfaction and Engagement
    • Lead and drive engagement initiatives for the team
    • Track team satisfaction scores and identify initiatives to build engagement within the team
  • Proactively challenge the team with larger and enriching projects/ initiatives for the organization or team
  • Exercise employee recognition and appreciation

Stakeholder Interaction


Stakeholder Type


Stakeholder Identification


Purpose of Interaction



Internal


Technology Solutions Group, BU Teams, Different Infrastructure teams


Understanding requirements, planning and status updates, maintenance and back up, issue resolution etc.



IRMC, QA


Guidance on risk mitigation and quality standards



External


Clients


Understanding requirements, planning and status updates, maintenance and back up, issue resolution etc.



Vendors/ Manufacturers


Development and deployment of platforms, applications, databases etc.



Display


Lists the competencies required to perform this role effectively:

  • Functional Competencies/ Skill
    • Technical Knowledge - Knowledge of own tower (platform, application, database etc) - Expert
    • Domain Knowledge - Understanding of IT industry and its trends - Competent to Expert

Competency Levels



Foundation


Knowledgeable about the competency requirements. Demonstrates (in parts) frequently with minimal support and guidance.



Competent


Consistently demonstrates the full range of the competency without guidance. Extends the competency to difficult and unknown situations as well.



Expert


Applies the competency in all situations and is serves as a guide to others as well.



Master


Coaches others and builds organizational capability in the competency area. Serves as a key resource for that competency and is recognised within the entire organization.



  • Behavioral Competencies
    • Managing Complexity
    • Client centricity
    • Execution Excellence
    • Passion for Results
    • Team Management
    • Stakeholder Management

Deliver


No.


Performance Parameter


Measure



1.


Operations of the tower


SLA adherence

Knowledge management

CSAT/ Customer Experience

Identification of risk issues and mitigation plans

Knowledge management



2.


New projects


Timely delivery

Avoid unauthorised changes

No formal escalations

Share job
Similar Jobs
View All
2 Days ago
Python Developer - Bangalore/ Pune
Space Exploration & Research, Information Technology
  • Pune, Maharashtra, India
Job Title: Python Developer with React.js - Bangalore/ Pune About Us “Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Com...
decor
2 Days ago
Azure Devops Engineer(5+ Yrs Exp)
Space Exploration & Research, Information Technology
  • Pune, Maharashtra, India
Required Qualifications & Skills: 5+ years in DevOps, SRE, or Infrastructure Engineering. Strong expertise in Cloud (AWS/GCP/Azure) & Infrastructure-as-Code (Terraform, CloudFormation). Proficient in Docker & Kubernetes. Hands-on with CI/CD tools ...
decor
2 Days ago
Practo Technologies - Lead Frontend Software Engineer - React.js/Next.js
Information Technology
Lead Software Engineer - UI Job DescriptionAbout Practo : www.practo.comPracto is the world's leading healthcare platform that connects millions of patients with hundreds of thousands of healthcare providers around the world and helps people make be...
decor
2 Days ago
Software Engineer 2
Space Exploration & Research, Information Technology
  • Pune, Maharashtra, India
As industries race to embrace AI, traditional database solutions fall short of rising demands for versatility, performance, and affordability. Couchbase is leading the way with Capella, the developer data platform for critical applications in our AI...
decor
2 Days ago
.Net Developer - Full Stack Technologies
Information Technology
Job Title : Senior .NET Full Stack DeveloperCompany : XevyteLocation : Bangalore (Hybrid)Experience Required : 6+ YearsAbout XevyteXevyte is a global technology and services company committed to driving digital transformation and sustainable growth....
decor
2 Days ago
SAP-Data Analyst
Space Exploration & Research, Information Technology
  • Pune, Maharashtra, India
Job Role:- SAP-Data Analyst  Job Location: -Noida/Gurgaon/Hyderabad/Bangalore/Pune Experience: -5 Years Job Roles & Responsibilities: - Collaborate with Finance & FBT Teams: Drive all data-related activities for the finance SAP deployment, ensur...
decor
2 Days ago
Senior Data Analyst Engineer
Space Exploration & Research, Information Technology
  • Pune, Maharashtra, India
Mirra Healthcare India Immedidate Joiners Only Job Description: We are seeking a highly skilled and experienced Senior Data Analyst/Engineer with a strong background in Python programming and Power BI development. The ideal candidate will have at ...
decor
2 Days ago
Senior Manager, Data Stewardship Engineer
Information Technology
  • Pune, Maharashtra, India
This site is for Residents of Europe, Middle East, Africa, Latin America & Asia Pacific.Residents of the United States, Canada & Puerto Rico, please click here. ...
decor

Talk to us

Feel free to call, email, or hit us up on our social media accounts.
Social media