Free cookie consent management tool by TermsFeed DataBricks - Data Engineer | Antal Tech Jobs
Back to Jobs
6 Weeks ago

DataBricks - Data Engineer

decor
Thiruvananthapuram, Kerala, India
Information Technology
Other
Wipro Limited

Overview

About the role:

The Enterprise Document Delivery team is seeking a Sr ETL Software Engineer who will participate in the entire system development lifecycle of applications related to document generation processing, printing, electronic and postal delivery for the bank. The team supports high volume statement applications residing on different platforms. The position will work with various Lines of Business across the organization to understand their requirements and architecture to design and develop the required solution. The candidate should have a strong knowledge SDLC process as well as Agile experience , and ensure that all phases of our development are 100% Wells Fargo Technology SDLC compliant. This position requires someone that can be flexible, wants to be part of a dynamic team, and is able to manage multiple priorities and tasks simultaneously.

Responsibilities of the role include the following:

  • ETL Design and Development: Develop, maintain, and optimize ETL processes for data ingestion, transformation, and data warehousing across multiple platforms, including both SQL and NoSQL databases.
  • Data Pipelines: Design, build, and manage scalable data pipelines using technologies like Databricks, Apache Spark, Python, SQL, and NoSQL databases.
  • NoSQL/MongoDB Expertise: Work with MongoDB to design efficient document schemas, implement query optimization, and handle large-scale unstructured data.
  • Data Integration: Collaborate with cross-functional teams to ensure seamless data integration between different sources such as databases (both relational and NoSQL), APIs, and external files.
  • Performance Optimization: Implement and monitor performance metrics, optimize data processing performance, and manage ETL job scheduling and dependencies.
  • Data Quality: Ensure data quality and integrity across ETL pipelines, implementing processes for data validation, cleansing, and enrichment.
  • Automation: Automate repeatable ETL tasks and data processing workflows to improve efficiency and accuracy.
  • Collaboration: Work closely with data architects, analysts, and business stakeholders to gather and understand data requirements.
  • Cloud Platforms: Leverage cloud services (Azure, GCP) for data storage, processing, and infrastructure management, ensuring scalability and cost efficiency.
  • Best Practices: Maintain documentation, adhere to data governance, and best practices in data management, including security and compliance.
  • Build Microservices APIS to expose ETL services .


ESSENTIAL QUALIFICATIONS

  • Experience: Minimum of 5+ years of experience as a Data Engineer or ETL Developer in complex, large-scale data environments.
  • SSIS, Databricks Expertise: Strong hands-on experience working with SSIS, Databricks, including using Apache Spark for data processing and optimization.
  • ETL Tools: Proficient with various ETL tools and frameworks such as Informatica, Talend, or SSIS or DataBricks
  • Big Data Technologies: In-depth knowledge of big data processing frameworks like Spark, Hadoop, Kafka, etc.
  • NoSQL/MongoDB: Expertise in working with NoSQL databases, especially MongoDB, for large-scale data storage, retrieval, and optimization.
  • Programming Skills: Proficient in SQL, Python or Java, Power shell for building data pipelines.
  • SQL and NoSQL Proficiency: Strong knowledge of SQL and experience working with both relational databases and NoSQL databases like MongoDB.
  • Data Modeling: Expertise in designing and implementing data models, including OLAP, OLTP, dimensional, and document-based models (NoSQL).
  • Data warehousing: Data warehousing is a key part of ETL process, as it stores data from multiple sources in an organized manner and will be needed to build a repeatable ETL workflow to support many different data sources.
  • Redesign and refactor legacy custom ETL processes to reusable ETL workflows that can ingest diverse data sources to normalize data to standard JSON / XML output.
  • Data Governance: Knowledge of data governance, security standards, and best practices for managing sensitive data.
  • Version Control: Experience with Git or other version control systems for code management.
  • Certification: Databricks certification, MongoDB certification, or other relevant certifications in data engineering, cloud platforms, or big data technologies.
  • Soft Skills: Strong problem-solving skills, excellent communication, and the ability to work in a collaborative team environment.
  • Analytical Mindset: Ability to translate business requirements into scalable, efficient, and reliable ETL solutions.
  • Solid understanding of legacy communication protocols and migration strategies.
  • Experience with cloud platforms like AWS, Azure, or Google Cloud or TKGI


Project Details:

  • Team(s) using SSIS to transform mainframe formatted files to standard JSON and XML file format
  • Converting applications from hosted platform to distributed cloud hosted environment
  • Evaluating and reengineering customed ETL workflows to reusable ETL microservice API
  • Target migration from SSIS to Databricks


Role Purpose

The purpose of this role is to provide significant technical expertise in architecture planning and design of the concerned tower (platform, database, middleware, backup etc) as well as managing its day-to-day operations

Do

  • Provide adequate support in architecture planning, migration & installation for new projects in own tower (platform/dbase/ middleware/ backup)
    • Lead the structural/ architectural design of a platform/ middleware/ database/ back up etc. according to various system requirements to ensure a highly scalable and extensible solution
    • Conduct technology capacity planning by reviewing the current and future requirements
    • Utilize and leverage the new features of all underlying technologies to ensure smooth functioning of the installed databases and applications/ platforms, as applicable
    • Strategize & implement disaster recovery plans and create and implement backup and recovery plans
  • Manage the day-to-day operations of the tower
    • Manage day-to-day operations by troubleshooting any issues, conducting root cause analysis (RCA) and developing fixes to avoid similar issues.
    • Plan for and manage upgradations, migration, maintenance, backup, installation and configuration functions for own tower
    • Review the technical performance of own tower and deploy ways to improve efficiency, fine tune performance and reduce performance challenges
    • Develop shift roster for the team to ensure no disruption in the tower
    • Create and update SOPs, Data Responsibility Matrices, operations manuals, daily test plans, data architecture guidance etc.
    • Provide weekly status reports to the client leadership team, internal stakeholders on database activities w.r.t. progress, updates, status, and next steps
    • Leverage technology to develop Service Improvement Plan (SIP) through automation and other initiatives for higher efficiency and effectiveness

  • Team Management
    • Resourcing
      • Forecast talent requirements as per the current and future business needs
      • Hire adequate and right resources for the team
  • Train direct reportees to make right recruitment and selection decisions
  • Talent Management
    • Ensure 100% compliance to Wipro’s standards of adequate onboarding and training for team members to enhance capability & effectiveness
    • Build an internal talent pool of HiPos and ensure their career progression within the organization
  • Promote diversity in leadership positions
  • Performance Management
    • Set goals for direct reportees, conduct timely performance reviews and appraisals, and give constructive feedback to direct reports.
    • Ensure that organizational programs like Performance Nxt are well understood and that the team is taking the opportunities presented by such programs to their and their levels below
  • Employee Satisfaction and Engagement
    • Lead and drive engagement initiatives for the team
    • Track team satisfaction scores and identify initiatives to build engagement within the team
  • Proactively challenge the team with larger and enriching projects/ initiatives for the organization or team
  • Exercise employee recognition and appreciation

Stakeholder Interaction


Stakeholder Type


Stakeholder Identification


Purpose of Interaction



Internal


Technology Solutions Group, BU Teams, Different Infrastructure teams


Understanding requirements, planning and status updates, maintenance and back up, issue resolution etc.



IRMC, QA


Guidance on risk mitigation and quality standards



External


Clients


Understanding requirements, planning and status updates, maintenance and back up, issue resolution etc.



Vendors/ Manufacturers


Development and deployment of platforms, applications, databases etc.



Display


Lists the competencies required to perform this role effectively:

  • Functional Competencies/ Skill
    • Technical Knowledge - Knowledge of own tower (platform, application, database etc) - Expert
    • Domain Knowledge - Understanding of IT industry and its trends - Competent to Expert

Competency Levels



Foundation


Knowledgeable about the competency requirements. Demonstrates (in parts) frequently with minimal support and guidance.



Competent


Consistently demonstrates the full range of the competency without guidance. Extends the competency to difficult and unknown situations as well.



Expert


Applies the competency in all situations and is serves as a guide to others as well.



Master


Coaches others and builds organizational capability in the competency area. Serves as a key resource for that competency and is recognised within the entire organization.



  • Behavioral Competencies
    • Managing Complexity
    • Client centricity
    • Execution Excellence
    • Passion for Results
    • Team Management
    • Stakeholder Management

Deliver


No.


Performance Parameter


Measure



1.


Operations of the tower


SLA adherence

Knowledge management

CSAT/ Customer Experience

Identification of risk issues and mitigation plans

Knowledge management



2.


New projects


Timely delivery

Avoid unauthorised changes

No formal escalations

Share job
Similar Jobs
View All
1 Day ago
TrueFan - Senior Machine Learning Engineer
Information Technology
  • Thiruvananthapuram, Kerala, India
About UsTrueFan is at the forefront of AI-driven content generation, leveraging cutting-edge generative models to build next-generation products. Our mission is to redefine content generation space through advanced AI technologies, including deep ge...
decor
1 Day ago
Salesforce commerce cloud consultant
Information Technology
  • Thiruvananthapuram, Kerala, India
Salesforce Commerce Cloud consultant  5+ Years of Experience 6 to 12 months Mode - Remote 1.1LPM - 1.2LPM Max Key Responsibilities Translate business requirements into scalable Salesforce Service Cloud solutions, in collaboration with CAE's technic...
decor
1 Day ago
Cloud Infrastructure Engineer
Information Technology
  • Thiruvananthapuram, Kerala, India
DescriptionInvent the future with us. Recognized by Fast Company’s 2023 100 Best Workplaces for Innovators List, Ampere is a semiconductor design company for a new era, leading the future of computing with an innovative approach to CPU design focuse...
decor
1 Day ago
Devops Engineer- Intermetiate
Information Technology
  • Thiruvananthapuram, Kerala, India
BackJD: Dev ops Engineer:As a DevOps Specialist- should be able to take ownership of the entire DevOps process, including Automated CI/CD pipelines and deployment to production.They should also be comfortable with risk analysis and prioritization.Le...
decor
1 Day ago
Sr Data Scientist (London)
Information Technology
  • Thiruvananthapuram, Kerala, India
AryaXAI stands at the forefront of AI innovation, revolutionizing AI for mission-critical, highly regulated industries by building explainable, safe, and aligned systems that scale responsibly. Our mission is to create AI tools that empower research...
decor
1 Day ago
Software Test Engineer
Information Technology
  • Thiruvananthapuram, Kerala, India
By clicking the “Apply” button, I understand that my employment application process with Takeda will commence and that the information I provide in my application will be processed in line with Takeda’s Privacy Notice and Terms of Use. I further att...
decor
1 Day ago
Software Developer 5 (Java Fullstack)
Information Technology
  • Thiruvananthapuram, Kerala, India
Job DescriptionBuilding off our Cloud momentum, Oracle has formed a new organization - Oracle Health Applications & Infrastructure. This team focuses on product development and product strategy for Oracle Health, while building out a complete platfo...
decor
1 Day ago
Java Developer - Spring Frameworks
Information Technology
  • Thiruvananthapuram, Kerala, India
Java DescriptionWe are looking for a passionate and talented Java Developer with 2-3 years of hands-on experience to join our growing development team.The ideal candidate should have a strong foundation in Java technologies and the ability to develo...
decor

Talk to us

Feel free to call, email, or hit us up on our social media accounts.
Social media