Free cookie consent management tool by TermsFeed Data Engineer/Associate Director | Antal Tech Jobs
Back to Jobs
3 Days ago

Data Engineer/Associate Director

decor
Bangalore, Karnataka, India
Information Technology
Full-Time
HSBC

Overview

Some careers shine brighter than others.

If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further.

HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions.

We are currently seeking an experienced professional to join our team in the role of

Senior Consultant Specialist/ Consultant Specialist/Senior Software engineer/Software engineer

(Based on number of years of experience and role)

In this role, you will:

  • We are seeking a highly skilled and experienced Senior Data Engineer with expertise in Java, Java 8, Microservices, Springboot 3.0.0, postgres, JPA, UI -React, Typescript, JS, Apache Flink, Apache Beam, MongoDB, and Google Cloud Platform (GCP) services such as Dataflow, Big Query, Pub/Sub, Google Cloud Storage (GCS), and Composer. The ideal candidate should also have hands-on experience with Apache Airflow, Google Kubernetes Engine (GKE), and Python for scripting and automation. You will play a critical role in designing, developing, and maintaining scalable, high-performance data pipelines and cloud-native solutions, with a strong focus on real-time stream processing using Apache Flink
  • Design, develop, and maintain real-time and batch data pipelines using Apache Flink and Apache Beam.
  • Implement stateful stream processing, event-time handling, and windowing with Flink.
  • Optimize Flink jobs for performance, scalability, and fault tolerance.
  • Build scalable, high-performance applications using Java.
  • Write clean, maintainable, and efficient code following best practices.
  • Integrate Flink pipelines with external systems such as Kafka, HDFS, and NoSQL databases.
  • Use Apache Airflow (or Composer on GCP) to orchestrate complex workflows and automate data pipeline execution.
  • Monitor and troubleshoot Airflow DAGs to ensure smooth operations.
  • Leverage GCP services to build and deploy cloud-native solutions:
  • Dataflow: Design and deploy real-time and batch data processing pipelines.
  • BigQuery: Perform data analysis and optimize queries for large datasets.
  • Pub/Sub: Implement messaging and event-driven architectures.
  • GCS: Manage and optimize cloud storage for data pipelines.
  • Composer: Orchestrate workflows using Apache Airflow on GCP.
  • Deploy and manage containerized applications on Google Kubernetes Engine (GKE).
  • Design Kubernetes manifests and Helm charts for deploying scalable and fault-tolerant applications.
  • Design and manage NoSQL databases using MongoDB, including schema design, indexing, and query optimization.
  • Ensure data consistency and performance for high-throughput applications.
  • Use Python for scripting, automation, and building utility tools.
  • Write Python scripts to interact with APIs, process data, and manage workflows.
  • Architect distributed systems with a focus on scalability, reliability, and performance.
  • Design fault-tolerant systems with high availability using best practices.
  • Work closely with cross-functional teams, including data engineers, DevOps engineers, and product managers, to deliver end-to-end solutions.
  • Participate in code reviews, design discussions, and technical decision-making.
  • Monitor production systems using tools like Stackdriver, Prometheus, or Grafana.
  • Optimize resource usage and costs for GCP services and Kubernetes clusters.

Qualifications - External

To be successful in this role, you should meet the following requirements:

  • Strong proficiency in Java mentioned above with experience in building scalable and high-performance applications.
  • Basic to intermediate knowledge of Python for scripting and automation.
  • Hands-on experience with Apache Flink for real-time stream processing and batch processing.
  • Knowledge of Flink’s state management, windowing, and event-time processing.
  • Experience with Flink’s integration with GCP services.
  • Knowledge of Apache Beam for unified batch and stream data processing.
  • Proficiency in Apache Airflow for building and managing workflows.
  • Experience with Composer on GCP is a plus.
  • Cloud Platform Expertise:
  • Strong experience with Google Cloud Platform (GCP) services:
  • Dataflow, BigQuery, Pub/Sub, GCS, and Composer.
  • Familiarity with GCP IAM, networking, and cost optimization.
  • Hands-on experience with Docker for containerization.
  • Proficiency in deploying and managing applications on Google Kubernetes Engine (GKE).
  • Expertise in MongoDB, including schema design, indexing, and query optimization.
  • Familiarity with other NoSQL or relational databases is a plus.
  • Strong problem-solving and analytical skills.
  • Excellent communication and collaboration abilities.
  • Ability to work in an agile environment and adapt to changing requirements.
  • Experience with other stream processing frameworks like Apache Kafka Streams or Spark Streaming.
  • Knowledge of other cloud platforms (AWS, Azure) is a plus.
  • Familiarity with Helm charts for Kubernetes deployments.
  • Experience with monitoring tools like Prometheus, Grafana, or Stackdriver.
  • Knowledge of security best practices for cloud and Kubernetes environments.

You’ll achieve more when you join HSBC.

www.hsbc.com/careers

HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website.

Issued by – HSBC Software Development India

Share job
Similar Jobs
View All
1 Day ago
Senior Azure Data Engineer - ETL/Power BI
Social Good & Community Development
We are seeking an experienced and hands-on Senior Azure Data Engineer with Power BI expertise to take on a dual role that combines technical leadership and active development.You will lead BI and data engineering efforts for enterprise-grade analyti...
decor
1 Day ago
Cyber Security Analyst - SOC
Social Good & Community Development
Experience : 3+years.Location : Nagpur.Notice period : 30days.Mandatory skills : SOC, Qradar , Endpoint corwdstrike.Job Description Responsible for conducting information security investigations as a result of security incidents identified by t...
decor
1 Day ago
QA Automation Tester - Selenium/Cypress
Social Good & Community Development
We are seeking a skilled and detail-oriented QA Automation Tester to join our Quality Assurance team.The ideal candidate will be responsible for designing and executing automated test cases, building test frameworks, and ensuring the delivery of hig...
decor
1 Day ago
AWS Cloud Security Consultant
Social Good & Community Development
Experience: 7+ YearsLocation: Pune or HyderabadNotice Period: Immediate to 30 Days🛡️ Role OverviewWe are looking for an experienced AWS Cloud Security Consultant (Specialist level) with a deep understanding of AWS native security capabilities. You w...
decor
1 Day ago
Big Data Engineer
Social Good & Community Development
This job is with Amazon, an inclusive employer and a member of myGwork – the largest global platform for the LGBTQ+ business community. Please do not contact the recruiter directly. DescriptionProfit Intelligence team in Amazon Retail is seeking a ...
decor
1 Day ago
Software Engineer II – Python, PySpark, AWS
Social Good & Community Development
hackajob is collaborating with J.P. Morgan to connect them with exceptional tech professionals for this role.You’re ready to gain the skills and experience needed to grow within your role and advance your career — and we have the perfect software en...
decor
1 Day ago
Credence Global Solutions - Software Developer - Javascript/.Net Core/MVC
Social Good & Community Development
We are looking for a skilled and enthusiastic Software Developer (.Net MVC / .Net Core) to join our growing technology team. The ideal candidate will have a strong background in Microsoft technologies and experience building scalable web application...
decor
1 Day ago
Senior Data Engineer
Social Good & Community Development
Job SummaryWe are looking for a skilled and motivated Software Engineer with strong experience in data engineering and ETL processes.The ideal candidate should be comfortable working with any object-oriented programming language, possess strong SQL ...
decor

Talk to us

Feel free to call, email, or hit us up on our social media accounts.
Social media