Overview
Company:Global Banking Organization
Key Skills: Cloud App, Cloudburst, Cloupia, vCloud Director, Cloud Foundry, Smart Cloud, Cloud Application, Cloud Architect, Cloud Fusion, Cloud Stack, Amazon Elastic Cloud, Cloudbees, PL SQL, Cloud Computing, wxPython, Pytorch, Cloudera, Pivotal Cloud Foundry (PCF), Pytest, Data Entry and Documentation, Cloudtrail, Cloudcraze, Cloudant, Cloud Infrastructure, Marketing Cloud, Cloud Integration, Cloud Migration, Data Extraction, Cloud Security, Salesforce Service Cloud, SAP Cloud, AWS Cloud, Azure Cloud, Azurecloud, Cloud, Cloud Instance, Data Engineer, Data Engineering, Data Stream, Fullstack Python, GCP, Google Cloud Applications, Google Cloud Platform, Java Cloud, Manufacturing Data Engineering, MySQL, NoSQL, Oracle SQL Loader, Python, QA Python, Salesforce Sales Cloud, Experience Cloud,
Roles & Responsibilities:
- Architect and implement complex data pipelines for batch and real-time processing using GCP services such as BigQuery, Dataflow, Composer, Dataproc, and Pub/Sub.
- Drive ETL strategy, optimize data ingestion from multiple banking systems, and ensure data quality, lineage, and accuracy.
- Lead data streaming and event-driven architecture initiatives using Kafka and Pub/Sub for real-time analytics.
- Design and review scalable data models supporting regulatory reporting, risk insights, and advanced analytics use cases.
- Automate cloud infrastructure provisioning and deployment workflows using Terraform and CI/CD pipelines.
- Oversee containerized workloads on GKE, ensuring cloud-native scalability and compliance.
- Maintain cloud security standards and collaborate with governance, risk, and security teams to meet financial regulatory requirements.
- Mentor junior engineers, define coding standards, and promote engineering best practices across the team.
- Partner with cross-functional teams to deliver secure, resilient, and high-performance data solutions.
Experience Required:
- 7 - 14 years of hands-on experience in Data Engineering and Cloud Platforms.
- Strong expertise in GCP (or AWS/Azure) with proven experience in building enterprise-grade data solutions.
- Experience designing ETL/ELT pipelines, real-time data flows, and event-driven systems.
- Proficiency in SQL, PL/SQL, Python, and cloud-native data services.
- Strong understanding of Big Data technologies, data modeling, and scalable data architectures.
- Hands-on experience with Terraform, CI/CD pipelines, and cloud security best practices.
- Prior experience in heavily regulated environments such as banking or financial services is preferred.
- Ability to lead technical initiatives and collaborate across engineering, security, and governance teams.
Education: Any Graduation.