Chennai, Tamil Nadu, India
Information Technology
Full-Time
Virtusa
Overview
Job Description
We are looking for a highly skilled Engineer with a solid experience of building Bigdata, GCP Cloud based real time data pipelines and REST APIs with Java frameworks. The Engineer will play a crucial role in designing, implementing, and optimizing data solutions to support our organization s data-driven initiatives. This role requires expertise in data engineering, strong problem-solving abilities, and a collaborative mindset to work effectively with various stakeholders.
This role will be focused on the delivery of innovative solutions to satisfy the needs of our business. As an agile team we work closely with our business partners to understand what they require, and we strive to continuously improve as a team.
Technical Skills 1. Core Data Engineering Skills
Proficiency in using GCP s big data tools like
BigQuery For data warehousing and SQL analytics.
Dataproc: For running Spark and Hadoop clusters.
GCP Dataflow For stream and batch data processing.(High level Idea)
GCP Pub/Sub: For real-time messaging and event ingestion.(High level Idea)
Expertise in building automated, scalable, and reliable pipelines using custom Python/Scala solutions or Cloud Data Functions .
Familiarity with APIs and SDKs for GCP services to build custom data solutions.
Familiarity with Kubernetes (GKE) and containerization for deploying data pipelines. (Optional but Good to have)
Monitoring and logging tools like Cloud Monitoring and Cloud Logging for production workflows.
ETL - Datawarehousing
GCP
Java
RESTAPI
CI/CD
Kubernetes
We are looking for a highly skilled Engineer with a solid experience of building Bigdata, GCP Cloud based real time data pipelines and REST APIs with Java frameworks. The Engineer will play a crucial role in designing, implementing, and optimizing data solutions to support our organization s data-driven initiatives. This role requires expertise in data engineering, strong problem-solving abilities, and a collaborative mindset to work effectively with various stakeholders.
This role will be focused on the delivery of innovative solutions to satisfy the needs of our business. As an agile team we work closely with our business partners to understand what they require, and we strive to continuously improve as a team.
Technical Skills 1. Core Data Engineering Skills
Proficiency in using GCP s big data tools like
BigQuery For data warehousing and SQL analytics.
Dataproc: For running Spark and Hadoop clusters.
GCP Dataflow For stream and batch data processing.(High level Idea)
GCP Pub/Sub: For real-time messaging and event ingestion.(High level Idea)
Expertise in building automated, scalable, and reliable pipelines using custom Python/Scala solutions or Cloud Data Functions .
- Programming and Scripting
Familiarity with APIs and SDKs for GCP services to build custom data solutions.
- Cloud Infrastructure
Familiarity with Kubernetes (GKE) and containerization for deploying data pipelines. (Optional but Good to have)
- DevOps and CI/CD
Monitoring and logging tools like Cloud Monitoring and Cloud Logging for production workflows.
- Backend Development (Spring Boot & Java) Design and develop RESTful APIs and microservices using Spring Boot. Implement business logic, security, authentication (JWT/OAuth), and database operations. Work with relational databases (MySQL, PostgreSQL, MongoDB, Cloud SQL). Optimize backend performance, scalability, and maintainability. Implement unit testing and integration testing
ETL - Datawarehousing
GCP
Java
RESTAPI
CI/CD
Kubernetes
Similar Jobs
View All
Talk to us
Feel free to call, email, or hit us up on our social media accounts.
Email
info@antaltechjobs.in