Hyderabad, Telangana, India
Healthcare & Life Sciences
Full-Time
UPS
Overview
Explore your next opportunity at a Fortune Global 500 organization. Envision innovative possibilities, experience our rewarding culture, and work with talented teams that help you become better every day. We know what it takes to lead UPS into tomorrow—people with a unique combination of skill + passion. If you have the qualities and drive to lead yourself or teams, there are roles ready to cultivate your skills and take you to the next level.
Job Description:
We are looking for an experienced and motivated Senior GCP Data Engineer to join our dynamic data team. In this role, you will be responsible for designing, building, and optimizing data pipelines, implementing advanced analytics solutions, and maintaining robust data infrastructure using Google Cloud Platform (GCP) services. You will play a key role in enabling data-driven decision-making and enhancing the performance and scalability of our data ecosystem.
Key Responsibilities:
Permanent
UPS is committed to providing a workplace free of discrimination, harassment, and retaliation.
Job Description:
We are looking for an experienced and motivated Senior GCP Data Engineer to join our dynamic data team. In this role, you will be responsible for designing, building, and optimizing data pipelines, implementing advanced analytics solutions, and maintaining robust data infrastructure using Google Cloud Platform (GCP) services. You will play a key role in enabling data-driven decision-making and enhancing the performance and scalability of our data ecosystem.
Key Responsibilities:
- Design, implement, and optimize data pipelines using Google Cloud Platform (GCP) services, including Compute Engine, BigQuery, Cloud Pub/Sub, Dataflow, Cloud Storage, and AlloyDB.
- Lead the design and optimization of schema for large-scale data systems, ensuring data consistency, integrity, and scalability.
- Work closely with cross-functional teams to understand data requirements and deliver efficient, high-performance solutions.
- Design and execute complex SQL queries for BigQuery and other databases, ensuring optimal performance and efficiency.
- Implement efficient data processing workflows and streaming data solutions using Cloud Pub/Sub and Dataflow.
- Develop and maintain data models, schemas, and data marts to ensure consistency and scalability across datasets.
- Ensure the scalability, reliability, and security of cloud-based data architectures.
- Optimize cloud storage, compute, and query performance, driving cost-effective solutions.
- Collaborate with data scientists, analysts, and software engineers to create actionable insights and drive business outcomes.
- Implement best practices for data management, including governance, quality, and monitoring of data pipelines.
- Provide mentorship and guidance to junior data engineers and collaborate with them to achieve team goals.
- Bachelor's degree in Computer Science, Engineering, or a related field (or equivalent work experience).
- 5+ years of experience in data engineering, with a strong focus on Google Cloud Platform (GCP).
- Extensive hands-on experience with GCP Compute Engine, BigQuery, Cloud Pub/Sub, Dataflow, Cloud Storage, and AlloyDB.
- Strong expertise in SQL for query optimization and performance tuning in large-scale datasets.
- Solid experience in designing data schemas, data pipelines, and ETL processes.
- Strong understanding of data modeling techniques, and experience with schema design for both transactional and analytical systems.
- Proven experience optimizing BigQuery performance, including partitioning, clustering, and cost optimization strategies.
- Experience with managing and processing streaming data and batch data processing workflows.
- Knowledge of AlloyDB for managing transactional databases in the cloud and integrating them into data pipelines.
- Familiarity with data security, governance, and compliance best practices on GCP.
- Excellent problem-solving skills, with the ability to troubleshoot complex data issues and find efficient solutions.
- Strong communication and collaboration skills, with the ability to work with both technical and non-technical stakeholders.
- Bachelor's/Master’s degree in Computer Science, Data Engineering, or a related field.
- Familiarity with infrastructure as code tools like Terraform or Cloud Deployment Manager.
- GCP certifications (e.g., Google Cloud Professional Data Engineer or Cloud Architect).
Permanent
UPS is committed to providing a workplace free of discrimination, harassment, and retaliation.
Similar Jobs
View All
Talk to us
Feel free to call, email, or hit us up on our social media accounts.
Email
info@antaltechjobs.in