Overview
Looking for Immediate Joiners (Core Data Engineer's With Programming & SQL Queries -Optimization GCP)
Opportunity Work from Office * - On-site
Experience: 5- 7 years
Job Description:
We are seeking a highly skilled and experienced GCP Cloud Data Engineer to join our dynamic team. The ideal candidate will have a strong background in data engineering with a focus on Google Cloud Platform (GCP) services. You will be responsible for designing, building, and maintaining scalable data pipelines and infrastructure on GCP, ensuring data is accessible, reliable, and available for business use.
Key Responsibilities:
- Data Pipeline Development: Design, develop, and maintain data pipelines using GCP services such as Dataflow, Dataproc, BigQuery, and Cloud Storage.
- Data Integration: Work on integrating data from various sources (structured, semi-structured, and unstructured) into GCP environments.
- Data Modeling: Develop and maintain efficient data models in BigQuery to support analytics and reporting needs.
- Data Warehousing: Implement data warehousing solutions on GCP, optimizing performance and scalability.
- ETL/ELT Processes: Build and manage ETL/ELT processes using tools like Apache Airflow, Data Fusion, and Python.
- Data Quality & Governance: Implement data quality checks, data lineage, and data governance best practices to ensure high data integrity.
- Automation: Automate data pipelines and workflows to reduce manual effort and improve efficiency.
- Collaboration: Work closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver data solutions that meet business needs.
- Optimization: Continuously monitor and optimize the performance of data pipelines and queries for cost and efficiency.
- Security: Ensure data security and compliance with industry standards and best practices.
Required Skills & Qualifications:
- Education: Bachelor’s degree in Computer Science, Information Technology, Engineering, or a related field.
- Experience: 5+ years of experience in data engineering, with at least 2+ years working with GCP / AWS / Azure Cloud.(either of 1*)
*
- Technical Skills:
- Proficiency in GCP services: BigQuery, Dataflow, Dataproc, Cloud Storage, Pub/Sub, and Cloud Functions.
- Strong programming skills in Python, SQL,Pyspark and familiarity with Java/Scala.
- Experience with orchestration tools like Apache Airflow.
- Knowledge of ETL/ELT processes and tools.
- Experience with data modeling and designing data warehouses in BigQuery.
- Familiarity with CI/CD pipelines and version control systems like Git.
- Understanding of data governance, security, and compliance.
- Soft Skills:
- Excellent problem-solving and analytical skills.
- Strong communication and collaboration abilities.
- Ability to work in a fast-paced environment and manage multiple priorities.
Preferred Qualifications:
- Certifications: GCP Professional Data Engineer or GCP Professional Cloud Architect certification.
- Domain Knowledge: Experience in the finance, e-commerce, healthcare domain is a plus.
Only Immediate Joiners / Open to work from Office
Education Qualification Mandate : ( BE BTech - Electronics - CS & E / IT / Circuit Background) **
Job Types: Full-time, Permanent
Pay: ₹1,559,734.41 - ₹2,444,720.06 per year
Benefits:
- Food provided
- Provident Fund
Supplemental Pay:
- Yearly bonus
Ability to commute/relocate:
- Bangalore, Karnataka: Reliably commute or planning to relocate before starting work (Required)
Application Question(s):
- Overlapping US Client call / Incase of Technical Assistance Required (Open) - Overlapping US Shift * if Needed **
Education:
- Bachelor's (Preferred)
Experience:
- Cloud Engineering: 5 years (Required)
- Java / Scala Programming: 5 years (Required)
- Python / SQL Developer: 5 years (Required)
- ETL Process: 5 years (Preferred)
Location:
- Bangalore, Karnataka (Preferred)
Work Location: In person
Application Deadline: 12/04/2025
Expected Start Date: 14/04/2025