Hyderabad, Telangana, India
Information Technology
Full-Time
Tredence Inc.
Overview
Role description
As a GCP DBT Manager you will work with team to help designing, building, and maintaining data pipelines and transformations using Google Cloud Platform (GCP) and the Data Build Tool (dbt). This often includes using tools like BigQuery, Cloud Composer, and Python, and requires strong SQL skills and knowledge of data warehousing concepts. The role also involves ensuring data quality, performance optimization, and collaborating with cross-functional teams.
Role & responsibilities
Data Pipeline Development:
Designing, building, and maintaining ETL/ELT pipelines using dbt and GCP services like BigQuery and Cloud Composer.
Data Modeling:
Creating and managing data models and transformations using dbt to ensure efficient and accurate data consumption for analytics and reporting.
Data Quality:
Developing and maintaining a data quality framework, including automated testing and cross-dataset validation.
Performance Optimization:
Writing and optimizing SQL queries for efficient data processing within BigQuery.
Collaboration:
Working with data engineers, analysts, scientists, and business stakeholders to deliver data solutions.
Incident Resolution:
Supporting day-to-day incident and ticket resolution related to data pipelines.
Documentation:
Creating and maintaining comprehensive documentation for data pipelines, configurations, and procedures.
Cloud Platform Expertise:
Utilizing GCP services like BigQuery, Cloud Composer, Cloud Functions, etc.
Scripting:
Developing and maintaining SQL/Python scripts for data ingestion, transformation, and automation tasks.
Preferred candidate profile
Requirements:
Experience:
Typically requires 7~12 years of experience in data engineering or a related field.
GCP Proficiency:
Strong hands-on experience with Google Cloud Platform (GCP) services, particularly BigQuery.
dbt Expertise:
Proficiency in using dbt for data transformation, testing, and documentation.
SQL Proficiency:
Advanced SQL skills for data modeling, performance optimization, and querying large datasets.
Data Warehousing:
Understanding of data warehousing concepts, dimensional modeling, and star schema design.
ETL/ELT:
Experience with ETL/ELT tools and frameworks, including Apache Beam, Cloud Dataflow, Data Fusion, or Airflow/Composer.
Similar Jobs
View All
Talk to us
Feel free to call, email, or hit us up on our social media accounts.
Email
info@antaltechjobs.in