Chennai, Tamil Nadu, India
Information Technology
Full-Time
Onix
Overview
Job DescriptionKey Responsibilities:
- Design, develop, and maintain data pipelines and ETL/ELT workflows using GCP-native tools and services.
- Build and optimize data warehouses using Snowflake.
- Write complex and efficient SQL queries for data transformation, analysis, and reporting.
- Collaborate with analysts, data scientists, and business stakeholders to understand data needs and deliver reliable solutions.
- Implement data governance, security, and monitoring best practices across GCP projects.
- Tune queries and optimize performance of large-scale datasets.
- Automate workflows using Cloud Composer (Airflow) or similar orchestration tools.
Required Skills & Qualifications:
- 3+ years of experience in a data engineering or data platform role.
- Strong hands-on experience with Snowflake data warehousing
- Expert-level skills in SQL — able to write optimized, scalable, and complex queries.
- Experience with data modeling (star/snowflake schema), partitioning, clustering, and performance tuning in a data warehouse.
- Familiarity with modern ELT tools such as dbt, Fivetran, or Cloud Data Fusion.
- Experience in Python or similar scripting language for data engineering tasks.
- Understanding of data governance, privacy, and Google Cloud Platform services, especially BigQuery, Cloud Storage, Dataflow, Pub/Sub, Composer.
Similar Jobs
View All
Talk to us
Feel free to call, email, or hit us up on our social media accounts.
Email
info@antaltechjobs.in