Overview
DescriptionJob Title : Data Engineer (DBT, Snowflake)
Company : PibyThree Consulting Pvt. Ltd.
Location : Pune, Maharashtra
Experience : 4+ Years
Notice Period : Immediate to 30 Days Preferred
Company Overview
PibyThree is a cloud transformation company enabling enterprises for the future. We are a nimble, highly dynamic organization driven by trust, ownership, and deep technical expertise, helping clients achieve scalable solutions with optimized cost and reduced risk.
Role Summary
The Data Engineer will be responsible for designing, building, and optimizing scalable data pipelines and analytics solutions using Snowflake and DBT. The role requires strong expertise in SQL, data modeling, and automation, with exposure to cloud-based data platforms.
Key Responsibilities
- Design, build, and manage scalable data pipelines and ETL/ELT workflows on Snowflake
- Develop and maintain DBT models to transform raw data into analytics-ready datasets
- Write, optimize, and troubleshoot complex SQL queries for data extraction and analysis
- Implement automation for data workflows using shell scripting
- Ensure data quality, reliability, and performance across data pipelines
- Collaborate with analytics, product, and engineering teams to support business reporting needs
- Monitor and resolve data pipeline failures and performance issues
- Contribute to documentation and best practices for data engineering processes
- Reliable and scalable Snowflake data pipelines delivered on time
- High-quality, well-documented DBT models supporting analytics use cases
- Optimized SQL performance and reduced data processing latency
- Automated and stable data workflows with minimal manual intervention
- Improved data availability and accuracy for business stakeholders
- Adherence to data engineering standards and cloud best practices
- Strong hands-on experience with Snowflake cloud data platform
- Proficiency in DBT for data modeling and transformations
- Advanced SQL development and performance tuning skills
- Experience with shell scripting for workflow automation
- Solid understanding of cloud data architectures and ETL/ELT concepts
- Strong analytical, debugging, and problem-solving abilities
- Effective communication and collaboration skills
- Experience with Azure Data Factory (ADF)
- Working knowledge of Python for data processing or automation
- Bachelors degree in Computer Science, Engineering, or a related field
- Minimum 4 years of experience in data engineering or analytics engineering roles
- Experience working in cloud-based data environments
(ref:hirist.tech)