Bangalore, Karnataka, India
Information Technology
Full-Time
UST
Overview
Role DescriptionJob Overview:
UST is seeking a skilled Snowflake Engineer with 6 to 10 years of experience to join our team. The ideal candidate will play a key role in the development, implementation, and optimization of data solutions on the Snowflake cloud platform. The candidate should have strong expertise in Snowflake, and a solid understanding of ETL processes, along with proficiency in data engineering and data processing technologies.
This role is essential for designing and maintaining high-performance data pipelines and data warehouses, focusing on scalability and efficient data storage.
Key Responsibilities
- Snowflake Data Warehouse Development:
- Design, implement, and optimize data warehouses on the Snowflake cloud platform.
- Ensure the effective utilization of Snowflake’s features for scalable, efficient, and high-performance data storage and processing.
- Data Pipeline Development:
- Develop, implement, and optimize end-to-end data pipelines on the Snowflake platform.
- Design and maintain ETL workflows to enable seamless data processing across systems.
- Data Transformation with PySpark:
- Leverage PySpark for data transformations within the Snowflake environment.
- Implement complex data cleansing, enrichment, and validation processes using PySpark to ensure the highest data quality.
- Collaboration:
- Work closely with cross-functional teams to design data solutions aligned with business requirements.
- Engage with stakeholders to understand business needs and translate them into technical solutions.
- Optimization:
- Continuously monitor and optimize data storage, processing, and retrieval performance in Snowflake.
- Leverage Snowflake’s capabilities for scalable data storage and data processing to ensure efficient performance.
Technical Skills
- Experience:
- 5 to 7 years of experience as a Data Engineer, with a strong emphasis on Snowflake.
- Proven experience in designing, implementing, and optimizing data warehouses on the Snowflake platform.
- Snowflake: Strong knowledge of Snowflake architecture, features, and best practices for data storage and performance optimization.
- ETL: Experience with ETL processes to extract, transform, and load data into Snowflake.
- Programming Languages: Proficiency in Python, SQL, or Scala for data processing and transformations.
- Data Modeling:
- Experience with data modeling techniques and designing efficient data schemas for optimal performance in Snowflake.
Snowflake,Pyspark,Sql,Etl
Similar Jobs
View All
Talk to us
Feel free to call, email, or hit us up on our social media accounts.
Email
info@antaltechjobs.in