Free cookie consent management tool by TermsFeed Data Engineer (Snowflake) | Antal Tech Jobs
Back to Jobs
5 Days ago

Data Engineer (Snowflake)

decor
Pune, Maharashtra, India
Information Technology
Full-Time
Allata

Overview

Allata is a global consulting and technology services firm with offices in the US, India, and Argentina. We help organizations accelerate growth, drive innovation, and solve complex challenges by combining strategy, design, and advanced technology. Our expertise covers defining business vision, optimizing processes, and creating engaging digital experiences. We architect and modernize secure, scalable solutions using cloud platforms and top engineering practices.

Allata also empowers clients to unlock data value through analytics and visualization and leverages artificial intelligence to automate processes and enhance decision-making. Our agile, cross-functional teams work closely with clients, either integrating with their teams or providing independent guidance—to deliver measurable results and build lasting partnerships.

If you are a smart & passionate team player - then this Senior/Data Engineer [Snowflake] opportunity is for you!

We at IMRIEL (An Allata Company) are looking for a Senior/Data Engineer to implement methods to improve data reliability and quality. You will combine raw information from different sources to create consistent and machine-readable formats. You will also develop and test architectures that enable data extraction and transformation for predictive or prescriptive modeling. Resourcefulness is a necessary skill in this role. If you truly love gaining new technical knowledge and can add more awesomeness to the team, you are eligible!

What you'll be doing:

  • Architecting, developing, and maintaining scalable, efficient, and fault-tolerant data pipelines to process, clean, and integrate data from diverse sources using Python and PySpark etc
  • Designing and implementing modern Data Warehouse and Data Lake solutions on cloud platforms like Azure or AWS to support complex analytical and operational workloads
  • Building and automating ETL/ELT workflows using advanced tools such as Snowflake, Azure Data Factory, or similar platforms, ensuring optimal performance and scalability
  • Leveraging DBT (Data Build Tool) to define, document, and execute data transformation and modeling workflows
  • Writing optimized SQL queries for data retrieval, aggregation, and transformation to support downstream analytics applications


What you need:

Basic Skills:

  • Advanced skills in Python and PySpark for high-performance distributed data processing
  • Proficient in creating data pipelines with orchestration frameworks like Apache Airflow or Azure Data Factory
  • Strong experience with Snowflake, SQL Data Warehouse, and Data Lake architectures
  • Ability to write, optimize, and troubleshoot complex SQL queries and stored procedures
  • Deep understanding of building and managing ETL/ELT workflows using tools such as DBT, Snowflake, or Azure Data Factory
  • Hands-on experience with cloud platforms such as Azure or AWS, including services like S3, Lambda, Glue, or Azure Blob Storage
  • Proficient in designing and implementing data models, including star and snowflake schemas
  • Familiarity with distributed processing systems and concepts such as Spark, Hadoop, or Databricks


Responsibilities:

  • Develop robust, efficient, and reusable pipelines to process and transform large-scale datasets using Python and PySpark
  • Design pipeline workflows for batch and real-time data processing using orchestration tools like Apache Airflow or Azure Data Factory
  • Implement automated data ingestion frameworks to extract data from structured, semi-structured, and unstructured sources such as APIs, FTP, and data streams
  • Architect and optimize scalable Data Warehouse and Data Lake solutions using Snowflake, Azure Data Lake, or AWS S3
  • Implement partitioning, bucketing, and indexing strategies for efficient querying and data storage management
  • Develop ETL/ELT pipelines using tools like Azure Data Factory or Snowflake to handle complex data transformations and business logic
  • Integrate DBT (Data Build Tool) to automate data transformations, ensuring modularity and testability
  • Ensure pipelines are optimized for cost-efficiency and high performance, leveraging features such as pushdown optimization and parallel processing
  • Write, optimize, and troubleshoot complex SQL queries for data manipulation, aggregation, and reporting
  • Design and implement dimensional and normalized data models (e.g., star and snowflake schemas) for analytics use cases
  • Deploy and manage data workflows on cloud platforms (Azure or AWS) using services like AWS Glue, Azure Synapse Analytics, or Databricks
  • Monitor resource usage and costs, implementing cost-saving measures such as data lifecycle management and auto-scaling
  • Implement data quality frameworks to validate, clean, and enrich datasets
  • Build self-healing mechanisms to minimize downtime and ensure the reliability of critical pipelines
  • Optimize distributed data processing workflows for Spark by tuning configurations such as executor memory and partitioning
  • Conduct profiling and debugging of data workflows to identify and resolve bottlenecks
  • Collaborate with data analysts, scientists, and stakeholders to define requirements and deliver usable datasets
  • Maintain clear and comprehensive documentation for pipelines, workflows, and architectural decisions
  • Conduct code reviews to ensure best practices in coding and performance optimization


Good To Have:

  • Experience with real-time data processing frameworks such as Kafka or Kinesis
  • Certifications in Snowflake
  • Cloud Certifications. (Azure, AWS, GCP) Knowledge
  • Knowledge of data visualization platforms such as Power BI, Tableau, or Looker for integration purposes


Personal Attributes:

  • Ability to identify, troubleshoot, and resolve complex data issues effectively
  • Strong teamwork, communication skills and intellectual curiosity to work collaboratively and effectively with cross-functional teams
  • Commitment to delivering high-quality, accurate, and reliable data products solutions
  • Willingness to embrace new tools, technologies, and methodologies
  • Innovative thinker with a proactive approach to overcoming challenges


At Allata, we value differences.

Allata is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees.

Allata makes employment decisions without regard to race, color, creed, religion, age, ancestry, national origin, veteran status, sex, sexual orientation, gender, gender identity, gender expression, marital status, disability or any other legally protected category.

This policy applies to all terms and conditions of employment, including but not limited to, recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation, and training.
Share job
Similar Jobs
View All
2 Days ago
Java Software Engineer
Information Technology
  • 2 - 6 Yrs
  • Anywhere in India/Multiple Locations
A rapidly growing tech company in the US is seeking enthusiastic Java Software Engineers to join their innovative team. As a Java Software Engineer, you will be integral to developing and enhancing cutting-edge software solutions, working clos...
decor
2 Days ago
.NET Developer
Information Technology
  • 2 - 6 Yrs
  • Anywhere in India/Multiple Locations
Job description We are seeking an experienced .NET Developer with expertise in backend development, cloud solutions, and microservices architecture to design, build, and optimize scalable applications. The role involves developing backend sol...
decor
2 Days ago
Python Developer
Information Technology
  • 2 - 6 Yrs
  • Anywhere in India/Multiple Locations
Title- Python Developer Location- Anywhere in India Mode- Remote Shift: 12:00 PM- 9:00 PM IST Key Responsibilities: Minimum 2+ years of experience in a QE/ Agile environment with a focus on technical, automated validations l...
decor
2 Days ago
Web Developer
Information Technology
  • 1 - 6 Yrs
  • Anywhere in India/Multiple Locations
Role & responsibilities 1. Design, develop, and maintain responsive websites. 2. Work with HTML, CSS, JavaScript, and modern frameworks (like React, Angular, or Vue). 3. Optimise websites for speed, performance, and SEO. 4. Collaborat...
decor
2 Days ago
Software Developer – Oracle APEX
Information Technology
  • Hyderabad, Telangana, India
About the job:Key responsibilities: 1. Develop and maintain sustainability applications using Oracle APEX. 2. Ensure the applications are robust, scalable, and user-friendly. 3. Work with Oracle PL/SQL for backend development and database optimizat...
decor
2 Days ago
Automation Test Engineer
Information Technology
  • Hyderabad, Telangana, India
About the job:Key responsibilities: 1. Perform testing of web and mobile-based applications with a focus on automation. 2. Implement frameworks such as Page Object Model and Data Driven Testing. 3. Work with Selenium, Protractor, Mocha, Chai, Jest,...
decor
2 Days ago
Full Stack Developer (C#, .NET, Angular) - Fresher
Information Technology
  • Hyderabad, Telangana, India
About the job:Key responsibilities: 1. Participate in the design, development, and deployment of full-stack applications using C#, .NET Core, and Angular. 2. Build and maintain scalable, secure, and high-performing cloud-based enterprise applicatio...
decor
2 Days ago
Principal Software Engineer
Information Technology
  • Hyderabad, Telangana, India
OverviewWorking at AtlassianAtlassians can choose where they work – whether in an office, from home, or a combination of the two. That way, Atlassians have more control over supporting their family, personal goals, and other priorities. We can hire ...
decor

Talk to us

Feel free to call, email, or hit us up on our social media accounts.
Social media