Bangalore, Karnataka, India
Information Technology
Contract
Liquidity Lock
Overview
Build and optimize our Snowflake-based analytics platform, manage multi-source data ingestion, and architect cost-efficient query strategies for frontend visualization.
Responsibilities
- Design and maintain Snowflake data warehouse infrastructure
- Build ETL/ELT pipelines from multiple data sources
- Optimize query performance and reduce compute costs
- Architect computation placement strategy between DuckDB and Snowflake
- Design aggregation layers and materialized views for visualization workloads
- Implement incremental processing and caching strategies
- Monitor warehouse performance and cost metrics
Required Skills
- Snowflake: Deep experience with warehouse architecture, clustering, materialized views, query optimization
- SQL: Advanced query tuning, execution plan analysis, cost optimization
- DuckDB: Experience with embedded analytics and local computation
- Data Integration: Multi-source ingestion (databases, APIs, streaming), ETL/ELT patterns
- Data Modeling: Star/snowflake schemas, OLAP design, aggregation strategies
- Performance Engineering: Query optimization, indexing, partitioning, incremental updates
Preferred
- dbt or similar transformation frameworks
- Python for data processing and automation
- Data visualization pipeline experience
Key Challenge
Balance query performance, visualization responsiveness, and Snowflake costs by intelligently distributing computations between local (DuckDB) and cloud (Snowflake) environments.
Similar Jobs
View All
Talk to us
Feel free to call, email, or hit us up on our social media accounts.
Email
info@antaltechjobs.in