Overview
Job Description Data Engineer Core Technical Skills Snowflake & Spark � Building and managing scalable data pipelines. � Spark-based transformations and ETL workflows. � Expertise in PySpark, including optimization techniques and cost management. � Snowflake-specific capabilities: o Performance tuning and query optimization. o Partitioning and clustering strategies. o Cost control and resource management. o Advanced features such as Time Travel, Zero-Copy Cloning, and Streams & Tasks for data engineering workflows. � Delta Lake concepts (ACID transactions, Z-Ordering, OPTIMIZE, VACUUM) for hybrid architectures. SQL & Relational Databases o Advanced SQL query writing. o PostgreSQL expertise (window functions, CTEs, query plans, indexing strategy). Streaming & Messaging o Apache Kafka for real-time ingestion and topic management. o Understanding of event-driven architecture. AWS & Cloud Services o Excellent in AWS Glue, Lambda, Step functions and Data Analytics services.