Thiruvananthapuram, Kerala, India
Information Technology
Full-Time
Minutes to Seconds
Overview
Job Description
MinutestoSeconds is a dynamic organization specializing in outsourcing services, digital marketing, IT recruitment, and custom IT projects. We partner with SMEs, mid-sized companies, and niche professionals to deliver tailored solutions.
We would love the opportunity to work with YOU!!
Requirements
Role- Snowflakes Data engineer
Exp- 8- 10 Years
Ingestion Methods/Patterns
▪ Automation of building, testing, and deploying code – inc. code linting, security scanning, and version
management.
Requirements
Role- Snowflakes Data engineer Exp- 8- 10 Years Ingestion Methods/Patterns o Fivetran o Snowflake-Snowpipe (File-Based sources) o Snowflake-Secure Data Share
MinutestoSeconds is a dynamic organization specializing in outsourcing services, digital marketing, IT recruitment, and custom IT projects. We partner with SMEs, mid-sized companies, and niche professionals to deliver tailored solutions.
We would love the opportunity to work with YOU!!
Requirements
Role- Snowflakes Data engineer
Exp- 8- 10 Years
Ingestion Methods/Patterns
- Fivetran
- Snowflake-Snowpipe (File-Based sources)
- Snowflake-Secure Data Share
- Solid Software Development (Full SDLC) Experience with excellent Coding skills:
- Python (required).
- Good knowledge of Git and GitHub (required)
- Good code-management experience/best practices (required).
- Understanding of CI/CD to automate and improve the efficiency, speed, and reliability of software delivery.
- Best Practices/principals
- Github Actions
▪ Automation of building, testing, and deploying code – inc. code linting, security scanning, and version
management.
- Experience with testing frameworks
- Good knowledge of IaC (Infrastructure as Code) using Terraform (required)
- Strong verbal and written skills are a must, ideally with the ability to communicate in both technical and some business
- A good level of experience with cloud technologies – AWS - namely S3, Lambda, SQS, SNS, API Gateway (API
- Extensive hands-on experience engineering data pipelines and a solid understanding of the full data supply chain,
- A passion for continuous improvement and learning, for optimization both in terms of cost and efficiency, as well as
- Experience working with large structured/semi-structured datasets
- A good understanding of Parquet, Avro, JSON/XML
- Experience with Apache Airflow / MWAA or similar orchestration tooling.
- Experience with Snowflake as a Data Platform
- Solid understanding of Snowflake Architecture – compute, storage, partitioning etc.
- Key features - such as COPY-INTO, Snowpipe, object-level tagging and masking policies
- RBAC (security model) – design and administration – intermediate skill required
- query performance tuning, and zero copy clone – nice to have
- virtual warehouse (compute) sizing
- TSQL experience – ability to understand complex queries and think about optimisation - advantageous
- Data Modelling experience – advantageous
- Exposure to dbt (data build tool) for data transformations – advantageous
- Exposure to Alation or other Enterprise Metadata Management (EMM) tooling – advantageous
- Documentation: architectural designs, operational procedures, and platform configurations to ensure smooth
Requirements
Role- Snowflakes Data engineer Exp- 8- 10 Years Ingestion Methods/Patterns o Fivetran o Snowflake-Snowpipe (File-Based sources) o Snowflake-Secure Data Share
- Solid Software Development (Full SDLC) Experience with excellent Coding skills: o Python (required). o Good knowledge of Git and GitHub (required) o Good code-management experience/best practices (required).
- Understanding of CI/CD to automate and improve the efficiency, speed, and reliability of software delivery. o Best Practices/principals o Github Actions ▪ automate workflows directly from their GitHub repositories. ▪ Automation of building, testing, and deploying code – inc. code linting, security scanning, and version management. o Experience with testing frameworks o Good knowledge of IaC (Infrastructure as Code) using Terraform (required) ▪ EVERYTHING we do is IaC
- Strong verbal and written skills are a must, ideally with the ability to communicate in both technical and some business language
- A good level of experience with cloud technologies – AWS - namely S3, Lambda, SQS, SNS, API Gateway (API Development), Networking (VPCs), PrivateLink and Secrets Manager.
- Extensive hands-on experience engineering data pipelines and a solid understanding of the full data supply chain, from discovery & analysis, data ingestion, processing & transformation, to consumption/downstream data-integration.
- A passion for continuous improvement and learning, for optimization both in terms of cost and efficiency, as well as ways of working. Obsessed with data observability (aka data reconciliation), ensuring pipeline and data integrity.
- Experience working with large structured/semi-structured datasets o A good understanding of Parquet, Avro, JSON/XML
- Experience with Apache Airflow / MWAA or similar orchestration tooling.
- Experience with Snowflake as a Data Platform o Solid understanding of Snowflake Architecture – compute, storage, partitioning etc. o Key features - such as COPY-INTO, Snowpipe, object-level tagging and masking policies o RBAC (security model) – design and administration – intermediate skill required o query performance tuning, and zero copy clone – nice to have o virtual warehouse (compute) sizing
- TSQL experience – ability to understand complex queries and think about optimisation - advantageous
- Data Modelling experience – advantageous
- Exposure to dbt (data build tool) for data transformations – advantageous
- Exposure to Alation or other Enterprise Metadata Management (EMM) tooling – advantageous
- Documentation: architectural designs, operational procedures, and platform configurations to ensure smooth onboarding and troubleshooting for team members.
Similar Jobs
View All
Talk to us
Feel free to call, email, or hit us up on our social media accounts.
Email
info@antaltechjobs.in