Bangalore, Karnataka, India
Information Technology
Full-Time
algoleap
Overview
Good knowledge in Database like Snowflake (preferably cloud hosted), with strong programming experience in SQL.
Strong understanding of and practical working experience with Data warehouse (incl. Cloud based DW like Snowflake, Azure Synapse), ODS, Data marts.
Competence in data preparation and/or ETL tools like snapLogic, MATILLION, Azure Data Factory, AWS Glue, SSIS (preferably good exposure in one or more) to build and maintain data pipelines and flows.
Good knowledge of databases, stored procedures, optimizations of huge data
In-depth knowledge of ingestion techniques, data cleaning, de-dupe, partitioning.
Experience with building the infrastructure required for data ingestion and analytics
Ability to fine tune report generating queries
Solid understanding of normalization and denormalization of data, database exception handling, transactions, profiling queries, performance counters, debugging, database & query optimization techniques
Understanding of index design and performance-tuning techniques
Familiarity with SQL security techniques such as data encryption at the column level, Transparent Data Encryption (TDE), signed stored procedures, and assignment of user permissions
Experience in understanding the source data from various platforms and mapping them into Entity Relationship Models (ER) for data integration and reporting
Adhere to standards for all database e.g., Data Models, Data Architecture and Naming Conventions
Exposure to Source control like GIT, Azure DevOps
Understanding of Agile methodologies (Scrum, Kanban)
Understanding of data modelling techniques and working knowledge with OLTP and OLAP systems
Preferably experience with NoSQL database to migrate data into other type of databases with real time replication. (desirable)
Programming language experience or exposure in Golang, Python, shells scripts (bash/zsh, grep/sed/awk etc..) (desirable)
Experience with CI/CD automation tools (desirable)
Strong understanding of and practical working experience with Data warehouse (incl. Cloud based DW like Snowflake, Azure Synapse), ODS, Data marts.
Competence in data preparation and/or ETL tools like snapLogic, MATILLION, Azure Data Factory, AWS Glue, SSIS (preferably good exposure in one or more) to build and maintain data pipelines and flows.
Good knowledge of databases, stored procedures, optimizations of huge data
In-depth knowledge of ingestion techniques, data cleaning, de-dupe, partitioning.
Experience with building the infrastructure required for data ingestion and analytics
Ability to fine tune report generating queries
Solid understanding of normalization and denormalization of data, database exception handling, transactions, profiling queries, performance counters, debugging, database & query optimization techniques
Understanding of index design and performance-tuning techniques
Familiarity with SQL security techniques such as data encryption at the column level, Transparent Data Encryption (TDE), signed stored procedures, and assignment of user permissions
Experience in understanding the source data from various platforms and mapping them into Entity Relationship Models (ER) for data integration and reporting
Adhere to standards for all database e.g., Data Models, Data Architecture and Naming Conventions
Exposure to Source control like GIT, Azure DevOps
Understanding of Agile methodologies (Scrum, Kanban)
Understanding of data modelling techniques and working knowledge with OLTP and OLAP systems
Preferably experience with NoSQL database to migrate data into other type of databases with real time replication. (desirable)
Programming language experience or exposure in Golang, Python, shells scripts (bash/zsh, grep/sed/awk etc..) (desirable)
Experience with CI/CD automation tools (desirable)
Similar Jobs
View All
Talk to us
Feel free to call, email, or hit us up on our social media accounts.
Email
info@antaltechjobs.in