Hyderabad, Telangana, India
Information Technology
Full-Time
algoleap
Overview
The Snowflake Data Architect / Snowflake Solution Architect is responsible for lead the design and implementation of scalable, secure, and high-performance data platforms using the Snowflake Data Cloud. The ideal candidate will have strong expertise in enterprise data architecture, dimensional and relational data modelling, performance tuning, and data governance. This role will drive architectural decisions to support analytics, reporting, and real-time business applications. Designing, building, and maintaining robust data systems and infrastructure for Pacific Data Platform. This role involves developing data pipelines, optimizing database performance, and ensuring data integrity and security. The architect collaborates with other data engineers, application engineers and analysts to support data-driven decision-making and leverages advanced technologies to enhance data processing capabilities.Key responsibilities include understanding of data model and design database, ETL processes, and implementing best practices for data management. Strong programming skills, experience with SQL and NoSQL databases, and proficiency in cloud platforms are essential. He/she must be self-directed and comfortable supporting the data needs of multiple teams, systems and products. The right candidate will be excited by the prospect of data process automation and optimizing or even re-designing our company’s data architecture to support our next generation of products and data initiatives.
Job Title: Snowflake Data Architect / Snowflake Solution Architect
Job Location: Hyderabad
Start Date: As soon as possible
Key Responsibilities
- Design and implement scalable data architectures to support data storage, processing, and analytics.
- Design and implement data schemas within Snowflake to effectively support analytics, reporting needs.
- Establish and enforce data access roles and policies.
- Develop strategies to make data AI-ready, including data cleansing, transformation, and enrichment processes.
- Provide guidance and support for analytical development and modelling to enhance data visualization and reporting capabilities.
- Conduct performance tuning and optimization of data models to improve query efficiency and response times.
- Develop, maintain, and optimize ETL (Extract, Transform, Load) processes for Pacific Data Analytics Platform to ensure efficient data integration from various sources (Both internal and external datasets)
- Manage and optimize database / data warehouse systems such as snowflake ensuring high availability and performance.
- Analyze and tune database performance, identifying bottlenecks and implementing improvements to enhance query performance.
- Ensure data integrity, consistency, and accuracy through rigorous data quality checks and validations.
- Work closely with data engineers, application engineers, analysts, and other stakeholders to understand data needs and provide appropriate solutions.
- Leverage cloud technologies (mainly AWS) for data storage, processing, and analytics, ensuring cost-effectiveness and scalability.
- Document data processes, architectures, and workflows while establishing best practices for data management and engineering.
- Set up monitoring solutions to track data pipelines and database performance, ensuring timely maintenance and fault resolution.
- Ability to quickly analyze existing SQL code and make improvements to enhance performance, take advantage of new SQL features, close security gaps, and increase robustness and maintainability of the code.
- Implement data security measures and ensure compliance with relevant regulations regarding data protection and privacy.
- Provide guidance and mentorship to junior data engineers, fostering a culture of learning and continuous improvement.
Experience: 15+ years of experience in Snowflake Solution Architect would be preferable.
- Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field with at least 10+ years of software development experience
- Expert knowledge in Database like Oracle, PostgreSQL, SQL Server (preferably cloud hosted), with strong programming experience in SQL.
- Competence in data preparation and/or ETL tools like Snaplogic or Azure Data Factory or AWS Glue or SSIS (preferably strong working experience in one or more) to build and maintain data pipelines and flows.
- Programming language experience in Python, shells scripts (bash/zsh, grep/sed/awk etc..).
- Deep knowledge of databases, stored procedures, optimizations of huge data
- In-depth knowledge of ingestion techniques, data cleaning, de-dupe, partitioning.
- Experience with building the infrastructure required for data ingestion and analytics
- Solid understanding of normalization and denormalization of data, database exception handling, transactions, profiling queries, performance counters, debugging, database & query optimization techniques
- Familiarity with SQL security techniques such as data encryption at the column level, Transparent Data Encryption (TDE), signed stored procedures, and assignment of user permissions
- Experience in understanding the source data from various platforms and mapping them into Entity Relationship Models (ER) for data integration and reporting
- Good understanding of Data Models, Data Architecture and Naming Conventions
- Knowledge of data visualization tools (e.g., Tableau, Power BI) is a plus.
- Exposure to Source control like GIT, Azure DevOps
- Understanding of Agile methodologies (Scrum, Kanban)
- Preferably experience with NoSQL database to migrate data into other type of databases with real time replication.
- Experience with CI/CD automation tools
v Must have completed the certifications on Snowpro Advanced : Architect
v Very good communication skills.
v Ability to easily fit into a distributed development team.
v Ability to manage timelines of multiple initiatives.
Similar Jobs
View All
Talk to us
Feel free to call, email, or hit us up on our social media accounts.
Email
info@antaltechjobs.in