Pune, Maharashtra, India
Information Technology
Full-Time
UST
Overview
Role Description
Role Proficiency:
Leverage expertise in a technology area (e.g. Infromatica Transformation Terradata data warehouse Hadoop Analytics) Responsible for Architecture for a small/mid-size projects.
Outcomes
Strategy & Planning:
Snowflake Architect Key Responsibilities:
Snowflake,Data modeling,Cloud platforms,Solution architecture
Role Proficiency:
Leverage expertise in a technology area (e.g. Infromatica Transformation Terradata data warehouse Hadoop Analytics) Responsible for Architecture for a small/mid-size projects.
Outcomes
- Implement either data extract and transformation a data warehouse (ETL Data Extracts Data Load Logic Mapping Work Flows stored procedures data warehouse) data analysis solution data reporting solutions or cloud data tools in any one of the cloud providers(AWS/AZURE/GCP)
- Understand business workflows and related data flows. Develop design for data acquisitions and data transformation or data modelling; applying business intelligence on data or design data fetching and dashboards
- Design information structure work-and dataflow navigation. Define backup recovery and security specifications
- Enforce and maintain naming standards and data dictionary for data models
- Provide or guide team to perform estimates
- Help team to develop proof of concepts (POC) and solution relevant to customer problems. Able to trouble shoot problems while developing POCs
- Architect/Big Data Speciality Certification in (AWS/AZURE/GCP/General for example Coursera or similar learning platform/Any ML)
- Percentage of billable time spent in a year for developing and implementing data transformation or data storage
- Number of best practices documented in any new tool and technology emerging in the market
- Number of associates trained on the data service practice
Strategy & Planning:
- Create or contribute short-term tactical solutions to achieve long-term objectives and an overall data management roadmap
- Implement methods and procedures for tracking data quality completeness redundancy and improvement
- Ensure that data strategies and architectures meet regulatory compliance requirements
- Begin engaging external stakeholders including standards organizations regulatory bodies operators and scientific research communities or attend conferences with respect to data in cloud
- Help Architects to establish governance stewardship and frameworks for managing data across the organization
- Provide support in implementing the appropriate tools software applications and systems to support data technology goals
- Collaborate with project managers and business teams for all projects involving enterprise data
- Analyse data-related issues with systems integration compatibility and multi-platform integration
- Provide advice to teams facing complex technical issues in the course of project delivery
- Define and measure project and program specific architectural and technology quality metrics
- Publish and maintain a repository of solutions best practices and standards and other knowledge articles for data management
- Conduct and facilitate knowledge sharing and learning sessions across the team
- Gain industry standard certifications on technology or area of expertise
- Support technical skill building (including hiring and training) for the team based on inputs from project manager /RTE’s
- Mentor new members in the team in technical areas
- Gain and cultivate domain expertise to provide best and optimized solution to customer (delivery)
- Work with customer business owners and other teams to collect analyze and understand the requirements including NFRs/define NFRs
- Analyze gaps/ trade-offs based on current system context and industry practices; clarify the requirements by working with the customer
- Define the systems and sub-systems that define the programs
- Set goals and manage performance of team engineers
- Provide career guidance to technical specialists and mentor them
- Identify alliance partners based on the understanding of service offerings and client requirements
- In collaboration with Architect create a compelling business case around the offerings
- Conduct beta testing of the offerings and relevance to program
- In collaboration with Architects II and III analyze the application and technology landscapers process and tolls to arrive at the architecture options best fit for the client program
- Analyze Cost Vs Benefits of solution options
- Support Architects II and III to create a technology/ architecture roadmap for the client
- Define Architecture strategy for the program
- Participate in internal and external forums (seminars paper presentation etc)
- Understand clients existing business at the program level and explore new avenues to save cost and bring process efficiency
- Identify business opportunities to create reusable components/accelerators and reuse existing components and best practices
- Assist the PM/Scrum Master/Program Manager to identify technical risks and come-up with mitigation strategies
- Monitor the concerns of internal stakeholders like Product Managers & RTE’s and external stakeholders like client architects on Architecture aspects. Follow through on commitments to achieve timely resolution of issues
- Conduct initiatives to meet client expectations
- Work to expand professional network in the client organization at team and program levels
- Identify potential opportunities for new service offerings based on customer voice/ partner inputs
- Conduct beta testing / POC as applicable
- Develop collaterals guides for GTM
- Use data services knowledge creating POC to meet a business requirements; contextualize the solution to the industry under guidance of Architects
- Use technology knowledge to create Proof of Concept (POC) / (reusable) assets under the guidance of the specialist. Apply best practices in own area of work helping with performance troubleshooting and other complex troubleshooting. Define decide and defend the technology choices made review solution under guidance
- Use knowledge of technology t rends to provide inputs on potential areas of opportunity for UST
- Use independent knowledge of Design Patterns Tools and Principles to create high level design for the given requirements. Evaluate multiple design options and choose the appropriate options for best possible trade-offs. Conduct knowledge sessions to enhance team's design capabilities. Review the low and high level design created by Specialists for efficiency (consumption of hardware memory and memory leaks etc.)
- Use knowledge of Software Development Process Tools & Techniques to identify and assess incremental improvements for software development process methodology and tools. Take technical responsibility for all stages in the software development process. Conduct optimal coding with clear understanding of memory leakage and related impact. Implement global standards and guidelines relevant to programming and development come up with 'points of view' and new technological ideas
- Use knowledge of Project Management & Agile Tools and Techniques to support plan and manage medium size projects/programs as defined within UST; identifying risks and mitigation strategies
- Use knowledge of Project Metrics to understand relevance in project. Collect and collate project metrics and share with the relevant stakeholders
- Use knowledge of Estimation and Resource Planning to create estimate and plan resources for specific modules or small projects with detailed requirements or user stories in place
- Strong proficiencies in understanding data workflows and dataflow
- Attention to details
- High analytical capabilities
- Data visualization
- Data migration
- RDMSs (relational database management systems SQL
- Hadoop technologies like MapReduce Hive and Pig.
- Programming languages especially Python and Java
- Operating systems like UNIX and MS Windows.
- Backup/archival software.
Snowflake Architect Key Responsibilities:
- Solution Design: Designing the overall data architecture within Snowflake, including database/schema structures, data flow patterns (ELT/ETL strategies involving Snowflake), and integration points with other systems (source systems, BI tools, data science platforms).
- Data Modeling: Designing efficient and scalable physical data models within Snowflake. Defining table structures, distribution/clustering keys, data types, and constraints to optimize storage and query performance.
- Security Architecture: Designing the overall security framework, including the RBAC strategy, data masking policies, encryption standards, and how Snowflake security integrates with broader enterprise security policies.
- Performance and Scalability Strategy: Designing solutions with performance and scalability in mind. Defining warehouse sizing strategies, query optimization patterns, and best practices for development teams. Ensuring the architecture can handle future growth in data volume and user concurrency.
- Cost Optimization Strategy: Designing architectures that are inherently cost-effective. Making strategic choices about data storage, warehouse usage patterns, and feature utilization (e.g., when to use materialized views, streams, tasks).
- Technology Evaluation and Selection: Evaluating and recommending specific Snowflake features (e.g., Snowpark, Streams, Tasks, External Functions, Snowpipe) and third-party tools (ETL/ELT, BI, governance) that best fit the requirements.
- Standards and Governance: Defining best practices, naming conventions, development guidelines, and governance policies for using Snowflake effectively and consistently across the organization.
- Roadmap and Strategy: Aligning the Snowflake data architecture with overall business intelligence and data strategy goals. Planning for future enhancements and platform evolution.
- Technical Leadership: Providing guidance and mentorship to developers, data engineers, and administrators working with Snowflake.
- Key Skills:
- Deep understanding of Snowflake's advanced features and architecture.
- Strong data warehousing concepts and data modeling expertise.
- Solution architecture and system design skills.
- Experience with cloud platforms (AWS, Azure, GCP) and how Snowflake integrates.
- Expertise in performance tuning principles and techniques at an architectural level.
- Strong understanding of data security principles and implementation patterns.
- Knowledge of various data integration patterns (ETL, ELT, Streaming).
- Excellent communication and presentation skills to articulate designs to technical and non-technical audiences.
- Strategic thinking and planning abilities. Looking for 12+ years of experience to join our team.
Snowflake,Data modeling,Cloud platforms,Solution architecture
Similar Jobs
View All
Talk to us
Feel free to call, email, or hit us up on our social media accounts.
Email
info@antaltechjobs.in