Overview
Role Summary:
We are seeking an experienced Data Architect to design and implement our enterprise data architecture, with a focus on Data Warehouse/Data Lake / Data Mesh. This position offers an opportunity to shape the data foundation of our organization, working with cutting-edge technologies and solving complex data challenges in a collaborative environment.
Key Responsibilities:
• Design, develop, and maintain scalable enterprise data architecture incorporating data
warehouse, data lake, and data mesh concepts
• Create and maintain data models, schemas, and mappings that support Reporting,
business intelligence, analytics, and AI/ML initiatives
• Establish data integration patterns for batch and real-time processing using AWS services (Glue, DMS, Lambda), Redshift, Snowflake or Data Bricks.
• Define technical specifications for data storage, data processing, and data access patterns
• Develop data models and enforce data architecture standards, policies, and best practices
• Partner with business stakeholders to translate requirements into architectural solutions
• Lead data modernization initiatives, including legacy system migrations
• Create roadmaps for evolving data architecture to support future business needs
• Provide expert guidance on complex data problems and architectural decisions
Required Qualifications:
Education & Experience
• Bachelor’s degree in computer science, Information Systems, or related field; Master’s degree preferred
• 8+ years of experience in data architecture, database design, data modelling or related roles
• 5+ years of experience with cloud data platforms, particularly AWS data services
• 3+ years of experience architecting MPP database solutions (Redshift, Snowflake, etc.)
• Expert knowledge of data warehouse architecture and dimensional modelling
• Strong understanding of AWS data services ecosystem (Redshift, S3, Glue, DMS, Lambda)
• Experience with SQL Server and migration to cloud data platforms
• Proficiency in data modelling, entity relationship diagrams, and schema design
• Working knowledge of data integration patterns and technologies (ETL/ELT, CDC)
• Experience with one or more programming/scripting languages (Python, SQL, Shell)
• Familiarity with data lake architectures and technologies (Parquet, Delta Lake, Athena)
• Excellent verbal and written communication skills, with ability to translate complex technical concepts to varied audiences
• Strong stakeholder management and influencing skills
• Experience implementing data warehouse, data lake and data mesh architectures
• Good to have knowledge of machine learning workflows and feature engineering
• Understanding of regulatory requirements related to data (Fed Ramp, GDPR, CCPA, etc.)
• Experience with big data technologies (Spark, Hadoop).
About Aurigo:
Aurigo is an American technology company founded in 2003 with a mission to help public sector
agencies and facility owners plan, deliver, and maintain their capital projects and assets safely and
efficiently. With more than $300 billion of capital programs under management, Aurigo's award-winning software solutions are trusted by over 300 customers in transportation, water and utilities, healthcare, higher education, and government on over 40,000 projects across North America. We are a privately held corporation headquartered in Austin, Texas, USA, with software development and support centers in Canada and India. We are proud to be Great Place to Work Certified three times in a row and recently recognized as one of the Top 25 AI Companies of 2024.