Overview
AuxoAI partners with Fortune 500-scale enterprises to design and deploy Agentic AI systems that drive measurable outcomes—from data modernization to deployed functional products. AuxoAI enables the creation of AI-first enterprises with productized solutions, AI first engineering, and forward deployed experts.
We are looking for a strong Data Engineer to join our growing team. The ideal candidate brings solid ETL fundamentals, hands-on pipeline experience, and cloud platform proficiency — with a preference for GCP/BigQuery expertise.
Responsibilities:
• Design, build, and maintain scalable data pipelines and ETL/ELT workflows
• Work with Dataform or dbt to implement transformation logic and data models
• Develop and optimize data solutions on GCP (BigQuery, GCS) or AWS/Azure
• Support data migration initiatives and data mesh architecture patterns
• Collaborate with analysts, scientists, and business stakeholders to deliver reliable data products
• Apply data governance and quality best practices across the data lifecycle
• Troubleshoot pipeline issues and drive proactive monitoring and resolution
Requirement:
• 2- 5 years of hands-on Data Engineering experience
• Strong ETL/ELT fundamentals — pipeline design, transformation logic, end-to-end ownership
• Proficiency with Dataform or dbt (preferred); strong SQL is a must
• Experience with BigQuery (preferred) or equivalent cloud data warehouse (Redshift, Snowflake, Synapse)
• Cloud platform experience: GCP (preferred), AWS, or Azure — including object storage (GCS, S3, ADLS)
• Exposure to data migration projects and/or data mesh principles
• Programming skills in Python or SQL; Spark/PySpark is a plus
• Bachelor's or Master's degree in Computer Science, Engineering, or related field