Overview
Job Summary
Auriga is looking for a Data Engineer to design and maintain cloud-native data pipelines supporting real-time analytics and machine learning. You'll work with cross-functional teams to build scalable, secure data solutions using GCP (BigQuery, Looker), SQL, Python, and orchestration tools like Dagster and DBT. Mentoring junior engineers and ensuring data best practices will also be part of your role.
WHAT YOU'LL DO:- Design, build, and maintain scalable data pipelines and architectures to support analytical and operational workloads.
- Develop and optimize ETL/ELT pipelines, ensuring efficient data extraction, transformation, and loading from various sources.
- Work closely with backend and platform engineers to integrate data pipelines into cloud-native applications.
- Manage and optimize cloud data warehouses, primarily BigQuery, ensuring performance, scalability, and cost efficiency.
- Implement data governance, security, and privacy best practices, ensuring compliance with company policies and regulations.
- Collaborate with analytics teams to define data models and enable self-service reporting and BI capabilities.
- Develop and maintain data documentation, including data dictionaries, lineage tracking, and metadata management.
- Monitor, troubleshoot, and optimize data pipelines, ensuring high availability and reliability.
- Stay up to date with emerging data engineering technologies and best practices, continuously improving our data infrastructure.
- Strong proficiency in English (written and verbal communication) is required.
- Experience working with remote teams across North America and Latin America, ensuring smooth collaboration across time zones.
- 5+ years of experience in data engineering, with expertise in building scalable data pipelines and cloud-native data architectures.
- Strong proficiency in SQL for data modeling, transformation, and performance optimization.
- Experience with BI and data visualization tools (e.g., Looker, Tableau, or Google Data Studio).
- Expertise in Python for data processing, automation, and pipeline development.
- Experience with cloud data platforms, particularly Google Cloud Platform (GCP).Hands-on experience with Google BigQuery, Cloud Storage, and Pub/Sub.
- Strong knowledge of ETL/ELT frameworks such as DBT, Dataflow, or Apache Beam.
- Familiarity with workflow orchestration tools like Dagster, Apache Airflow or Google Cloud Workflows.
- Understanding of data privacy, security, and compliance best practices.
- Strong problem-solving skills, with the ability to debug and optimize complex data workflows.
- Excellent communication and collaboration skills.
- Experience with real-time data streaming solutions (e.g., Kafka, Pub/Sub, or Kinesis).
- Familiarity with machine learning workflows and MLOps best practices.
- Knowledge of Terraform for Infrastructure as Code (IaC) in data environments.
- Familiarity with data integrations involving Contentful, Algolia, Segment, and Talon.One.
About Company
Hi there! We are Auriga IT.
We power businesses across the globe through digital experiences, data and insights. From the apps we design to the platforms we engineer, we're driven by an ambition to create world-class digital solutions and make an impact. Our team has been part of building the solutions for the likes of Zomato, Yes Bank, Tata Motors, Amazon, Snapdeal, Ola, Practo, Vodafone, Meesho, Volkswagen, Droom and many more.
We are a group of people who just could not leave our college-life behind and the inception of Auriga was solely based on a desire to keep working together with friends and enjoying the extended college life.
Who Has not Dreamt of Working with Friends for a Lifetime
Come Join In!
Our Website