
Overview
With more than 1 billion reviews and opinions from nearly 8 million businesses, travelers turn to clients to find deals on accommodations, book experiences, and reserve tables at delicious restaurants. They discover great places nearby as a travel guide company, available in 43 markets and 22 languages.
As a member of the Data Platform Enterprise Services Team, you will collaborate with engineering and business stakeholders to build, optimize, maintain, and secure the full data vertical, including tracking instrumentation, information architecture, ETL pipelines, and tooling that provide key analytics insights for business-critical decisions at the highest levels of product, finance, sales, CRM, marketing, data science, and more, all in a dynamic environment of continuously modernizing tech stack including highly scalable architecture, cloud-based infrastructure, and real-time responsiveness.
REQUIREMENTS:
BS/MS in Computer Science or related field.
4+ years of experience in data engineering or software development.
Proven data design and modeling with large datasets (star/snowflake schema, SCDs, etc.).
Strong SQL skills and ability to query large datasets.
Experience with modern cloud data warehouses: Snowflake, BigQuery, etc.
ETL development experience: SLA, performance, and monitoring.
Familiarity with BI tools and semantic layer principles (e.g., Looker, Tableau).
Understanding of CI/CD, testing, documentation practices.
Comfortable in a fast-paced, dynamic environment.
Ability to collaborate cross-functionally and communicate with technical/non-technical peers.
Strong data investigation and problem-solving abilities.
Comfortable with ambiguity and focused on clean, maintainable data architecture.
Detail-oriented with a strong sense of ownership.
NICE TO HAVE
Experience with data governance, data transformation tools.
Prior work with e-commerce platforms.
Experience with Airflow, Dagster, Monte Carlo, or Knowledge Graphs.
RESPONSIBILITIES:
Collaborate with stakeholders from multiple teams to collect business requirements and translate them into technical data model solutions.
Design, build, and maintain efficient, scalable, and reusable data models in cloud data warehouses (e.g., Snowflake, BigQuery).
Transform data from many sources into clean, curated, standardized, and trustworthy data products.
Build data pipelines and ETL processes handling terabytes of data.
Analyze data using SQL and dashboards; ensure models align with business needs.
Ensure data quality through testing, observability tools, and monitoring.
Troubleshoot complex data issues, validate assumptions, and trace anomalies.
Participate in code reviews and help improve data development standards.
WE OFFER
- US and EU projects based on advanced technologies.
- Competitive compensation based on skills and experience.
- Annual performance appraisals.
- Remote-friendly culture and no micromanagement.
- Personalized learning program tailored to your interests and skill development.
- Bonuses for article writing, public talks, other activities.
- 15 PTO days, 10 national holidays.
- Free webinars, meetups and conferences organized by Svitla.
- Fun corporate celebrations and activities.
- Awesome team, friendly and supportive community!
ABOUT SVITLA
If you are interested in our vacancy, please send your CV.
We will be happy to see you in our friendly team :)
LET'S MEET IN PERSON
Yuliia Mamitova
Why hesitate? Apply now