Overview
About the Company
At Bluecopa, we’re building the next-generation finance operations platform for high-growth companies. Our mission is to empower finance teams to automate, analyze, and act faster on their data — enabling smarter decisions and accelerating business outcomes. We’re a fast-paced, product-driven team solving real-world data problems at scale. If you love working with modern data stacks, building robust pipelines, and driving automation across complex systems, we’d love to have you on board.
About the Role
We are seeking a Data Engineer / Senior Data Engineer – Data Integration with 4–6 years of hands-on experience in building and maintaining scalable data pipelines, integrations, and transformations. You will work extensively with Python, dbt, modern data warehouses, and integration tools to connect multiple data sources — including SaaS platforms, databases, and file-based or object storage systems such as Google Cloud Storage (GCS), Amazon S3, Azure Blob Storage, SFTP, Google Drive, and OneDrive — ensuring the data foundation is reliable, consistent, and analytics-ready.
Responsibilities
- Design, develop, and maintain data integration pipelines connecting various SaaS platforms, databases, and file-based or object storage systems such as Google Cloud Storage (GCS), Amazon S3, Azure Blob Storage, SFTP, Google Drive, and OneDrive.
- Build and manage ELT/ETL workflows using Python and dbt.
- Model, transform, and organize data in data warehouses such as BigQuery, Snowflake, or Databricks.
- Integrate and sync data using iPaaS or data movement tools like Airbyte, Singer, CData, Dell Bhoomi, etc.
- Collaborate with Analytics, Platform, and Product teams to define and maintain data models, transformations, and data contracts.
- Optimize data workflows for scalability, performance, and cost-efficiency.
- Implement data quality checks, validation, and observability mechanisms.
- Follow best practices for version control, CI/CD, and orchestration of data jobs.
Qualifications
- Experience: 4–8 years in data engineering or data integration roles.
- Programming: Strong hands-on experience with Python (mandatory).
- Transformation: Proven experience with dbt (Data Build Tool) for data modeling, transformation, and deployment (mandatory).
- Data Platforms: Hands-on with one or more modern data warehouses / data processing engines like BigQuery, Snowflake, or Databricks (mandatory).
- Integration Tools: Experience with one or more iPaaS / data integration tools such as Airbyte, Singer, CData, or Dell Bhoomi.
- Change Data Capture (CDC): Experience with Debezium or similar CDC tools for real-time data streaming and replication.
- SQL: Excellent command over SQL and database performance tuning.
- Cloud: Familiarity with cloud platforms such as GCP, AWS, or Azure.
- Version Control: Experience using Git and collaborative development workflows.
Required Skills
- Experience with API integrations and RESTful data ingestion.
- Knowledge of data orchestration tools like Airflow or Prefect.
- Exposure to data governance, cataloging, or metadata management tools.
- Prior experience in SaaS or high-growth startup environments.
Pay range and compensation package
Not specified.
Equal Opportunity Statement
We are committed to diversity and inclusivity in our hiring practices.