Overview
Being Different Is Good
We’re bringing “outside of the box” thinking to technology solutions and services. And that takes people who are into that sort of thing. We’re always hiring talented folks that are looking for a place to call home. Reach out - let’s see if we’re a fit!
Along with providing a fun, collaborative work environment and diverse projects to work on, we continuously strive to take the best care of our team members and their overall well-being
A few perks of joining us
- Flexible work from home
- Comprehensive health insurance
- Remote work allowance
- Entertainment allowance
- Wellness assistance
- Learning and development assistance
- Weekend getaway
- Special occasion celebration
- Rewards and recognition programs
- Annual retreat
Location: Permanent WFH/Remote
Type : Fulltime
About the Role:
We are looking for a Data Engineer who is passionate about building robust data pipelines and architectures that transform raw data into actionable insights. In this role, you'll design, build, and maintain the infrastructure necessary to support data analytics and machine learning efforts. If you enjoy working with large datasets, optimizing ETL processes, and ensuring data quality, you'll fit right in.
What You’ll Be Doing:
Designing and Building Data Pipelines
- Architect and implement data pipelines that efficiently process large volumes of data.
- Develop and maintain ETL processes to transform raw data into structured, meaningful information.
- Collaborate with engineers, analysts, and business stakeholders to understand data requirements and ensure reliable data flows.
Managing Data Infrastructure
- Work with various data storage solutions such as data lakes, data warehouses, and relational databases.
- Ensure data systems are scalable, reliable, and optimized for performance.
- Implement data governance practices to maintain data integrity and security.
Optimizing and Automating Workflows
- Monitor data pipelines to identify and resolve performance issues.
- Automate repetitive tasks and streamline data processing workflows.
- Continuously document and improve data processes to ensure transparency and efficiency.
Collaborating and Innovating
- Partner with cross-functional teams to understand business needs and deliver effective data solutions.
- Stay updated on industry trends and emerging technologies in data engineering.
- Contribute ideas to improve data quality, efficiency, and overall infrastructure.
What We’re Looking For:
Technical Expertise
- Proficiency in programming languages such as Python, Java, or Scala.
- Strong command of SQL and experience with relational databases.
- Experience with big data technologies like Hadoop, Spark, or Kafka is a plus.
- Familiarity with cloud platforms (AWS, GCP, Azure) and their data services.
Problem-Solving and Analytical Skills
- Ability to design efficient data architectures and solve complex data challenges.
- A strong focus on data quality and performance optimization.
- Experience in developing robust, scalable data pipelines.
Communication and Collaboration
- Strong verbal and written communication skills.
- Ability to translate complex technical concepts into clear terms for non-technical stakeholders.
- Experience working in collaborative, cross-functional teams.
Learning Mindset
- A drive to continuously learn and adapt in a fast-paced environment.
- Curiosity about emerging data technologies and best practices.
- Willingness to share knowledge and help the team grow.
Experience Needed:
- 5+ years of experience in data engineering, or a related field.
- Proven track record of building and managing data pipelines and infrastructure.
- Experience working in environments that handle large-scale data processing.