Free cookie consent management tool by TermsFeed Senior Data Engineer(Product Companies Only) | Antal Tech Jobs
Back to Jobs
13 Hours ago

Senior Data Engineer(Product Companies Only)

decor
Anywhere in India/Multiple Locations
4 - 8 Yrs
Internet
Remote
Full-Time
Antal International

Overview

My Client is an AI powered, all-in-one white-label sales & marketing platform that empowers agencies, entrepreneurs, and businesses to elevate their digital presence and drive growth.

Role : Senior Data Engineer

Location : Remote

They are seeking a talented and motivated tech data engineer to join their team who will be responsible for designing, developing, and maintaining our data infrastructure and developing backend systems and solutions that support real-time data processing, large-scale event-driven architectures, and integrations with various data systems.

This role involves collaborating with cross-functional teams to ensure data reliability, scalability, and performance. The candidate will work closely with data scientists, analysts and software engineers to ensure efficient data flow and storage, enabling data-driven decision-making across the organisation.

Requirements :

- 4+ years of experience in software development
 

- Bachelors or Masters degree in Computer Science, Engineering, or a related field
 

Strong Problem-Solving Skills : Ability to debug and optimize data processing workflows
 

Programming Fundamentals : Solid understanding of data structures, algorithms, and software design patterns
 

Software Engineering Experience : Demonstrated experience (SDE II/III level) in designing, developing, and delivering software solutions using modern languages and frameworks (Node.js, JavaScript, Python, TypeScript, SQL, Scala or Java)
 

ETL Tools & Frameworks : Experience with Airflow, dbt, Apache Spark, Kafka, Flink or similar technologies.
 

Cloud Platforms : Hands-on experience with GCP (Pub/Sub, Dataflow, Cloud Storage) or AWS (S3, Glue, Redshift)
 

Databases & Warehousing : Strong experience with PostgreSQL, MySQL, Snowflake, and NoSQL databases (MongoDB, Firestore, ES)
 

Version Control & CI/CD : Familiarity with Git, Jenkins, Docker, Kubernetes, and CI/CD pipelines for deployment
 

Communication : Excellent verbal and written communication skills, with the ability to work effectively in a collaborative environment
 

- Experience with data visualization tools (e.g. Superset, Tableau), Terraform, IaC, ML/AI data pipelines and devops practices are a plus

Responsibilities :

Software Engineering Excellence : Write clean, efficient, and maintainable code using JavaScript or Python while adhering to best practices and design patterns
 

Design, Build, and Maintain Systems : Develop robust software solutions and implement RESTful APIs that handle high volumes of data in real-time, leveraging message queues (Google Cloud Pub/Sub, Kafka, RabbitMQ) and event-driven architectures
 

Data Pipeline Development : Design, develop and maintain data pipelines (ETL/ELT) to process structured and unstructured data from various sources
 

Data Storage & Warehousing : Build and optimize databases, data lakes and data warehouses (e.g. Snowflake) for high-performance querying
 

Data Integration : Work with APIs, batch and streaming data sources to ingest and transform data
 

Performance Optimization : Optimize queries, indexing and partitioning for efficient data retrieval
Collaboration : Work with data analysts, data scientists, software developers and product teams to understand requirements and deliver scalable solutions
Monitoring & Debugging : Set up logging, monitoring, and alerting to ensure data pipelines run reliably
Ownership & Problem-Solving : Proactively identify issues or bottlenecks and propose innovative solutions to address them

Share job

Talk to us

Feel free to call, email, or hit us up on our social media accounts.
Social media