Bangalore, Karnataka, India
Information Technology
Full-Time
Encardio Rite
Overview
About The Role
As a Data Engineer in the Edge of Technology Center, you will play a critical role in designing and implementing scalable data infrastructure to power advanced analytics, AI/ML, and business intelligence. This position demands a hands-on technologist who can architect reliable pipelines, manage real-time event streams, and ensure smooth data operations across cloud-native environments. You will work closely with cross functional teams to enable data-driven decision- making and innovation across the
organization.
Key Responsibilities
Bachelor’s or master’s degree in computer science, Data Science, Engineering, or a related field from a recognized institution.
Technical Skills
Encardio offers a thriving environment where innovation and collaboration are essential. You'll be part of a diverse team shaping the future of infrastructure globally. Your work will directly contribute to some of the world's most ambitious and ground-breaking engineering projects.
Encardio is an equal-opportunity employer committed to diversity and inclusion.
How To Apply
Please submit your CV and cover letter outlining your suitability for the role at humanresources@encardio.com
As a Data Engineer in the Edge of Technology Center, you will play a critical role in designing and implementing scalable data infrastructure to power advanced analytics, AI/ML, and business intelligence. This position demands a hands-on technologist who can architect reliable pipelines, manage real-time event streams, and ensure smooth data operations across cloud-native environments. You will work closely with cross functional teams to enable data-driven decision- making and innovation across the
organization.
Key Responsibilities
- Design, implement, and maintain robust ETL/ELT pipelines using tools like Argo Workflows or Apache Airflow.
- Manage and execute database schema changes with Alembic or Liquibase, ensuring data consistency.
- Configure and optimize distributed query engines like Trino and AWS Athena for analytics.
- Deploy and manage containerized workloads on AWS EKS or GCP GKE using Docker, Helmfile, and Argo CD.
- Build data lakes/warehouses on AWS S3 and implement performant storage using Apache Iceberg.
- Use Terraform and other IaC tools to automate cloud infrastructure provisioning securely.
- Develop CI/CD pipelines with GitHub Actions to support rapid and reliable deployments.
- Architect and maintain Kafka-based real-time event-driven systems using Apicurio and AVRO.
- Collaborate with product, analytics, and engineering teams to define and deliver data solutions.
- Monitor and troubleshoot data systems for performance and reliability issues using observability tools (e.g., Prometheus, Grafana).
- Document data flows and maintain technical documentation to support scalability and knowledge sharing.
- Fully operational ETL/ELT pipelines supporting high-volume, low-latency data processing.
- Zero-downtime schema migrations with consistent performance across environments.
- Distributed query engines tuned for large-scale analytics with minimal response time.
- Reliable containerized deployments in Kubernetes using GitOps methodologies.
- Kafka-based real-time data ingestion pipelines with consistent schema validation.
- Infrastructure deployed and maintained as code using Terraform and version control.
- Automated CI/CD processes ensuring fast, high-quality code releases.
- Cross-functional project delivery aligned with business requirements.
- Well-maintained monitoring dashboards and alerting for proactive issue resolution.
- Internal documentation and runbooks for operational continuity and scalability.
Bachelor’s or master’s degree in computer science, Data Science, Engineering, or a related field from a recognized institution.
Technical Skills
- Orchestration Tools: Argo Workflows, Apache Airflow
- Database Migration: Alembic, Liquibase
- SQL Engines: Trino, AWS Athena
- Containers & Orchestration: Docker, AWS EKS, GCP GKE
- Data Storage: AWS S3, Apache Iceberg
- Relational Databases: Postgres, MySQL, Aurora
- Infrastructure Automation: Terraform (or equivalent IaC tools)
- CI/CD: GitHub Actions or similar
- GitOps Tools: Argo CD, Helmfile
- Event Streaming: Kafka, Apicurio, AVRO
- Languages: Python, Bash
- Monitoring: Prometheus, Grafana (preferred)
- Strong analytical and problem-solving capabilities in complex technical environments.
- Excellent written and verbal communication skills to interact with both technical and non- technical stakeholders.
- Self-motivated, detail-oriented, and proactive in identifying improvement opportunities.
- Team player with a collaborative approach and eagerness to mentor junior team members.
- High adaptability to new technologies and dynamic business needs.
- Effective project management and time prioritization.
- Strong documentation skills for maintaining system clarity.
- Ability to translate business problems into data solutions efficiently.
- Competitive salary and benefits package in a globally operating company.
- Opportunities for professional growth and involvement in diverse projects.
- Dynamic and collaborative work environment
Encardio offers a thriving environment where innovation and collaboration are essential. You'll be part of a diverse team shaping the future of infrastructure globally. Your work will directly contribute to some of the world's most ambitious and ground-breaking engineering projects.
Encardio is an equal-opportunity employer committed to diversity and inclusion.
How To Apply
Please submit your CV and cover letter outlining your suitability for the role at humanresources@encardio.com
Similar Jobs
View All
Talk to us
Feel free to call, email, or hit us up on our social media accounts.
Email
info@antaltechjobs.in