Back to Jobs
1 Day ago
Interesting Job Opportunity: GoodWorkLabs - Senior DevOps Engineer - Cloud Infrastructure
Bangalore, Karnataka, India
Information Technology
Full-Time
GoodWorkLabs
Overview
About The RoleWe are looking for a Software Development Engineer III specializing in Infrastructure and Data Engineering. You will design, build, and optimize scalable infrastructure and data pipelines that enable our product, analytics, and machine learning teams to deliver reliable insights and services. As a senior IC, you will work on cloud infrastructure, data modeling, and distributed systems while collaborating with product, backend, and analytics stakeholders.
What You Will Engineering
- Design, build, and maintain scalable cloud infrastructure (AWS/GCP/Azure).
- Implement Infrastructure-as-Code (IaC) using Terraform, CloudFormation, or similar tools.
- Ensure reliability, observability, and security across environments.
- Automate deployment, monitoring, and incident response processes.
- Build robust ETL/ELT pipelines to process structured and unstructured data.
- Design and optimize data models for analytics and product use cases.
- Ensure data integrity, availability, and performance across systems.
- Implement batch and streaming data pipelines (e.g., Kafka, Spark, Flink).
- Partner with product, analytics, and engineering teams to define data needs and infrastructure requirements.
- Contribute to cross-functional system design discussions and architectural reviews.
- Ensure delivery of reliable infrastructure and data solutions that meet SLAs.
- Implement monitoring, logging, and alerting for infrastructure and pipelines.
- Write automated tests for infrastructure and data transformations.
- Drive adoption of best practices for cost optimization and scalability.
- 4 - 6 years of experience in infrastructure and/or data engineering roles.
- Strong expertise in cloud platforms (AWS, GCP, or Azure).
- Proficiency in SQL and experience with relational/NoSQL databases.
- Experience with data pipeline frameworks (Airflow, DBT, Spark, Flink).
- Proficiency in at least one programming language (Python, Go, or Java).
- Familiarity with CI/CD pipelines and containerization (Docker, Kubernetes).
- Experience with real-time data streaming (Kafka, Kinesis, Pub/Sub).
- Exposure to data warehousing solutions (Snowflake, BigQuery, Redshift).
- Background in security, compliance, or cost optimization of infrastructure.
- Experience in fintech or transaction-heavy domains.
(ref:hirist.tech)
Similar Jobs
View All
Talk to us
Feel free to call, email, or hit us up on our social media accounts.
Email
info@antaltechjobs.in