Free cookie consent management tool by TermsFeed Senior Software Engineer | Antal Tech Jobs
Back to Jobs
4 Weeks ago

Senior Software Engineer

decor
Aurangabad, Maharashtra, India
Information Technology
Full-Time
Circana India

Overview

Senior Software Engineer

Let’s be unstoppable together!

At Circana, we are fueled by our passion for continuous learning and growth, we seek and share feedback freely, and we celebrate victories both big and small in an environment that is flexible and accommodating to our work and personal lives. We have a global commitment to diversity, equity, and inclusion as we believe in the undeniable strength that diversity brings to our business, employees, clients, and communities (with us you can always bring your full self to work). Join our inclusive, committed team to be a challenger, own outcomes, and stay curious together. Learn more at www.circana.com.

We are seeking a skilled and motivated Data Engineer to join our growing team. In this role, you will be responsible for designing, building, and maintaining robust data pipelines and infrastructure on the On-Premises and cloud platform. You will leverage your expertise in PySpark, Spark, Python and Apache Airflow to process and orchestrate large-scale data workloads, ensuring data quality, efficiency, and scalability. If you have a passion for data engineering and a desire to make a significant impact, we encourage you to apply!


Key Responsibilities:

Data Engineering & Data Pipeline Development

  • Design, develop, and optimize scalable DATA workflows using Python, PySpark, and Airflow
  • Implement real-time and batch data processing using Spark
  • Enforce best practices for data quality, governance, and security throughout the data lifecycle
  • Ensure data availability, reliability and performance through monitoring and automation.

Cloud Data Engineering :

  • Manage cloud infrastructure and cost optimization for data processing workloads
  • Implement CI/CD pipelines for data workflows to ensure smooth and reliable deployments.

Big Data & Analytics:

  • Build and optimize large-scale data processing pipelines using Apache Spark and PySpark
  • Implement data partitioning, caching, and performance tuning for Spark-based workloads.
  • Work with diverse data formats (structured and unstructured) to support advanced analytics and machine learning initiatives.

.Workflow Orchestration (Airflow)

  • Design and maintain DAGs (Directed Acyclic Graphs) in Airflow to automate complex data workflows
  • Monitor, troubleshoot, and optimize job execution and dependencies

Required Skills & Experience:

  • 4+ years of experience in data engineering with expertise in PySpark, Spark.
  • Strong programming skills in Python, SQL with the ability to write efficient and maintainable code
  • Deep understanding of Spark internals (RDDs, DataFrames, DAG execution, partitioning, etc.)
  • Experience with Airflow DAGs, scheduling, and dependency management
  • Knowledge of Git, Docker, Kubernetes, and apply best practices of DevOps for CI/CD workflows
  • Experience in cloud platform like Azure/AWS is favourable.
  • Excellent problem-solving skills and ability to optimize large-scale data processing.
  • Experience in Agile/Scrum environments
  • Strong communication and collaboration skills, with the ability to effectively work with remote teams

Bonus Points:

  • Experience with data modeling and data warehousing concepts
  • Familiarity with data visualization tools and techniques
  • Knowledge of machine learning algorithms and frameworks

We are seeking a skilled and motivated Data Engineer to join our growing team. In this role, you will be responsible for designing, building, and maintaining robust data pipelines and infrastructure on the On-Premises and cloud platform. You will leverage your expertise in PySpark, Spark, Python and Apache Airflow to process and orchestrate large-scale data workloads, ensuring data quality, efficiency, and scalability. If you have a passion for data engineering and a desire to make a significant impact, we encourage you to apply!


Key Responsibilities:

Data Engineering & Data Pipeline Development

  • Design, develop, and optimize scalable DATA workflows using Python, PySpark, and Airflow
  • Implement real-time and batch data processing using Spark
  • Enforce best practices for data quality, governance, and security throughout the data lifecycle
  • Ensure data availability, reliability and performance through monitoring and automation.

Cloud Data Engineering :

  • Manage cloud infrastructure and cost optimization for data processing workloads
  • Implement CI/CD pipelines for data workflows to ensure smooth and reliable deployments.

Big Data & Analytics:

  • Build and optimize large-scale data processing pipelines using Apache Spark and PySpark
  • Implement data partitioning, caching, and performance tuning for Spark-based workloads.
  • Work with diverse data formats (structured and unstructured) to support advanced analytics and machine learning initiatives.

.Workflow Orchestration (Airflow)

  • Design and maintain DAGs (Directed Acyclic Graphs) in Airflow to automate complex data workflows
  • Monitor, troubleshoot, and optimize job execution and dependencies

Required Skills & Experience:

  • 4+ years of experience in data engineering with expertise in PySpark, Spark.
  • Strong programming skills in Python, SQL with the ability to write efficient and maintainable code
  • Deep understanding of Spark internals (RDDs, DataFrames, DAG execution, partitioning, etc.)
  • Experience with Airflow DAGs, scheduling, and dependency management
  • Knowledge of Git, Docker, Kubernetes, and apply best practices of DevOps for CI/CD workflows
  • Experience in cloud platform like Azure/AWS is favourable.
  • Excellent problem-solving skills and ability to optimize large-scale data processing.
  • Experience in Agile/Scrum environments
  • Strong communication and collaboration skills, with the ability to effectively work with remote teams

Bonus Points:

  • Experience with data modeling and data warehousing concepts
  • Familiarity with data visualization tools and techniques
  • Knowledge of machine learning algorithms and frameworks

Circana Behaviors

As well as the technical skills, experience and attributes that are required for the role, our shared behaviors sit at the core of our organization. Therefore, we always look for people who can continuously champion these behaviors throughout the business within their day-to-day role:

  • Stay Curious: Being hungry to learn and grow, always asking the big questions
  • Seek Clarity: Embracing complexity to create clarity and inspire action
  • Own the Outcome: Being accountable for decisions and taking ownership of our choices
  • Center on the Client: Relentlessly adding value for our customers
  • Be a Challenger: Never complacent, always striving for continuous improvement
  • Champion Inclusivity: Fostering trust in relationships engaging with empathy, respect and integrity
  • Commit to each other: Contributing to making Circana a great place to work for everyone

Location

This position can be located in the following area(s): Bangalore

Prospective candidates may be asked to consent to background checks (in accordance with local legislation and our candidate privacy notice ) Your current employer will not be contacted without your permission.

Share job
Similar Jobs
View All
1 Day ago
QA Engineer – Mobile Gaming
Information Technology
  • Vishakhapatnam, Andhra Pradesh, India
About BeBettaBeBetta is a gamified reward platform designed for gamers and entertainers. We’re a mobile-first company growing quickly, with new features launching every week. Our mission is to transform how creators and users engage in the digital s...
decor
1 Day ago
DeepTek.ai - DevOps Engineer - Ansible/Terraform
Information Technology
  • Vishakhapatnam, Andhra Pradesh, India
Job Description : 1- 3 years of hands-on experience with AWS services (EC2, VPC, IAM, S3, CloudWatch, etc.)Required Skills Design and manage secure, scalable, and highly available AWS infrastructure. Deploy and manage containerized workloads using...
decor
1 Day ago
Data Scientist
Information Technology
  • Vishakhapatnam, Andhra Pradesh, India
About LoyalyticsLoyalytics is a fast-growing Analytics consulting and product organization based out of Bangalore.We work with large retail clients across the globe helping them monetize their data assets through our consulting assignments and produ...
decor
1 Day ago
Scrum master/ Senior Consultant Specialist
Information Technology
  • Vishakhapatnam, Andhra Pradesh, India
Job DescriptionSome careers shine brighter than others.If you’re looking for a career that will help you stand out, join HSBC, and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new...
decor
1 Day ago
Python Developer - Django
Information Technology
  • Vishakhapatnam, Andhra Pradesh, India
Job Title : Python Django Developer (3 Years Experience)Location : [Your Location / Remote / Hybrid]Job Type : [Full-time / Contract / Part-time]Experience : 3+ YearsAbout The RoleWe are looking for a skilled and motivated Python Django Develope...
decor
1 Day ago
IT - SDWan Engineer
Information Technology
  • Vishakhapatnam, Andhra Pradesh, India
Syensqo is all about chemistry. We’re not just referring to chemical reactions here, but also to the magic that occurs when the brightest minds get to work together. This is where our true strength lies. In you. In your future colleagues and in all ...
decor
1 Day ago
Senior UI Developer - React.js/AngularJS
Information Technology
  • Vishakhapatnam, Andhra Pradesh, India
Job Description : UX Developer.Location : Pune, India, Remote.Experience : 3-5 years.Job Type : the Role : We are seeking a talented UI/UX Developer with 35 years of experience to join our product engineering team.The ideal candidate will have a...
decor
1 Day ago
Motorola Solutions - Frontend/UI Developer - AngularJS
Information Technology
  • Vishakhapatnam, Andhra Pradesh, India
Department OverviewThe Cloud Platform Engineering team is responsible for : Design and implementation of the continuous integration/continuous delivery (CI/CD) pipeline into multiple public cloud regions Development and operation of common platfor...
decor

Talk to us

Feel free to call, email, or hit us up on our social media accounts.
Social media