Free cookie consent management tool by TermsFeed Data Engineer | Antal Tech Jobs
Back to Jobs
5 Weeks ago

Data Engineer

decor
Bangalore, Karnataka, India
Information Technology
Other
Yash Technologies

Overview

Date: Mar 25, 2025
Job Requisition Id: 60689
Location: Bangalore, KA, IN Bangalore, KA, IN

YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation.


At YASH, we’re a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth – bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future.


We are looking forward to hire Python Professionals in the following areas :


Job Description:



Job Title: Data Engineer/ DevOps - Enterprise Big Data Platform


Right to Hire requirement
In this role, you will be part of a growing, global team of data engineers, who collaborate in DevOps mode, to enable business with state-of-the-art technology to leverage data as an asset and to take better informed decisions.

The Enabling Functions Data Office Team is responsible for designing, developing, testing, and supporting automated end-to-end data pipelines and applications on Enabling Function’s data management and analytics platform (Palantir Foundry, AWS and other components).

The Foundry platform comprises multiple different technology stacks, which are hosted on Amazon Web Services (AWS) infrastructure or own data centers. Developing pipelines and applications on Foundry requires:

  • Proficiency in SQL / Scala / Python (Python required; all 3 not necessary)
  • Proficiency in PySpark for distributed computation
  • Familiarity with Ontology, Slate
  • Familiarity with Workshop App basic design/visual competency
  • Familiarity with common databases (e.g. Oracle, mySQL, Microsoft SQL). Not all types required
  • This position will be project based and may work across multiple smaller projects or a single large project utilizing an agile project methodology.


Roles & Responsibilities:

  • Tech / B.Sc./M.Sc. in Computer Science or related field and overall 6+ years of industry experience
  • Strong experience in Big Data & Data Analytics
  • Experience in building robust ETL pipelines for batch as well as streaming ingestion.
  • Big Data engineers with a firm grounding in Object Oriented Programming and an advanced level knowledge with commercial experience in Python, PySpark and SQL
  • Interacting with RESTful APIs incl. authentication via SAML and OAuth2
  • Experience with test driven development and CI/CD workflows
  • Knowledge of Git for source control management
  • Agile experience in Scrum environments like Jira
  • Experience in visualization tools like Tableau or Qlik is a plus
  • Experience in Palantir Foundry, AWS or Snowflake is an advantage
  • Basic knowledge of Statistics and Machine Learning is favorable
  • Problem solving abilities
  • Proficient in English with strong written and verbal communication

Primary Responsibilities

  • Responsible for designing, developing, testing and supporting data pipelines and applications
  • Industrialize data pipelines
  • Establishes a continuous quality improvement process to systematically optimize data quality
  • Collaboration with various stakeholders incl. business and IT

Education

  • Bachelor (or higher) degree in Computer Science, Engineering, Mathematics, Physical Sciences or related fields


Professional Experience

  • 6+ years of experience in system engineering or software development
  • 3+ years of experience in engineering with experience in ETL type work with databases and Hadoop platforms.


Skills:

  • Hadoop General: Deep knowledge of distributed file system concepts, map-reduce principles and distributed computing. Knowledge of Spark and differences between Spark and Map-Reduce. Familiarity of encryption and security in a Hadoop cluster.
  • Data management / data structures.
  • Must be proficient in technical data management tasks, i.e. writing code to read, transform and store data
  • XML/JSON knowledge
  • Experience working with REST APIs
  • Spark Experience in launching spark jobs in client mode and cluster mode. Familiarity with the property settings of spark jobs and their implications to performance.
  • Application Development Familiarity with HTML, CSS, and JavaScript and basic design/visual competency.
  • SCC/Git Must be experienced in the use of source code control systems such as Git.
  • ETL Experience with developing ELT/ETL processes with experience in loading data from enterprise sized RDBMS systems such as Oracle, DB2, MySQL, etc.
  • Authorization Basic understanding of user authorization (Apache Ranger preferred)
  • Programming Must be at able to code in Python or expert in at least one high level language such as Java, C, Scala.
  • Must have experience in using REST APIs.
  • SQL Must be an expert in manipulating database data using SQL. Familiarity with views, functions, stored procedures and exception handling.
  • AWS General knowledge of AWS Stack (EC2, S3, EBS, …)
  • IT Process Compliance*|SDLC experience and formalized change controls.
  • Working in DevOps teams, based on Agile principles (e.g. Scrum)
  • ITIL knowledge (especially incident, problem and change management)

Languages: Fluent English skills

  • Specific information related to the position:
  • Physical presence in primary work location (Bangalore)
  • Flexible to work CEST and US EST time zones (according to team rotation plan)
  • Willingness to travel to Germany, US and potentially other locations (as per project demand)

At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment. We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale.


Our Hyperlearning workplace is grounded upon four principles

  • Flexible work arrangements, Free spirit, and emotional positivity
  • Agile self-determination, trust, transparency, and open collaboration
  • All Support needed for the realization of business goals,
  • Stable employment with a great atmosphere and ethical corporate culture

Share job
Similar Jobs
View All
1 Day ago
TrueFan - Senior Machine Learning Engineer
Information Technology
  • Thiruvananthapuram, Kerala, India
About UsTrueFan is at the forefront of AI-driven content generation, leveraging cutting-edge generative models to build next-generation products. Our mission is to redefine content generation space through advanced AI technologies, including deep ge...
decor
1 Day ago
Salesforce commerce cloud consultant
Information Technology
  • Thiruvananthapuram, Kerala, India
Salesforce Commerce Cloud consultant  5+ Years of Experience 6 to 12 months Mode - Remote 1.1LPM - 1.2LPM Max Key Responsibilities Translate business requirements into scalable Salesforce Service Cloud solutions, in collaboration with CAE's technic...
decor
1 Day ago
Cloud Infrastructure Engineer
Information Technology
  • Thiruvananthapuram, Kerala, India
DescriptionInvent the future with us. Recognized by Fast Company’s 2023 100 Best Workplaces for Innovators List, Ampere is a semiconductor design company for a new era, leading the future of computing with an innovative approach to CPU design focuse...
decor
1 Day ago
Devops Engineer- Intermetiate
Information Technology
  • Thiruvananthapuram, Kerala, India
BackJD: Dev ops Engineer:As a DevOps Specialist- should be able to take ownership of the entire DevOps process, including Automated CI/CD pipelines and deployment to production.They should also be comfortable with risk analysis and prioritization.Le...
decor
1 Day ago
Sr Data Scientist (London)
Information Technology
  • Thiruvananthapuram, Kerala, India
AryaXAI stands at the forefront of AI innovation, revolutionizing AI for mission-critical, highly regulated industries by building explainable, safe, and aligned systems that scale responsibly. Our mission is to create AI tools that empower research...
decor
1 Day ago
Software Test Engineer
Information Technology
  • Thiruvananthapuram, Kerala, India
By clicking the “Apply” button, I understand that my employment application process with Takeda will commence and that the information I provide in my application will be processed in line with Takeda’s Privacy Notice and Terms of Use. I further att...
decor
1 Day ago
Software Developer 5 (Java Fullstack)
Information Technology
  • Thiruvananthapuram, Kerala, India
Job DescriptionBuilding off our Cloud momentum, Oracle has formed a new organization - Oracle Health Applications & Infrastructure. This team focuses on product development and product strategy for Oracle Health, while building out a complete platfo...
decor
1 Day ago
Java Developer - Spring Frameworks
Information Technology
  • Thiruvananthapuram, Kerala, India
Java DescriptionWe are looking for a passionate and talented Java Developer with 2-3 years of hands-on experience to join our growing development team.The ideal candidate should have a strong foundation in Java technologies and the ability to develo...
decor

Talk to us

Feel free to call, email, or hit us up on our social media accounts.
Social media