Free cookie consent management tool by TermsFeed Data Platform Engineer – Commodities Technology | Antal Tech Jobs
Back to Jobs
1 Week ago

Data Platform Engineer – Commodities Technology

decor
Information Technology
Full-Time
Millennium

Overview

About Us

Founded in 1989, Millennium is a global alternative investment management firm. Millennium seeks to pursue a diverse array of investment strategies across industry sectors, asset classes and geographies. The firm’s primary investment areas are Fundamental Equity, Equity Arbitrage, Fixed Income, Commodities and Quantitative Strategies. We solve hard and interesting problems at the intersection of computer science, finance, and mathematics. We are focused on innovating and rapidly applying innovations to real world scenarios. This enables engineers to work on interesting problems, learn quickly and have deep impact to the firm and the business.

Within Millennium, the Commodities Technology team builds the data and analytics platforms that power our commodities investment strategies. We aggregate and process large volumes of fundamental and alternative data – including weather, supply/demand indicators, storage and transportation data – to provide our Portfolio Managers with a differentiated information edge.

The Role

We are seeking a Data Platform Engineer to help build the next-generation data platform (CFP) for the Commodities business.

In this role, you will design and implement the core platform infrastructure, APIs, and event‑driven services that power ingestion, transformation, cataloging, and consumption of commodities data. You will work across Python, SQL and modern cloud services to build resilient pipelines, orchestration frameworks, and system management tools with a strong focus on reliability, observability, and performance.

You will work closely with engineering teams in the US, Europe, and Singapore as well as with our commodities modelling and research teams in Bangalore to deliver a scalable platform that can support rapid experimentation and production workloads.

Key Responsibilities

  • Platform Engineering: Design and build the foundational data platform components, including event handling, system management tools, and query‑optimized storage for large‑scale commodities datasets.
  • Data Pipelines & Orchestration: Implement and maintain robust batch and streaming pipelines using Python, SQL, Airflow, and Kafka to ingest and transform data from multiple internal and external sources.
  • Cloud Infrastructure: Develop and manage cloud‑native infrastructure on AWS (S3, SQS, RDS, Terraform), ensuring security, scalability, and cost efficiency.
  • API & Services Development: Build and maintain FastAPI‑based services and APIs for data access, metadata, and platform operations, enabling self‑service consumption by downstream users.
  • Performance & Reliability: Optimize queries, workflows, and resource usage to deliver low‑latency data access and high platform uptime; introduce monitoring, alerting, and automated testing (PyTest, CI/CD).
  • Collaboration & Best Practices: Partner with quantitative researchers, data scientists, and other engineers to understand requirements, translate them into platform capabilities, and promote best practices in code quality, DevOps, and documentation.

Required Qualifications

  • Experience: 4–8 years of software/data engineering experience, preferably building or operating data platforms or large‑scale data pipelines.
  • Programming: Strong proficiency in Python with solid software engineering practices (testing, code review, CI/CD).
  • Data & SQL: Hands‑on experience with SQL and relational databases (Snowflake, Postgres or similar); understanding of data modelling and query optimization.
  • Orchestration & Streaming: Practical experience with Airflow (or similar workflow orchestration tools) and message/streaming systems such as Kafka.
  • Cloud & Infrastructure as Code: Experience with AWS services (S3, SQS, RDS) and infrastructure‑as‑code tools such as Terraform.
  • APIs & Services: Experience building RESTful services, ideally with FastAPI or a similar Python web framework.
  • DevOps: Familiarity with Git‑based workflows and CI/CD tooling (e.g., GitHub Actions) and automated testing frameworks (PyTest).
  • Soft Skills: Strong communication skills, ability to work in a distributed team, and a pragmatic, ownership‑driven mindset.

Preferred Qualifications

  • Experience with columnar/analytic data formats and engines (e.g., Iceberg, ClickHouse, Parquet).
  • Exposure to monitoring/observability stacks (Prometheus, Grafana, OpenTelemetry, etc.).
  • Prior experience in financial markets or commodities data environments.
  • Experience working in high‑impact, globally distributed engineering teams.
Share job
Similar Jobs
View All
17 Hours ago
Associate Devops Lead - GCP
Information Technology
  • 2400000 - 3500000 INR - Annual
  • 6 - 10 Yrs
  • Greater Noida, Noida
Responsibilities Design and deploy complex, multi-tier applications on GCP, ensuring scalability, reliability, and cost-efficiency. Manage and optimize workloads using GCP services like Compute Engine, Kubernetes Engine, BigQuery, Cloud Funct...
decor
18 Hours ago
Director/ Senior Director - Data Delivery Partner (CPG Domain)
Information Technology
  • 6000000 - 8000000 INR - Annual
  • 16 - 23 Yrs
  • Hyderabad
Role Overview: We are seeking an experienced Account Delivery Head – Director level to lead end-to end delivery for strategic accounts in the Consumer Packaged Goods (CPG) domain, with a strong focus on Data Engineering, Advanced Analytics, and Da...
decor
1 Day ago
Quality Engineering Architect
Information Technology
  • 9 - 12 Yrs
  • Ahmedabad, Indore, Hyderabad
Your mission, roles and requirements: Design and implement scalable automation frameworks while defining the overall testing tool landscape for the organization. The role focuses on building robust test harnesses, significantly reducing testing cy...
decor
1 Day ago
Senior Maps Data Engineer
AI & Machine Learning Advancement
  • 6 - 10 Yrs
  • Hyderabad
Job Opening: Maps Data Engineer Location: Hyderabad Experience: 6+ years About Antal: Antal International, East Patel Nagar Delhi, is a leading recruitment consultancy having expertise in connecting top talent across IT, Manufact...
decor
1 Day ago
Maps Data Engineer
AI & Machine Learning Advancement
  • 4 - 7 Yrs
  • Hyderabad
Job Opening: Maps Data Engineer Location: Hyderabad Experience: 4+ years About Antal: Antal International, East Patel Nagar Delhi, is a leading recruitment consultancy having expertise in connecting top talent across IT, Manufact...
decor
2 Days ago
ETL Developer/Data Engineer
Information Technology
  • Bangalore, Karnataka, India
DescriptionAbout the Organization :G N Solutions Pvt. Ltd. is a trusted IT company providing state-of- the-art solutions, services and products to our clients spread across diverse domains and geographies. We are one of the privileged IBM Business Pa...
decor
2 Days ago
Vision Group - Senior Software Engineer
Information Technology
  • Bangalore, Karnataka, India
DescriptionRequired Mandatory Skills : Architecture Design Dot Net .Net Core JavaScript SQL Server Azure Cloud MicroservicesJob Responsibilities Responsible for delivering high quality software on time Works closely with Engineering leads and other d...
decor
2 Days ago
Cloud Native Architect - Azure
Information Technology
  • Bangalore, Karnataka, India
EPAM is a leading global provider of digital platform engineering and development services. We are committed to having a positive impact on our customers, our employees, and our communities. We embrace a dynamic and inclusive culture. Here you will c...
decor

Talk to us

Feel free to call, email, or hit us up on our social media accounts.
Social media