1800000 - 1800000 INR - Yearly
Bangalore, KA, India
Information Technology
Full-Time
Blurgs Innovation
Overview
Key Responsibilities
- Design, implement, and maintain ingestion adapters for diverse real-time data sources, including protocol decoding, structural validation, and early rejection of malformed or incomplete messages.
- Convert decoded data into a stable canonical event envelope containing timestamps, source identity, normalized measurements, quality metadata, and complete lineage information.
- Develop time-synchronization, state-buffering, interpolation, and reference-frame normalization services to align observations across multiple streams.
- Own the central replayable event transport layer using Kafka (KRaft mode) so that all downstream components operate from the identical event history.
- Build Apache Flink stream-processing applications for event-time windowed correlation, hypothesis management, belief combination, and multi-stream reasoning while preserving uncertainty and conflicts.
- Implement the complete persistence strategy: Redis for hot operational and belief state, PostgreSQL for curated reference data and rules, ClickHouse for long-term history, lineage tables, and replay data.
- Guarantee deterministic recovery and hot-state rebuild from checkpoints plus replay without silently resetting or losing internal identities.
- Design and enforce explicit identity lifecycle rules (observation → internal track → hypothesis → fused object) with full audit trail for all merges, splits, and relinks.
- Collaborate with ML/AI engineers to treat model outputs strictly as evidential input (never as final truth) while preserving complete source lineage for every operational conclusion.
- Support first-class handling of partial or unresolved observations with proper uncertainty representation.
Mandatory Technical Skills
- 5+ years of production experience building real-time streaming data platforms.
- Deep expertise in Apache Kafka (KRaft mode) and Apache Flink (stateful event-time processing, windowed operations, exactly-once semantics).
- Strong Python skills, including high-performance data parsing and binary protocol handling (ONNX integration experience is a plus).
- Hands-on production experience with Redis, PostgreSQL, and ClickHouse (or equivalent OLAP store).
- Proven track record designing systems with strong data lineage, auditability, and full replay capabilities.
- Solid understanding of temporal data handling, uncertainty management, and event-time versus processing-time semantics.
Similar Jobs
View All
Talk to us
Feel free to call, email, or hit us up on our social media accounts.
Email
info@antaltechjobs.in