Overview
#Connections #hiring #Immediate #DataEngineer #Bhopal
Hi Connections,
We are hiring data engineer for our client.
Job Title: Data Engineer – Real-Time Streaming & Integration (Apache Kafka)
Location: Bhopal, Madhya Pradesh
Key Responsibilities:
· Design, develop, and maintain real-time streaming data pipelines using Apache Kafka and Kafka Connect.
· Implement and optimize ETL/ELT processes for structured and semi-structured data from various sources.
· Build and maintain scalable data ingestion, transformation, and enrichment frameworks across multiple environments.
· Collaborate with data architects, analysts, and application teams to deliver integrated data solutions that meet business requirements.
· Ensure high availability, fault tolerance, and performance tuning for streaming data infrastructure.
· Monitor, troubleshoot, and enhance Kafka clusters, connectors, and consumer applications.
· Enforce data governance, quality, and security standards throughout the pipeline lifecycle.
· Automate workflows using orchestration tools and CI/CD pipelines for deployment and version control.
Required Skills & Qualifications:
· Strong hands-on experience with Apache Kafka, Kafka Connect, and Kafka Streams.
· Expertise in designing real-time data pipelines and stream processing architectures.
· Solid experience with ETL/ELT frameworks using tools like Apache NiFi, Talend, or custom Python/Scala-based solutions.
· Proficiency in at least one programming language: Python, Java, or Scala.
· Deep understanding of message serialization formats (e.g., Avro, Protobuf, JSON).
· Strong SQL skills and experience working with data lakes, warehouses, or relational databases.
· Familiarity with schema registry, data partitioning, and offset management in Kafka.
· Experience with Linux environments, containerization, and CI/CD best practices.
Preferred Qualifications:
· Experience with cloud-native data platforms (e.g., AWS MSK, Azure Event Hubs, GCP Pub/Sub).
· Exposure to stream processing engines like Apache Flink or Spark Structured Streaming.
· Familiarity with data lake architectures, data mesh concepts, or real-time analytics platforms.
· Knowledge of DevOps tools like Docker, Kubernetes, Git, and Jenkins.
Work Experience:
· 6+ years of experience in data engineering with a focus on streaming data and real-time integrations.
· Proven track record of implementing data pipelines in production-grade enterprise environments.
Education Requirements:
· Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field.
· Certifications in data engineering, Kafka, or cloud data platforms are a plus.
Interested guys, kindly share your updated profile to pavani@sandvcapitals.com or reach us on 7995292089.
Thank you.
Job Type: Full-time
Pay: ₹1,200,000.00 - ₹1,500,000.00 per year
Schedule:
- Day shift
Experience:
- Data Engineer: 6 years (Required)
- ETL: 6 years (Required)
Work Location: In person