Noida, Uttar Pradesh, India
Social Good & Community Development
Full-Time
IDfy
Overview
Experience -
- Strong understanding with hands-on experience of data warehouses; a minimum of 4-6 years of experience in any data warehousing stack is a must.
- Experience in creating & managing data ingestion pipelines based on the ELT (Extract, Load & Transform) model, at scale.
- You own defining Data Models, Transformation Logic & Data Flow in your current role.
- Know-how of Logstash, Apache BEAM & Dataflow, Apache Airflow, ClickHouse, Grafana, InfluxDB/VictoriaMetrics, BigQuery.
- Understanding of Product Development Methodologies; we follow Agile methodology.
- You have a knack for understanding data & can derive insights from it.
- Experience with any TimeSeries DB (we use InfluxDB & VictoriaMetrics) and Alerting / Anomaly Detection Frameworks.
- Visualization Tools: Metabase/ PowerBI/Tableau.
- Experience in developing software in the Cloud such as GCP / AWS.
- A passion for exploring new technologies and express yourself through technical blogs.
What your day will look like -
- Architecting & implementing Data Pipelines & Frameworks to provide a better developer experience for our dev teams.
- Helping other PODs in IDfy define their data landscape and onboarding them onto our platform.
- Manage the platform and its scale. Optimize the infrastructure for cost as we scale.
- Provide guidance and support to your team members, helping them to develop their skills and grow professionally.
- Track project progress, identify risks, and develop contingency plans to ensure that
- projects pass the IDfy quality gates and are delivered on time and within budget.
- Keep abreast of the latest trends and technologies in Data Engineering, GenAI, and Data Science.
Similar Jobs
View All
Talk to us
Feel free to call, email, or hit us up on our social media accounts.
Email
info@antaltechjobs.in