Pune, Maharashtra, India
Information Technology
Full-Time
Phenom
Overview
Job Requirements
About PHENOM:
PHENOM is a global HR tech company that is revolutionizing the talent experience. Our AI-powered platform helps companies hire fa
ster, develop better, and retain longer. We’re a passionate team-building solutions that empower candidates, recruiters, managers, and employees with meaningful, data-driven insights.
Phenom People is looking for bright and motivated Data Engineers to play a key role in building the next generation Enterprise Data Lake. The ideal candidate will be passionate about building extremely large, scalable Data Lake on cloud and will want to be part of a team that has accepted the goal to provide data driven innovations at Phenom. We are one of fastest growing global Talent Relationship Marketing Cloud Platform with strong technology orientation.
Responsibilities
Must have
About PHENOM:
PHENOM is a global HR tech company that is revolutionizing the talent experience. Our AI-powered platform helps companies hire fa
ster, develop better, and retain longer. We’re a passionate team-building solutions that empower candidates, recruiters, managers, and employees with meaningful, data-driven insights.
Phenom People is looking for bright and motivated Data Engineers to play a key role in building the next generation Enterprise Data Lake. The ideal candidate will be passionate about building extremely large, scalable Data Lake on cloud and will want to be part of a team that has accepted the goal to provide data driven innovations at Phenom. We are one of fastest growing global Talent Relationship Marketing Cloud Platform with strong technology orientation.
Responsibilities
- Translate business and functional requirements into robust, scalable solutions that work well within the overall data architecture.
- Develops and maintains scalable data pipelines and builds new API integrations.
- Design, develop, implement, test, document, and operate large scale, high volume and low latency applications.
- Design data integrations and data quality framework.
- Participate in the full development life cycle, end-to-end, from design, implementation and testing, to documentation, delivery, support,
- Experience in working and delivering end-to-end projects independently.
Must have
- Excellent knowledge in Python programming.
- 3+ Years development experience in big data using Spark.
- Experience with Dimension Modeling, Data Warehousing, and building ETL pipelines
- Strong expertise in SQL and experience in writing complex SQLs.
- Knowledge building stream processing platforms using Kafka, Spark Streaming
- Knowledge of using job orchestration frameworks like Airflow, Oozie, Luigi, etc
- Experience with AWS services such as S3, EMR, RDS
- Good understanding of cloud data warehouses like Snowflake is an added advantage.
- Good understanding of SQL distribution engines like Presto, Druid
- Knowledge of Streaming processing frameworks like Flink etc.
- Knowledge of NoSQL databases like HBase, Cassandra etc.
- Competitive salary for a startup
- Gain experience rapidly
- Work directly with executive team
- Fast-paced work environment
Similar Jobs
View All
Talk to us
Feel free to call, email, or hit us up on our social media accounts.
Email
info@antaltechjobs.in