Dehra dun, Uttarakhand, India
Information Technology
Full-Time
NationsBenefits
Overview
About NationsBenefits:
At NationsBenefits, we are leading the transformation of the insurance industry by developing innovative benefits management solutions. We focus on modernizing complex back-office systems to create scalable, secure, and high-performing platforms that streamline operations for our clients. As part of our strategic growth, we are focused on platform modernization — transitioning legacy systems to modern, cloud-native architectures that support the scalability, reliability, and high performance of core back-office functions in the insurance domain.
Position Overview:
We are seeking a self-driven Data Engineer with 4–7 years of experience to build and optimize scalable ETL/ELT pipelines using Azure Databricks, PySpark, and Delta Lake. The role involves working across scrum teams to develop data solutions, ensure data governance with Unity Catalog, and support real-time and batch processing. Strong problem-solving skills, T-SQL expertise, and hands-on experience with Azure cloud tools are essential. Healthcare domain knowledge is a plus.
Job Description:
At NationsBenefits, we are leading the transformation of the insurance industry by developing innovative benefits management solutions. We focus on modernizing complex back-office systems to create scalable, secure, and high-performing platforms that streamline operations for our clients. As part of our strategic growth, we are focused on platform modernization — transitioning legacy systems to modern, cloud-native architectures that support the scalability, reliability, and high performance of core back-office functions in the insurance domain.
Position Overview:
We are seeking a self-driven Data Engineer with 4–7 years of experience to build and optimize scalable ETL/ELT pipelines using Azure Databricks, PySpark, and Delta Lake. The role involves working across scrum teams to develop data solutions, ensure data governance with Unity Catalog, and support real-time and batch processing. Strong problem-solving skills, T-SQL expertise, and hands-on experience with Azure cloud tools are essential. Healthcare domain knowledge is a plus.
Job Description:
- Work with different scrum teams to develop all the quality database programming requirements of the sprint.
- Experience in Azure cloud platforms like Advanced Python Programming, Databricks, Azure SQL , Data factory (ADF), Data Lake, Data storage, SSIS.
- Create and deploy scalable ETL/ELT pipelines with Azure Databricks by utilizing PySpark and SQL.
- Create Delta Lake tables with ACID transactions and schema evolution to support real-time and batch processing.
- Experience in Unity Catalog for centralized data governance, access control, and data lineage tracking.
- Independently analyse, solve, and correct issues in real time, providing problem resolution end-to-end.
- Develop unit tests to be able to test them automatically.
- Use SOLID development principles to maintain data integrity and cohesiveness.
- Interact with product owner and business representatives to determine and satisfy needs.
- Sense of ownership and pride in your performance and its impact on company’s success.
- Critical thinker and problem-solving skills.
- Team player.
- Good time-management skills.
- Great interpersonal and communication skills.
- 4-7 years of experience as a Data Engineer.
- Self-driven with minimal supervision.
- Proven experience with T-SQL programming, Azure Databricks, Spark (PySpark/Scala), Delta Lake, Unity Catalog, ADLS Gen2
- Microsoft TFS, Visual Studio, Devops exposure.
- Experience with cloud platforms such as Azure or any.
- Analytical, problem-solving mindset.
- HealthCare domain knowledge
- Healthcare Domain Knowledge
Similar Jobs
View All
Talk to us
Feel free to call, email, or hit us up on our social media accounts.
Email
info@antaltechjobs.in