Free cookie consent management tool by TermsFeed Sr Data Engineer | Antal Tech Jobs
Back to Jobs
5 Weeks ago

Sr Data Engineer

decor
Information Technology
Full-Time
Illumina

Overview

What if the work you did every day could impact the lives of people you know? Or all of humanity?
At Illumina, we are expanding access to genomic technology to realize health equity for billions of people around the world. Our efforts enable life-changing discoveries that are transforming human health through the early detection and diagnosis of diseases and new treatment options for patients.
Working at Illumina means being part of something bigger than yourself. Every person, in every role, has the opportunity to make a difference. Surrounded by extraordinary people, inspiring leaders, and world changing projects, you will do more and become more than you ever thought possible.
JOB SUMMARY:

The position is an exciting opportunity to be a member of the Data Integration & Analytics team within GIS Application & Platform Services Dept. The team’s scope includes data services on enterprise data platforms like Snowflake cloud data platform, SAP HANA analytics and Denodo data virtualization. The team is responsible for managing the full software development lifecycle of data, its data quality, and operations. This role will support strategic solutions like the enterprise data lake on AWS/ Snowflake and the enterprise data warehouse on Snowflake. The role is responsible for collaborating with cross functional teams, planning, and coordinating requirements, providing data engineering services and helping build trust in the data being managed.
JOB DUTIES:
  • Translate business requirements into data requirements, data warehouse design and sustaining data management strategies on enterprise data platforms like Snowflake.
  • Work with project leads, stakeholders, and business SMEs to define technical specifications to develop data modeling requirements and maintain data infrastructure to provide business users with the tools and data needed.
  • Solution architect, design and develop large scale and optimized analytics solutions.
  • Requirements gathering and analysis, development planning and co-ordination in collaboration with stakeholder teams.
  • Understand data requirements & data latency to design & develop data ingestion pipelines
  • Design and architect data lake solutions on AWS S3 and Snowflake, considering scalability, automation, security, and performance requirements
  • Responsible for architecting, building, and optimizing data lake to store, process and analyze large volumes of structure and unstructured data.
  • Understand data architecture & solution design, design & develop dimensional/ semantic data models in an enterprise data warehouse environment.
  • Development and automation of enterprise data transformation pipelines
  • Work with cross functional teams and process owners on the development of test cases and scripts, test models and solutions to verify that requirements are met and ensuring high levels of data quality.
  • Develop and apply quality assurance best practices.
  • Design and apply data engineering best practices for data lakes and data warehouses.
  • Analyze data and data behaviors to support business user queries.
  • Excellent understanding of impact due to changes in data platforms, data models and data behaviors.
  • Excellent problem-solving skills and ability to troubleshoot complex data engineering issues.
  • Benchmark application operational performance periodically, track (metrics) and fix issues.
  • Understand and comply with data governance and compliance practices as defined for risk management. This includes data encryption practices, RBAC and security policies.
  • Promote and apply metadata management best practices supporting enterprise data catalogs.
  • Support change and release management processes.
  • Support incident and response management including problem solving and root cause analysis, documentation.
  • Support automation and on-call processes (Tier 1 / Tier 2).
SPECIFIC SKILLS OR OTHER REQUIREMENTS:

Requires 6 years of experience with
Primary Experience: Data Integration and Data Warehousing
Data Platforms:
  • Required: Snowflake cloud data platform
  • Preferred: SAP HANA analytics
Data Engineering:
  • Data Integration: 5+ years of experience working and technologies and ETL/ ELT patterns of data delivery. Strong understanding and experience in implementation of SDLC practices.
  • Snowflake: 2+ years of required expertise with Snowflake SnowSQL, Snowpipe (integrated with AWS S3), Streams and Tasks, Stored Procedures, Merge statements, Functions, RBAC, Security Policies, Compute and Storage usage optimization techniques, Performance optimization techniques.
  • Dbt Cloud: 2+ years of expertise with dbt cloud platform, very good understanding of data models (views, data materializations, incremental data loads, snapshots), cross functional references, DAGs and its impact, job scheduling, audit and monitoring, working with code repositories, deployments.
  • AWS: 1+ years of required expertise with AWS services like S3, Glue, Lambda, Athena.
  • Apache Software: Preferred expertise in data processing with Spark & Flink, message broking using Kafka, orchestration using Airflow, processing high data volumes in open table formats like Iceberg.
  • HVR (Fivetran): Experience or knowledge on data replication with HVR is a plus.
  • SnapLogic: Experience or knowledge on data integrations with SnapLogic is a plus.
  • Certifications: Snowflake and dbt Cloud data engineering certifications is a plus.
Source Systems:
  • Required: Knowledge and experience integrating data from SAP ERP (On premises, Cloud), Salesforce CRM, Workday, ServiceNow, Relational databases, REST APIs, Flat Files, Cloud Storage
Data Orchestration:
  • Required: Control-M, Apache Airflow

Cloud Storage Platforms:
  • Required: Amazon Web Services
  • Preferred: Microsoft Azure

Programming/ Scripting:
  • Snowflake: Scripting (Snowpipes, Tasks, Streams, Merge Statements, Stored Procedures, Functions, Security Policies), SQL, Python, PySpark

Code Management:
  • Required: Excellent understanding of working with code repositories like GitHub, GitLab ,code version management, branching and merging patterns in a central repository managing cross functional code, deployments.

Data Operations:
  • Excellent understanding of Data Ops practices for data management.

Solution Design:
  • Good understanding of end-to-end solution architecture and design practices, ability to document solutions (maintain diagrams)

Stakeholder Engagement:
  • Ability to take the lead and drive project activities collaborating with Analytics stakeholders and ensure the requirement is completed.

Data Warehousing:
  • Excellent on fundamental concepts of dimensional modeling, experience working on data warehouse solutions, requirement gathering, design & build, data analysis, data quality. data validations, developing data transformations using ELT/ ETL patterns.

Data As-A-Product:
  • Preferred knowledge and experience working with data treated as data products. Illumina is following a hybrid data mesh architecture which promotes data as a product for data lifecycle management.

Governance:
  • Good understanding of working with companies having regulated systems and processes for data. Adherence to data protection practices using tagging, security policies and data security (object-level, column-level, row-level). Promoting and applying best practices for data catalogs, following data classification practices and metadata management for data products within your scope.

Operating Systems:
  • Windows, Linux

EDUCATION & EXPERIENCE:

Bachelor’s degree equivalent in Computer Science/Engineering or equivalent degree.
#LI-HYBRID
#illuminacareers

Illumina believes that everyone has the ability to make an impact, and we are proud to be an equal opportunity employer committed to providing employment opportunity regardless of sex, race, creed, color, gender, religion, marital status, domestic partner status, age, national origin or ancestry, physical or mental disability, medical condition, sexual orientation, pregnancy, military or veteran status, citizenship status, and genetic information.
Share job
Similar Jobs
View All
1 Day ago
TrueFan - Senior Machine Learning Engineer
Information Technology
  • Thiruvananthapuram, Kerala, India
About UsTrueFan is at the forefront of AI-driven content generation, leveraging cutting-edge generative models to build next-generation products. Our mission is to redefine content generation space through advanced AI technologies, including deep ge...
decor
1 Day ago
Salesforce commerce cloud consultant
Information Technology
  • Thiruvananthapuram, Kerala, India
Salesforce Commerce Cloud consultant  5+ Years of Experience 6 to 12 months Mode - Remote 1.1LPM - 1.2LPM Max Key Responsibilities Translate business requirements into scalable Salesforce Service Cloud solutions, in collaboration with CAE's technic...
decor
1 Day ago
Cloud Infrastructure Engineer
Information Technology
  • Thiruvananthapuram, Kerala, India
DescriptionInvent the future with us. Recognized by Fast Company’s 2023 100 Best Workplaces for Innovators List, Ampere is a semiconductor design company for a new era, leading the future of computing with an innovative approach to CPU design focuse...
decor
1 Day ago
Devops Engineer- Intermetiate
Information Technology
  • Thiruvananthapuram, Kerala, India
BackJD: Dev ops Engineer:As a DevOps Specialist- should be able to take ownership of the entire DevOps process, including Automated CI/CD pipelines and deployment to production.They should also be comfortable with risk analysis and prioritization.Le...
decor
1 Day ago
Sr Data Scientist (London)
Information Technology
  • Thiruvananthapuram, Kerala, India
AryaXAI stands at the forefront of AI innovation, revolutionizing AI for mission-critical, highly regulated industries by building explainable, safe, and aligned systems that scale responsibly. Our mission is to create AI tools that empower research...
decor
1 Day ago
Software Test Engineer
Information Technology
  • Thiruvananthapuram, Kerala, India
By clicking the “Apply” button, I understand that my employment application process with Takeda will commence and that the information I provide in my application will be processed in line with Takeda’s Privacy Notice and Terms of Use. I further att...
decor
1 Day ago
Software Developer 5 (Java Fullstack)
Information Technology
  • Thiruvananthapuram, Kerala, India
Job DescriptionBuilding off our Cloud momentum, Oracle has formed a new organization - Oracle Health Applications & Infrastructure. This team focuses on product development and product strategy for Oracle Health, while building out a complete platfo...
decor
1 Day ago
Java Developer - Spring Frameworks
Information Technology
  • Thiruvananthapuram, Kerala, India
Java DescriptionWe are looking for a passionate and talented Java Developer with 2-3 years of hands-on experience to join our growing development team.The ideal candidate should have a strong foundation in Java technologies and the ability to develo...
decor

Talk to us

Feel free to call, email, or hit us up on our social media accounts.
Social media