Vadodara, Gujarat, India
Information Technology
Full-Time
IBM
Overview
Introduction
In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology
Your Role And Responsibilities
As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and AWS Cloud Data Platform
Responsibilities
Master's Degree
Required Technical And Professional Expertise
In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology
Your Role And Responsibilities
As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and AWS Cloud Data Platform
Responsibilities
- Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark, Scala, and Hive, Hbase or other NoSQL databases on Cloud Data Platforms (AWS) or HDFS
- Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform
- Experience in developing streaming pipelines
- Experience to work with Hadoop / AWS eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc
Master's Degree
Required Technical And Professional Expertise
- Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala ;
- Minimum 3 years of experience on Cloud Data Platforms on AWS;
- Experience in AWS EMR / AWS Glue / DataBricks, AWS RedShift, DynamoDB
- Good to excellent SQL skills
- Exposure to streaming solutions and message brokers like Kafka technologies
- Certification in AWS and Data Bricks or Cloudera Spark Certified developers
Similar Jobs
View All
Talk to us
Feel free to call, email, or hit us up on our social media accounts.
Email
info@antaltechjobs.in