Bangalore, Karnataka, India
Information Technology
Full-Time
Impetus
Overview
Qualification
Need to hire GCP enabled Module Leads and Leads with proficiency on data engineering technologies and languages. These folks should be able to to drive and lead migration from On Prem to GCP for Amex use cases
Role
7-10 years of experience in the role of implementation of high end software products.
Provides technical leadership in Big Data space (Hadoop Stack like Spark, M/R, HDFS, Hive, HBase etc) and contributes to open source Big Data technologies.
Must have : Operating knowledge of cloud computing platforms (GCP, especially Big Query, Dataflow, Dataproc, Storage, VMs, Networking, Pub Sub, Cloud Functions, Composer servics)
Should be aware with columnar database e.g parquet, ORC etc
Visualize and evangelize next generation infrastructure in Cloud platform/Big Data space (Batch, Near Real-time, Real-time technologies).
Passionate for continuous learning, experimenting, applying and contributing towards cutting edge open source technologies and software paradigms
Developing and implementing an overall organizational data strategy that is in line with business processes. The strategy includes data model designs, database development standards, implementation and management of data warehouses and data analytics systems
Expert-level proficiency in at-least 4-5 GCP services
Experience with technical solutions based on industry standards using GCP - IaaS, PaaS and SaaS capabilities
Strong understanding and experience in distributed computing frameworks, particularly
Experience working within a Linux computing environment, and use of command line tools including knowledge of shell/Python scripting for automating common task
Experience
7 to 10 years
Job Reference Number
12861
Need to hire GCP enabled Module Leads and Leads with proficiency on data engineering technologies and languages. These folks should be able to to drive and lead migration from On Prem to GCP for Amex use cases
Role
7-10 years of experience in the role of implementation of high end software products.
Provides technical leadership in Big Data space (Hadoop Stack like Spark, M/R, HDFS, Hive, HBase etc) and contributes to open source Big Data technologies.
Must have : Operating knowledge of cloud computing platforms (GCP, especially Big Query, Dataflow, Dataproc, Storage, VMs, Networking, Pub Sub, Cloud Functions, Composer servics)
Should be aware with columnar database e.g parquet, ORC etc
Visualize and evangelize next generation infrastructure in Cloud platform/Big Data space (Batch, Near Real-time, Real-time technologies).
Passionate for continuous learning, experimenting, applying and contributing towards cutting edge open source technologies and software paradigms
Developing and implementing an overall organizational data strategy that is in line with business processes. The strategy includes data model designs, database development standards, implementation and management of data warehouses and data analytics systems
Expert-level proficiency in at-least 4-5 GCP services
Experience with technical solutions based on industry standards using GCP - IaaS, PaaS and SaaS capabilities
Strong understanding and experience in distributed computing frameworks, particularly
Experience working within a Linux computing environment, and use of command line tools including knowledge of shell/Python scripting for automating common task
Experience
7 to 10 years
Job Reference Number
12861
Similar Jobs
View All
Talk to us
Feel free to call, email, or hit us up on our social media accounts.
Email
info@antaltechjobs.in