Bangalore, Karnataka, India
Information Technology
Full-Time
Koantek
Overview
Location : Mumbai.
Work mode : Hybrid.
Must have skills : Databricks, (Hands on Python, SQL,Pyspark with any cloud platform).
Job Summary
The Databricks AWS/Azure/GCP Architect at Koantek builds secure, highly scalable big data solutions to. achieve tangible, data-driven outcomes all the while keeping simplicity and operational effectiveness in mind.
This role collaborates with teammates, product teams, and cross-functional project teams to lead the adoption and integration of the Databricks Lakehouse Platform into the enterprise ecosystem and AWS/Azure/GCP architecture.
This role is responsible for implementing securely architected big data solutions that are operationally reliable, performant, and deliver on strategic initiatives.
Requirements
Work mode : Hybrid.
Must have skills : Databricks, (Hands on Python, SQL,Pyspark with any cloud platform).
Job Summary
The Databricks AWS/Azure/GCP Architect at Koantek builds secure, highly scalable big data solutions to. achieve tangible, data-driven outcomes all the while keeping simplicity and operational effectiveness in mind.
This role collaborates with teammates, product teams, and cross-functional project teams to lead the adoption and integration of the Databricks Lakehouse Platform into the enterprise ecosystem and AWS/Azure/GCP architecture.
This role is responsible for implementing securely architected big data solutions that are operationally reliable, performant, and deliver on strategic initiatives.
Requirements
- Expert-level knowledge of data frameworks, data lakes and open-source projects such as Apache Spark, MLflow, and Delta Lake.
- Expert-level hands-on coding experience in Spark/Scala,Python or Pyspar.
- In depth understanding of Spark Architecture including Spark Core, Spark SQL, Data Frames, Spark Streaming, RDD caching, Spark MLib IoT/event-driven/microservices in the cloud
- Experience with private and public cloud architectures, pros/cons, and migration considerations.
- Extensive hands-on experience implementing data migration and data processing using AWS/Azure/GCP services.
- 9+ years in consulting experience with minimum 7+ Years of experience in data engineering, data platform and analytics.
- Projects delivered with hands-on experience in development on databricks
- Knowledge of any one cloud platform (AWS or Azure or GCP).
- Deep experience with distributed computing with spark with knowledge of spark runtime internals.
- Familiarity with CI/CD for production deployments.
- Familiarity with optimization for performance and scalability
- Completed data engineering professional certification and required classes.=
Similar Jobs
View All
Talk to us
Feel free to call, email, or hit us up on our social media accounts.
Email
info@antaltechjobs.in