Bangalore, Karnataka, India
Information Technology
Other
Prodapt
Overview
Overview:
As a Data Engineering Architect, you will be responsible for designing and implementing robust data solutions that leverage both GCP and Hadoop technologies. You will create scalable data pipelines, optimize data storage, ensure data quality and accessibility, facilitate data migration projects, and provide pre-sales support to help articulate our data solutions to potential clients. Your expertise will be crucial in transforming complex data sets into actionable insights that align with our business objectives. Your knowledge of the business domain will be instrumental in understanding business requirements and translating them into effective data strategies.
Responsibilities: Design and architect data solutions on Google Cloud Platform (GCP) and Hadoop ecosystems to support data ingestion, processing, and storage.
Requirements: - Develop and maintain data pipelines using tools such as Apache Beam, Dataflow, Hadoop, BigQuery and Hive.
- Lead data migration projects, ensuring seamless transition of data from on-premises to cloud environments.
- Collaborate with cross-functional teams to gather requirements and translate them into technical specifications.
- Provide pre-sales support by engaging with potential clients, understanding their data needs, and presenting tailored solutions.
- Implement data governance and data quality frameworks to ensure the integrity and reliability of data.
- Optimize data models and storage solutions for performance and cost efficiency.
- Provide technical leadership and mentorship to junior data engineers and analysts.
- Stay updated with the latest trends and technologies in data engineering, GCP and Hadoop.
- Work closely with stakeholders to understand their data needs and provide actionable insights.
Qualifications:
- Bachelor’s degree in Computer Science, Information Technology, or a related field; Master’s degree preferred.
- Proven experience of around 10 to 15 years as a Data Engineer or Data Architect, with a strong focus on GCP and Hadoop.
- In-depth knowledge of GCP services such as BigQuery, Dataflow, Pub/Sub, Cloud Storage, and Dataproc, as well as Hadoop components like HDFS, MapReduce, and Hive.
- Demonstrated experience in designing data pipelines and executing data migration projects.
- Experience with data modeling, ETL processes, and data warehousing concepts.
- Familiarity with programming languages such as Python, Java, or Scala.
- Strong analytical skills and the ability to work with large data sets.
- Excellent problem-solving skills and the ability to work in a fast-paced environment.
- Strong communication and collaboration skills.
- Following skills are desirable:
- Experience with machine learning and data analytics tools.
- Knowledge of data privacy regulations and best practices.
- Certifications in GCP (e.g., Google Cloud Professional Data Engineer) and Hadoop related technologies are a plus.
Similar Jobs
View All
Talk to us
Feel free to call, email, or hit us up on our social media accounts.
Email
info@antaltechjobs.in