Overview
About the job:
Company Overview
ALIQAN Technologies is a leading IT services and consulting firm based in New Delhi, offering clients innovative design, software, and web development solutions all in one place. Our comprehensive approach aims to deliver pure satisfaction and convenience for our valued customers.
Job Overview
We are seeking a skilled Data Engineer with expertise in GCP to join our growing team at ALIQAN Technologies. This is a remote, mid-level position located in Noida for professionals with up to 10 years of relevant experience. The ideal candidate will drive our cloud-driven data engineering initiatives to the next level.
Qualifications and Skills
Hands-on experience in Google Cloud Platform, able to architect, configure, and operate GCP solutions (Mandatory skill).
Expertise in using Terraform for infrastructure automation and creating scalable cloud resources (Mandatory skill).
Proficiency in Data Fusion for orchestrating and managing ETL processes on Google Cloud (Mandatory skill).
Strong working knowledge of cloud storage concepts and managing large-scale data lakes using cloud-based storage solutions.
Solid hands-on experience in Pyspark for large-scale data processing and efficient transformation of datasets in distributed environments.
Expertise in Cloud Dataflow for running data pipelines and handling real-time or batch data processing at scale within GCP.
Experience in orchestrating, scheduling, and monitoring workflows with Cloud Composer (Airflow), ensuring reliable pipeline execution.
Proven experience in utilizing BigQuery for advanced analytics, high-performance querying, and optimizing data warehouse solutions on GCP.
Ability to troubleshoot performance issues, debug failures, and ensure data integrity and compliance with best practices.
Clear written and verbal communication skills to effectively present technical solutions to cross-functional teams and stakeholders.
Roles and Responsibilities
Design, develop, and optimize scalable data pipelines on Google Cloud Platform leveraging Data Fusion and related cloud technologies.
Automate infrastructure deployment using Terraform to promote efficiency, repeatability, and version control across environments.
Build and maintain robust ETL workflows, ensuring timely ingestion, transformation, and loading of high-volume data sets into BigQuery and cloud storage.
Develop advanced transformations and data processing solutions using Pyspark and Cloud Dataflow for both batch and real-time requirements.
Implement workflow orchestration with Cloud Composer (Airflow), monitoring runs and addressing failures proactively.
Work collaboratively with data science, analytics, and software engineering teams to define data requirements and technical specifications.
Document data engineering solutions, architecture decisions, and data models to support knowledge sharing and operational support.
Monitor, profile, and tune performance of pipelines and data solutions to ensure scalability, reliability, and optimal cost.
Who can apply:
- have minimum 5 years of experience
- are Computer Science Engineering students
Only those candidates can apply who:
Salary:
₹ 9,60,000 - 10,80,000 /year
Experience:
5 year(s)
Deadline:
2025-12-26 23:59:59