Overview
Job Title
Senior Data Platform Developer (GCP - BigQuery)
Job Overview
We are seeking an experienced Senior Data Platform Developer to join our team. The ideal candidate will have strong hands-on expertise in AlloyDB (or PostgreSQL) for transactional (OLTP) systems, BigQuery for large-scale analytics (OLAP), and exposure to Snowflake or similar cloud data warehouses. This role is ideal for someone passionate about data and performance — with a solid background in data modeling, performance tuning, data reduction, archiving, and permissioning in cloud-native environments.
Key Responsibilities
Design, develop, and maintain scalable data pipelines to ingest, process, and store data in AlloyDB and BigQuery.
Administer and tune AlloyDB for PostgreSQL instances for high availability, reliability, and OLTP performance.
Build and manage analytical data models in BigQuery, enabling efficient OLAP reporting and dashboarding use cases.
Optimize data architectures with best practices for partitioning, archiving, compression, and cost-efficiency.
Implement and manage data governance, security, and permissioning policies across platforms.
Develop and maintain data validation and monitoring solutions to ensure pipeline reliability and data quality.
Support performance tuning and schema optimization efforts across both transactional and analytical systems.
Collaborate with cross-functional teams including analysts, engineers, and product owners to deliver end-to-end data solutions.
Contribute to documentation and knowledge sharing across the team.
Key Skills & Requirements:-
7+ years of experience in database development with a strong focus on PostgreSQL or AlloyDB.
Proven expertise in OLTP and OLAP architecture and performance tuning.
Extensive hands-on experience with Google Cloud Platform (GCP), especially BigQuery and AlloyDB.
Solid understanding of data modeling, indexing, and query optimization for large-scale systems.
Experience with Snowflake, Redshift, or similar cloud data warehouses is a strong plus.
Proficient in SQL and one or more programming languages (e.g., Python or Java) for data pipeline and automation tasks.
Strong understanding of data lifecycle management, including archiving, retention, and compliance.
Familiarity with data security, RBAC, and access control mechanisms in cloud environments.
Bachelor’s degree in Computer Science, Engineering, or a related field, or equivalent practical experience.