Job Description
Job Description – Senior Data Engineer (GCP)
Role Overview
We are looking for a Senior Data Engineer with 4+ years of experience in building scalable data pipelines and strong hands-on expertise in Google Cloud Platform (GCP ). The ideal candidate will work closely with data science, analytics, and business teams to design, build, and optimize reliable data solutions.
Key Responsibilities
Design, develop, and maintain scalable ETL/ELT data pipelines on GCP.
Build and optimize BigQuery datasets, views , partitioned and clustered tables.
Develop batch and near-real-time pipelines using Dataflow / Apache Beam
Ingest data from multiple sources (APIs, databases, files, streaming systems)
Implement data quality checks, validation, and monitoring
Optimize query performance and control GCP cost usage
Work with Cloud Storage, Pub/Sub, Composer (Airflow) for orchestration
Collaborate with Data Scientists & Analysts to support ML and BI use cases
Role Overview
We are looking for a Senior Data Engineer with 4+ years of experience in building scalable data pipelines and strong hands-on expertise in Google Cloud Platform (GCP ). The ideal candidate will work closely with data science, analytics, and business teams to design, build, and optimize reliable data solutions.
Key Responsibilities
Design, develop, and maintain scalable ETL/ELT data pipelines on GCP.
Build and optimize BigQuery datasets, views , partitioned and clustered tables.
Develop batch and near-real-time pipelines using Dataflow / Apache Beam
Ingest data from multiple sources (APIs, databases, files, streaming systems)
Implement data quality checks, validation, and monitoring
Optimize query performance and control GCP cost usage
Work with Cloud Storage, Pub/Sub, Composer (Airflow) for orchestration
Collaborate with Data Scientists & Analysts to support ML and BI use cases
Ready to Apply?
Take the next step in your AI career. Submit your application to Mindsprint today.
Submit Application