Job Description

  • 8–10 years of experience in data engineering or ETL development, with at least 4+ years in Google Cloud Dataflow.
  • Good hands-on experience in Google Cloud Data Services (like Dataflow, Cloud Storage, BigQuery, Cloud Composer, Secret Manager, etc.).
  • Strong understanding of ETL/ELT concepts and data migration of TB scale of data.
  • Develop and optimize end-to-end data pipelines using Dataflow.
  • Develop and implement generic, reusable pipelines for integrating incremental data.
  • Expertise in leading a data processing team and delivering with quality.
  • Ensure data quality throughout the data pipeline.


Roles & Responsibilities


  • Have proficiency in design, implementation, and optimization of data engineering solutions over large volume (TB, PB scale) of data using GCP data services.
  • Proven expertise in GCP services including Dataflow, BigQuery, Cloud Storage, Cloud Compos...

Ready to Apply?

Take the next step in your AI career. Submit your application to Impetus today.

Submit Application