Job Description

- 8–10 years of experience in data engineering or ETL development, with at least 4+ years in Google Cloud Dataflow.
- Good hands-on experience in Google Cloud Data Services (like Dataflow, Cloud Storage, Big Query, Cloud Composer, Secret Manager, etc.).
- Strong understanding of ETL/ELT concepts and data migration of TB scale of data.
- Develop and optimize end-to-end data pipelines using Dataflow.
- Develop and implement generic, reusable pipelines for integrating incremental data.
- Expertise in leading a data processing team and delivering with quality.
- Ensure data quality throughout the data pipeline.
Roles & Responsibilities
- Have proficiency in design, implementation, and optimization of data engineering solutions over large volume (TB, PB scale) of data using GCP data services.
- Proven expertise in GCP services including Dataflow, Big Query, Cloud Storage, Cloud Composer, Cloud Functions. Experience building scalable data lakes and pipelines.
- ...

Ready to Apply?

Take the next step in your AI career. Submit your application to Impetus today.

Submit Application