Job Description

Key Responsibilities

  • Design, build, and maintain end-to-end GCP data pipelines for both batch and streaming workloads.
  • Ensure high availability, reliability, and performance of data platforms in line with defined SLAs and SLOs.
  • Develop and manage ETL/ELT workflows using Cloud Composer (Airflow), Dataflow (Apache Beam), and BigQuery.
  • Manage and enhance data lakes and data warehouses using BigQuery and Cloud Storage.
  • Implement real-time and streaming data solutions using Pub/Sub, Dataflow, or Kafka.
  • Build and expose data APIs and microservices for data consumption using Cloud Run, Cloud Functions, or App Engine.
  • Define and enforce data quality, governance, and lineage using Data Catalog and Cloud Data Quality tools.
  • Collaborate closely with DevOps teams to implement CI/CD pipelines, infrastructure as code, and automated monitoring for data workflows.
  • Participate in incident management, root cau...

Ready to Apply?

Take the next step in your AI career. Submit your application to VAYUZ Technologies today.

Submit Application