Job Description
Key Responsibilities
Design, build, and maintain end-to-end GCP data pipelines (batch and
streaming).
Ensure data platform uptime and performance in alignment with defined
SLAs/SLOs.
Develop and optimize ETL/ELT workflows using Cloud Composer (Airflow),
Dataflow (Apache Beam), and BigQuery.
Manage and enhance data lakes and warehouses using BigQuery and Cloud
Storage.
Implement streaming data solutions using Pub/Sub, Dataflow, or Kafka.
Build data APIs and microservices for data consumption using Cloud Run,
Cloud Functions, or App Engine.
Define and enforce data quality, governance, and lineage using Data Catalog
and Cloud Data Quality tools.
Collaborate with DevOps to build CI/CD pipelines, infrastructure as code, and
automated monitoring for data workflows.
Participate in incident management, RCA, and change control processes
following ITIL best practices.
Proactively identify opti...
Ready to Apply?
Take the next step in your AI career. Submit your application to Niveus Solutions today.
Submit Application