Job Description

  • Key Responsibilities:
  • Design, develop, and maintain scalable data pipelines using GCP services.
  • Build ETL/ELT processes to ingest, transform, and load data from multiple sources.
  • Develop custom data processing applications and logic using Java.
  • Optimize data workflows for performance, reliability, and cost-efficiency.
  • Work with cloud-native GCP services such as BigQuery, Dataflow, Pub/Sub, Cloud Storage, and Cloud Functions.
  • Collaborate with data analysts, data scientists, and business teams to understand requirements and deliver solutions.
  • Implement data quality checks, error handling, and monitoring for data pipelines.
  • Ensure compliance with data governance, security, and privacy standards.
  • Troubleshoot data issues and production incidents related to pipelines or processing logic.
  • Participate in code reviews, technical design discussions, and continuous improvement initiatives.

Ready to Apply?

Take the next step in your AI career. Submit your application to BCForward today.

Submit Application