Job Description


Job Description:

Responsibilities:
  • Design and implement end-to-end data pipelines using Cloud Dataflow (Python/Apache Beam) for batch and streaming data.
  • Develop, optimize, and maintain BigQuery stored procedures (SPs), SQL scripts, and user-defined functions (UDFs) for complex transformations and business logic implementation.
  • Build and manage data orchestration workflows using Cloud Composer (Airflow) with appropriate operators and dependencies.
  • Establish secure and efficient connections to source systems for data ingestion and integration.
  • Manage data ingestion workflows from on-premise and cloud sources into Google Cloud Storage (GCS) and BigQuery.
  • Execute history data migration from legacy data warehouses (preferably Snowflake, Teradata, Netezza, Oracle, SQL Server) to BigQuery, ensuring accuracy and performance optimization.
  • Design and maintain data validation and testing frameworks fo...
  • Ready to Apply?

    Take the next step in your AI career. Submit your application to Cynet Systems today.

    Submit Application