Job Description

Roles and Responsibility

  • Data modelling and design
  • Excellent Coding skills in ETLTools.
  • Experience on running ETLpipelines from a wide variety of sources, both batch & streaming, usinglatest data frameworks and technologies with real time monitoring and alerting.
  • Essential experience of workingwith distributed systems software development.
  • Critical experience inperformance optimization for both data loading and ingestion.
  • Know-how of criticaldeveloper’s toolkit such as Linux, GitHub, Dockers, VSCode, Jupyter
  • Ability to work in a fast-pacedand deadline driven environment.
  • Experience of working on publicclouds (AWS / Azure)
  • Experience of migrationprojects from on-Prem to cloud
  • Google Cloud Platform (minimum2 years’ experience with the following services)
  • Data Ingestion – Data Fusion,Cloud Storage, BigQuery
  • Data Preparation – Dataprep,Dataflow, Cloud Functions, BigQu...
  • Ready to Apply?

    Take the next step in your AI career. Submit your application to Multi Recruit today.

    Submit Application