Job Description

Requirement:

  • Mandatory to have knowledge of Big Data Architecture Patterns and experience in delivery of BigData and Hadoop Ecosystems.
  • Strong experience required in GCP . Must have done multiple large projects with GCP Big Query and ETL
  • Experience working in GCP based Big Data deployments (Batch/Realtime) leveraging components like GCP Big Query, air flow, Google Cloud Storage, Data fusion, Data flow, Data Proc etc
  • Should have experience in SQL/Data Warehouse
  • Expert in programming languages like Java, Hadoop, Scala
  • Expert in at least one distributed data processing frameworks: like Spark (Core, Streaming , SQL), Storm or Flink etc.
  • Should have worked on any of Orchestration tools – Oozie , Airflow , Ctr-M or similar, Kubernetes.
  • Worked on Performance Tuning, Optimization and Data security
  • Preferred Experience and Knowledge:
  • Excellent understanding of data technologies landscape / eco...

Ready to Apply?

Take the next step in your AI career. Submit your application to Tata Consultancy Services Limited today.

Submit Application