Job Description

Description

:

Primary Skills: Databricks with Pyspark, Python

Secondaryskills; ADF

  • 7+ years of experience with detailed knowledge of data warehouse technical architectures, ETL/ ELT and reporting/analytic tools.
  • 4+ years of work experience in Databricks, Pyspark and Python project development work
  • Expertise in Bigdata Eco systems like HDFS and Spark
  • Strong hands-on experience on Python or Scala.
  • Hands on experience on Azure Data Factory.
  • Able to convert the SQL stored procedures to Python code in Pyspark frame work using Dataframes.
  • Design and develop SQL Server stored procedures, functions, views and triggers to be used during the ETL process.
  • SQL SERVER development work experience with relational databases and knowledge is a must
  • Development of Stored Procedures for transformations in ETL pipeline
  • Should have work ...
  • Ready to Apply?

    Take the next step in your AI career. Submit your application to Hexaware Technologies today.

    Submit Application