Job Description

We are looking for a PySpark developer ( with ETL background ) to be able to design and build solution on one of our customer programs. This to build a data standardized and curation layer that will integrate data across internal and external sources, provide analytical insights and integrate with customer’s critical systems.

Roles and Responsibilities

  • Ability to design, build and unit test the application in Spark/Pyspark
  • In-depth knowledge of Hadoop, Spark, and similar frameworks
  • Ability to understand existing ETL logic to convert into Spark/PySpark
  • Good implementation experience of oops concepts
  • Knowledge of Unix shell scripting, RDBMS, Hive, HDFS File System, HDFS File Types, HDFS compression codec
  • Experience in processing large amounts of structured and unstructured data, including integrating data from multiple sources
  • Experience in working with Bitbucket a...
  • Ready to Apply?

    Take the next step in your AI career. Submit your application to Techno Wise today.

    Submit Application