Job Description

About the job RQ08100: Software Developer - ETL

General Responsibilities

This role is responsible for designing, developing, maintaining, and optimizing ETL (Extract, Transform, Load) processes in Databricks for data warehousing, data lakes, and analytics. The developer will work closely with data architects and business teams to ensure the efficient transformation and movement of data to meet business needs, including handling Change Data Capture (CDC) and streaming data.

  • Azure Databricks, Delta Lake, Delta Live Tables, and Spark to process structured and unstructured data.
  • Azure Databricks/PySpark (good Python/PySpark knowledge required) to build transformations of raw data into curated zone in the data lake.
  • Azure Databricks/PySpark/SQL (good SQL knowledge required) to develop and/or troubleshoot transformations of curated data into FHIR.
  • Understand the requirements. Recommend changes to models to support ETL desi...

Ready to Apply?

Take the next step in your AI career. Submit your application to Rubicon Path today.

Submit Application