Job Description

Job Duties

We are implementing a strict Medallion Architecture to organize petabytes of industrial data. This role is for a Data Engineer who excels at transforming raw chaos into structured, queryable assets.

You will build and maintain the ELT pipelines that move data from Bronze (Raw) to Silver (Cleaned) and Gold (Aggregated). You will work with Delta Lake (On-prem/Databricks ), Polars, and Airflow to ensure data quality and availability for Data Scientists and the Knowledge Graph.

What You’ll Do

  • Pipeline Development: Develop and maintain robust Airflow DAGs to orchestrate complex data transformations.
  • Data Transformation: Use Spark (when scale requires) and Polars to clean, enrich, and aggregate data according to business logic.
  • Architecture Implementation: Enforce the Medallion Architecture patterns, ensuring clear separation of concerns between data layers.

Ready to Apply?

Take the next step in your AI career. Submit your application to Halliburton today.

Submit Application