Job Description

Responsibilities



  • Design, build, and maintain end-to-end ETL/ELT pipelines (Python/SQL).

  • Orchestrate workflows using Apache Airflow (scheduling, retries, SLA management, monitoring).

  • Design and implement data models (staging → core → marts) for reporting and application/API consumption.

  • Implement automated data quality controls: completeness, consistency checks, reconciliation, anomaly detection, and alerting.

  • Integrate with source systems (databases, APIs, files) and publish processed datasets to relational databases.

  • Define and maintain data contracts and metrics in collaboration with backend and business stakeholders.

  • Maintain clear documentation (field definitions, lineage, schema evolution).

  • Improve reliability and reduce manual interventions (runbooks, debugging, observability).




Hybrid work from Warszawa - 3 days from the office, 2 remot...

Ready to Apply?

Take the next step in your AI career. Submit your application to Michael Page today.

Submit Application