Job Description

Responsibilities

  • Design, develop, and maintain data pipelines and ETL workflows using AWS services.
  • Implement orchestration and automation for data workflows.
  • Work with large datasets to ensure data integrity, scalability, and performance.
  • Collaborate with stakeholders to understand data requirements and deliver solutions.
  • Deploy changes directly to production environments with confidence and accountability.
  • Support migration efforts to Databricks and optimize workflows for performance.

Qualifications

  • Experience with data lake architectures, big data technologies, and data pipeline orchestration.
  • Familiarity with CI/CD practices for data engineering.
  • AWS Certification (e.g., AWS Certified Data Analytics - Specialty or Solutions Architect) is a plus.
  • Strong problem-solving skills and attention to detail.
  • Expert-level fluency in AWS services rele...

Ready to Apply?

Take the next step in your AI career. Submit your application to Michael Page today.

Submit Application