Job Description

What you'll be doing:

  • Design, develop, and optimise DBT models to support scalable data transformations
  • Architect and implement modern ELT pipelines using DBT and orchestration tools like Apache Airflow and Prefect
  • Lead performance tuning and query optimization for DBT models running on Snowflake, Redshift, or Databricks
  • Integrate DBT workflows & pipelines with AWS services (S3, Lambda, Step Functions, RDS, Glue) and event-driven architectures
  • Implement robust data ingestion processes from multiple sources, including manufacturing execution systems (MES), Manufacturing stations, and web applications
  • Manage and monitor orchestration tools (Airflow, Prefect) for automated DBT model execution
  • Implement CI/CD best practices for DBT, ensuring version control, automated testing, and deployment workflows
  • Troubleshoot data pipeline issues and provide solutions for optimizing cost and performance.

Ready to Apply?

Take the next step in your AI career. Submit your application to Evnek today.

Submit Application