Job Description

  • Designing and implement data processes (e.g. ADF Pipelines, Databricks)
  • Data Models developing and versioning
  • Working with cloud-based data infrastructures (Azure)
  • Development & execution of schema + data migration
  • Optimizing ETL performance
  • University degree in computer science, mathematics, physics or similar field
  • Fluent English and Polish language skills
  • Technical knowledge (Programming languages, Databases and Frameworks)
  • Data modelling concepts and techniques (data warehouse, transactional)
  • Experience with Databricks & Delta Lake, Python, PowerShell, MS SQL Server, T-SQL
  • A good knowledge of the Azure data engineering stack - Azure Data Factory, Azure Data Lake, Cosmos DB
  • A good working knowledge of DevOps, Testing, Operating and Optimizing complex data flows
  • Technical knowledge (Programming languages, Databases and Frameworks)
  • Experience working with ...
  • Ready to Apply?

    Take the next step in your AI career. Submit your application to Cluster Reply today.

    Submit Application