Job Description
Designing and implement data processes (e.g. ADF Pipelines, Databricks)Data Models developing and versioningWorking with cloud-based data infrastructures (Azure)Development & execution of schema + data migrationOptimizing ETL performanceUniversity degree in computer science, mathematics, physics or similar fieldFluent English and Polish language skillsTechnical knowledge (Programming languages, Databases and Frameworks) Data modelling concepts and techniques (data warehouse, transactional)Experience with Databricks & Delta Lake, Python, PowerShell, MS SQL Server, T-SQLA good knowledge of the Azure data engineering stack - Azure Data Factory, Azure Data Lake, Cosmos DBA good working knowledge of DevOps, Testing, Operating and Optimizing complex data flowsTechnical knowledge (Programming languages, Databases and Frameworks) Experience working with ...
Ready to Apply?
Take the next step in your AI career. Submit your application to Cluster Reply today.
Submit Application