Job Description
Responsibilities:
Collaborate in a cross-functional team to design, optimize, and maintain data pipelinesDevelop and maintain Python applications for a data warehouse environmentBuild and enhance REST APIs interfacing with external data servicesIntegrate and process data streams with Kafka and with Postgres for data storageDevelop and maintain internal tools, improving development and operating workflowsContribute to continuous improvement of the data warehouse ecosystem and internal processesEstablish best practices and development standards for Python-based servicesΑπαραίτητα Προσόντα:
Qualifications:
Good programming skills in Python and practical experience in software development Knowledge in stream-based Big Data platforms using Apache Kafka Working experience with relational databases, data warehouses and advanced SQL Experience in REST APIs is a plus
Ready to Apply?
Take the next step in your AI career. Submit your application to Confidential today.
Submit Application