Job Description

Responsibilities:

  • Collaborate in a cross-functional team to design, optimize, and maintain data pipelines
  • Develop and maintain Python applications for a data warehouse environment
  • Build and enhance REST APIs interfacing with external data services
  • Integrate and process data streams with Kafka and with Postgres for data storage
  • Develop and maintain internal tools, improving development and operating workflows
  • Contribute to continuous improvement of the data warehouse ecosystem and internal processes
  • Establish best practices and development standards for Python-based services
  • Απαραίτητα Προσόντα:

    Qualifications:

  • Good programming skills in Python and practical experience in software development
  • Knowledge in stream-based Big Data platforms using Apache Kafka
  • Working experience with relational databases, data warehouses and advanced SQL
  • Experience in REST APIs is a plus
  • Ready to Apply?

    Take the next step in your AI career. Submit your application to Confidential today.

    Submit Application