Job Description

Description:


Noesis is looking for candidates with the following profile:

Main Tasks and Responsibilities:

  • Develop and maintain data processing workflows using SQL, Python, and PySpark;
  • Design, implement, and optimise data models within Data Warehouse and Data Lake architectures;
  • Work within distributed data ecosystems, supporting platforms such as Hadoop, Hive, Spark, or Databricks:
  • Contribute to the automation of data pipelines and engineering processes:
  • Participate in code reviews and ensure alignment with established development standards.

Requirements:

  • Degree in Computer Science or similar;
  • Minimum of 1 year as a Data Engineer;
  • Foundational knowledge of SQL and Python or similar data processing technologies;
  • Basic understanding of data modelling concepts and data storage architectures, including Data Warehouse and Data Lake;
  • Familiarity with at least...

Ready to Apply?

Take the next step in your AI career. Submit your application to NOESIS today.

Submit Application