Job Description

  • Construct and maintain data pipelines (ETL/ELT) from multiple data sources into centralized repositories (data warehouses, lakehouses, databases)
  • Implement physical data models to support efficient storage, analysis, and data integrity
  • Transform raw data into usable formats for reporting and analytics, including data cleansing and deduplication
  • Integrate data from disparate systems while ensuring data quality and consistency
  • Bachelor’s degree in Computer Science, IT, Mathematics, or equivalent
  • At least 6 years of progressive Data Engineering experience
  • Strong background in ETL/ELT, data warehousing, and data modeling
  • Minimum 3 years’ experience on Microsoft Azure , including exposure to Microsoft Fabric and Data Factory
  • Proficient in PySpark or Python , T-SQL , and database technologies
  • Proven e...

Ready to Apply?

Take the next step in your AI career. Submit your application to SYSGEN RPO today.

Submit Application