Job Description

  • Design, develop, and maintain robust and scalable data pipelines using Python and SQL.
  • Analyze and understand source systems and data flows to support accurate data ingestion.
  • Ensure data quality, consistency, and governance across various systems and platforms.
  • Optimize existing pipelines and queries for improved performance and scalability.
  • Role Requirements and Qualifications:
  • Strong proficiency in Python and SQL for data processing, scripting, and analytics.
  • Proven experience in building and maintaining production-level data pipelines.
  • Familiarity with Azure cloud services such as Azure Data Factory, Blob Storage, and SQL Database.
  • Experience working with Databricks for big data processing and collaborative analytics.
  • Exposure to or willingness to learn Exploratory Data Analysis (EDA) techniques.


Skills Required
Python, Sql, Azure, Databricks, Data Governance

Ready to Apply?

Take the next step in your AI career. Submit your application to Acenet today.

Submit Application