Job Description
Design, build, and maintain reliable, scalable ETL/ELT data pipelines for automated data ingestion, transformation, and validation into data warehouse.
Develop data integration/data flow system to manage different sources (databases, APIs, files) and to ensure data consistency and quality.
Propose and implement the most appropriate architecture for batch and real-time data ingestion and processing.
Implement data quality checks, logging, and monitoring to ensure reliability and transparency of data flows.
Requirements / Qualifications:
At least Degree in Computer Science, Information Technology or equivalent.
Strong SQL knowledge and min 1 ++ year working experience is required with relational databases, query authoring (SQL) as well as query optimization.
Proficient in SQL script and experience with relational databases.
Added advantage if candidate has experience working with big data.
Added advantage if candidate...
Ready to Apply?
Take the next step in your AI career. Submit your application to MR.DIY today.
Submit Application