Job Description
Duties
- ETL / ELT pipeline design.
- Data Lake / Lakehouse Design.
- Querying data using PySpark and T-SQL.
- Performance optimization.
- Preparation of product specifications and supporting documentation.
- Data security (adherence to GDPR).
Technical requirements
- 3+ years of experience in a similar role.
- Strong proficiency with SQL language and its variation among popular RDBMS types.
- Python (for Spark data manipulation).
- Hands-on experience with Synapse Data engineering is essential.
- Hands-on experience with Fabric Data Engineering is preferred.
- Data Factory Pipelines design (ETL/ELT processes).
- Experience with on-prem SQL (2016 and above) including SSIS.
- Some knowledge of C# (for Azure Function Apps).
- Microsoft development stack (Visual Studio, Azure Dev Ops).
Job Type: Contract
Contract length: 12 months
Pay: RM1...
Ready to Apply?
Take the next step in your AI career. Submit your application to IT Consulting today.
Submit Application