Job Description
Key Responsibilities
- Design, build, and maintain end to end data pipelines using Microsoft Fabric, including Lakehouse, Data Factory, Dataflows, and notebooks.
- Develop and optimize SQL based transformations, data models, and curated datasets for enterprise reporting and analytics.
- Build and maintain Python based data engineering logic for ingestion, transformation, validation, and automation.
- Implement and operate data quality controls, including validation rules, reconciliation checks, and exception handling.
- Monitor data pipelines, investigate failures or data quality issues, and implement fixes with minimal escalation.
- Integrate data from multiple enterprise systems, including CRM, ticketing systems, telephony platforms, and operational databases.
- Maintain technical documentation, data lineage, and operational runbooks for owned pipelines and datasets.
- Work closely with the M...
Ready to Apply?
Take the next step in your AI career. Submit your application to Maarut today.
Submit Application