Job Description

Responsibilities

  • Design and develop scalable, production-ready data pipelines on
    Microsoft Fabric
    , working closely with business and analytics teams to translate requirements into reliable data solutions.
  • Build and maintain end-to-end data ingestion and transformation workflows using
    OneLake, Lakehouse/Warehouse, Fabric Data Pipelines, Dataflows Gen2, and Spark Notebooks
    .
  • Implement and manage
    medallion architecture (Bronze/Silver/Gold)
    , including Delta tables, schema evolution, partitioning strategies, and performance tuning.
  • Develop
    batch and near-real-time ingestion pipelines
    from databases, APIs, and file-based sources, implementing
    CDC, SCD Type 1/2, audit columns, and upsert logic
    using Fabric Pipelines and PySpark.
  • Write high-quality
    SQL and PySpark transformations
    , ensuring reusability, maintainability, and performance across notebooks and pipelines.
  • Implement
    data...

Ready to Apply?

Take the next step in your AI career. Submit your application to NETSOL Technologies Inc. today.

Submit Application