About the Role
Responsibilities
- Design and develop scalable, production-ready data pipelines on
Microsoft Fabric
, working closely with business and analytics teams to translate requirements into reliable data solutions. - Build and maintain end-to-end data ingestion and transformation workflows using
OneLake, Lakehouse/Warehouse, Fabric Data Pipelines, Dataflows Gen2, and Spark Notebooks
. - Implement and manage
medallion architecture (Bronze/Silver/Gold)
, including Delta tables, schema evolution, partitioning strategies, and performance tuning. - Develop
batch and near-real-time ingestion pipelines
from databases, APIs, and file-based sources, implementing
CDC, SCD Type 1/2, audit columns, and upsert logic
using Fabric Pipelines and PySpark. - Write high-quality
SQL and PySpark transformations
, ensuring reusability, maintainability, and performance across notebooks and pipelines. - Implement
data...
Ready to Apply?
Submit your application today and take the next step in your career journey with NETSOL Technologies Inc..
Apply Now