Lead the creation of data architectures using Microsoft Fabric
components (e.g., lakehouses, pipelines, and semantic models). Define
scalable solutions for data ingestion, storage,transformation, and
analytics, incorporating principles like data mesh or unified lakehouse
on OneLake.
Build and deploy ETL/ELT processes with Data Factory, Spark jobs
(using PySpark/SQL), and notebooks
Analyze business, data, and system requirements to perform data design
and data modeling activities.