
Consulting Website + Client Portal Development (Fabric)
- or -
Post a project like this- Posted:
- Proposals: 30
- Remote
- #4446890
- OPPORTUNITY
- Awarded





Description
We provide clients with data cleansing, transformation, hosting, and Power BI reporting as an all-in-one solution.
We’re seeking a skilled developer / agency to build:
1. A modern marketing website
2. A secure client portal integrated with Microsoft Fabric for data upload, cleansing, and reporting
All hosting and data processing must remain within UAE (Azure Region) and comply with GDPR + UAE data-residency standards.
---
New Proposal
Login to your account and send a proposal now to get this project.
Log inClarification Board Ask a Question
-

Hi Pardip,
Would you like your client portal to enable automated data cleansing and Power BI dashboard access directly within Microsoft Fabric, or prefer separate user modules for upload and reporting?
Pardip H.11 Nov 2025Thanks for your question, our vision is:
1️⃣ Client Login
Clients visit portal.datanovalabs.com (hosted in UAE).
They log in securely via Microsoft Entra ID (Azure AD B2C) — Fabric
2️⃣ File Upload
Inside their workspace, clients click “Upload Data”.
They can attach files such as .csv, .xlsx, SQL bulk, .json, zip etc
Each client has their own secure storage container in the Fabric environment — completely isolated from other clients.
Clients can upload multiple pieces of information and can create new folders, as and when required, however there will be a folder called "Reporting Raw Data" where all files within that folder will be cleansed and then reported upon using Power BI. The other folders/ files which clients create/ upload are used only for hosting, not reporting, we will not do anything with them.
OneLake acts as the central data lake for all uploads.
3️⃣ Automated ETL Trigger
Once uploaded:
1. A Fabric Data Pipeline detects the new file.
2. It launches a Fabric Dataflow Gen2 or Fabric Notebook (Python) to:
Clean data (remove blanks, duplicates, errors).
Standardise field names and formats.
Merge or join data where needed.
3. The cleansed dataset is saved back to OneLake as:
/OneLake/DataNova/Clients/{ClientName}/Processed/{DatasetName}.parquet
4️⃣ Reporting Layer
The processed data feeds into Power BI semantic models within Fabric. Developed by external.
Each client has a workspace or dataset tagged with their tenant ID.
Reports auto-refresh whenever a new file is uploaded and processed.
Clients can view dashboards directly inside the portal’s embedded Power BI viewer (via Power BI Embedded).
5️⃣ Notifications & Audit Trail
Power Automate or alternative sends a confirmation email or Teams message to:
The client (“Your data upload has been successfully processed”).
Your internal team (“New upload from Client X at [timestamp]”).
Audit log captures:
Who uploaded
When processed
What scripts were executed
Output file names
All this is stored in Fabric’s activity log and optionally mirrored in Azure Log Analytics.
6️⃣ Optional Enhancements
✅ Use Fabric Copilot for automated anomaly detection or predictive summaries inside reports.Envitics Solutions17 Nov 2025Hi Pradip,
Thanks for sharing such a clear end-to-end workflow. Your Fabric setup with Entra ID, OneLake storage isolation and automated ETL is well-defined and definitely achievable. To move ahead, could you share whether you also want us to handle the full portal UI/UX design, or will you provide the frontend screens on your side?