A comprehensive reference table (lookup table) and data mapping service for Microsoft Fabric Extensibility Toolkit that enables data classification, harmonization, and transformation using reference tables and attribute-based configuration.
This project implements a powerful and flexible reference table service in C# that integrates with Microsoft Fabric using the Extensibility Toolkit. It provides reference tables for data classification and harmonization, plus attribute-based data mapping capabilities. Reference tables act as lookup tables (KeyMapping outports) that help structure data consistently and make it comparable across different sources, making it ideal for master data management, data classification, ETL processes, and legacy system modernization.
Reference Tables are lookup tables that help you classify, group, and standardize data values across different systems. They provide a single source of truth for data classification.
Visual Example:
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β Before: Inconsistent Product Data β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ€
β System A: VTP001 = "Health Insurance" β
β System B: VTP001 = "Krankenversicherung" β
β System C: VTP001 = "Medical Coverage" β
β β Difficult to analyze across systems β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β
βββββββββββββββββββββββββββββββ
β Reference Table (Lookup) β
β ββββββββββ¬βββββββββββββββ β
β β VTP001 β Insurance β β
β β β /Health β β
β ββββββββββΌβββββββββββββββ€ β
β β VTP002 β Insurance β β
β β β /Life β β
β ββββββββββΌβββββββββββββββ€ β
β β VTP003 β Banking β β
β β β /Savings β β
β ββββββββββ΄βββββββββββββββ β
βββββββββββββββββββββββββββββββ
β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β After: Harmonized Classification β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ€
β All Systems: VTP001 β Insurance / Health β
β β
Consistent classification β
β β
Comparable analytics β
β β
Automated harmonization β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
π See Visual Guide with UI Mockups - Understand the solution with detailed diagrams and mockups!
π Quick Reference Guide - Fast overview for first-time users!
This project is created for the Microsoft Fabric Extensibility Toolkit Contest to demonstrate the capabilities of building custom workloads for Microsoft Fabric.
This repository implements a sequential multi-agent workflow for quality assurance and deployment readiness. The workflow models a controlled, stage-gated development pipeline with 7 specialized stages:
- Azure Architect - Architecture validation
- .NET Senior Developer - Build and code analysis
- DevOps Engineer - CI/CD and artifact creation
- Blazor Fluent UI Specialist - Frontend validation
- Test Specialist - Integration/E2E tests
- Unit Test Specialist - Unit tests with coverage
- Orchestrator - Final coordination and merge readiness
Each stage executes sequentially with optional manual approval gates. For complete documentation, see Sequential Multi-Agent Workflow Documentation.
- MappingWorkload: Full Microsoft Fabric workload implementation with
IWorkloadinterface - Orchestrated Operations: Execute reference table and mapping operations through unified workload API
- Configuration Validation: Pre-execution validation of workload configurations
- Health Monitoring: Built-in health checks and status reporting
- Cancellation Support: Graceful handling of operation cancellations
- Item Management: Create and configure mapping items within Fabric workspaces (NEW)
- OneLake Integration: Store and retrieve mapping data from OneLake (NEW)
- Reference Tables (Lookup Tables): Create and manage reference tables for data classification and harmonization
- KeyMapping Outports: Provide reference tables as KeyMapping outports for Fabric data products
- Manual Master Data: Centrally maintained system-independent master data
- Automated Sync: Automatically sync reference tables from source data (outports)
- Data Classification: Group and classify cost types, diagnoses, product categories, etc.
- Label Harmonization: Standardize labels and codes from different data sources
- OneLake Integration: Store and consume reference tables via OneLake
- Lakehouse Storage (NEW): Persist reference table configurations and data in lakehouse as JSON
- Item Creation: Create mapping items directly within Fabric workspaces
- Lakehouse Integration: Reference lakehouse tables as data sources
- Column Mapping Configuration: Define one-to-many column mappings with transformations
- Item Definition Storage: Store item configurations following Fabric Extensibility Toolkit patterns
- OneLake Data Storage: Persist mapping/lookup tables to OneLake for consumption
- Traceability: Full data lineage from lakehouse to mapping items to OneLake
- Attribute-Based Mapping: Use custom attributes to define mappings between source and target properties
- Type Conversion: Automatic conversion between compatible types (string to int, bool, decimal, etc.)
- Flexible Configuration: Configure mapping behavior at both class and property levels
- Batch Operations: Map collections of objects efficiently
- Error Handling: Detailed error reporting and validation
- REST API: Full-featured ASP.NET Core Web API for reference table and mapping operations
- Frontend UI: React-based user interface with Fluent UI components
- Basic Mode: Table-based CRUD editor for non-technical users
- Expert Mode: JSON editor with syntax highlighting for power users
- Configuration Panel: Reference table selection with search and filters
- Extensible: Support for custom converters and mapping profiles
- Microsoft Fabric Integration: Native integration with Fabric workspaces via Extensibility Toolkit
FabricMappingService/
βββ .ai/ # AI assistant context and commands
β βββ context/ # Context documentation for AI tools
β β βββ fabric-workload.md # Extensibility Toolkit knowledge
β β βββ fabric.md # Microsoft Fabric platform context
β β βββ mapping-service.md # Custom mapping service context
β βββ commands/ # Command templates
β βββ workload/ # Workload operations
β βββ item/ # Item operations
βββ src/
β βββ FabricMappingService.Core/ # Core mapping library
β β βββ Attributes/ # Custom mapping attributes
β β βββ Converters/ # Type converters
β β βββ Exceptions/ # Custom exceptions
β β βββ Models/ # Configuration models
β β βββ Services/ # Mapping service implementation
β β βββ Workload/ # Fabric workload implementation
β β β βββ IWorkload.cs # Workload interface
β β β βββ MappingWorkload.cs # Main workload class
β β β βββ WorkloadConfiguration.cs # Configuration models
β β β βββ WorkloadExecutionResult.cs # Result models
β β βββ Examples/ # Example models
β βββ FabricMappingService.Api/ # REST API
β β βββ Controllers/ # API controllers
β β β βββ WorkloadController.cs # Workload endpoints
β β β βββ ItemController.cs # Mapping item endpoints
β β β βββ ReferenceTableController.cs # Reference table endpoints
β β β βββ MappingController.cs # Mapping endpoints
β β βββ Dtos/ # Data transfer objects
β β βββ Program.cs # API configuration
β βββ FabricMappingService.Frontend/ # Frontend UI
β βββ src/ # React components
β β βββ components/ # UI components
β β βββ services/ # API client
β β βββ types/ # TypeScript types
β βββ public/ # Static assets
β βββ package.json # NPM dependencies
β βββ webpack.config.js # Build configuration
βββ tests/
β βββ FabricMappingService.Tests/ # Unit tests
βββ fabric-manifest/ # Fabric workload manifest
β βββ workload-manifest.json # Main workload manifest
β βββ Product.json # Frontend metadata and UI configuration
β βββ items/ # Item type definitions
β β βββ ReferenceTableItem/ # Reference Table item type
β β βββ MappingConfigurationItem/ # Mapping Configuration item type
β β βββ MappingJobItem/ # Mapping Job item type
β βββ assets/ # Visual assets
β β βββ images/ # Icons and images
β βββ translations/ # Localization files
βββ scripts/ # Automation scripts
βββ docs/ # Documentation
- .NET 10.0 SDK or later (for local development)
- For Azure deployment, retarget to .NET 8.0 or 9.0 in
.csprojfiles if needed
- For Azure deployment, retarget to .NET 8.0 or 9.0 in
- PowerShell 7 (for automation scripts)
- Visual Studio 2022, VS Code, or GitHub Codespaces
- Microsoft Fabric workspace (for integration)
- Node.js 18+ and npm (for frontend development)
Use the provided setup scripts for automated environment configuration:
# Windows PowerShell
.\scripts\Setup\Setup.ps1
# macOS/Linux
pwsh ./scripts/Setup/Setup.ps1# Install frontend dependencies
.\scripts\Setup\SetupFrontend.ps1
# Force reinstall packages
.\scripts\Setup\SetupFrontend.ps1 -Force
# Run security audit
.\scripts\Setup\SetupFrontend.ps1 -AuditFor a complete setup of both backend and frontend:
# Setup backend
.\scripts\Setup\Setup.ps1
# Setup frontend
.\scripts\Setup\SetupFrontend.ps1# Clone the repository
git clone https://github.com/philippfrenzel/msfabric-mapping-etk.git
cd msfabric-mapping-etk
# Build the solution
dotnet build
# Run tests
dotnet test
# Run the API
cd src/FabricMappingService.Api
dotnet run# Navigate to frontend directory
cd src/FabricMappingService.Frontend
# Install dependencies
npm install
# Start development server
npm start# Start backend development server with hot reload
.\scripts\Run\StartDevServer.ps1
# Start on custom port
.\scripts\Run\StartDevServer.ps1 -Port 5500
# Build backend
.\scripts\Build\Build.ps1
# Build in Debug mode
.\scripts\Build\Build.ps1 -Configuration Debug
# Publish for deployment
.\scripts\Build\Publish.ps1# Start frontend development server
.\scripts\Run\StartFrontendDevServer.ps1
# Start on custom port
.\scripts\Run\StartFrontendDevServer.ps1 -Port 3001
# Open browser automatically
.\scripts\Run\StartFrontendDevServer.ps1 -Open
# Build frontend for production
.\scripts\Build\BuildFrontend.ps1
# Build for development
.\scripts\Build\BuildFrontend.ps1 -Mode development
# Clean and build
.\scripts\Build\BuildFrontend.ps1 -Clean# Start both backend and frontend together
.\scripts\Run\StartFullStack.ps1
# Customize ports
.\scripts\Run\StartFullStack.ps1 -ApiPort 5500 -FrontendPort 3001
# Build both backend and frontend
.\scripts\Build\BuildAll.ps1
# Build with custom configuration
.\scripts\Build\BuildAll.ps1 -Configuration Debug -FrontendMode development
# Clean build
.\scripts\Build\BuildAll.ps1 -Clean
# Skip tests
.\scripts\Build\BuildAll.ps1 -SkipTests
# Build only backend
.\scripts\Build\BuildAll.ps1 -SkipFrontend
# Build only frontend
.\scripts\Build\BuildAll.ps1 -SkipBackendThe API will be available at https://localhost:5001 (or the port specified in launchSettings.json).
The frontend will be available at http://localhost:3000 (or custom port if specified).
This project includes a complete dev container configuration. Click "Code" β "Open with Codespaces" to get started instantly with a pre-configured development environment.
This project includes a complete MappingWorkload implementation that follows the Microsoft Fabric Extensibility Toolkit patterns. The workload provides a unified interface for executing all mapping and reference table operations.
- IWorkload Interface: Defines the contract for Fabric workload implementations
- MappingWorkload Class: Orchestrates all mapping operations through a single
ExecuteAsyncmethod - WorkloadController: REST API endpoints for workload operations (
/api/workload/*)
The workload supports the following operation types:
- CreateReferenceTable: Create new reference tables for data classification
- SyncReferenceTable: Synchronize reference tables with source data
- ReadReferenceTable: Read reference table data (KeyMapping outports)
- UpdateReferenceTableRow: Update individual rows in reference tables
- DeleteReferenceTable: Delete reference tables
- ExecuteMapping: Execute data mapping operations
- ValidateMapping: Validate mapping configurations
- HealthCheck: Check workload health status
# Get workload information
curl https://localhost:5001/api/workload/info
# Check workload health
curl https://localhost:5001/api/workload/health
# Execute a workload operation
curl -X POST https://localhost:5001/api/workload/execute \
-H "Content-Type: application/json" \
-d '{
"operationType": "CreateReferenceTable",
"timeoutSeconds": 60,
"parameters": {
"tableName": "produkttyp",
"columns": "[{\"name\":\"ProductType\",\"dataType\":\"string\",\"order\":1}]",
"isVisible": true
}
}'For detailed instructions on building, deploying, and registering the workload in Microsoft Fabric, see:
- Workload Guide (German): Comprehensive guide covering build, deployment, and Fabric registration
- Fabric Integration Guide: Technical integration details
- RegisterWorkload.ps1: PowerShell script for automated workload registration
Reference tables (Referenztabellen) provide a powerful way to classify, group, and harmonize data values. They act as lookup tables (KeyMapping outports in Fabric) that help structure data consistently and make it comparable across different sources.
A reference table is essentially a list that defines how certain values are grouped, renamed, or standardized. It works like a lookup table that helps with data analysis by providing clear structure and comparability.
Primary Use Cases:
- Manual Master Data: Centrally maintained system-independent master data
- Cost Type Mapping: Classifying and mapping cost categories
- Diagnosis Classification: Standardizing medical or technical diagnoses
- Label Harmonization: Unifying labels from different sources
- Product Grouping: Creating product type hierarchies
- Code Mapping: Mapping external codes to internal classifications
Create a reference table manually for custom classifications that don't directly map to source system data.
Create an empty reference table:
var mappingIO = new MappingIO(storage);
var columns = new List<ReferenceTableColumn>
{
new() { Name = "Category", DataType = "string", Order = 1, Description = "Product category" },
new() { Name = "Group", DataType = "string", Order = 2, Description = "Product group" }
};
mappingIO.CreateReferenceTable(
tableName: "vertragsprodukte",
columns: columns,
isVisible: true,
notifyOnNewMapping: false);Add rows to the table:
mappingIO.AddOrUpdateRow(
tableName: "vertragsprodukte",
key: "VTP001",
attributes: new Dictionary<string, object?>
{
["Category"] = "Insurance",
["Group"] = "Health"
});Read the mapping:
var mappingData = mappingIO.ReadMapping("vertragsprodukte");
// Returns: Dictionary<string, Dictionary<string, object?>>
// Key: "VTP001" -> { "key": "VTP001", "Category": "Insurance", "Group": "Health" }Automatically create reference tables from source data (outports). This approach is more structured and professional when codes from data sources need to be classified.
Define your source data model:
public class ProductData
{
public string Produkt { get; set; } // This will be the key
public string Name { get; set; }
public decimal Price { get; set; }
}Sync the reference table with source data:
var mappingIO = new MappingIO(storage);
var products = new List<ProductData>
{
new() { Produkt = "VTP001", Name = "Product A", Price = 100m },
new() { Produkt = "VTP002", Name = "Product B", Price = 200m },
new() { Produkt = "VTP003", Name = "Product C", Price = 300m }
};
// Synchronize the reference table
// Creates table if it doesn't exist, adds new keys only
int newKeysAdded = mappingIO.SyncMapping(
data: products,
keyAttributeName: "Produkt", // Property containing the key values
mappingTableName: "produkttyp");
Console.WriteLine($"Added {newKeysAdded} new keys");Important Notes:
- The key column is automatically named "key"
SyncMappingonly adds NEW keys, existing keys are NOT updated- Removed values are NOT automatically deleted (must be done manually)
- The key should not be changed as it's automatically created
- Notifications for new mappings are sent only once per key
Read and use the mapping:
// Read the reference table
var produktMapping = mappingIO.ReadMapping("produkttyp");
// Use the mapping data
foreach (var entry in produktMapping)
{
Console.WriteLine($"Key: {entry.Key}, Data: {string.Join(", ", entry.Value)}");
}Create a reference table:
curl -X POST https://localhost:5001/api/reference-tables \
-H "Content-Type: application/json" \
-d '{
"tableName": "vertragsprodukte",
"columns": [
{
"name": "Category",
"dataType": "string",
"description": "Product category",
"order": 1
},
{
"name": "Group",
"dataType": "string",
"description": "Product group",
"order": 2
}
],
"isVisible": true,
"notifyOnNewMapping": false
}'Sync a reference table with data:
curl -X POST https://localhost:5001/api/reference-tables/sync \
-H "Content-Type: application/json" \
-d '{
"mappingTableName": "produkttyp",
"keyAttributeName": "Produkt",
"data": [
{ "Produkt": "VTP001", "Name": "Product A", "Price": 100 },
{ "Produkt": "VTP002", "Name": "Product B", "Price": 200 }
]
}'Response:
{
"success": true,
"tableName": "produkttyp",
"newKeysAdded": 2,
"totalKeys": 2
}Read a reference table:
curl https://localhost:5001/api/reference-tables/produkttypList all reference tables:
curl https://localhost:5001/api/reference-tablesAdd or update a row:
curl -X PUT https://localhost:5001/api/reference-tables/produkttyp/rows \
-H "Content-Type: application/json" \
-d '{
"key": "VTP001",
"attributes": {
"Category": "Insurance",
"Group": "Health",
"SubGroup": "Basic Coverage"
}
}'Delete a reference table:
curl -X DELETE https://localhost:5001/api/reference-tables/produkttypWhen creating outports in the Fabric client:
- Use outport type "KeyMapping" for reference table data
- The key element is automatically designated as "key"
- Reference tables can be consumed as outports by other data products
Example workflow:
- Create reference table via API or sync from source data
- Read the mapping:
var mapping = mappingIO.ReadMapping("tableName") - Provide as KeyMapping outport for consumption by analytics
// Example: Providing reference table as KeyMapping outport
var mapping = mappingIO.ReadMapping("produkttyp");
// Convert to DataFrame/output format
// Write to KeyMapping outport for consumption
outportWriter.Write(mapping, "Produkttyp_Mapping", outportType: "KeyMapping");Create and manage mapping items directly within Fabric workspaces. This feature allows you to:
- Create mapping items that reference lakehouse tables
- Configure which attribute will be used for lookup operations
- Define one-to-many column mappings
- Store mapping/lookup tables to OneLake for consumption
Via REST API:
curl -X POST https://localhost:5001/api/items \
-H "Content-Type: application/json" \
-d '{
"displayName": "Product Category Mapping",
"description": "Maps product codes to categories",
"workspaceId": "workspace-123",
"lakehouseItemId": "lakehouse-456",
"lakehouseWorkspaceId": "workspace-123",
"tableName": "Products",
"referenceAttributeName": "ProductId",
"mappingColumns": [
{
"columnName": "ProductCode",
"dataType": "string",
"isRequired": true,
"transformation": "uppercase"
},
{
"columnName": "LegacyCode",
"dataType": "string",
"isRequired": false
}
],
"oneLakeLink": "https://onelake.dfs.fabric.microsoft.com/workspace-123/lakehouse-456/Tables/Products"
}'Response:
{
"itemId": "item-789",
"displayName": "Product Category Mapping",
"workspaceId": "workspace-123",
"lakehouseItemId": "lakehouse-456",
"tableName": "Products",
"referenceAttributeName": "ProductId",
"mappingColumns": [...],
"createdAt": "2024-01-15T10:30:00Z",
"updatedAt": "2024-01-15T10:30:00Z"
}Once you've configured your mapping item, you can store the actual mapping data to OneLake:
curl -X POST https://localhost:5001/api/items/store-to-onelake \
-H "Content-Type: application/json" \
-d '{
"itemId": "item-789",
"workspaceId": "workspace-123",
"tableName": "ProductMapping",
"data": {
"PROD001": {
"key": "PROD001",
"ProductCode": "PROD001",
"Category": "Electronics",
"SubCategory": "Computers"
},
"PROD002": {
"key": "PROD002",
"ProductCode": "PROD002",
"Category": "Electronics",
"SubCategory": "Phones"
}
}
}'Response:
{
"success": true,
"oneLakePath": "https://onelake.dfs.fabric.microsoft.com/workspace-123/item-789/Tables/ProductMapping",
"rowCount": 2
}curl https://localhost:5001/api/items/read-from-onelake/workspace-123/item-789/ProductMappingResponse:
{
"PROD001": {
"key": "PROD001",
"ProductCode": "PROD001",
"Category": "Electronics",
"SubCategory": "Computers"
},
"PROD002": {
"key": "PROD002",
"ProductCode": "PROD002",
"Category": "Electronics",
"SubCategory": "Phones"
}
}You can also use the workload API to manage mapping items:
# Create mapping item via workload
curl -X POST https://localhost:5001/api/workload/execute \
-H "Content-Type: application/json" \
-d '{
"operationType": "CreateMappingItem",
"parameters": {
"displayName": "Product Mapping",
"workspaceId": "workspace-123",
"lakehouseItemId": "lakehouse-456",
"tableName": "Products",
"referenceAttributeName": "ProductId",
"mappingColumns": "[]"
}
}'
# Store to OneLake via workload
curl -X POST https://localhost:5001/api/workload/execute \
-H "Content-Type: application/json" \
-d '{
"operationType": "StoreToOneLake",
"parameters": {
"itemId": "item-789",
"workspaceId": "workspace-123",
"tableName": "ProductMapping",
"data": "{\"PROD001\": {\"key\": \"PROD001\", \"Category\": \"Electronics\"}}"
}
}'Benefits:
- Fabric Integration: Native integration with Microsoft Fabric workspaces and lakehouses
- Traceability: Track source lakehouse and table for each mapping item
- OneLake Storage: Store mapping tables directly to OneLake for consumption by other workloads
- Configuration Management: Centrally manage mapping configurations including column transformations
- Data Lineage: Establish clear lineage from lakehouse tables to mapping items to OneLake storage
Reference tables can now store references to lakehouse tables as their data source. This enables integration with the OneLakeView component from the Fabric Extensibility Toolkit in the frontend.
Creating a reference table with lakehouse source:
var mappingIO = new MappingIO(storage);
var columns = new List<ReferenceTableColumn>
{
new() { Name = "Category", DataType = "string", Order = 1 },
new() { Name = "SubCategory", DataType = "string", Order = 2 }
};
mappingIO.CreateReferenceTable(
tableName: "products_ref",
columns: columns,
isVisible: true,
notifyOnNewMapping: true,
sourceLakehouseItemId: "12345678-1234-1234-1234-123456789012",
sourceWorkspaceId: "87654321-4321-4321-4321-210987654321",
sourceTableName: "ProductsTable",
sourceOneLakeLink: "https://onelake.dfs.fabric.microsoft.com/workspace/lakehouse/Tables/ProductsTable");Via REST API:
curl -X POST https://localhost:5001/api/reference-tables \
-H "Content-Type: application/json" \
-d '{
"tableName": "products_ref",
"columns": [
{ "name": "Category", "dataType": "string", "order": 1 },
{ "name": "SubCategory", "dataType": "string", "order": 2 }
],
"isVisible": true,
"notifyOnNewMapping": true,
"sourceLakehouseItemId": "12345678-1234-1234-1234-123456789012",
"sourceWorkspaceId": "87654321-4321-4321-4321-210987654321",
"sourceTableName": "ProductsTable",
"sourceOneLakeLink": "https://onelake.dfs.fabric.microsoft.com/workspace/lakehouse/Tables/ProductsTable"
}'Benefits:
- Traceability: Track the source of reference data back to the original lakehouse table
- Documentation: Automatically document where reference data comes from
- Frontend Integration: The frontend can use the OneLakeView component to allow users to browse and select lakehouse tables visually
- Data Lineage: Establish clear data lineage from source tables to reference tables
Define your source and target models with mapping attributes:
using FabricMappingService.Core.Attributes;
// Source model
public class LegacyCustomerModel
{
[MapTo("CustomerId")]
public int Id { get; set; }
[MapTo("FullName")]
public string CustomerName { get; set; }
[IgnoreMapping]
public string InternalNotes { get; set; }
}
// Target model
public class ModernCustomerModel
{
public int CustomerId { get; set; }
public string FullName { get; set; }
}Map the objects:
using FabricMappingService.Core.Services;
var mapper = new AttributeMappingService();
var legacy = new LegacyCustomerModel
{
Id = 123,
CustomerName = "John Doe"
};
var modern = mapper.Map<LegacyCustomerModel, ModernCustomerModel>(legacy);
// modern.CustomerId = 123
// modern.FullName = "John Doe"curl -X POST https://localhost:5001/api/mapping/customer/legacy-to-modern \
-H "Content-Type: application/json" \
-d '{
"id": 123,
"customerName": "John Doe",
"email": "john@example.com",
"phone": "+1234567890",
"createdDate": "2024-01-01T00:00:00Z",
"status": true,
"country": "USA"
}'Response:
{
"success": true,
"data": {
"customerId": 123,
"fullName": "John Doe",
"emailAddress": "john@example.com",
"phoneNumber": "+1234567890",
"registrationDate": "2024-01-01T00:00:00Z",
"isActive": true,
"country": "USA"
},
"errors": [],
"warnings": [],
"mappedPropertiesCount": 7
}curl https://localhost:5001/api/mapping/infocurl https://localhost:5001/api/mapping/health[MappingProfile("StrictMapping", IgnoreUnmapped = true, CaseSensitive = false)]
public class SourceModel
{
[MapTo("TargetId")]
public int Id { get; set; }
}var sources = new List<LegacyCustomerModel> { /* ... */ };
var results = mapper.MapCollection<LegacyCustomerModel, ModernCustomerModel>(sources);var result = mapper.MapWithResult<SourceModel, TargetModel>(source);
if (result.Success)
{
Console.WriteLine($"Mapped {result.MappedPropertiesCount} properties");
// Use result.Result
}
else
{
Console.WriteLine($"Errors: {string.Join(", ", result.Errors)}");
}The project includes comprehensive unit tests:
dotnet testTest coverage includes:
- Attribute mapping functionality
- Reference table operations
- Type conversion
- Error handling
- Batch operations
- Configuration options
Configure the mapping service behavior:
var configuration = new MappingConfiguration
{
CaseSensitive = true, // Case-sensitive property matching
IgnoreUnmapped = false, // Map properties without attributes
ThrowOnError = true, // Throw exceptions on errors
MapNullValues = true, // Map null values
MaxDepth = 10 // Max depth for nested mapping
};
var mapper = new AttributeMappingService(configuration);The project includes a complete workload manifest for Fabric integration following the Microsoft Fabric Extensibility Toolkit patterns. The manifest structure is organized as follows:
fabric-manifest/ directory contains:
workload-manifest.json: Main workload manifest with authentication, backend endpoints, and item type definitionsProduct.json: Frontend metadata and UI configuration for the Fabric experienceitems/: Item type definitions for ReferenceTable, MappingConfiguration, and MappingJobassets/: Visual assets including icons and imagestranslations/: Localization files for internationalization
See fabric-manifest/README.md for detailed documentation.
The .ai/ directory provides context documentation for AI tools and agents:
context/: Knowledge base about Fabric platform, Extensibility Toolkit, and this servicefabric-workload.md: Extensibility Toolkit development patternsfabric.md: Microsoft Fabric platform architecture and APIsmapping-service.md: Custom mapping service context and usage
commands/: Command templates for common workload and item operations
This structure enables AI assistants (like GitHub Copilot) to better understand the project context and provide more accurate code suggestions and assistance.
- Register your application in Microsoft Entra ID
- Update the manifest with your AAD App ID and backend URL in
fabric-manifest/workload-manifest.json - Deploy the API to Azure App Service or your hosting platform
- Register the workload in your Fabric tenant
- Configure permissions for workspace access
The service defines three item types for Fabric:
- ReferenceTable: Reference tables for data classification (KeyMapping outports) - provides lookup tables for data harmonization
- MappingConfiguration: Store and manage attribute-based mapping configurations
- MappingJob: Execute and monitor mapping operations
| Method | Endpoint | Description |
|---|---|---|
| GET | /api/workload/info |
Get workload metadata |
| GET | /api/workload/health |
Get workload health status |
| POST | /api/workload/execute |
Execute workload operation |
| POST | /api/workload/validate |
Validate workload configuration |
| Method | Endpoint | Description |
|---|---|---|
| GET | /api/items/{itemId} |
Get mapping item by ID |
| GET | /api/items/workspace/{workspaceId} |
List all mapping items in workspace |
| POST | /api/items |
Create a new mapping item |
| PUT | /api/items/{itemId} |
Update an existing mapping item |
| DELETE | /api/items/{itemId} |
Delete a mapping item |
| POST | /api/items/store-to-onelake |
Store mapping data to OneLake |
| GET | /api/items/read-from-onelake/{workspaceId}/{itemId}/{tableName} |
Read mapping data from OneLake |
| Method | Endpoint | Description |
|---|---|---|
| GET | /api/reference-tables |
List all reference tables |
| GET | /api/reference-tables/{tableName} |
Get reference table data |
| POST | /api/reference-tables |
Create a new reference table |
| POST | /api/reference-tables/sync |
Sync reference table with data |
| PUT | /api/reference-tables/{tableName}/rows |
Add or update a row |
| DELETE | /api/reference-tables/{tableName} |
Delete a reference table |
| Method | Endpoint | Description |
|---|---|---|
| GET | / |
Service information |
| GET | /api/mapping/info |
Available mappings |
| GET | /api/mapping/health |
Health check |
| POST | /api/mapping/customer/legacy-to-modern |
Map legacy customer |
| POST | /api/mapping/product/external-to-internal |
Map external product |
| POST | /api/mapping/customer/batch-legacy-to-modern |
Batch map customers |
Maps a source property to a target property with a different name.
[MapTo("TargetPropertyName", Optional = false)]
public string SourceProperty { get; set; }Defines mapping behavior at the class level.
[MappingProfile("ProfileName", IgnoreUnmapped = true)]
public class MyModel { }Excludes a property from mapping.
[IgnoreMapping]
public string InternalField { get; set; }Implement IPropertyConverter for custom type conversions:
public class CustomConverter : IPropertyConverter
{
public object? Convert(object? sourceValue, Type targetType)
{
// Your conversion logic
}
public bool CanConvert(Type sourceType, Type targetType)
{
// Return true if conversion is supported
}
}Use it in attributes:
[MapTo("Target", ConverterType = typeof(CustomConverter))]
public string Source { get; set; }- Master Data Management: Centrally maintain system-independent master data accessible as KeyMapping outports
- Data Classification: Group and classify cost types, diagnoses, or product categories with structured hierarchies
- Label Harmonization: Standardize labels and codes from different data sources for consistent analytics
- Product Hierarchies: Create product type classifications and groupings for organized reporting
- Code Mapping: Map external codes to internal classifications for seamless data integration
- Cost Center Mapping: Classify and standardize cost center codes across different systems
- Medical Code Classification: Standardize ICD codes, diagnosis codes, or procedure codes
- Customer Segmentation: Create and maintain customer classification schemes
- Legacy System Modernization: Map data from old systems to modern formats
- API Integration: Transform data between different API schemas
- Data Migration: Convert data during system migrations
- ETL Processes: Transform data in Extract-Transform-Load pipelines
- Multi-tenant Applications: Map data structures across different tenants
- π Quick Reference Guide: Fast overview of the Reference Table tool - Perfect for first-time users!
- π¨ UI Mockups & Visual Guide: Visual mockups and diagrams explaining the reference table (lookup) tool - Start here to understand the solution!
- Project Setup Guide: Complete environment setup instructions
- Quick Start: Get up and running quickly
- Workload Guide: Complete guide to build, deploy, and register the workload in Microsoft Fabric
- Workload Guide (German): Deutsche Anleitung fΓΌr Workload-Deployment
- Lakehouse Storage Configuration: Configure lakehouse-based storage for reference tables
- API Documentation: Detailed API endpoint reference
- Fabric Integration: Microsoft Fabric integration guide
- Project Structure: Repository organization and conventions
- Scripts Documentation: Automation scripts reference
- Security Policy: Security reporting guidelines
- Support: Getting help and support resources
- Code of Conduct: Community guidelines
This is a competition entry, but suggestions and feedback are welcome! Please read our Code of Conduct before contributing.
For support, see SUPPORT.md.
This project is created for the Microsoft Fabric Extensibility Toolkit Contest.
Philipp Frenzel (@philippfrenzel)
- Microsoft Fabric Team for the Extensibility Toolkit
- Microsoft for hosting the competition
- The .NET community for excellent tools and libraries
- Based on design principles from Microsoft Fabric Tools Workload
For questions or feedback about this project, please open an issue on GitHub or see SUPPORT.md.
Built with β€οΈ for the Microsoft Fabric Extensibility Toolkit Contest
This project follows the design principles and best practices established by the Microsoft Fabric Extensibility Toolkit.