Adventures with Dataverse: Migrating Microsoft SQL Server Tables to Dataverse Tables

From time to time, I want to migrate data from Microsoft SQL Server to Microsoft Dataverse.

Dataverse offers a lot of capabilities for working with entities within the Power Platform. Sure, I could use virtual tables, but those comes with a lot of limitations.

I also wanted to create a pattern for Microsoft SQL Server to Dataverse migration.

I wanted to keep in the spirt of low code, specifically leveraging the Power Platform where I could, so I decided to use dataflows for the migration.

Dataflows are a self-service, cloud-based, data preparation technology. Dataflows enable customers to ingest, transform, and load data into Microsoft Dataverse environments, Power BI workspaces, or your organization’s Azure Data Lake Storage account. Dataflows are authored by using Power Query, a unified data connectivity and preparation experience already featured in many Microsoft products, including Excel and Power BI. Customers can trigger dataflows to run either on demand or automatically on a schedule; data is always kept up to date.

Microsoft Learn – What are dataflows?

I decided I would try to migrate the AdventureWorks sample database, which can be found at AdventureWorks sample databases – SQL Server | Microsoft Learn.

The sample database includes one-to-many and many-to-many relationships, common to most enterprises.

Figure 1 – AdventureWorks tables

Here are the steps I used in my migration:

  1. Document all the SQL tables that need to be imported, including primary keys, secondary keys and foreign key relationships.
  2. Create dataflow to create the Dataverse entities and load with the initial data, this will not include the relationships.
    • Add column transformations to populate the Name field for Dataverse entities.
    • Run the dataflow.
  3. Fix any errors, such as field type mappings, e.g., in my import I had to increase the size of a text field to greater than 100.
  4. Verify additional keys were created in Dataverse entities, sometimes the primary keys are not setup correctly as additional keys.
  5. Add relationships to the Dataverse entities by creating new columns of type lookup pointing to the related tables.
  6. Add choice columns to the Dataverse entities.
  7. Create a dataflow to reload the data with relationships, this will map our lookup columns to the correct Dataverse entities based on the secondary keys.
  8. Add calculation columns, e.g., I added an Amount column to the SalesOrderDetail table that replaces the LineTotal column and calculates the value based on UnitPrice, UnitPriceDiscount and OrderQty.
  9. Add required rules to columns, e.g., required fields, minimum and/or maximum values for numeric data, etc.

Document all SQL tables

The tables I would be importing included:

Table NamePrimary Key Column(s)Foreign Key Column(s)
AddressAddressID (int)
CustomerCustomerID (int)
CustomerAddressCustomerID (int)
AddressID (int)
CustomerID (Customer)
AddressID (Address)
ProductProductID (int)ProductCategoryID (int)
ProductModelID (int)
ProductCategoryProductCategoryID (int)ParentProductCategoryID (int)
ProductDescriptionProductDescriptionID (int)
ProductModelProductModelID (int)
ProductModelProductDescriptionProductModelID (int)
ProductDescriptionID (int)
ProductModelID (ProductModel)
ProductDescriptionID (ProductDescription)
SalesOrderDetailSalesOrderID (int)
SalesOrderDetailID (int)
SalesOrderID (SalesOrderHeader)
ProductID (Product)
SalesOrderHeaderSalesOrderID (int)CustomerID (Customer)
ShipToAddressID (Address)
BillToAddressID (Address)

Create dataflow to create the Dataverse entities and load with the initial data

Navigate to https://make.powerapps.com.

Click Dataflows, and then click New dataflow.

Figure 2 – Create the new dataflow

Set Name to AdventureWorks Data Import (Step 1).

Click Create.

Click SQL Server database.

Figure 3 – Select SQL Server database

Enter Connection settings.

Figure 4 – Enter connection settings

Click Next.

Select all the SQL tables to import.

Figure 5 – Select the tables to import

Click Transform data.

Add column transformations

Select the Address table.

Click the Add column tab.

Click Custom column.

Figure 6 – Add the custom column for name

Set Custom control formula to [AddressLine1] & " " & [City] & ", " & [StateProvince] & " " & [PostalCode].

Click OK.

Right click the Custom column, and click Rename.

Rename the column to Name.

Figure 7 – Rename the column

Right click on the Name column, click Change type and click Text.

Repeat for the Customer table.

Set Custom control formula to [FirstName] & " " & [LastName] & " (" & [CompanyName] & ")".

Click Next.

Click on each table and do the following:

  • Set Delete rows that no longer exist in the query output to checked.
  • Set Alternate key columns to the Primary Key of the original table.

For the tables CustomerAddress, ProductModelProductDescription, SalesOrderDetail set Unique primary name column to Auto-generated.

Ideally, we would like to leave these columns blank as they are not necessary, but we don’t have that option in the import.

For the table ProductDescription set Unique primary name column to Description.

Review all settings, and then click Next.

Run the dataflow

Click Publish.

The new flow will be published and then a refresh will be attempted.

Figure 8 – Dataflow publishing and refreshing

Fix any errors

When our dataflow ran the first time I encountered an error, to see the details for the error click on the (…) next the flow and click Show refresh history.

Figure 8 – Navigate to refresh history

Click the Start time of the latest run, should only be one.

Figure 9 – Refresh history

Review the error message.

In my case, the Description column for the ProductDescription entity needs to support more than 100 characters.

Figure 10 – Review error message

To fix the error click on Tables.

Find the ProductDescription table in the list, should be named SalesLT ProductDescription and select it.

Click Columns under Schema.

Click Description Primary name column.

Click Advanced options.

Set Maximum character count to 500.

Click Save.

Navigate back to the dataflow by clicking Dataflows.

Click the (…) next to the AdventureWorks Data Import (Step 1) dataflow and click Refresh.

Figure 11 – Refresh the dataflow

This time the dataflow should run successfully.

Verify additional keys were created in Dataverse entities

Sometimes the primary keys are not mapped as additional keys in the Dataverse entities, not sure why this happens.

Click Tables.

Click SalesLT Address table.

Click Keys under Schema.

You should see a Key called PrimaryKey, if you don’t, you will need to add it.

Figure 11 – PrimaryKey column

To add a missing key click New key.

Set Display name to PrimaryKey and select AddressID.

Click Save.

Figure 12 – Add missing PrimaryKey key

Repeat for each of the tables.

Table NamePrimary Key Column(s)
AddressAddressID (int)
CustomerCustomerID (int)
CustomerAddressCustomerID (int)
AddressID (int)
ProductProductID (int)
ProductCategoryProductCategoryID (int)
ProductDescriptionProductDescriptionID (int)
ProductModelProductModelID (int)
ProductModelProductDescriptionProductModelID (int)
ProductDescriptionID (int)
SalesOrderDetailSalesOrderID (int)
SalesOrderDetailID (int)
SalesOrderHeaderSalesOrderID (int)

Add relationship columns

We need to add the following relationships to our tables:

Table NameForeign Key Column(s)
CustomerAddressCustomerID (Customer)
AddressID (Address)
ProductProductCategoryID (int)
ProductModelID (int)
ProductCategoryParentProductCategoryID (int)
ProductModelProductDescriptionProductModelID (ProductModel)
ProductDescriptionID (ProductDescription)
SalesOrderDetailSalesOrderID (SalesOrderHeader)
ProductID (Product)
SalesOrderHeaderCustomerID (Customer)
ShipToAddressID (Address)
BillToAddressID (Address)

Navigate to Tables and select the SalesLT CustomerAddress table.

Click Columns.

Click New column.

Set Name to SalesLT Customer.

Set Data type to Lookup.

Set Related table to SalesLT Customer.

Click Save.

Repeat for Address.

Figure 13 – Add relationship column

Repeat for all remain relationships.

For the SalesLT ProductCategory table, I named my field for the ParentProductCategoryId to SalesLT ProductCategory.

For the SalesLT ProductModelProductDescription table the relationship for the SalesLT ProductDescription table will generate an error due to the length of the Relationship name.

Set Relationship name to productmodelproductdescription.

Figure 14 – Fix relationship name for SalesLT ProductDescription relationship

For the BillToAddressID and ShipToAddressID fields in the SalesLT SalesOrderHeader table, I named those relationship SalesLT Address (BillTo) and SalesLT Address (ShipTo).

Figure 15 – Address relationships

Add choice columns

The status for the sales order in the SQL Database is managed by a User Defined Function.

Let’s add this a choice.

Click Choices.

Click New choice.

Figure 15 – Add new choice

Set Display name to SalesLT SalesOrderHeader Status.

Add the choices, updating the Label and Value fields respectively.

Figure 16 – Set choice properties and options

Click Save.

Click Tables.

Click SalesLT SalesOrderHeader table.

Click Columns under Schema.

Click New column.

Set Display name to Status (Choice).

Set Data type to Choice.

Set Sync this choice with to SalesLT SalesOrderHeader Status.

Figure 17 – Creating choice column in SalesLT SalesOrderHeader table

Click Save.

Create a dataflow to reload the data with relationships and choices

Click Dataflows, and then click New dataflow.

Set Name to AdventureWorks Data Import (Step 2).

Click Create.

Click SQL Server database.

In Connection settings enter Server and Database and then select the connection that was created when we created our first dataflow.

Figure 18 – Select connection

Click Next.

Select on the SQL tables with relationships.

  • CustomerAddress
  • Product
  • ProductCategory
  • ProductModelProductDescription
  • SalesOrderDetail
  • SalesOrderHeader

Click Transform data.

Click Next.

Select SalesLT CustomerAddress from Queries.

Select Load to existing table.

Set Destination table to the SalesLT CustomAddress table, notice the name includes the prefix, e.g., cr110_SalesLTCustomerAddress.

Set Select key (optional) to PrimaryKey.

Click Auto map.

Leave the Source column for Name set to (none).

Set the Destination column for AddressID and CustomerID accordingly.

Figure 19 – Relationship mappings

Repeat for the remaining tables and then review.

For the SalesLT SalesOrderHeader table set the Destination column for StatusChoice to Status.

Click Next.

Click Publish.

After the dataflow runs, review the tables to ensure the relationships were mapped correctly.

Figure 20 – Verifying relationships mapped correctly for SalesLT CustomerAddress

Add calculation columns

I now want to add a calculated column to the SalesLT SalesOrderDetail table.

Click Tables.

Click the SalesLT SalesOrderDetail table.

Click the (+) in the SalesLT SalesOrderDetails columns and data section.

Figure 21 – Add calculated column

Set Display name to Line Total (calculated).

Set Data type to Formula (preview).

Set Formula to OrderQty * (Decimal(UnitPrice) - Decimal(UnitPriceDiscount)).

Click Save.

Add required rules to columns

If you know some columns are required, you can update those in column properties.

Figure 22 – Making LastName required in SalesLT Customer table

You can also add constraints to numeric fields.

Figure 23 – Making OrderQty required in SalesLIT SalesOrderDetail table and assign a minimum and maximum value

Moving to a different environment

Bonus content!

If you want to move this to a different environment, simply create new solution and then import the dataflows and the Dataverse tables.

Figure 24 – Import into a solution

I know this seems like a lot of work, and one might want an “easy” button, but I think doing it this way is just a better practice, it forces you to revisit your database structure, get familiar with the data again, address any possible legacy issues in the data, and then import it in such a way that you can accelerate the application building process.

As always, if there is a better way, please share!

5 Replies to “Adventures with Dataverse: Migrating Microsoft SQL Server Tables to Dataverse Tables”

  1. Hi Great job putting this together! I question what AI will do in this space over the next five years.

    1. Great question! Hopefully it will make it easier. As “fun” as it is to figure out how to do this, it is a lot of trial and error. Certainly AI can help do this better.

Leave a Reply

Your email address will not be published. Required fields are marked *