I’ve recently been exploring various analytical options within Dynamics 365 Finance and Operations, and one that I’ve delved deeply into is Link to Fabric.
There is a walkthrough guide available on the Microsoft Fasttrack Github repo. See Hands-on Lab: Link to Fabric from Dynamics 365 finance and operations apps
This guide is an excellent starting point and should be one of the first things you try out. However, it’s important to understand that there are limitations to this approach that may not be suitable for all real-world scenarios. Lets discuss these items and what I have been exploring ...
Background
I want to join multiple tables to create my denormalised views that I can report on. My goal is to use Direct Lake mode in the semantic model. Specifically, I wanted to avoid the need to reimport data into Power BI for reporting.
Key Limitations
The first limitation you’ll encounter is:
By design, only tables in the semantic model derived from tables in a Lakehouse or Warehouse support Direct Lake mode. Although tables in the model can be derived from SQL views in the Lakehouse or Warehouse, queries using those tables will fall back to DirectQuery mode.
FinOps is highly normalised and following the Microsoft lab which uses views would not work for me.
Solutions
A solution to this problem is to create a delta table that I could load from the query/view.
Here are several ways to to do this:
1. Import Data Using a Data Pipeline: This method is easy to configure but can be slow and is not ideal for large volumes of data. Only works within the same workspace.
2. Import Data Using Dataflow Gen2: Also easy to configure, but the limitation is that the copy only works within the same workspace.
3. Import Using a Stored Procedure: Simple to set up but shares the same limitation as Dataflow Gen2, working only at the SQL analytics endpoint level and not across workspaces.
4. Import Using a Notebook: This method has a higher learning curve but offers the best performance and flexibility.
Scenarios
Select statement with a join
%%sqlSELECTparty.recid AS PartyId,party.name AS Name,COALESCE(party.namealias, '') AS ShortName,COALESCE(postal.countryregionid, '') AS Country,COALESCE(postal.state, '') AS State,COALESCE(postal.city, '') AS City,COALESCE(postal.street, '') AS Street,COALESCE(postal.zipcode, '') AS PostCode,COALESCE(phone.locator, '') AS PhoneNumber,COALESCE(email.locator, '') AS EmailFROM dirpartytable partyLEFT OUTER JOIN logisticspostaladdress postal ON postal.location = party.primaryaddresslocationAND postal.validto > current_date() -- filters only valid(effective) addressesLEFT OUTER JOIN logisticselectronicaddress phone ON phone.recid = party.primarycontactphoneLEFT OUTER JOIN logisticselectronicaddress email ON email.recid = party.primarycontactemail
You should see a table result showing below your query.
Create table if not exists
%%sqlCREATE TABLE IF NOT EXISTS fact_dirpartytableUSING DELTA ASSELECTparty.recid AS PartyId,party.name AS Name,COALESCE(party.namealias, '') AS ShortName,COALESCE(postal.countryregionid, '') AS Country,COALESCE(postal.state, '') AS State,COALESCE(postal.city, '') AS City,COALESCE(postal.street, '') AS Street,COALESCE(postal.zipcode, '') AS PostCode,COALESCE(phone.locator, '') AS PhoneNumber,COALESCE(email.locator, '') AS EmailFROM dirpartytable partyLEFT OUTER JOIN logisticspostaladdress postal ON postal.location = party.primaryaddresslocationAND postal.validto > current_date() -- filters only valid(effective) addressesLEFT OUTER JOIN logisticselectronicaddress phone ON phone.recid = party.primarycontactphoneLEFT OUTER JOIN logisticselectronicaddress email ON email.recid = party.primarycontactemail
This is a one of copy and will not copy data if the table exists already.
Next blog post, I will cover a few different scenarios.