Skip to main content

Event Based Integration using Business Events and Data Management Framework

In this post I will share how I used Business Events for integration.
If you have been in the FinOps space, then you know that there is a Recurring integration pattern that uses dequeue concept. The dequeue pattern, requires that you constantly poll the FinOps URL to see if there is a file you can download.
https://{base URL}/api/connector/dequeue/{activityID}
Then you have to acknowledge the dequeue by calling a second URL
https://{base URL}/api/connector/ack/{activityID}

Another alternative is to use the Package export method. The advantage of this one is that the external system is doing the polling and executes the export job on request. No need for the FinOps batch job to run.

In summary:
  1. Both methods use a polling architecture.
  2. Both methods produce a zip file which you have to extract
  3. Both methods require that you make a second call to check the status or acknowledge the download
Now, imagine if you could use the same Data Management framework but have it coupled with Business Events to trigger from FinOps whenever there is a new file. That means your other system doesn’t have to constantly poll to find if there are new files. On top of that, rather than getting a zip file, you export the actual file (xml, csc, json etc). My preference is to export a JSON file. I will do a future post on how to do this (its actually very simple).

I have developed a business event and is available on GitHub. It contains 3 classes as with any business event. Just add that to your model and build.



Once you installed the GitHub code. You should see the Data export business event.




How it works:

Create an export in the Data management workspace with the following:
  • Data project operation type = Export
  • Project category = Integration
I put these filter conditions so that not all exports cause a business event to trigger.

Just schedule your export to run on a regular basis with as an incremental export.

The export will generate a payload that looks like this.


The main fields you need are the DownloadURL and the EntityName. The downloadURL is formed when the event fires. By default it is valid for 1 hour. Think about it when using it in production.

Below is how I am using it in Flow.
The Business Event fires to Flow. I read the JSON payload and check on the EntityName is equal to “Customer groups”. Then I use an HTTP request with the DownloadURL to get the file.


You might be wondering how I parse a JSON file using the DownloadURL. :-) You will have to wait till next blog post.

Comments

AWS said…
Thanks for posting such useful information. You have done a great job.
D365 Operations Training
D365 Finance Training
D365 Finance and Operations Training

Popular posts from this blog

AX - How to use Map and MapEnumerator

Similar to Set class, Map class allows you to associate one value (the key) with another value. Both the key and value can be any valid X++ type, including objects. The types of the key and the value are specified in the declaration of the map. The way in which maps are implemented means that access to the values is very fast.
Below is a sample code that sets and retrieves values from a map.
static void checkItemNameAliasDuplicate(Args _args) { inventTable inventTable; Map map; MapEnumerator mapEnumerator; NameAlias nameAlias; int counter = 0; ; map = new Map(Types::String, Types::Integer); //store into map while select inventTable { nameAlias = inventTable.NameAlias; if (!map.exists(nameAlias)) { map.insert(nameAlias, 1); } else { map.insert(nameAlias, map.lookup(nameAlias) + 1); } } //retrieve from…

Azure DevOps Release Pipeline–Walkthrough

It is a great start to 2019. Joris from Microsoft has welcomed the year with the release of the Azure DevOps Release Pipeline task on the marketplace. Official blog:https://community.dynamics.com/365/financeandoperations/b/newdynamicsax/archive/2019/01/18/first-azure-devops-task-released Addition blog:https://daxmusings.codecrib.com/2019/01/azure-devops-release-pipeline.html I thought I would do a walkthrough for those that haven’t had a chance to play with it yet.
New release pipeline In Azure DevOps, click on the New release pipeline.
You will get an option to select from a template. Just select “Empty Job”. In the first stage, make sure the Agent job is using “Hosted VS 2017”.

In the Agent job click on the + icon to add a task. Select the LCS Asset Upload task. If you don’t see, then you have not installed it. Just select the “Dynamics 365 Unified Operations Tools” link at the bottom. Otherwise, install from here https://marketplace.visualstudio.com/items?itemName=Dyn365FinOps.dynamics36…

Detailed guide on creating Business Events with Azure Service Bus

I have been working with the new Business Events feature released in FinOps and you should read the docs site first.
This blog post focuses primarily on setting up Azure Service Bus endpoint. Setting up the Azure services can be tricky if you are not familiar with Azure Key Vault and application registrations.
I have sequenced the post so that you don't have to jump around. There are four key elements to this: Create the app registrationCreate the service busCreate the key vault secretConfigure FinOps Create an App Registration In the Azure Portal, navigate to the Azure Active Directory menu. Click on App registrations (there is the old one and the preview menu - they are the same but the UI is a bit different). I will show the original App registrations way.


Create a new Web app/API registration and give it a name. It doesn’t really matter in our case what the sign-on url is.




Take note of the Application ID as you will need it later for setting up the Business Event.

Under the Ke…