Skip to main content

Event Based Integration using Business Events and Data Management Framework

In this post I will share how I used Business Events for integration.
If you have been in the FinOps space, then you know that there is a Recurring integration pattern that uses dequeue concept. The dequeue pattern, requires that you constantly poll the FinOps URL to see if there is a file you can download.
https://{base URL}/api/connector/dequeue/{activityID}
Then you have to acknowledge the dequeue by calling a second URL
https://{base URL}/api/connector/ack/{activityID}

Another alternative is to use the Package export method. The advantage of this one is that the external system is doing the polling and executes the export job on request. No need for the FinOps batch job to run.

In summary:
  1. Both methods use a polling architecture.
  2. Both methods produce a zip file which you have to extract
  3. Both methods require that you make a second call to check the status or acknowledge the download
Now, imagine if you could use the same Data Management framework but have it coupled with Business Events to trigger from FinOps whenever there is a new file. That means your other system doesn’t have to constantly poll to find if there are new files. On top of that, rather than getting a zip file, you export the actual file (xml, csc, json etc). My preference is to export a JSON file. I will do a future post on how to do this (its actually very simple).

I have developed a business event and is available on GitHub. It contains 3 classes as with any business event. Just add that to your model and build.



Once you installed the GitHub code. You should see the Data export business event.




How it works:

Create an export in the Data management workspace with the following:
  • Data project operation type = Export
  • Project category = Integration
I put these filter conditions so that not all exports cause a business event to trigger.

Just schedule your export to run on a regular basis with as an incremental export.

The export will generate a payload that looks like this.


The main fields you need are the DownloadURL and the EntityName. The downloadURL is formed when the event fires. By default it is valid for 1 hour. Think about it when using it in production.

Below is how I am using it in Flow.
The Business Event fires to Flow. I read the JSON payload and check on the EntityName is equal to “Customer groups”. Then I use an HTTP request with the DownloadURL to get the file.


You might be wondering how I parse a JSON file using the DownloadURL. :-) You will have to wait till next blog post.

Popular posts from this blog

AX - How to use Map and MapEnumerator

Similar to Set class, Map class allows you to associate one value (the key) with another value. Both the key and value can be any valid X++ type, including objects. The types of the key and the value are specified in the declaration of the map. The way in which maps are implemented means that access to the values is very fast. Below is a sample code that sets and retrieves values from a map. static void checkItemNameAliasDuplicate(Args _args) { inventTable inventTable; Map map; MapEnumerator mapEnumerator; NameAlias nameAlias; int counter = 0; ; map = new Map(Types::String, Types::Integer); //store into map while select inventTable { nameAlias = inventTable.NameAlias; if (!map.exists(nameAlias)) { map.insert(nameAlias, 1); } else { map.insert(nameAlias, map.lookup(nameAlias) + 1); } } //retrieve fro...

AX - How to use Set and SetEnumerator

The Set class is used for the storage and retrieval of data from a collection in which the values of the elements contained are unique and serve as the key values according to which the data is automatically ordered. You can create a set of primitive data types or complex data types such as a Class, Record or Container. Below is sample of a set of records. static void _Set(Args _args) {     CustTable       custTable;     Set             set = new Set(Types::Record);     SetEnumerator   setEnumerator;     ;     while select custTable     {         if (custTable && !set.in(custTable))         {             set.add(custTable);         }     }     if (!set.empty())     {    ...

Approve Workflow via email using template placeholders #Dyn365FO

Dynamics 365 for Finance and Operations has placeholders which can be inserted into the instructions. Normally you would want this to show up in the email that is sent. One of the most useful ones is the URL link to the exact record that you are approving. In the workflow configurations use the placeholder and build up your message. Towards the end it has workflow specific ones. The URL token is %Workflow.Link to web% . For the technical people the token is replaced in this class WorkflowDocumentField. This is what I inserted into my email template. <BODY> subject: %subject% <BR> message: %message% <BR> company: %company% <BR> for: %for% <BR> </BODY> Should look like this. The final result looks like this. If you debug these are the place holders that are put together.