Skip to main content

Event Based Integration using Business Events and Data Management Framework

In this post I will share how I used Business Events for integration.
If you have been in the FinOps space, then you know that there is a Recurring integration pattern that uses dequeue concept. The dequeue pattern, requires that you constantly poll the FinOps URL to see if there is a file you can download.
https://{base URL}/api/connector/dequeue/{activityID}
Then you have to acknowledge the dequeue by calling a second URL
https://{base URL}/api/connector/ack/{activityID}

Another alternative is to use the Package export method. The advantage of this one is that the external system is doing the polling and executes the export job on request. No need for the FinOps batch job to run.

In summary:
  1. Both methods use a polling architecture.
  2. Both methods produce a zip file which you have to extract
  3. Both methods require that you make a second call to check the status or acknowledge the download
Now, imagine if you could use the same Data Management framework but have it coupled with Business Events to trigger from FinOps whenever there is a new file. That means your other system doesn’t have to constantly poll to find if there are new files. On top of that, rather than getting a zip file, you export the actual file (xml, csc, json etc). My preference is to export a JSON file. I will do a future post on how to do this (its actually very simple).

I have developed a business event and is available on GitHub. It contains 3 classes as with any business event. Just add that to your model and build.



Once you installed the GitHub code. You should see the Data export business event.




How it works:

Create an export in the Data management workspace with the following:
  • Data project operation type = Export
  • Project category = Integration
I put these filter conditions so that not all exports cause a business event to trigger.

Just schedule your export to run on a regular basis with as an incremental export.

The export will generate a payload that looks like this.


The main fields you need are the DownloadURL and the EntityName. The downloadURL is formed when the event fires. By default it is valid for 1 hour. Think about it when using it in production.

Below is how I am using it in Flow.
The Business Event fires to Flow. I read the JSON payload and check on the EntityName is equal to “Customer groups”. Then I use an HTTP request with the DownloadURL to get the file.


You might be wondering how I parse a JSON file using the DownloadURL. :-) You will have to wait till next blog post.

Popular posts from this blog

AX - How to use Map and MapEnumerator

Similar to Set class, Map class allows you to associate one value (the key) with another value. Both the key and value can be any valid X++ type, including objects. The types of the key and the value are specified in the declaration of the map. The way in which maps are implemented means that access to the values is very fast. Below is a sample code that sets and retrieves values from a map. static void checkItemNameAliasDuplicate(Args _args) { inventTable inventTable; Map map; MapEnumerator mapEnumerator; NameAlias nameAlias; int counter = 0; ; map = new Map(Types::String, Types::Integer); //store into map while select inventTable { nameAlias = inventTable.NameAlias; if (!map.exists(nameAlias)) { map.insert(nameAlias, 1); } else { map.insert(nameAlias, map.lookup(nameAlias) + 1); } } //retrieve fro

AX - How to use Set and SetEnumerator

The Set class is used for the storage and retrieval of data from a collection in which the values of the elements contained are unique and serve as the key values according to which the data is automatically ordered. You can create a set of primitive data types or complex data types such as a Class, Record or Container. Below is sample of a set of records. static void _Set(Args _args) {     CustTable       custTable;     Set             set = new Set(Types::Record);     SetEnumerator   setEnumerator;     ;     while select custTable     {         if (custTable && !set.in(custTable))         {             set.add(custTable);         }     }     if (!set.empty())     {         setEnumerator = set.getEnumerator();         setEnumerator.reset();         while (setEnumerator.moveNext())         {             custTable = setEnumerator.current();             info(strfmt("Customer: %1",custTable.AccountNum));         }     } } Common mistake when creating a set of recIds

Import document handling (attachment) files #MSDyn365FO

Out of the box you have limited data entities for migrating attachments. If you search what is already in the AOT, you will see a few various examples. I suggest you look at the LedgerJournalAttachmentsEntity as it is the simplest and cleans to copy from. I wont go into detail but I will give a quick run down of what it looks like. Use the DocuRefEntity as your main datasource. It does most of the work for you. Set your table you want to import for as the child datasource Add the Key You will need to add the postLoad method. There is minor code to update the virtual field FileContents. Below is an export I did for the general journal attachments. The import zip structure should be the same way. It will create the usual artifacts such as the excel, manifest and package header xml files. You will see a Resources folder under that. If you drill down to the resources you will see the attachments. This is an export and it used the document GUID for uniqueness. The other thing is the extensi