Skip to main content

Recurring import General Journal file using Microsoft Flow #MicrosoftFlow #MSDyn365FO

Microsoft Flow is a simple and cost effective way of integrating. In this post I will walk through how to use Flow for recurring file integration. The most common scenario I can think of is the general journal import.

First, I would recommend reading Microsoft article on recurring integration.

https://docs.microsoft.com/en-us/dynamics365/unified-operations/dev-itpro/data-entities/recurring-integrations

Solution below shows you how to use Microsoft Flow to read from OneDrive for Business and import to FinOps. The exact same thing can be done using Logic Apps for a more enterprise managed solution.

Lets start by setting up our folder structure to drop our general journal files in. Create four folders like so:

  • inbound – drop the files here to be processed
  • processing – Flow will move the file here from the inbound folder while executing
  • success – Flow will move it here when the file has successfully been imported and processed
  • error – Flow will move it here if the file fails to process for any reason

Now in Microsoft Flow designer, place a “When a file is created” and select the inbound folder.

image

In the next step, I used “Initialize variables” to be able to quickly change settings later in the flow. Otherwise, it can get messy to maintain and move around.

You might ask why I used an Object type rather than a String. In a string, I can set a single variable only. However, with the object I used a json string which allows me to set up multiple variables. (Let me know if this can be in a better way)

image

I parsed the json string. You can do this easily by copying the json string from above. Then click “Use sample payload to generate schema” to paste it in. It will generate the schema for you.

image

Next two steps is moving the file from inbound folder to processing folder. However OneDrive for Business doesn't have a Move action. I had to use a Create and Delete action as two steps. If you are using OneDrive for personal you will see a Move file action.

Once its moved, I use “Get file content” action to read the file.

image

Next, I used an HTTP POST action to send to the enqueue the file. I use the variables that were initialized earlier.

In the header I am setting the external identifier as the name of the file. This is done so in FinOps we can identify the job by the name.

image

This is what the result looks like in FinOps.

image

We are done sending the file to FinOps. The next steps are doing some cleaning of the file by moving to the success or error folder. Or you can email someone when a file is process or it errors out. You can get fancy.


However, I will show an example what you can do. Below I check every minute for 60 minutes if the job has been processed in FinOps. You can change the interval.

I added a “Do until” action. This is a loop that executes at an interval for a time limit.

image

The advanced query from the above is pasted below for your convenience. Its just doing an “or” condition. It checks if it is in any of these states.

@or(
      equals(body('HTTP_2')?['DataJobStatus']?['DataJobState'], 'Processed'),
      equals(body('HTTP_2')?['DataJobStatus']?['DataJobState'], 'ProcessedWithErrors'),
      equals(body('HTTP_2')?['DataJobStatus']?['DataJobState'], 'PostProcessingError'),
      equals(body('HTTP_2')?['DataJobStatus']?['DataJobState'], 'PreProcessingError')
    )

Initially it gets Queued, In Process and will go to Processed if it is successful.

All the statuses are listed here:

https://docs.microsoft.com/en-us/dynamics365/unified-operations/dev-itpro/data-entities/recurring-integrations

Next, I added a Condition to check the job status. If it is Processed, then I know it is successful. I can then move the file to the success folder. Otherwise, to the error folder.

image

I will stop here as the next few stops are pretty much a repeat of moving a file in OneDrive.

Why don’t you take it to the next level and check out Ludwig’s blog post on automating the posting of the general journal.

https://dynamicsax-fico.com/2016/08/17/automatic-posting-of-journals/

Popular posts from this blog

AX - How to use Map and MapEnumerator

Similar to Set class, Map class allows you to associate one value (the key) with another value. Both the key and value can be any valid X++ type, including objects. The types of the key and the value are specified in the declaration of the map. The way in which maps are implemented means that access to the values is very fast. Below is a sample code that sets and retrieves values from a map. static void checkItemNameAliasDuplicate(Args _args) { inventTable inventTable; Map map; MapEnumerator mapEnumerator; NameAlias nameAlias; int counter = 0; ; map = new Map(Types::String, Types::Integer); //store into map while select inventTable { nameAlias = inventTable.NameAlias; if (!map.exists(nameAlias)) { map.insert(nameAlias, 1); } else { map.insert(nameAlias, map.lookup(nameAlias) + 1); } } //retrieve fro

AX - How to use Set and SetEnumerator

The Set class is used for the storage and retrieval of data from a collection in which the values of the elements contained are unique and serve as the key values according to which the data is automatically ordered. You can create a set of primitive data types or complex data types such as a Class, Record or Container. Below is sample of a set of records. static void _Set(Args _args) {     CustTable       custTable;     Set             set = new Set(Types::Record);     SetEnumerator   setEnumerator;     ;     while select custTable     {         if (custTable && !set.in(custTable))         {             set.add(custTable);         }     }     if (!set.empty())     {         setEnumerator = set.getEnumerator();         setEnumerator.reset();         while (setEnumerator.moveNext())         {             custTable = setEnumerator.current();             info(strfmt("Customer: %1",custTable.AccountNum));         }     } } Common mistake when creating a set of recIds

Import document handling (attachment) files #MSDyn365FO

Out of the box you have limited data entities for migrating attachments. If you search what is already in the AOT, you will see a few various examples. I suggest you look at the LedgerJournalAttachmentsEntity as it is the simplest and cleans to copy from. I wont go into detail but I will give a quick run down of what it looks like. Use the DocuRefEntity as your main datasource. It does most of the work for you. Set your table you want to import for as the child datasource Add the Key You will need to add the postLoad method. There is minor code to update the virtual field FileContents. Below is an export I did for the general journal attachments. The import zip structure should be the same way. It will create the usual artifacts such as the excel, manifest and package header xml files. You will see a Resources folder under that. If you drill down to the resources you will see the attachments. This is an export and it used the document GUID for uniqueness. The other thing is the extensi