Skip to main content

Posts

Showing posts with the label Flow

Business Events and Date format in Flow

Business Events formats Dates in the Microsoft JSON format, e.g. "EventTime": "/Date(1560839609000)/" I wish it was in ISO8601 standard e.g. "2019-06-18T05:40Z". Below is what I did using Flow. First get the integer part of the string by using the replace function. Function: int(replace(replace('/Date(1560839609000)/','/Date(',''), ')/', '')) Output: 1560839609000 To format into date. Function: addseconds('1970-1-1', Div(1560839609000,1000) , 'yyyy-MM-dd') Output: 2019-06-18 To format into datetime. Function: addseconds('1970-1-1', Div(1560839609000,1000) , 'yyyy-MM-dd hh:mm:ss') Output: 2019-06-18 06:33:29 Using an online converter I am able to validate my output. https://www.epochconverter.com/ After that, you can use the Date Time string to or formatDateTime function. For the developers out there. Newtonsoft is great for working with dates and supp...

Data Management Export - XML to JSON Transformation

In my last post, I wrote about event based integration using Business Events . I used JSON as my export file type. JSON is a lot easier to work with in Microsoft Flow or Azure LogicApp. Below is how I achieved it. Data Management framework doesn’t do JSON by default. However, it does do XML file format. A bit of googling and trial and error. I found this XSLT code that transformed XML to JSON. https://gist.github.com/bojanbjelic/1632534 Here is the authors blog post to give credit. https://www.bjelic.net/2012/08/01/coding/convert-xml-to-json-using-xslt/#code Setup Under the Data management workspace, open the Source data format form. Create new record called JSON and set the default extension to json. File format = XML XML Style = Attribute Root element = Document (I left this as default) Create a new Export and select your entity. In the Source data format, select JSON record that was created in the previous step. Now click on the View map icon. In the mapping...

Event Based Integration using Business Events and Data Management Framework

In this post I will share how I used Business Events for integration. If you have been in the FinOps space, then you know that there is a Recurring integration pattern that uses dequeue concept. The dequeue pattern, requires that you constantly poll the FinOps URL to see if there is a file you can download. https://{base URL}/api/connector/dequeue/{activityID} Then you have to acknowledge the dequeue by calling a second URL https://{base URL}/api/connector/ack/{activityID} Another alternative is to use the Package export method. The advantage of this one is that the external system is doing the polling and executes the export job on request. No need for the FinOps batch job to run. In summary: Both methods use a polling architecture. Both methods produce a zip file which you have to extract Both methods require that you make a second call to check the status or acknowledge the download Now, imagine if you could use the same Data Management framework but have it coupled...

Recurring import General Journal file using Microsoft Flow #MicrosoftFlow #MSDyn365FO

Microsoft Flow is a simple and cost effective way of integrating. In this post I will walk through how to use Flow for recurring file integration. The most common scenario I can think of is the general journal import. First, I would recommend reading Microsoft article on recurring integration. https://docs.microsoft.com/en-us/dynamics365/unified-operations/dev-itpro/data-entities/recurring-integrations Solution below shows you how to use Microsoft Flow to read from OneDrive for Business and import to FinOps. The exact same thing can be done using Logic Apps for a more enterprise managed solution. Lets start by setting up our folder structure to drop our general journal files in. Create four folders like so: inbound – drop the files here to be processed processing – Flow will move the file here from the inbound folder while executing success – Flow will move it here when the file has successfully been imported and processed error – Flow will move it here if the file fails to process...

Integration - Create a lead in CRM via a web service in 10 minutes

CRM (Microsoft Dynamics Online – not AX CRM) has an SDK which you could use to integrate to. It can been overwhelming sometimes. So, I decided to use Flow to do the communication for me (HTTP Request > Dynamics). Took me 10 minutes from start to finish. I didn’t have to learn the CRM SDK or figure out how to do authentication etc. I wanted to send a simple json message like this. {      "Email": "munib@fakeemail.com",      "FirstName": "Munib",      "LastName": "Ahmed",      "Topic": "Health" } Go to Flow and create a new HTTP request. Click on “Use sample payload to generate schema” and enter the above json message. It will generate a schema as per below screenshot. Take note of the HTTP POST URL that has been generated. We will use that later to send the message to. In the Actions select Dynamics > “Create a new record”. Select Leads as the entity name. Map the fields and you are done. Now we just ...