Tuesday, 18 June 2019

Event Based Integration using Business Events and Data Management Framework

In this post I will share how I used Business Events for integration.
If you have been in the FinOps space, then you know that there is a Recurring integration pattern that uses dequeue concept. The dequeue pattern, requires that you constantly poll the FinOps URL to see if there is a file you can download.
https://{base URL}/api/connector/dequeue/{activityID}
Then you have to acknowledge the dequeue by calling a second URL
https://{base URL}/api/connector/ack/{activityID}

Another alternative is to use the Package export method. The advantage of this one is that the external system is doing the polling and executes the export job on request. No need for the FinOps batch job to run.

In summary:
  1. Both methods use a polling architecture.
  2. Both methods produce a zip file which you have to extract
  3. Both methods require that you make a second call to check the status or acknowledge the download
Now, imagine if you could use the same Data Management framework but have it coupled with Business Events to trigger from FinOps whenever there is a new file. That means your other system doesn’t have to constantly poll to find if there are new files. On top of that, rather than getting a zip file, you export the actual file (xml, csc, json etc). My preference is to export a JSON file. I will do a future post on how to do this (its actually very simple).

I have developed a business event and is available on GitHub. It contains 3 classes as with any business event. Just add that to your model and build.

Once you installed the GitHub code. You should see the Data export business event.

How it works:

Create an export in the Data management workspace with the following:
  • Data project operation type = Export
  • Project category = Integration
I put these filter conditions so that not all exports cause a business event to trigger.

Just schedule your export to run on a regular basis with as an incremental export.

The export will generate a payload that looks like this.

The main fields you need are the DownloadURL and the EntityName. The downloadURL is formed when the event fires. By default it is valid for 1 hour. Think about it when using it in production.

Below is how I am using it in Flow.
The Business Event fires to Flow. I read the JSON payload and check on the EntityName is equal to “Customer groups”. Then I use an HTTP request with the DownloadURL to get the file.

You might be wondering how I parse a JSON file using the DownloadURL. :-) You will have to wait till next blog post.

Tuesday, 23 April 2019

Detailed guide on creating Business Events with Azure Service Bus

I have been working with the new Business Events feature released in FinOps and you should read the docs site first.
This blog post focuses primarily on setting up Azure Service Bus endpoint. Setting up the Azure services can be tricky if you are not familiar with Azure Key Vault and application registrations.

I have sequenced the post so that you don't have to jump around. There are four key elements to this:
  1. Create the app registration
  2. Create the service bus
  3. Create the key vault secret
  4. Configure FinOps

Create an App Registration

In the Azure Portal, navigate to the Azure Active Directory menu. Click on App registrations (there is the old one and the preview menu - they are the same but the UI is a bit different). I will show the original App registrations way.

Create a new Web app/API registration and give it a name. It doesn’t really matter in our case what the sign-on url is.

Take note of the Application ID as you will need it later for setting up the Business Event.

Under the Keys menu, create a new secret key. Copy the value and keep it as you will need it later.

Once the setup is done, just click on Required permissions and Grant permissions button. This has to be done by an administrator. If you don’t Grant permission. You might get an error like “Invalid client secret is provided.”.

Create a service bus

Search for service bus in the search bar. Then create a new service bus.

On the create menu, give it a name and select a pricing tier. Take note of both as they will be required later.

Once it is created, click on the Queues to create a new queue.

Give it a name and click Create. Take note of the name as it will be required later.

Next, we need to get the connection string. This is required when setting up business events in FinOps. Click on Shared access policies and then select the RootManagedSharedAccessKey. Copy the primary connection string.

Create key vault secret

Now to create the key vault secret. Key vault will hold our connection string to the Azure service bus.
I usually use the search bar on the Azure Portal to find the key vaults menu.

Create a new key vault and give it a name.

Once it is created, take note of the DNS name. We will need it later.

Under the Secrets menu, click on “Generate/Import”.

Give it a name and paste the connection string to the Azure Service bus. Take note of the name you entered. You will need it later in FinOps.

Give the application registration access to the key vault. Under the key vault > Access policies. Click on Add new.

Select the template “Key, Secret & Certificate Management”.
Click on “Select principal” and search for the application registration we created earlier and select.

You will have something like this. Just click on the save button.

Configure FinOps

The Business events menu has now moved to the System administration menu. Open up the business events form and start your set up.

When the form opens, click on Endpoints to create a new endpoint. Select Azure Service Bus Queue as the endpoint type and give it a name. This is where all those important strings you copied earlier are important.
  • Queue name - the Azure Service bus queue name you gave it
  • Service Bus SKU - the Azure Service bus pricing tier
  • Azure Active Directory application ID - this is the Application ID under the Application registration properties
  • Azure application secret - Under the application registration there was a secret key that was generate
  • Key Vault DNS name - Under the Key vault there was a property DNS name
  • Key Vault secret name - the name you gave the secret
Once all these values are set, click on OK. You will generally get a meaningful error that you could take action on.

Now that the end point is created, we will activate a business event against it. Click on Business event catalog and select the event. In my case, I selected the Free text invoice posted event.
The Activate menu, will let you select a company and the endpoint we created above.

Schedule the business events batch job

Under the system administration > Periodic tasks > Business events
Click on the Start business events batch job
You have to schedule it as a batch job, don’t just run it. Without this nothing will be sent to the business events end point.

Create a free text invoice and post it

I am not going to write anything here. You can figure this one out.

Result of the service bus queue

Using Service Bus Explorer we can see the message.

What happens in the back end.

If you want to know what happens in the back end. Business events essentially creates a record in the BUSINESSEVENTSCOMMITLOG table. This will get deleted once the batch job picks it up and sends the event to the selected endpoint.

Thursday, 4 April 2019

Regression Suite Automation Tool (RSAT) via command line

The recent RSAT tool now supports running via command line. Just call the ConsoleApp executable with the right commands. This is great for automation scenarios. Normally, you would have all your test plans prepared and you just want to execute it them on a regular basis.

Below is a quick summary of what it covers.

To get the list of commands just enter ? at the end.

C:\Program Files (x86)\Regression Suite Automation Tool\
Microsoft.Dynamics.RegressionSuite.ConsoleApp.exe ?

To get the list of IDs just run this.
C:\Program Files (x86)\Regression Suite Automation Tool\
Microsoft.Dynamics.RegressionSuite.ConsoleApp.exe list

To run a list of test cases just use the playbackbyid command.
C:\Program Files (x86)\Regression Suite Automation Tool\
Microsoft.Dynamics.RegressionSuite.ConsoleApp.exe playbackbyid 104 105 106

Monday, 1 April 2019

Embed PowerApps in Modern POS

There was a feature released around 8.1.3 to open URL in POS.

Follow the guide to add a button to POS via the Screen Layouts. In the Action, select Open URL.
Enter the PowerApps url. In this case I want it to open up as embedded rather than popping up with a new window.
Run your distribution job to have it show up on POS.

Below is an example I am working on. We have an external system managing licensing.

Just note, that it will pop up with a Microsoft login screen. You can tick the login to remember you.

If the screen doesn’t fit in the POS window. You can click on “Fit to screen” on the top right.

Hope this gives you ideas to take advantage of in Retail space. Possibilities are endless.

Wednesday, 20 February 2019

Regression Suite Automation Tool - Create unique values

The RSAT doesn't have any roll back feature. So, it can be frustrating updating the excel file with
unique records.

Well not to worry - RSAT supports Excel formulas. The user guide suggests using RandBetween(a,b) Excel function.
However, that isn't really reliable and you are hoping you don't get the same random number.

I have been using date formula. The below concatenates the datetime to the string.

If you want to get started with RSAT.
Download the tool and the user guide from here.

You can also watch a short video that walks you through how to use it.

Wednesday, 23 January 2019

Azure DevOps Release Pipeline–Walkthrough

It is a great start to 2019. Joris from Microsoft has welcomed the year with the release of the Azure DevOps Release Pipeline task on the marketplace.
I thought I would do a walkthrough for those that haven’t had a chance to play with it yet.

New release pipeline
In Azure DevOps, click on the New release pipeline.

You will get an option to select from a template. Just select “Empty Job”.
In the first stage, make sure the Agent job is using “Hosted VS 2017”.

In the Agent job click on the + icon to add a task. Select the LCS Asset Upload task.
If you don’t see, then you have not installed it. Just select the “Dynamics 365 Unified Operations Tools”
link at the bottom. Otherwise, install from here

Now that you have the first task added. Fill in the details. You will have to add a
new connection to LCS. I need to investigate the tags to give it a better name and description.

When you click on the new LCS connection, you will get this dialog. Most of it is defaulted for you. Enter the username and password.
Client ID (Application ID) can be created in Azure Portal.
You need permissions to “Dynamics Lifecycle services”. Make sure it is a
Native application type. Hopefully it looks like the screenshot below.
Don’t forget to click on “Grant permissions”.
Next, we go back to the main Pipeline screen and select our artifact.
Click on the Add an artifact tile. Select build and fill in the details.
What we want to do in this scenario is release the latest build.

To set up a trigger to occur on a build. Click on the lightning icon. You can enter
filters to ensure only your release branch gets uploaded to LCS.
You can run it manually to test it out. Click on the Create a release.
A successful run should look like this.
It is pretty satisfying to see the result on LCS.

Friday, 18 January 2019

Data Entity stuck “In Process”

There are a few reasons a scheduled data job may not execute. I made a silly mistake and it seemed as though the jobs where stuck on “In process”.
Things I checked:
1. Check the job isn’t disabled. See screenshot below. There is a toggle.
2. Check your recurring batch job is scheduled
3. Check you haven’t made any mistakes when enqueuing the entity. (I made a typo here)
I made the mistake of copying the URL and not changing the entity name. See the highlighted part in the url.

Below is a screenshot of the record info. You can see the Entity field contains the string I passed via the enqueue URL.

Sounds pretty simple but hopefully helps someone out there.