Wednesday, 14 November 2018

Recurring import General Journal file using Microsoft Flow #MicrosoftFlow #MSDyn365FO

Microsoft Flow is a simple and cost effective way of integrating. In this post I will walk through how to use Flow for recurring file integration. The most common scenario I can think of is the general journal import.

First, I would recommend reading Microsoft article on recurring integration.

https://docs.microsoft.com/en-us/dynamics365/unified-operations/dev-itpro/data-entities/recurring-integrations

Solution below shows you how to use Microsoft Flow to read from OneDrive for Business and import to FinOps. The exact same thing can be done using Logic Apps for a more enterprise managed solution.

Lets start by setting up our folder structure to drop our general journal files in. Create four folders like so:

  • inbound – drop the files here to be processed
  • processing – Flow will move the file here from the inbound folder while executing
  • success – Flow will move it here when the file has successfully been imported and processed
  • error – Flow will move it here if the file fails to process for any reason

Now in Microsoft Flow designer, place a “When a file is created” and select the inbound folder.

image

In the next step, I used “Initialize variables” to be able to quickly change settings later in the flow. Otherwise, it can get messy to maintain and move around.

You might ask why I used an Object type rather than a String. In a string, I can set a single variable only. However, with the object I used a json string which allows me to set up multiple variables. (Let me know if this can be in a better way)

image

I parsed the json string. You can do this easily by copying the json string from above. Then click “Use sample payload to generate schema” to paste it in. It will generate the schema for you.

image

Next two steps is moving the file from inbound folder to processing folder. However OneDrive for Business doesn't have a Move action. I had to use a Create and Delete action as two steps. If you are using OneDrive for personal you will see a Move file action.

Once its moved, I use “Get file content” action to read the file.

image

Next, I used an HTTP POST action to send to the enqueue the file. I use the variables that were initialized earlier.

In the header I am setting the external identifier as the name of the file. This is done so in FinOps we can identify the job by the name.

image

This is what the result looks like in FinOps.

image

We are done sending the file to FinOps. The next steps are doing some cleaning of the file by moving to the success or error folder. Or you can email someone when a file is process or it errors out. You can get fancy.


However, I will show an example what you can do. Below I check every minute for 60 minutes if the job has been processed in FinOps. You can change the interval.

I added a “Do until” action. This is a loop that executes at an interval for a time limit.

image

The advanced query from the above is pasted below for your convenience. Its just doing an “or” condition. It checks if it is in any of these states.

@or(
      equals(body('HTTP_2')?['DataJobStatus']?['DataJobState'], 'Processed'),
      equals(body('HTTP_2')?['DataJobStatus']?['DataJobState'], 'ProcessedWithErrors'),
      equals(body('HTTP_2')?['DataJobStatus']?['DataJobState'], 'PostProcessingError'),
      equals(body('HTTP_2')?['DataJobStatus']?['DataJobState'], 'PreProcessingError')
    )

Initially it gets Queued, In Process and will go to Processed if it is successful.

All the statuses are listed here:

https://docs.microsoft.com/en-us/dynamics365/unified-operations/dev-itpro/data-entities/recurring-integrations

Next, I added a Condition to check the job status. If it is Processed, then I know it is successful. I can then move the file to the success folder. Otherwise, to the error folder.

image

I will stop here as the next few stops are pretty much a repeat of moving a file in OneDrive.

Why don’t you take it to the next level and check out Ludwig’s blog post on automating the posting of the general journal.

https://dynamicsax-fico.com/2016/08/17/automatic-posting-of-journals/

Tuesday, 30 October 2018

Debug Modern POS using .NET Reflector

Respect to all those POS developers. It really requires some dedication and focus to be a POS developer. This is an example of something I would have not figured out without my colleagues.


We were working on a development project and we struggled to make sense out of the error.

The error we got was complaining about the a method to validate the Unit of measure and Quantity. I was sure the object was not null and I had passed the right thing to it. Unit of measure and quantity fields were populated correctly. The modification was overriding the price.

System.NullReferenceException was unhandled by user code
   HResult=-2147467261
   Message=Object reference not set to an instance of an object.
   Source=Microsoft.Dynamics.Commerce.Runtime.Workflow
   StackTrace:
        at Microsoft.Dynamics.Commerce.Runtime.Workflow.CartWorkflowHelper.ValidateCartLineUnitOfMeasureAndQuantity(RequestContext context, Cart newCart, SalesTransaction salesTransaction, Dictionary`2 salesLineByLineId, CartLineValidationResults cartLineValidationResults)
        at Microsoft.Dynamics.Commerce.Runtime.Workflow.CartWorkflowHelper.ValidateUpdateCartRequest(RequestContext context, SalesTransaction salesTransaction, SalesTransaction returnedSalesTransaction, Cart newCart, Boolean isGiftCardOperation, IDictionary`2 productByRecordId)
        at Microsoft.Dynamics.Commerce.Runtime.Workflow.SaveCartRequestHandler.Process(SaveCartRequest request)
        at Microsoft.Dynamics.Commerce.Runtime.SingleRequestHandler`2.Execute(Request request)
        at Microsoft.Dynamics.Commerce.Runtime.CommerceRuntime.Execute[TResponse](Request request, RequestContext context, IRequestHandler handler, Boolean skipRequestTriggers)
        at ECL.Commerce.Runtime.Donation.TriggerHandlers.SaveCartRequestHandler.Process(SaveCartRequest request)
        at Microsoft.Dynamics.Commerce.Runtime.SingleRequestHandler`2.Execute(Request request)
        at Microsoft.Dynamics.Commerce.Runtime.CommerceRuntime.Execute[TResponse](Request request, RequestContext context, IRequestHandler handler, Boolean skipRequestTriggers)
   InnerException

After many hours over the course a few days we struggled. A colleague suggested to use .NET reflector (this is where you need an experienced retail developer). Using this tool, I was able to make sense of the problem.


Below are the steps to use .NET reflector.

1. Download .NET reflector

https://www.red-gate.com/products/dotnet-development/reflector/

2. Install following wizard. You can use the trial for 14 days or just active it with your serial number

When you install – I would recommend both the desktop and the visual studio extension

image

3. Select the assembly to debug. There are a couple of ways you can do that.

Select using the .NET Reflector > Generate PDBs (this will pop up a dialog to select the assembly)

image

Alternatively, from your solution explorer. Select the dll that is referenced and click on Enable Debugging.
image

4. This will launch the object browser.

Navigate to the method that caused the error. Then right click, Go to Decompiled Definition

image

5. I put a breakpoint and run through the process.

What I found was that it code used the salesTransaction object rather than the cartLine to get the unit of measure. Causing a cryptic error about unit of measure/quantity. In other words, I needed to create the cart line and save it to commit it to the salesTransaction first. Then after that process is finished, I can then override the price. i.e. 2 steps rather than trying to it all in one process.

image

I hope I don’t get in trouble of advising people to use .NET reflector by blogging here. I just don’t see how else I would have figure this thing out.

Wednesday, 17 October 2018

Send file to temp blob storage in #MSDyn365FO

In this post I will talk about sending a file to a temporary blob storage. A little warning before I get into it.

This is using the Microsoft’s blob storage that is provisioned for us. I haven’t given it a lot of thought on how it would behave in production. Take it at your own risk.

Lets start by looking back on a couple of posts last month on printing a report to a byte array.

http://dynamicsnavax.blogspot.com/2018/09/print-report-as-byte-array-via-x-in.html

http://dynamicsnavax.blogspot.com/2018/09/alternate-way-to-print-report-as-byte.html

You can use those sample pieces of code to get a report as a stream and send it to the below code.

if (stream)
{
    str fileName = 'myfile';
    str contentType = 'application/pdf';
    str fileExtension = SRSPrintDestinationSettings::findFileNameType(SRSReportFileFormat::PDF, SRSImageFileFormat::BMP);

    FileUploadTemporaryStorageStrategy fileUploadStrategy = new FileUploadTemporaryStorageStrategy();
    FileUploadTemporaryStorageResult fileUploadResult = fileUploadStrategy.uploadFile(stream, fileName + fileExtension, contentType , fileExtension);

    if (fileUploadResult == null || !fileUploadResult.getUploadStatus())
    {
        warning("@ApplicationPlatform:FileUploadFailed");
    }
    else
    {
        downloadUrl = fileUploadResult.getDownloadUrl();
        if (downloadUrl == "")
        {
            throw Exception::Error;
        }
    }
}

A download URL will be generated for you. The URL is going to be public and with an expiration (15 minutes I believe).

Below is an error if you try to access it after the expiration period.

image

Monday, 15 October 2018

Find TableId using table browser #MSDyn365FO

I am usually doing this from the development environment. However, I was debugging a customer environment where I only had access to the front end. I had a table that only gave me the RefTableId. I needed to find out the table that record related to.

I used SysTableIdView from table browser. Here is the link.

http://usnconeboxax1aos.cloud.onebox.dynamics.com/?cmp=usmf&mi=SysTableBrowser&TableName=SysTableIdView

image

Sunday, 30 September 2018

Send to Azure Service Bus in #MSDyn365FO

Sending a message to Azure Service Bus is really simple in FinOps.

Below is the job I wrote to send a message to the service bus. It takes a connection string and a queue name for connecting. The message string and key value pair list can be supplied to the properties.


    static str connectionString = 'Endpoint=sb://navaxservicebus.servicebus.windows.net/;SharedAccessKeyName=RootManageSharedAccessKey;SharedAccessKey=h5KwXSEFIHxxxxxxxxxxxxxxxxxx';
    static str queueName = 'navaxqueue';

    /// <summary>
    /// Runs the class with the specified arguments.
    /// </summary>
    /// <param name = "_args">The specified arguments.</param>
    public static void main(Args _args)
    {
        if (connectionString && queueName)
        {
            Microsoft.ServiceBus.Messaging.QueueClient queueClient = Microsoft.ServiceBus.Messaging.QueueClient::CreateFromConnectionString(connectionString, queueName);
            Microsoft.ServiceBus.Messaging.BrokeredMessage brokeredMessage = new Microsoft.ServiceBus.Messaging.BrokeredMessage("My Message");
            
            var properties = brokeredMessage.Properties;

            properties.Add("MyKey1", "MyValue1");
            properties.Add("MyKey2", "MyValue2");
            
            queueClient.Send(brokeredMessage);
        }
    }

Using Service Bus Explorer you can see the result.

image

At the moment Microsoft is working on event based triggers. One of the possible solutions they presented was writing to the service bus. Looking forward to it.

I took inspiration from one of the standard classes I noticed in in 8PU15. It subscribes to the insert/delete of a user record to send the user detail to a service bus. The big bonus is that service bus assembles are included on the machine. I won’t dive any deeper into this class in this blog post.

image

Thursday, 27 September 2018

Alternate way to print a report as a byte array via X++ in #MSDyn365FO

Earlier this month I posted on how to print a report as a byte array. I will do the same but using an alternative method. I will use the print archive instead.

You need to create an extension class for the SRSPrintArchiveContract class to add a parm method for the RecId.

[ExtensionOf(classStr(SRSPrintArchiveContract))]
final class SRSPrintArchiveContract_NAVAX_Extension
{
    public RefRecId    navaxPrintJobHeaderRecId;

    public RefRecId parmNAVAXPrintJobHeaderRecId(RefRecId _navaxPrintJobHeaderRecId = navaxPrintJobHeaderRecId)
    {
        navaxPrintJobHeaderRecId = _navaxPrintJobHeaderRecId;
        return navaxPrintJobHeaderRecId;
    }

    public RecId savePrintArchiveDetails(container binData)
    {
        RecId recId = next savePrintArchiveDetails(binData);

        this.parmNAVAXPrintJobHeaderRecId(recId);

        return recId;
    }

}

This is the alternative method I wrote.


    public static str printSalesInvoiceBase64StrV2(SalesInvoiceId _salesInvoiceId)
    {
        str                     ret;
        CustInvoiceJour         custInvoiceJour;

        select firstonly custInvoiceJour
            where custInvoiceJour.InvoiceId == _salesInvoiceId;

        if (custInvoiceJour)
        {
            str ext = SRSPrintDestinationSettings::findFileNameType(SRSReportFileFormat::PDF, SRSImageFileFormat::BMP);
            PrintMgmtReportFormatName printMgmtReportFormatName = PrintMgmtDocType::construct(PrintMgmtDocumentType::SalesOrderInvoice).getDefaultReportFormat();
                                     
            SalesInvoiceContract salesInvoiceContract = new SalesInvoiceContract();
            salesInvoiceContract.parmRecordId(custInvoiceJour.RecId);

            SrsReportRunController  srsReportRunController = new SrsReportRunController();
            srsReportRunController.parmReportName(printMgmtReportFormatName);
            srsReportRunController.parmExecutionMode(SysOperationExecutionMode::Synchronous);
            srsReportRunController.parmShowDialog(false);
            srsReportRunController.parmReportContract().parmRdpContract(salesInvoiceContract);

            SRSPrintDestinationSettings printerSettings = srsReportRunController.parmReportContract().parmPrintSettings();
            printerSettings.printMediumType(SRSPrintMediumType::Archive);
            printerSettings.fileFormat(SRSReportFileFormat::PDF);
            printerSettings.parmFileName(custInvoiceJour.InvoiceId + ext);
            printerSettings.overwriteFile(true);
                    
            srsReportRunController.startOperation();

            RefRecId printJobHeaderRecId = printerSettings.parmSRSPrintArchiveContract().parmNAVAXPrintJobHeaderRecId();

            if (printJobHeaderRecId)
            {
                DocuRef        docuRef;
            
                select firstonly docuRef
                    where docuRef.RefRecId == printJobHeaderRecId && 
                        docuRef.ActualCompanyId == curExt();
                                           
                BinData binData = new BinData();
                binData.setData(DocumentManagement::getAttachmentAsContainer(docuRef));
                ret = binData.base64Encode();
            }
        }

        return ret;
    }

Note that it will pop up with an info log saying it got sent to print archive.

image

If you navigate to the print archive, you will see the record.

image

I don’t mind either way. The first method looks messy with calling some dlls like the SRSProxy etc.

The second method adds overhead by sending to the print archive table. Over time, some cleaning up has go on here.

Wednesday, 12 September 2018

Gotcha with Extending Retail Channel transaction table

I will start by pointing you to a good article Andreas Hofmann from Microsoft has written. It steps you through what you need to extend a transactional table in the Retail Channel and bring that back to HQ via the P Job.

https://dynamicsnotes.com/extending-a-transactional-retail-channel-table-with-additional-data-hq-x-table-extension-cdx-extension-table-crt-data-service/

Now to summerise the issue I faced recently (being a retail rookie).

Following the blog post I created a custom Int64 field on the RetailTransactionSalesTrans. However, when I ran the P job it failed with this error.

“Failed to convert parameter value from a String to a Int64”

I did some investigation by trying to find out what it is actually doing. Essentially the job will do an outer join to your custom extension table. Even though your custom field is 0 by default. You won’t be creating an extension record for every single transaction record. The p job will do an outer join between RetailTransactionSalesTrans to your custom CustomRetailTransSalesTrans, you will notice some blanks in your file that is coming to HQ.

See figure below what the file looks like.

image

Remember also, that the sync happens by downloading and uploading flat files. That is why you have it trying to convert from string to int64. Hence, the error.

You can see the files by going to the Upload sessions and downloading the zip file.

image

As a colleague told me today. The advice is, use a string and treat it as a string from Channel to HQ.