Part 2 – Build azure function to process the messages in Azure Service topic

In this second part of my three part series, I am going to list down steps to build an azure function to process Confirmed Purchased orders which which we are receiving in Azure Service topic. We deployed our topic with two subscriptions to essentially build multiple messaging pipelines for different consuming applications. The idea is to demonstrate how Azure cloud messaging service can help architects and developer building a messaging platform for multiple target applications and hence avoiding P2P integrations. Microsoft Azure provides cloud messaging services (MaaS) which scale up and and scale out on demand and easy to operate and maintain.

Scenario

In the Part 1, we configured an OOTB business event with a service topic having multiple subscriptions. In this part, we are assuming a real world scenario. When a PO is confirmed, it is ready to be fulfilled by the warehouse. Suppose the customer has an external warehouse management system that accepts a data through its Web API. Here are the high level steps to achieve our goal.

  • Build an Azure function with input binding to trigger when a message is received in the service bus topic subscription.
  • Parse the message to extract purchase orders details.
  • Extract the purchase order information needed for the external warehousing application.
  • Construct a web API call to Post and update to the external warehousing application via its Web API.

Note: The external web API that I am using in this here is outside the scope of this post. I already have a pre-build API that I’ll be utilizing here for the demo purposes. If you want to learn how to build a web API real quick using Azure functions, please refer to my earlier blog post here.

How to build REST APIs with Azure functions

Prerequisites:

  • Access to Azure subscription with permissions to create azure functions. The monthly Visual studio developer subscription would work.
  • Some foundational and working knowledge of Azure function. If you are new to this, please refer to the following Microsoft documentation for a quick start. https://docs.microsoft.com/en-us/azure/azure-functions/
  • Visual Studio (community edition or Visual Studio code would work)

Build the Azure function Web API

In visual Studio 2019, ensure you have installed the Azure Workload while installing the VS. if you haven’t done it, you can always do it by running the installation process again. Here is the docs reference.

https://docs.microsoft.com/en-us/azure/azure-functions/functions-create-your-first-function-visual-studio

Create a new Project of Type Azure function.

Give a meaningful name to the project and click Create. I named it BusinessEventsFunctionApp.

Now right click on Project and add a new Azure function.

Select HTTP Trigger input binding and click OK.

Now before we go into the details of function’s code structure, add a class to parse JSON that we receive from Service bus. Define usual Getters and Setters.

    public class PurchaseOrder
    {
        public string LegalEntity { get; set; }
        public string PurchaseJournal { get; set; }
        public string PurchaseType { get; set; }
        public DateTime PurchaseOrderDate { get; set; }
        public string PurchaseOrderNumber { get; set; }
        public string TransactionCurrencyCode { get; set; }
        public double TransactionCurrencyAmount { get; set; }
        public string VendorAccount { get; set; }
        public string BusinessEventId { get; set; }
        public string ControlNumber { get; set; }
        public string EventId { get; set; }
        public string EventName { get; set; }
        public string MajorVersion { get; set; }
        public string MinorVersion { get; set; }
    }
  • Now change the code in your function as the following. Here what we are doing is
    • De-serializing the message received by utilizing JsonConvert.Deserialize() method and parsing it our PurchaseOrder class structure which we defined in the earlier step.
    • Then we are logging the Vendor account, Purchase Order ID and the Purchase Order date. We’ll pass this information to the external warehousing system.
    • Now we construct an API call and call the external System API.
    • I have this external API also built using Azure function and running on localhost at the following address.

http://localhost:7071/api/PurchaseOrderShipment

    public static class CallExternalAPI
    {
        [FunctionName("CallExternalAPI")]
        public static async void Run([ServiceBusTrigger("confirmedpos", "ConfirmedPOsExternalAPI", 
                                Connection = "ConnectionString")]
                                string mySbMsg, ILogger log)
        {
            HttpRequestMessage req = new HttpRequestMessage();

            log.LogInformation($"C# ServiceBus topic trigger function processed message: {mySbMsg}");

            // Deserialize the recieved message.
            var obj = JsonConvert.DeserializeObject<PurchaseOrder>(mySbMsg);

            // Log the extracted informaiton.
            log.LogInformation($"{obj.VendorAccount}");
            log.LogInformation($"{obj.PurchaseOrderNumber}");
            log.LogInformation($"{obj.PurchaseOrderDate}");

            try { 
            // Call Your  API
            HttpClient newClient = new HttpClient();

            //Read Server Response
            string myJson = "{'VendorAccount' : '" + obj.VendorAccount + "', 'PurchaseOrderID' : '" + obj.PurchaseOrderNumber + "', 'PurchaseOrderDate' : '" + obj.PurchaseOrderDate + "'}";

            var response = await newClient.PostAsync("http://localhost:7071/api/PurchaseOrderShipment", new StringContent(myJson, Encoding.UTF8, "application/json"));

            }
            catch (Exception ex)
            {
                log.LogInformation("Some exception occured {0}", string.Format(ex.Message));
            }
        }

So the flow is that as soon as the PO is confirmed, a message is dropped on the topic which will auto-trigger the above function. The function will then parse the message and update an external API. Since we are running development on our local dev machine, we need to configure the different ports to avoid port conflict and run both of these APIs on the same server. In order to configure the different port, add this in the settings.json file in VS.

  • Lets run this function. make sure the D365 FinOps environment is also running where the business event was configured in the previous post.
Machine generated alternative text:
[5/23/2e2e 
[5/23/2e2e 
[5/23/2e2e 
[5/23/2e2e 
[5/23/2e2e 
[5/23/2e2e 
[5/23/2e2e 
[5/23/2e2e 
[5/23/2e2e 
[5/23/2e2e 
[5/23/2e2e 
[5/23/2e2e 
[5/23/2e2e 
[5/23/2e2e 
[5/23/2e2e 
[5/23/2e2e 
[5/23/2e2e 
•46 AM] 
Version—3. 
on=(null)) 
15/23/2e2e 
•15/23/2e2e 
[5/23/2e2e 
[5/23/2e2e 
[5/23/2e2e 
[5/23/2e2e 
[5/23/2e2e 
[5/23/2e2e 
[5/23/2e2e 
[5/23/2e2e 
[5/23/2e2e 
[5/23/2e2e 
[5/23/2e2e 
5 
5. 
5. 
5. 
5. 
5. 
5. 
5. 
5. 
5. 
5. 
5. 
5. 
5. 
5. 
5. 
5. 
e. 13353. 
5. 
5. 
5. 
5. 
5. 
5. 
5. 
5. 
5. 
5. 
5. 
5. 
5. 
AM] 
AM] 
AM] 
AM] 
AM] 
AM] 
"MessageWaitTimeout " : 
"BatchOptions' 
"ee : el : eel' 
"MaxMessageCount": leee, 
"OperationTimeout": ' 
"AutoComplete": true 
Httpoptions 
"Dynamic ThrottlesEnab1ed" : 
false, 
"MaxConcurrent Requests " : 
-1, 
"MaxOutstandingRequests " : 
-1, 
"RoutePrefix": "apl 
Starting JobHost 
Starting Host (Hostld=hsusd1r9ex43hm-13617998e2, Instance1d=fe2de89e-bcc8-452e-9b84-8e1f6c585af2, 
e, ProcessId=55248, AppDomainId=1, InDebugMode=Fa1se, InDiagnosticMode=Fa1se, FunctionsExtensionVersi 
AM] 
AM] 
AM] 
AM] 
AM] 
AM] 
AM] 
AM] 
AM] 
AM] 
AM] 
Loading functions metadata 
1 functions loaded 
Generating 1 job function(s) 
Function 'CallExterna1API' is async but does not return a Task. Your function may not run correct 
Found the following functions : 
Busines s Events FunctionApp. Call ExternalAPI . Run 
Initializing function HTTP routes 
No HTTP routes mapped 
Host initialized (474ms) 
Host started (794ms) 
Job host started 
Hosting environment: Production 
Content root path: C: 
Now listening on: http://e.e.e.e:7e73 
Application s 
ut down. 
[5/23/2e2e AM] Host lock lease acquired by instance ID 'eeeeeeeeeeeeeeeeeeeeeeeeeoc3E8DB' .
  • Here is the code base for the fictitious external API to receive the confirmed POs and add them in the in-memory list.

public static class POConfirmedApi
{
    static List orders = new List();    
    
    [FunctionName("ShipPurchaseOrder")]
    public static async Task<IActionResult> ShipPurchaseOrder(
        [HttpTrigger(AuthorizationLevel.Anonymous, "post", Route = "PurchaseOrderShipment")] HttpRequest req,
        ILogger log)
    {
        log.LogInformation("Update the list of purchase orders.");

        string requestBody = await new StreamReader(req.Body).ReadToEndAsync();
        var order = JsonConvert.DeserializeObject<PurchaseOrder>(requestBody);

        orders.Add(order);

        return new OkObjectResult(order);
    }

    [FunctionName("GetPurchaseOrders")]
    public static async Task<IActionResult> GetPurchaseOrders(
    [HttpTrigger(AuthorizationLevel.Anonymous, "get", Route = "PurchaseOrderShipment")] HttpRequest req,
    ILogger log)
    {
        log.LogInformation("Getting list of Purchase Orders.");
        return new OkObjectResult(orders);
    }


}

public class PurchaseOrder
{
    public string VendorAccount { get; set; }

    public string PurchaseOrderID { get; set; }        

    public DateTime PurchaseOrderDate { get; set; }        
}

Add the above code in a separate function app project and then Run.

Login to the D365 FinOps environment and pick any PO. Edit it and then confirm.

At this point, you should have noticed some movement on both functions’ running consoles. In the first window, the message is retrieved from the service topic, extracted and pushed to the external API. In the second window the external API confirms the receipt of the message with limited details. As a best practice, only the required details should be pushed to the external systems.

Now we can validate if the external system has it added in its in-memory list by calling a GET on the same API.

The API returns the same PO which we confirmed in D365 FinOps.

Next in the final part of this three-part series, we’ll create another function to trigger on the subscription of the same topic to send an email.

How to build REST APIs with Azure functions

Web APIs has been around for quite a while and one can develop REST APIs now in all most all technology platforms. Certainly in Azure as well. In this blog post, I am going to describe how to create server less Azure functions that can work as REST APIs and how easy it is to create and test them locally and then publish to Azure Subscription.

Prerequisites

  • Access to Azure subscription with permissions to create azure functions. The monthly Visual studio developer subscription would work.
  • Some foundational and working knowledge of Azure function. If you are new to this, please refer to the following Microsoft documentation for a quick start. https://docs.microsoft.com/en-us/azure/azure-functions/
  • Visual Studio (community edition or Visual Studio code would work)

Scenario and high level

We are going to assume a scenario where we need an API to create and retrieve customers in an online CRM database. For the sake of time and simplicity and also the purpose of this post is to build REST APIs via function apps, I am going have an in-memory list of customers maintained in my Azure functions. We’ll configure routes for POST and GET API calls.

Create a function app in Visual studio

I am using VS 2019 Enterprise edition. Click File -> New – Project and selection Azure Functions. And then Next.

Give it a name and Create.

Select HTTP Trigger. Also provide any Azure storage account. If you dont have one you can create it through Azure portal.

Click Create. This is the main function which will receive HTTP requests, process them and respond back. If you are familiar with MVC architectural pattern, this is your controller.

Before we dig into the code to make it work like an API. Lets create a model class. I’ll call it Customer. This is quite self-explanatory. I purposely didn’t allow setter methods for CustomerID and CreatedDateTime as I want the business logic to initialize it.

public class Customer
{
public string CustomerID { get; } = Guid.NewGuid().ToString(“n”);
public DateTime CreatedDateTime { get; } = DateTime.UtcNow;
public string FirstName { get; set; }
public string LastName { get; set; }
public string Email { get; set; }
}

Adding one more model class which is a stripped down version of the Customer class. This is to parse the message we receive through API Post call. The API only requires the three attributes of the customer being created.

public class CustomerCreate
{
public string FirstName { get; set; }
public string LastName { get; set; }
public string Email { get; set; }
}

Now change the main controller function as depicted here. I have highlighted some of the changes to look closely and understand. We are adding the customer received in a static list. Configured the API route for this function and allowed only POST call. I also changed the authorization level to Anonymous so that we can test it quickly. off course in real would you dont want to do it.

POST /api/customer

Now add one more method for GET call. This will return all the customers added in the list.

GET /api/customer

We should add one more fucntion to get the speicfic customer by customerID. So we can have a call like

GET /api/customer/{customerId}

Test the API

Although you can publish the function app to the Azure portal subscription with a consumption based plan, but here we’ll do run it on local host and test it through postman. This would allow developers to develop and test it on their local dev environment.

Postman is a free platform for API development which allows you to test and play with any Web APIs. Download it if you haven’t already.

Lets run the visual studio project and you will find the runtime will publish the REST APIs to local host at some arbitrary port.

Now run the Postman, and call the Post method first to add a customer record. Make sure you get the 200 OK status returned.

Now do a GET to retrieve all customers. Since I added a couple of more customers, I see three records returned in JSON format. Copy any Customer ID for the next step.

Now make a GET call with the copied Customer ID to retrieve only a specific record.

When you are done with your API development and ready to rock and roll and move to Testing or next phase, you can publish it right within from Visual studio to your Azure subscription.

Create a consumption plan if you dont have one already.

Provide storage account and other details and create.

Now publish it to the portal.

Once its published successfully, you will be able to see and manage this function App in Azure portal.

I hope you found it helpful.

For more amazing and helpful content, please subscribe to this blog.

Part 1 – Configure Business events with Service Bus Topics

In this blog post, I’ll configure the business events in D365FO with Azure Service Bus topics.

Configure Azure Service Queue

Machine generated alternative text:
Azure services 
Create a 
resource 
Virtual 
machines 
Azure Active 
Directory 
Subscriptions 
Resource 
groups 
Service Bus 
Key vaults
  • Select Service Bus or ‘Create a resource’ and find service bus.
  • For service bus, you need to create a service bus namespace, where you can add service queues and topics and manage other relevant resources. After providing the details down here, click Create.

Name -> Some meaningful unique name

Pricing tier -> Standard (minimum required for topics support) 

Subscription -> select the subscription you want to utilize here.

Location -> Your closest Azure region

  • Now in service bus namespace, add a topic.
  • Provide a meaningful name to the topic and leave the default values and create. I named it ConfirmedPOs. Once its created, Click +Subscription.
  • Add two subscriptions. Set the values as shown and for the purpose of this tutorial, leave all other values as default.
    • ConfirmedPOsExternalAPI
    • ConfirmedPOsTeams
  • On Service bus namespace, goto the shared access policies -> Root managed shared access key and copy the Primary connection string and note it down somewhere.
  • Now create an Azure Key Vault. Go to portal home page -> Create a resource -> Key Vault. Provide meaningful name, select your region, subscription and resource group and then hit Review + Create. Hit Create on the next screen.
  • Once the key vault’s deployment is finished Go to the resource, and add a secret.
  • On Create a secret screen, provide a name and copy the connection string value which you copied on step 6 here. Click create. Note down the secret Name which you’ll use while configuring business events later.
  • The azure application that was registered for service to service authentication must also be added on the KeyVault under access policies.
  • Click on + Add access policy and provide the details. Make sure to select the AAD application and click Add. Once you are back to the Access policies screen, Click Save.
  • Now Go back to overview page of the key vault and make note of DNS name.

Configure AAD app

  • Now configure AAD app for service to service authentication. Azure portal home page -> Azure active directory -> App registrations and click New registration.
  • On next screen, give it a name and click Register to create an app.
  • Next, on the app, go to Certificates & secrets -> +New client Secret give it some description and click Add.
  • Now copy the secret value and keep it safe somewhere. You wouldn’t be able to access this again once you exit out this screen.

Configure Business Events

  • Now in FinOps, navigate to System administration -> Setup -> Business events -> Business events catalog.
  • On Endpoints tab, click New. Leave the default endpoint type = Azur service bus topic and click next.
  • Add the details gathered in the earlier steps and click OK.
  • Now activate the business event.
  • Leave the Legal entity blank so that it can capture events from all companies. Press ok. Now the event is activated.
  • Now confirm a purchase order. Navigate to Accounts Payable -> All purchase orders. Select any approved purchase order and click Purchase -> confirm.
  • Now you can use Service bus explorer to receive and visualize the JSON messages received in the service topic. If you haven’t used the Service bus explorer before, here is the GitHub link with documentation.

https://github.com/paolosalvatori/ServiceBusExplorer/

  • On Service bus explorer, connect to your service bus and you’ll see the topic with two subscriptions. Right click on any subscription to receive messages from DLQ (Dead Letter Queue). The message will be received in DLQ by default. We’ll discuss it in details in subsequent parts of this series.

Now you have successfully configured Business events with the Azure service bus topic. In the next part, we’ll build an azure function to consume and process this message.

Next in this three parts series …

Part 2 – Build azure function for a topic subscription to update a third party system via an API call

Avoid point-to-point integrations nightmare by using business events and Azure based middle layer

Integrating and communicating through standardized data sets with multiple systems is an indispensable requirement in every ERP project. No business can just rely on a single system or service to comply with all business needs. Traditionally it has been seen that as the requirements arises to use additional piece of software or a service provider, the integration is built at that time to fulfill the requirements. Although it is looked into from overall enterprise architecture standpoint but often missed from integration (or Technical I should say) architecture perspective. This results in many point-to-point integrations causing lot of churn and maintenance nightmare down the road. Often times this could be avoided by utilizing a robust middle layer and following integration best practices.

In this upcoming series of posts, I am going to assume a scenario and build an integration utilizing Azure based middle layer to demonstrate the power of Azure functions and service bus and how it can work altogether with FinOps to produce a highly scalable and robust integrations by avoiding point-to-point touchpoints.

Scenario

Contoso entertainment systems have a unique requirement where they want to update a third party service provider about a posted purchase order in real-time through their web API. At the same time, they want to have an email message sent to a team’s alias.

High level approach/design

  • A Purchase order is confirmed and registered a business event.
  • A message is dropped on Service bus topic with multiple subscriptions through service bus.
  • Azure Functions will be auto-triggered and process the message as soon as the message arrives in the topic/subscription.

We’ll be building it in three parts. 

Part 1 – Configure Business events with Service Bus Topics

Part 2 – Build azure function for a topic subscription to update a third party system via an API call

Part 3 – Build azure function for a topic subscription to send an outlook email

I’ll continue to post these parts and update this post with the links in coming days. Feel free to subscribe to my blog to get first hand update.

Recurring Outbound Integrations using DMF and Azure Logic Apps

In this article, I am going to list down steps to create an Azure LogicApp based recurring integration for bulk data export. If LogicApps is new to you, don’t worry, you can follow through the steps and have your logic app developed quite easily and learn through the process. For more details about logic apps, check this link.

https://docs.microsoft.com/en-us/azure/logic-apps/logic-apps-overview

We are going to work on a fictitious scenario.

Scenario
Contoso Consulting USA needs to provide Weekly Expense report of all of its employees’ travel expenses to their Payroll service provider. The payroll service provider can accept information in variety of formats including CSV files. They can grab the file from any blob storage or shared OneDrive. They would prefer to get the incremental export means they do support upserts.

High level steps

  • Create a DMF (Data Management Framework) export project.
  • Create and schedule a logic app that triggers every week on a defined time.
    • Add trigger to execute every Monday night.
    • Export a CSV file or a data package.
    • Get the URL of the exported file (blob)
    • Make sure the export is successful.
    • Upload the CSV file or data package to shared OneDrive for business location.
    • Send a notification email.

Pre-Req

  • Valid Azure Subscription with permissions to create logic apps and access blob storage. Visual studio dev subscription (150 USD monthly credit) would work. Check your country-specific billing/limit if you are outsize US.
  • System admin access on a FinOps environment. Both FinOps and Azure should be on same tenant.
  • OneDrive for business account. Your organizational Microsoft office credentials comes with a OneDrive account which you can use.

Configure Data export project in D365 FinOps
1. In D365FO, navigate to System administrator -> Data management workspace.
2. Click on Data entities tile and look for entity ‘Expenses’. Select the entity and enable change tracking for the entire entity. This is required for incremental export.

Image1

3. Click on Export tile.

Image2

4. Provide some meaningful Group name and Description.

Image3

5. Click on Add entity button. Select Expense entity and provide values as mentioned down below. Default refresh type should be set to ‘Incremental push only’ if you want to generate and incremental export. Keep in mind that the first export will always be the full push means all the data will be exported. Click Add to add it in the project.

Image4

Develop and Schedule Logic App
6. Login to your Azure portal. https://portal.azure.com
7. Click on Create new resource and select Integrations -> Logic App

Image5

8. On Create screen, provide the required details. If you want to utilize any existing resource group to manage resources and cost under some existing plan. Otherwise you can create new. Click Review + Create and then Create to finish the deployment of LogicApp.

Image6

9. Once your deployment is finished, click Go to resource to access logic app web designer.

Image7

10. On Logic App designer, click on Recurrence to add your trigger of the app.
Image8
In logic app, there is a trigger and some actions. This is the final view of logic app when its built.

Image9

11. Recurrence – This is a trigger where you can setup when and how frequent you want the logic app to execute. Essentially this will start the logic app which will orchestrate the data file generation and movement. Click add parameters to add time zone, hours and specific days.
12. Initialize DMFExecutionID – you can add next action and search for initialize. Here we are initializing the a string type variable (GUID) to use it as a unique batch job execution ID. Search for initialize in the actions and add ‘Initialize variable’. Then click on Value textbox, click on expression on the sidebar window and in fx column look for guid(), click ok.
13. Export project – This is a control action that you can add to provide a scope for some actions to execute together. Basically it’s a way of grouping actions in a single scope. Once you add a control, you can rename it to anything. I named it ‘Export project’. Search for scope action to find it.

Image10

14. Now in ‘Export Project’, click Add an action to add some nested actions under it.

Image11

15. Now search for ‘Until’ and add it. This is a loop control. Give it a meaningful name like ‘Until ExportToPackage Succeeded’

Image12

16. Now before you add a value to control the loop, add an action.

Image13

17. Next, search for ‘dynamics 365 for finance’ in Choose an action. Select ‘Execute action’

Image14

18. Next, you’ll have to sign in to your tenant.
19. Once you signed, you can select your FinOps environment from the drop down. Also select the Action as shown below. This is an OOTB OData action that will execute the DMF export project that we created earlier. Rename Execution action to ‘ExportPackage’

Image15

20. Now click on ‘Choose a value’ text field and select ‘OutputParameters’ is not equal to 00000000-0000-0000-0000-000000000000.

Image16

21. Add few parameters.
executionID as the one which is initialized above
definitionGroupId = ‘Expense export’ (this is the name of the export project created in FinOps.
reExecute = No
legalEntityID = USSI (select whatever legal entity you want to export from, I choose USSI from demo data since it does have expense report already).

Image17

22. In the same ‘Export package’ scope, add a condition to validate if the package is exported. It should be under control group of action.

Image18

23. Now define the condition. If value is-equals-to DMFExecutionID. The Value output parameters will contain the same execution ID that we provided while calling the execution action. This is basically the job ID that DMF maintains in the job history.

Image19

24. For false situation, it means the package can’t be exported so we can send an email to some alias or an admin to notify that logic app can’t be finished successfully. Add an Office 365 outlook > Send an email (V2) action.

Image20

25. You need to sign in first. Provide relevant details in Send email action.

Image21

26. For successful completion of package export, you need to add quite a few actions to move package from D365 FinOps Azure blob location to a OneDrive. You need to first grab a URL from blob where the exported file is stored.

Image22

27. Add a condition after until loop to check if the package URL is valid. You’ll select the value of the previous execution action that fetched the exported file URL.

Image23

28. If False, repeats steps 24 and 25 to send an email.
29. If true, first add HTTP Get action to retrieve content of file from Blob. Search for HTTP in action.

Image24

30. Set the Method to Get and URI as output value of execute action added in step 25.

Image25

31. Add action to create a file on OneDrive. Look for Onedrive in action search box. Select OneDrive for Business and then select Create file action.

Image26

32. Sign in to your OneDrive. You can use your organization office 365 or customers org’s sign in details.
33. Once signed in, provide the following details.
a. Folder path = / (you can create and designate a folder on OneDrive and add that path here as well)
b. File Name = Expenses.csv (you can be creative and use logic app expressions and built-in date and string functions to dynamics generate the file name here)
c. File Content = Here you need to provide the Body of the HTTP action form the earlier step.

Image27

34. Now you are done with designing your Logic App. Save it once again and correct any errors if you find any.
35. Close the designer and manually run the trigger.

Image28

36. Click refresh to check the Run history.
37. If its succeeded, your logic apps perfect. If it doesn’t, drill into the run history to find out the steps where it fails. The failing step itself provides details of the failures. Fix the issue and try again.

Image29

38. Now navigate to your OneDrive to find out the generated file.

Image30

If you want to grab the same file and see the run history in FinOps, follow the steps down here.

1. Click on the job history and then click on the Job. Notice this Job ID was the same as the executionID initialized in LogicApp.

Image31

2. Now click on Download Package and open up the package to extract the CSV file. You can get the same file form DMF as well.

Image32