Part 2 – Build azure function to process the messages in Azure Service topic

In this second part of my three part series, I am going to list down steps to build an azure function to process Confirmed Purchased orders which which we are receiving in Azure Service topic. We deployed our topic with two subscriptions to essentially build multiple messaging pipelines for different consuming applications. The idea is to demonstrate how Azure cloud messaging service can help architects and developer building a messaging platform for multiple target applications and hence avoiding P2P integrations. Microsoft Azure provides cloud messaging services (MaaS) which scale up and and scale out on demand and easy to operate and maintain.

Scenario

In the Part 1, we configured an OOTB business event with a service topic having multiple subscriptions. In this part, we are assuming a real world scenario. When a PO is confirmed, it is ready to be fulfilled by the warehouse. Suppose the customer has an external warehouse management system that accepts a data through its Web API. Here are the high level steps to achieve our goal.

  • Build an Azure function with input binding to trigger when a message is received in the service bus topic subscription.
  • Parse the message to extract purchase orders details.
  • Extract the purchase order information needed for the external warehousing application.
  • Construct a web API call to Post and update to the external warehousing application via its Web API.

Note: The external web API that I am using in this here is outside the scope of this post. I already have a pre-build API that I’ll be utilizing here for the demo purposes. If you want to learn how to build a web API real quick using Azure functions, please refer to my earlier blog post here.

How to build REST APIs with Azure functions

Prerequisites:

  • Access to Azure subscription with permissions to create azure functions. The monthly Visual studio developer subscription would work.
  • Some foundational and working knowledge of Azure function. If you are new to this, please refer to the following Microsoft documentation for a quick start. https://docs.microsoft.com/en-us/azure/azure-functions/
  • Visual Studio (community edition or Visual Studio code would work)

Build the Azure function Web API

In visual Studio 2019, ensure you have installed the Azure Workload while installing the VS. if you haven’t done it, you can always do it by running the installation process again. Here is the docs reference.

https://docs.microsoft.com/en-us/azure/azure-functions/functions-create-your-first-function-visual-studio

Create a new Project of Type Azure function.

Give a meaningful name to the project and click Create. I named it BusinessEventsFunctionApp.

Now right click on Project and add a new Azure function.

Select HTTP Trigger input binding and click OK.

Now before we go into the details of function’s code structure, add a class to parse JSON that we receive from Service bus. Define usual Getters and Setters.

    public class PurchaseOrder
    {
        public string LegalEntity { get; set; }
        public string PurchaseJournal { get; set; }
        public string PurchaseType { get; set; }
        public DateTime PurchaseOrderDate { get; set; }
        public string PurchaseOrderNumber { get; set; }
        public string TransactionCurrencyCode { get; set; }
        public double TransactionCurrencyAmount { get; set; }
        public string VendorAccount { get; set; }
        public string BusinessEventId { get; set; }
        public string ControlNumber { get; set; }
        public string EventId { get; set; }
        public string EventName { get; set; }
        public string MajorVersion { get; set; }
        public string MinorVersion { get; set; }
    }
  • Now change the code in your function as the following. Here what we are doing is
    • De-serializing the message received by utilizing JsonConvert.Deserialize() method and parsing it our PurchaseOrder class structure which we defined in the earlier step.
    • Then we are logging the Vendor account, Purchase Order ID and the Purchase Order date. We’ll pass this information to the external warehousing system.
    • Now we construct an API call and call the external System API.
    • I have this external API also built using Azure function and running on localhost at the following address.

http://localhost:7071/api/PurchaseOrderShipment

    public static class CallExternalAPI
    {
        [FunctionName("CallExternalAPI")]
        public static async void Run([ServiceBusTrigger("confirmedpos", "ConfirmedPOsExternalAPI", 
                                Connection = "ConnectionString")]
                                string mySbMsg, ILogger log)
        {
            HttpRequestMessage req = new HttpRequestMessage();

            log.LogInformation($"C# ServiceBus topic trigger function processed message: {mySbMsg}");

            // Deserialize the recieved message.
            var obj = JsonConvert.DeserializeObject<PurchaseOrder>(mySbMsg);

            // Log the extracted informaiton.
            log.LogInformation($"{obj.VendorAccount}");
            log.LogInformation($"{obj.PurchaseOrderNumber}");
            log.LogInformation($"{obj.PurchaseOrderDate}");

            try { 
            // Call Your  API
            HttpClient newClient = new HttpClient();

            //Read Server Response
            string myJson = "{'VendorAccount' : '" + obj.VendorAccount + "', 'PurchaseOrderID' : '" + obj.PurchaseOrderNumber + "', 'PurchaseOrderDate' : '" + obj.PurchaseOrderDate + "'}";

            var response = await newClient.PostAsync("http://localhost:7071/api/PurchaseOrderShipment", new StringContent(myJson, Encoding.UTF8, "application/json"));

            }
            catch (Exception ex)
            {
                log.LogInformation("Some exception occured {0}", string.Format(ex.Message));
            }
        }

So the flow is that as soon as the PO is confirmed, a message is dropped on the topic which will auto-trigger the above function. The function will then parse the message and update an external API. Since we are running development on our local dev machine, we need to configure the different ports to avoid port conflict and run both of these APIs on the same server. In order to configure the different port, add this in the settings.json file in VS.

  • Lets run this function. make sure the D365 FinOps environment is also running where the business event was configured in the previous post.
Machine generated alternative text:
[5/23/2e2e 
[5/23/2e2e 
[5/23/2e2e 
[5/23/2e2e 
[5/23/2e2e 
[5/23/2e2e 
[5/23/2e2e 
[5/23/2e2e 
[5/23/2e2e 
[5/23/2e2e 
[5/23/2e2e 
[5/23/2e2e 
[5/23/2e2e 
[5/23/2e2e 
[5/23/2e2e 
[5/23/2e2e 
[5/23/2e2e 
•46 AM] 
Version—3. 
on=(null)) 
15/23/2e2e 
•15/23/2e2e 
[5/23/2e2e 
[5/23/2e2e 
[5/23/2e2e 
[5/23/2e2e 
[5/23/2e2e 
[5/23/2e2e 
[5/23/2e2e 
[5/23/2e2e 
[5/23/2e2e 
[5/23/2e2e 
[5/23/2e2e 
5 
5. 
5. 
5. 
5. 
5. 
5. 
5. 
5. 
5. 
5. 
5. 
5. 
5. 
5. 
5. 
5. 
e. 13353. 
5. 
5. 
5. 
5. 
5. 
5. 
5. 
5. 
5. 
5. 
5. 
5. 
5. 
AM] 
AM] 
AM] 
AM] 
AM] 
AM] 
"MessageWaitTimeout " : 
"BatchOptions' 
"ee : el : eel' 
"MaxMessageCount": leee, 
"OperationTimeout": ' 
"AutoComplete": true 
Httpoptions 
"Dynamic ThrottlesEnab1ed" : 
false, 
"MaxConcurrent Requests " : 
-1, 
"MaxOutstandingRequests " : 
-1, 
"RoutePrefix": "apl 
Starting JobHost 
Starting Host (Hostld=hsusd1r9ex43hm-13617998e2, Instance1d=fe2de89e-bcc8-452e-9b84-8e1f6c585af2, 
e, ProcessId=55248, AppDomainId=1, InDebugMode=Fa1se, InDiagnosticMode=Fa1se, FunctionsExtensionVersi 
AM] 
AM] 
AM] 
AM] 
AM] 
AM] 
AM] 
AM] 
AM] 
AM] 
AM] 
Loading functions metadata 
1 functions loaded 
Generating 1 job function(s) 
Function 'CallExterna1API' is async but does not return a Task. Your function may not run correct 
Found the following functions : 
Busines s Events FunctionApp. Call ExternalAPI . Run 
Initializing function HTTP routes 
No HTTP routes mapped 
Host initialized (474ms) 
Host started (794ms) 
Job host started 
Hosting environment: Production 
Content root path: C: 
Now listening on: http://e.e.e.e:7e73 
Application s 
ut down. 
[5/23/2e2e AM] Host lock lease acquired by instance ID 'eeeeeeeeeeeeeeeeeeeeeeeeeoc3E8DB' .
  • Here is the code base for the fictitious external API to receive the confirmed POs and add them in the in-memory list.

public static class POConfirmedApi
{
    static List orders = new List();    
    
    [FunctionName("ShipPurchaseOrder")]
    public static async Task<IActionResult> ShipPurchaseOrder(
        [HttpTrigger(AuthorizationLevel.Anonymous, "post", Route = "PurchaseOrderShipment")] HttpRequest req,
        ILogger log)
    {
        log.LogInformation("Update the list of purchase orders.");

        string requestBody = await new StreamReader(req.Body).ReadToEndAsync();
        var order = JsonConvert.DeserializeObject<PurchaseOrder>(requestBody);

        orders.Add(order);

        return new OkObjectResult(order);
    }

    [FunctionName("GetPurchaseOrders")]
    public static async Task<IActionResult> GetPurchaseOrders(
    [HttpTrigger(AuthorizationLevel.Anonymous, "get", Route = "PurchaseOrderShipment")] HttpRequest req,
    ILogger log)
    {
        log.LogInformation("Getting list of Purchase Orders.");
        return new OkObjectResult(orders);
    }


}

public class PurchaseOrder
{
    public string VendorAccount { get; set; }

    public string PurchaseOrderID { get; set; }        

    public DateTime PurchaseOrderDate { get; set; }        
}

Add the above code in a separate function app project and then Run.

Login to the D365 FinOps environment and pick any PO. Edit it and then confirm.

At this point, you should have noticed some movement on both functions’ running consoles. In the first window, the message is retrieved from the service topic, extracted and pushed to the external API. In the second window the external API confirms the receipt of the message with limited details. As a best practice, only the required details should be pushed to the external systems.

Now we can validate if the external system has it added in its in-memory list by calling a GET on the same API.

The API returns the same PO which we confirmed in D365 FinOps.

Next in the final part of this three-part series, we’ll create another function to trigger on the subscription of the same topic to send an email.

How to build REST APIs with Azure functions

Web APIs has been around for quite a while and one can develop REST APIs now in all most all technology platforms. Certainly in Azure as well. In this blog post, I am going to describe how to create server less Azure functions that can work as REST APIs and how easy it is to create and test them locally and then publish to Azure Subscription.

Prerequisites

  • Access to Azure subscription with permissions to create azure functions. The monthly Visual studio developer subscription would work.
  • Some foundational and working knowledge of Azure function. If you are new to this, please refer to the following Microsoft documentation for a quick start. https://docs.microsoft.com/en-us/azure/azure-functions/
  • Visual Studio (community edition or Visual Studio code would work)

Scenario and high level

We are going to assume a scenario where we need an API to create and retrieve customers in an online CRM database. For the sake of time and simplicity and also the purpose of this post is to build REST APIs via function apps, I am going have an in-memory list of customers maintained in my Azure functions. We’ll configure routes for POST and GET API calls.

Create a function app in Visual studio

I am using VS 2019 Enterprise edition. Click File -> New – Project and selection Azure Functions. And then Next.

Give it a name and Create.

Select HTTP Trigger. Also provide any Azure storage account. If you dont have one you can create it through Azure portal.

Click Create. This is the main function which will receive HTTP requests, process them and respond back. If you are familiar with MVC architectural pattern, this is your controller.

Before we dig into the code to make it work like an API. Lets create a model class. I’ll call it Customer. This is quite self-explanatory. I purposely didn’t allow setter methods for CustomerID and CreatedDateTime as I want the business logic to initialize it.

public class Customer
{
public string CustomerID { get; } = Guid.NewGuid().ToString(“n”);
public DateTime CreatedDateTime { get; } = DateTime.UtcNow;
public string FirstName { get; set; }
public string LastName { get; set; }
public string Email { get; set; }
}

Adding one more model class which is a stripped down version of the Customer class. This is to parse the message we receive through API Post call. The API only requires the three attributes of the customer being created.

public class CustomerCreate
{
public string FirstName { get; set; }
public string LastName { get; set; }
public string Email { get; set; }
}

Now change the main controller function as depicted here. I have highlighted some of the changes to look closely and understand. We are adding the customer received in a static list. Configured the API route for this function and allowed only POST call. I also changed the authorization level to Anonymous so that we can test it quickly. off course in real would you dont want to do it.

POST /api/customer

Now add one more method for GET call. This will return all the customers added in the list.

GET /api/customer

We should add one more fucntion to get the speicfic customer by customerID. So we can have a call like

GET /api/customer/{customerId}

Test the API

Although you can publish the function app to the Azure portal subscription with a consumption based plan, but here we’ll do run it on local host and test it through postman. This would allow developers to develop and test it on their local dev environment.

Postman is a free platform for API development which allows you to test and play with any Web APIs. Download it if you haven’t already.

Lets run the visual studio project and you will find the runtime will publish the REST APIs to local host at some arbitrary port.

Now run the Postman, and call the Post method first to add a customer record. Make sure you get the 200 OK status returned.

Now do a GET to retrieve all customers. Since I added a couple of more customers, I see three records returned in JSON format. Copy any Customer ID for the next step.

Now make a GET call with the copied Customer ID to retrieve only a specific record.

When you are done with your API development and ready to rock and roll and move to Testing or next phase, you can publish it right within from Visual studio to your Azure subscription.

Create a consumption plan if you dont have one already.

Provide storage account and other details and create.

Now publish it to the portal.

Once its published successfully, you will be able to see and manage this function App in Azure portal.

I hope you found it helpful.

For more amazing and helpful content, please subscribe to this blog.

Part 1 – Configure Business events with Service Bus Topics

In this blog post, I’ll configure the business events in D365FO with Azure Service Bus topics.

Configure Azure Service Queue

Machine generated alternative text:
Azure services 
Create a 
resource 
Virtual 
machines 
Azure Active 
Directory 
Subscriptions 
Resource 
groups 
Service Bus 
Key vaults
  • Select Service Bus or ‘Create a resource’ and find service bus.
  • For service bus, you need to create a service bus namespace, where you can add service queues and topics and manage other relevant resources. After providing the details down here, click Create.

Name -> Some meaningful unique name

Pricing tier -> Standard (minimum required for topics support) 

Subscription -> select the subscription you want to utilize here.

Location -> Your closest Azure region

  • Now in service bus namespace, add a topic.
  • Provide a meaningful name to the topic and leave the default values and create. I named it ConfirmedPOs. Once its created, Click +Subscription.
  • Add two subscriptions. Set the values as shown and for the purpose of this tutorial, leave all other values as default.
    • ConfirmedPOsExternalAPI
    • ConfirmedPOsTeams
  • On Service bus namespace, goto the shared access policies -> Root managed shared access key and copy the Primary connection string and note it down somewhere.
  • Now create an Azure Key Vault. Go to portal home page -> Create a resource -> Key Vault. Provide meaningful name, select your region, subscription and resource group and then hit Review + Create. Hit Create on the next screen.
  • Once the key vault’s deployment is finished Go to the resource, and add a secret.
  • On Create a secret screen, provide a name and copy the connection string value which you copied on step 6 here. Click create. Note down the secret Name which you’ll use while configuring business events later.
  • The azure application that was registered for service to service authentication must also be added on the KeyVault under access policies.
  • Click on + Add access policy and provide the details. Make sure to select the AAD application and click Add. Once you are back to the Access policies screen, Click Save.
  • Now Go back to overview page of the key vault and make note of DNS name.

Configure AAD app

  • Now configure AAD app for service to service authentication. Azure portal home page -> Azure active directory -> App registrations and click New registration.
  • On next screen, give it a name and click Register to create an app.
  • Next, on the app, go to Certificates & secrets -> +New client Secret give it some description and click Add.
  • Now copy the secret value and keep it safe somewhere. You wouldn’t be able to access this again once you exit out this screen.

Configure Business Events

  • Now in FinOps, navigate to System administration -> Setup -> Business events -> Business events catalog.
  • On Endpoints tab, click New. Leave the default endpoint type = Azur service bus topic and click next.
  • Add the details gathered in the earlier steps and click OK.
  • Now activate the business event.
  • Leave the Legal entity blank so that it can capture events from all companies. Press ok. Now the event is activated.
  • Now confirm a purchase order. Navigate to Accounts Payable -> All purchase orders. Select any approved purchase order and click Purchase -> confirm.
  • Now you can use Service bus explorer to receive and visualize the JSON messages received in the service topic. If you haven’t used the Service bus explorer before, here is the GitHub link with documentation.

https://github.com/paolosalvatori/ServiceBusExplorer/

  • On Service bus explorer, connect to your service bus and you’ll see the topic with two subscriptions. Right click on any subscription to receive messages from DLQ (Dead Letter Queue). The message will be received in DLQ by default. We’ll discuss it in details in subsequent parts of this series.

Now you have successfully configured Business events with the Azure service bus topic. In the next part, we’ll build an azure function to consume and process this message.

Next in this three parts series …

Part 2 – Build azure function for a topic subscription to update a third party system via an API call

Avoid point-to-point integrations nightmare by using business events and Azure based middle layer

Integrating and communicating through standardized data sets with multiple systems is an indispensable requirement in every ERP project. No business can just rely on a single system or service to comply with all business needs. Traditionally it has been seen that as the requirements arises to use additional piece of software or a service provider, the integration is built at that time to fulfill the requirements. Although it is looked into from overall enterprise architecture standpoint but often missed from integration (or Technical I should say) architecture perspective. This results in many point-to-point integrations causing lot of churn and maintenance nightmare down the road. Often times this could be avoided by utilizing a robust middle layer and following integration best practices.

In this upcoming series of posts, I am going to assume a scenario and build an integration utilizing Azure based middle layer to demonstrate the power of Azure functions and service bus and how it can work altogether with FinOps to produce a highly scalable and robust integrations by avoiding point-to-point touchpoints.

Scenario

Contoso entertainment systems have a unique requirement where they want to update a third party service provider about a posted purchase order in real-time through their web API. At the same time, they want to have an email message sent to a team’s alias.

High level approach/design

  • A Purchase order is confirmed and registered a business event.
  • A message is dropped on Service bus topic with multiple subscriptions through service bus.
  • Azure Functions will be auto-triggered and process the message as soon as the message arrives in the topic/subscription.

We’ll be building it in three parts. 

Part 1 – Configure Business events with Service Bus Topics

Part 2 – Build azure function for a topic subscription to update a third party system via an API call

Part 3 – Build azure function for a topic subscription to send an outlook email

I’ll continue to post these parts and update this post with the links in coming days. Feel free to subscribe to my blog to get first hand update.

Recurring Outbound Integrations using DMF and Azure Logic Apps

In this article, I am going to list down steps to create an Azure LogicApp based recurring integration for bulk data export. If LogicApps is new to you, don’t worry, you can follow through the steps and have your logic app developed quite easily and learn through the process. For more details about logic apps, check this link.

https://docs.microsoft.com/en-us/azure/logic-apps/logic-apps-overview

We are going to work on a fictitious scenario.

Scenario
Contoso Consulting USA needs to provide Weekly Expense report of all of its employees’ travel expenses to their Payroll service provider. The payroll service provider can accept information in variety of formats including CSV files. They can grab the file from any blob storage or shared OneDrive. They would prefer to get the incremental export means they do support upserts.

High level steps

  • Create a DMF (Data Management Framework) export project.
  • Create and schedule a logic app that triggers every week on a defined time.
    • Add trigger to execute every Monday night.
    • Export a CSV file or a data package.
    • Get the URL of the exported file (blob)
    • Make sure the export is successful.
    • Upload the CSV file or data package to shared OneDrive for business location.
    • Send a notification email.

Pre-Req

  • Valid Azure Subscription with permissions to create logic apps and access blob storage. Visual studio dev subscription (150 USD monthly credit) would work. Check your country-specific billing/limit if you are outsize US.
  • System admin access on a FinOps environment. Both FinOps and Azure should be on same tenant.
  • OneDrive for business account. Your organizational Microsoft office credentials comes with a OneDrive account which you can use.

Configure Data export project in D365 FinOps
1. In D365FO, navigate to System administrator -> Data management workspace.
2. Click on Data entities tile and look for entity ‘Expenses’. Select the entity and enable change tracking for the entire entity. This is required for incremental export.

Image1

3. Click on Export tile.

Image2

4. Provide some meaningful Group name and Description.

Image3

5. Click on Add entity button. Select Expense entity and provide values as mentioned down below. Default refresh type should be set to ‘Incremental push only’ if you want to generate and incremental export. Keep in mind that the first export will always be the full push means all the data will be exported. Click Add to add it in the project.

Image4

Develop and Schedule Logic App
6. Login to your Azure portal. https://portal.azure.com
7. Click on Create new resource and select Integrations -> Logic App

Image5

8. On Create screen, provide the required details. If you want to utilize any existing resource group to manage resources and cost under some existing plan. Otherwise you can create new. Click Review + Create and then Create to finish the deployment of LogicApp.

Image6

9. Once your deployment is finished, click Go to resource to access logic app web designer.

Image7

10. On Logic App designer, click on Recurrence to add your trigger of the app.
Image8
In logic app, there is a trigger and some actions. This is the final view of logic app when its built.

Image9

11. Recurrence – This is a trigger where you can setup when and how frequent you want the logic app to execute. Essentially this will start the logic app which will orchestrate the data file generation and movement. Click add parameters to add time zone, hours and specific days.
12. Initialize DMFExecutionID – you can add next action and search for initialize. Here we are initializing the a string type variable (GUID) to use it as a unique batch job execution ID. Search for initialize in the actions and add ‘Initialize variable’. Then click on Value textbox, click on expression on the sidebar window and in fx column look for guid(), click ok.
13. Export project – This is a control action that you can add to provide a scope for some actions to execute together. Basically it’s a way of grouping actions in a single scope. Once you add a control, you can rename it to anything. I named it ‘Export project’. Search for scope action to find it.

Image10

14. Now in ‘Export Project’, click Add an action to add some nested actions under it.

Image11

15. Now search for ‘Until’ and add it. This is a loop control. Give it a meaningful name like ‘Until ExportToPackage Succeeded’

Image12

16. Now before you add a value to control the loop, add an action.

Image13

17. Next, search for ‘dynamics 365 for finance’ in Choose an action. Select ‘Execute action’

Image14

18. Next, you’ll have to sign in to your tenant.
19. Once you signed, you can select your FinOps environment from the drop down. Also select the Action as shown below. This is an OOTB OData action that will execute the DMF export project that we created earlier. Rename Execution action to ‘ExportPackage’

Image15

20. Now click on ‘Choose a value’ text field and select ‘OutputParameters’ is not equal to 00000000-0000-0000-0000-000000000000.

Image16

21. Add few parameters.
executionID as the one which is initialized above
definitionGroupId = ‘Expense export’ (this is the name of the export project created in FinOps.
reExecute = No
legalEntityID = USSI (select whatever legal entity you want to export from, I choose USSI from demo data since it does have expense report already).

Image17

22. In the same ‘Export package’ scope, add a condition to validate if the package is exported. It should be under control group of action.

Image18

23. Now define the condition. If value is-equals-to DMFExecutionID. The Value output parameters will contain the same execution ID that we provided while calling the execution action. This is basically the job ID that DMF maintains in the job history.

Image19

24. For false situation, it means the package can’t be exported so we can send an email to some alias or an admin to notify that logic app can’t be finished successfully. Add an Office 365 outlook > Send an email (V2) action.

Image20

25. You need to sign in first. Provide relevant details in Send email action.

Image21

26. For successful completion of package export, you need to add quite a few actions to move package from D365 FinOps Azure blob location to a OneDrive. You need to first grab a URL from blob where the exported file is stored.

Image22

27. Add a condition after until loop to check if the package URL is valid. You’ll select the value of the previous execution action that fetched the exported file URL.

Image23

28. If False, repeats steps 24 and 25 to send an email.
29. If true, first add HTTP Get action to retrieve content of file from Blob. Search for HTTP in action.

Image24

30. Set the Method to Get and URI as output value of execute action added in step 25.

Image25

31. Add action to create a file on OneDrive. Look for Onedrive in action search box. Select OneDrive for Business and then select Create file action.

Image26

32. Sign in to your OneDrive. You can use your organization office 365 or customers org’s sign in details.
33. Once signed in, provide the following details.
a. Folder path = / (you can create and designate a folder on OneDrive and add that path here as well)
b. File Name = Expenses.csv (you can be creative and use logic app expressions and built-in date and string functions to dynamics generate the file name here)
c. File Content = Here you need to provide the Body of the HTTP action form the earlier step.

Image27

34. Now you are done with designing your Logic App. Save it once again and correct any errors if you find any.
35. Close the designer and manually run the trigger.

Image28

36. Click refresh to check the Run history.
37. If its succeeded, your logic apps perfect. If it doesn’t, drill into the run history to find out the steps where it fails. The failing step itself provides details of the failures. Fix the issue and try again.

Image29

38. Now navigate to your OneDrive to find out the generated file.

Image30

If you want to grab the same file and see the run history in FinOps, follow the steps down here.

1. Click on the job history and then click on the Job. Notice this Job ID was the same as the executionID initialized in LogicApp.

Image31

2. Now click on Download Package and open up the package to extract the CSV file. You can get the same file form DMF as well.

Image32

How to setup claim based site on EP in AX2012

Alright. In last post, I jot down the steps to setup public portal on EP. In this post, I’ll extend it for those who wants to setup claim-aware (or secure site if you will) on top of that. The process of setting up claim based site on Enterprise Portal is specifically for AX2012 R2/R3. At the time of writing, the process is manual for the most part. We are looking at some ways to automate it (or some part of it) in near future.

Pre-Requisites and assumptions:

  1. The SharePoint foundation 2010 or 2013 has already been installed and configured on the box.
  2. The regular EP site must be installed and deployed. (refer to TechNet article for EP installation)
  3. Admin access to Azure subscription.
  4. Live/Google/Yahoo account for logging in as a claim-user.
  5. This document uses the localhost address, and therefore the same machine must act as both server and client

Claim-aware EP sites 

Claim-aware or ACS enabled EP site are secured (HTTPS) SharePoint applications which the registered claim-based AX users can access outside of the Domain environment. This provides another way of authentication to AX other than the traditional active directory based domain users. If properly configured in conjunction with Windows Azure enabled services, an organization implementing AX, can offload the authentication mechanism to other third party Identity providers such as LiveID, Google and Yahoo.

Getting windows Azure subscription

In order to setup ACS, you need an Azure subscription (note that NOT all teams have access to Azure subscriptions). You might have to get your request approved from somebody in your line of managers. You might need a windows Live ID to get the subscription. Please check with your team and policies.

Setup ACS site on Azure management portal

Once you are provisioned as admin, you can access the azure control panel from this link. https://manage.windowsazure.com

    1. Click Active Directory on the left panel and then select ‘Access control namespace’ in the main header.
    2. If you haven’t already, create a New namespace for your service .This can be shared with other services that you may want to use in Azure (Service Bus, Caching). To create one, click ‘+New’ and then follow App Services -> Active Directory -> Access Control -> quick create. Your ACS URI will look like https://<acs_namespace>.accesscontrol.windows.net
    3. Once you’ve created a namespace, select the ACS and then click Manage. NOTE: only the Admin of the Azure subscription will be able to access the ACS Management Portal. This person then needs to give access to other co-Admins as Portal administrators for them to be able to access ACS Access Control. Instructions are available here: http://msdn.microsoft.com/en-us/library/windowsazure/gg185956.aspx
    4. We need to setup Identity Providers that we are going to allow. This can be done by clicking Identity providers on the left hand side inside ACS Management Portal.
    5.  At the time of writing, following identity provider are supported and can be added in the new namespace.
        • Google
        • Yahoo
        • Facebook applications
          • FB requires specific Facebook application setup. Please consult the MS link here to setup Facebook as ACS identity provider.
          • Once you are finished with the steps mentioned in the above link, do the following two setting on Facebook app
            • In the App domain – Add “Windows.net” so you can include any of your relying app site
            • In the Canvas URL – Add AD namespace URL
    6. Next add a Relying party application: select “Relying party applications” from the left side of the ACS Management Portal.
        • Select Add
        • Enter a meaningful name for the relying party (to be used internally in ACS Management Portal). For a regular secure site, it can be named as urn:<host_name>:AzureACS
        • Type in a Realm (where the authentication request will come from). For a regular secure site, it should be urn:<host_name>:AzureACS
        • Type in the return URL  (where the ACS will redirect the web browser after a successful authentication). For a regular secure site, it should be https://<host_name&gt;:<acs_port>/_trust
          Note: At this point the claim-aware site is not created. Use any port number like 5000 and then use the same port number in section 3.3 while creating the secure site on the host machine
        • (Optional) Type in some Error URL to get redirect user in case of some unexpected exception. This is a good practice to do so
        • Token format : For AX 2012: SAML1.1
        • Token encryption policy : None
        • Token lifetime (secs) enter some large number so ACS token doesn’t expire too often, e.g. 24hrs = 86400 seconds
        • Select the Identity providers you would like to enable this application to get authentication from.
        • Rule group: select to create a new rule group.
        • Click Save.
    7. Next, configure the Rule Group just created in the above step. It should be under Rule Groups and named “Default Rule Group for <your_application_name>”
      1. Click Add
      2. On the next page, select the Identity provider that you would like configured, for example “LiveID” or “Yahoo!” or all of those.
      3. Input claim type: leave the default value of ‘Any’
      4. Input claim value: ‘Any’
      5. Output claim type: select “Pass through first input claim type”
      6. Output claim value: select “Pass through first input claim value”
      7. Click Save
    8. Now create and upload the custom token signing certificate to the ACS site
      1. Open up Server Manager and navigate to Web Server (IIS) node and select Server Certificates
      2. Open the “Server Certificates” feature and on the right panel, click “Create Self-Signed Certificate…”
      3. Specify a friendly name for the certificate and click OK
      4. Now right click that certificate on the middle panel and select “Export…”
      5. Set a path to export the file to <path_to_acs_signing_cert> and specify a password.
      6. Open up MMC.exe, File -> Add/Remove Snap-in…  and Add> Certificates node and select “Computer account” \ “Local Computer” and click Finish.
      7. Now expand Certificates (Local Computer)\Trusted Root Certification Authorities\Certificates\ and locate the certificate you just created, right click, All Tasks\Export…
      8. In the Certificate Export Wizard, select “No, do not export the private key” and use all default settings to export the certificate as .cer file. Note down the path.
      9. At this point, you should have both <acs_signing_cert>.pfx and <acs_signing_cert>.cer files.
      10. Now go back to ACS Management Portal, and on the left hand side, click “Certificate and keys”
      11. Click “Add” above the “Token Signing” section, select the relying party created in step 6 above, browse to the <acs_signing_cert>.pfx and type in the password, and click “Save”Note: The above steps are taken/derived from the AX foundation team link here

Creating claim-aware site

  1. Create and export self-signing certificate to enable secure browser based communication. Steps to create the SSLCert are here.
  2. Now import the SSLCert certificate created in the above step. Following are the steps to import it.
    1. On the Windows server that will host the forms-based Enterprise Portal site, click Start > Run, type mmc, and then click OK.
    2. Click File > Add/remove snap-in.
    3. Click Certificates, and then click Add.
    4. When the system prompts you to specify which type of account to manage certificates for, click Computer Account, and then click Next.
    5. Click Local computer, and then click Finish.
    6. In the Add or Remove Snap-ins dialog box, click.
    7. In the MMC snap-in, click the Certificates (Local Computer) node.
    8. Right-click Personal, and then click All tasks > Import. The Certificate Import Wizard opens. Click Next.
    9. Browse to the certificate, and then click Next.
    10. Enter the password for the certificate, and then click Next.
    11. Select the Mark this key as exportable option, and then click Next. The Certificate Store dialog box appears. Click Next.
  1. Now run through the following steps to create a claim-aware site on a new SharePoint application.
    1. Open the Microsoft Dynamics AX 2012 Management Shell with administrator privileges. Click Start > Administrative Tools > right-click Microsoft Dynamics AX 2012 Management Shell and click Run as administrator.
    2. Enter the following command and press Enter.
      • $Cred=Get-Credential
    3. When prompted, enter the credentials for the .NET Business Connector proxy account. The credentials must be the .NET Business Connector proxy account and password that were specified when Enterprise Portal binaries were installed earlier in this document. If you specify an account other than the .NET Business Connector proxy account, then the cmdlet overwrites the existing .NET Business Connector account, which can cause existing Enterprise Portal installations to stop working. Also note, this cmdlet designates the .NET Business Connector proxy account as the Enterprise Portal site administrator.
    4. Execute the following command, replacing “PathToSSLCert” with the path to SSLCert, which you imported earlier in this document.$SSLCert = Get-PfxCertificate “<PathToSSLCert>”   (the one created in step 1, the .pfx file)
    5. When prompted, enter the password that you specified when you exported the SSL certificate.
    6. On the Enterprise Portal server, execute the New-AXClaimsAwareEnterprisePortalServer cmdlet. For descriptions of the required parameters and syntax, see New-AXClaimsAwareEnterprisePortalServer on TechNet. The following example shows the cmdlet with the required parameters. Note that the port value of 5000 is a user-defined value. You can specify any available port number. If you specify port 443, then you do not need to specify the port number when you type the web site URL.

      New-AXClaimsAwareEnterprisePortalServer -Credential $Cred -Port 5000 -SSLCertificate $SSLCert

    7. This cmdlet can take several minutes to be completed. After the cmdlet is completed, you can access a new instance of Enterprise Portal at the following URL: https://<host_name&gt;:<acs_port>/sites/DynamicsAx
    8. In the production environment, the self-signing certificated wouldn’t be used. Since here we used the self-signing certificated, you’ll get the error in your browser. Continue with the ‘Not recommended’ option and it will show you the site.
    9. You can also double check if the new site is created properly by navigating to System Administration -> Setup -> Enterprise portal -> Web site

Configure claim-aware site

Pre-Requisites:

  • Claims Aware EP site is deployed successfully in the AX box  ==> https://<host_name&gt;:<acs_port>/Sites/DynamicsAx  (Section 3.3)
  • Access to ACS Management Portal ==> https://<acs_namespace&gt;.accesscontrol.windows.net/v2/wsfederation  (Section 3.2)
  • A certificate file (without the private key) of the signing certificate that was uploaded to ACS (.cer file) ==> <ACS_signing_cert>

Steps

  1. Open up Sharepoint 2010 Management Shell and execute the following three commands one by one to establish claims mappin

$claim1 = New-SPClaimTypeMapping -IncomingClaimType “http://schemas.xmlsoap.org/ws/2005/05/identity/claims/nameidentifier” -IncomingClaimTypeDisplayName “ACS Name Identifier Claim” -LocalClaimType “http://schemas.microsoft.com/custom/claim/type/2013/07/acs-nameidentifier

$claim2 = New-SPClaimTypeMapping -IncomingClaimType “http://schemas.microsoft.com/accesscontrolservice/2010/07/claims/identityprovider” -IncomingClaimTypeDisplayName “ACS Identity Provider” -LocalClaimType “http://schemas.microsoft.com/custom/claim/type/2013/07/acs-identityprovider

$claim3 = New-SPClaimTypeMapping -IncomingClaimType “http://schemas.xmlsoap.org/ws/2005/05/identity/claims/name” -IncomingClaimTypeDisplayName “ACS username” -LocalClaimType “http://schemas.microsoft.com/custom/claim/type/2013/07/acs-username

Now register the Token.

          $acscert = Get-PfxCertificate <ACS_signing_cert>

          New-SPTrustedIdentityTokenIssuer -Name <name_of_your_SP_Trusted_Identity_Provider> -Description <description_of_your_SP_Trusted_Identity_Provider>
         -Real <realm_of_your_SP_trusted_identity_provider> -ImportTrustCertificate $acscert -SignInUrl “https://<acs_namespace&gt;.accesscontrol.windows.net/v2/wsfederation”
         -ClaimsMappings $claim1,$claim2,$claim3 -IdentifierClaim $claim1.InputClaimType

In this example, here are the values used:

<host_name> The name of your server box.
<acs_port> 5000
<ACS_signing_cert> .cer file created above
<name_of_your_SP_Trusted_Identity_Provider> “AzureACS”
<description_of_your_SP_Trusted_Identity_Provider> “Azure ACS”
<realm_of_your_SP_trusted_identity_provider> urn:<host_name>:AzureACS
<acs_namespace> <Namespace created above>
After execution, do not close the SharePoint Management Shell window yet.
  • Import <ACS_signing_cert> as trusted root certificate in compute account through MMC.exe
  • Now import <ACS_signing_cert> as trusted root certificate in SharePoint by going back to the SharePoint Management Shell window from step 1 and run the following:

    $cert = New-Object System.Security.Cryptography.X509Certificates.X509Certificate2($acscert)

    $spcert = New-SPTrustedRootAuthority -Certificate $cert -Name “ACSTokenSigningCert”

  • To setup the claims on EP Authentication Provider through SharePoint Central Administration site, add the “Azure ACS” to your Claims Aware EP site. To do this, navigate to SharePoint Central Administration -> Manage Web applications and select the claim-aware application you created in section 3.3. Now click on “Authentication providers” on the ribbon bar and click Default. Make sure ‘Azure ACS’ is checked as mentioned in the following screenshot.
  • Setup the Security -> Users -> Specify web application user policy, Add Users: All Users\All Users (AzureACS) and give Full Read access
  • Now you are able to sign into secure site (https://<host_name&gt;:<acs_port>/Sites/DynamicsAx) using the Windows Live ID credentials from step 6:

How to setup a Public site on Enterprise portal in AX2012

The public site is a custom site collection deployed/created on top of regular SharePoint app (Sharepoin-80) that we configure while installing EP on AX2012. This post will list down steps to setup a public site.

Pre-Requisites:

  1. The SharePoint foundation 2010 or 2013 has already been installed and configured on the box.
  2. The regular EP site must be installed and deployed. (refer to TechNet article for EP installation)

Public Site Creation in SharePoint 2010/2013

  1. Launch SharePoint Central Administration
  2. Under Application Management, click Manage web applications
  3. Select the Web Application hosting the public site. It is SharePoint – 80 in our case.
  4. Click Authentication Providers button from the ribbon
  5. Click the Default Zone
  6. Under Anonymous Access, make sure Enable anonymous access check box is unchecked (required for EP Public Site Creation)
  7. Click Save
  8. Under Application Management, click Create Site Collections
  9. On the following page, enter page Title as EP Public Site (or any meaningful name)
  10. Set Web Site Address URL as Public (or anything meaningful)
  11. SP2013 ONLY: Select 2010 from ‘Select experience version’ drop down
  12. Select Microsoft Dynamics Public template under Custom tab
  13. Set Primary Site Collection Administrator as your alias
  14. Press OK

After the site has successfully been created, click on the site URL to verify that you can access it

Configure IIS

  1. Open Internet Information Services (IIS) Manager (Start > Administrative Tools > Internet Information Services Manager).
  2. Select the Web Site SharePoint-80
  3. Double click on Authentication
  4. Enable Anonymous Authentication (if not already)
  5. Enable Windows Authentication (if not already)

Enable Guest Account in AX

  1. From the Microsoft Dynamics AX client, click System Administration > Common -> Users -> Users.
  2. Double click the Guest User
  3. Click on Edit
  4. Select the Enabled check box.
  5. Assign Guest user to Vendor anonymous (external) role. (note: needed to expose vendor portal on Public site, you can add more external roles as needed)

Configuring SharePoint

  1. Launch SharePoint Central Administration
  2. Under Application Management, click Manage web applications
  3. Select Web Application hosting the public site. It is SharePoint – 80 in our case.
  4. Click Authentication Providers button from the ribbon
  5. Click the Default Zone
  6. Under Anonymous Access, check Enable anonymous access (if not already)
  7. Click Save
  8. Now go to the EP Public site http://<host_name>/sites/Public
  9. Click Sign In to login using system account
  10. Navigate to Menu Site Actions > Site Permission
  11. From the Ribbon click Anonymous Access button
  12. In the popup dialog, select Entire Web site and click OK
  13. 13.Launch SharePoint Central Administration
  14. Click Application Management > Web Applications > Configure alternate Access Mapping
  15. Click Add Internal URLs
  16. Choose Alternate Access Mapping collection to be the Web application Sharepoint-80
  17. Type http://localhost as the internal URL and Zone =Default.
  18. 18.   Click Save

Setup Internet Explorer for anonymous access

To ensure that you are accessing the page in anonymous access mode and not automatically logged in using Windows authentication:

  1. In Internet Explorer, Go to Tools > Internet Options > Security tab
  2. Select the Trusted zone
  3. Click on Custom Level
  4. Scroll to the bottom, under User Authentication > Login
  5. Select Anonymous logon.
  6. Click OK 

Now you should be able to access your public site at the URL  http://<host_name>/sites/Public

publicsite

How to launch the internet browser with a data field bound to URL data?

While working on a project, I figured out a way in AX that could trigger your system default internet browser within the AX. Many of you already have seen this while playing with SSRS or EP when you click on ‘view in browser’ kind of a buttons.

Its really simple. For the sake of example, suppose we have a stringEdit control (name: URL) with a normal button. There is a method urlLookup() in the global infolog instance that when executed opens up the default browser with the website link provided as the parameter.

Write the following code in the click event of the button ‘View in browser’.

void clicked()
{
    infolog.urlLookup(URL.text());
}

Alternatively, if you have a databound field, you can extend it from WebsiteURL EDT to get a small web lookup icon on the form control that takes you to the browser. No coding needed.