Hooking up to SaaS solutions using the SaaS connectors

Connecting to a SaaS solution is greatly simplified and it is a key feature of the SaaS connectors.

Consider what would be required to connect to a cloud-based solution without an accelerator such as a Logic App SaaS connector. With Dynamics 365 in the cloud, for example, custom code would be required to manage the process of obtaining an OAuth 2.0 token, caching the token for maximum performance, and managing the steps of obtaining a new token when it expires. This is summarized in the following sequence diagram:

Hooking up to SaaS solutions using the SaaS connectors

An alternative option to writing custom code to support the interactions mentioned earlier is to let Microsoft do the heavy lifting and manage this process instead.

Office 365 follows the same approach, and once authenticated with both, it is possible to start building a solution that allows Sunny Electricals to create a process that ties together these two popular Microsoft SaaS solutions. This will form the basis for our first scenario.

Working with Dynamics 365 and Office 365

In our scenario, we will create a contact, send an e-mail, and create a calendar reminder for when a Lead is added to Dynamics 365.

The purpose of this process is to give some insight into how quickly a workflow can be created with SaaS solutions, along with providing at least a good overview of a potential sales workflow whereby any new leads would be followed up by a sales team.

When working with Dynamics 365 and Office 365, it is important to understand how and where user authentication takes place.

When a new Dynamics 365 and Office 365 instance is created, a new instance of an Azure Active Directory is also created. This directory and the instances that use it are part of the appeal of using a SaaS solution.

With a traditional application, each instance of a directory would be provisioned with an instance of the application. This makes maintenance and updates more onerous since they have to be applied to many instances of the application to ensure that every instance is on the same version. If this rolling update is not performed, the vendor is left supporting many versions of a running application that can be expensive and difficult to control.

However, each instance of the SaaS solution has its own Azure Active Directory tenant for security and content while leveraging a common infrastructure and application solution in Dynamics 365 and Office 365. In this way, it is possible to update and improve the common infrastructure and applications for all customers, while ensuring that their security and content remains their responsibility.

This is one of the key use cases for Software as a Service (SaaS).

After creating a new Logic App, we need to create the workflow that is going to follow the requirements of our process.

In our scenario, first, we create a connection using the Microsoft-managed connector for Dynamics 365:

  1. We choose the Microsoft-managed connector for Dynamics 365 - when a record is created.

    Working with Dynamics 365 and Office 365

  2. We need to authenticate to our Dynamics 365 Active Directory tenant by providing a user and password that has the necessary access rights to create and maintain Leads.

    Working with Dynamics 365 and Office 365

  3. Once authenticated, we chose the Organization we want to monitor, the Entity Name we wish to monitor, in our case, Leads, and the frequency we want to use to check for new leads.

    Note

    Dynamics 365 is polled by the connector and any new leads found initiate a new instance of the Logic App.

    Working with Dynamics 365 and Office 365

Once we have created our connection to Dynamics 365, we can continue to create the workflow process by adding steps for each of the Office 365 Outlook tasks we need to be complete when a new lead is added.

  1. First, we choose the Microsoft-managed connector for Office 365 Outlook to create a new contact .

    Note

    If the connector does not appear in the list, click on Load More at the bottom.

    Working with Dynamics 365 and Office 365

  2. We need to authenticate to our Office 365 Outlook account; because this is the first time we have connected to Office 365 Outlook, we need to sign in, the authenticated user is the user account in which the contact needs to be created.

    Working with Dynamics 365 and Office 365

  3. Once authenticated, we need to set the information we want to capture to ensure that the contact is created appropriately with the information brought in by the Dynamics 365 connector, for instance:
    • Folder ID
    • Given Name
    • Display Name
    • E-mail Addresses Address
    • Company Name
    • Mobile Phone

    Working with Dynamics 365 and Office 365

  4. Next, we choose the Microsoft-managed connector for Office 365 Outlook to send an e-mail.

    Working with Dynamics 365 and Office 365

  5. We are already authenticated from the previous Office 365 Outlook step, so we can either connect as the same user or create a new connection; the authenticated user is the user from which is the e-mail is sent.

    Working with Dynamics 365 and Office 365

  6. Once authenticated, we set the necessary information to send an e-mail based on the information brought in by the Dynamics 365 or Office 365 Outlook connectors from the previous steps, for instance:
    • To - which could in this case be a distribution list
    • Subject
    • Body
    • Importance

    Working with Dynamics 365 and Office 365

  7. Finally, we choose the Microsoft-managed connector for Office 365 Outlook to create a new event.

    Working with Dynamics 365 and Office 365

  8. We choose how to authenticate this connector, with the previously authenticated user being chosen by default; the event is created in the account used to authenticate for the connector.
  9. Once authenticated, we need to set the correct information to ensure that the event is created with the information brought in by the Dynamics 365 or Office 365 Outlook connectors from the previous steps:
    • Calendar id
    • Start time
    • Subject
    • Content
    • End time

    Working with Dynamics 365 and Office 365

We have now completed our simple workflow, and we can save the Logic App. Once saved, it will poll Dynamics 365 based on the interval specified in the Dynamics 365 connector to check for any new Leads.

Working with Dynamics 365 and Office 365

To ensure that the workflow described by the Logic App executes correctly, we need to test it. First, we need to create a new lead in Dynamics 365.

Working with Dynamics 365 and Office 365

Since the Logic App is controlled by the interval set when we created the Dynamics 365 connection, we can either wait for the interval to expire or manually start the trigger.

Once the Logic App has triggered and executed, we can examine the details of the run to ensure that it has met our requirements and completed successfully.

Note

More information on monitoring and logging information in Logic Apps is provided in a future chapter.

Working with Dynamics 365 and Office 365

We can see that the three Office 365 Outlook actions have completed successfully, so we can check in the chosen Office 365 accounts to ensure that the results satisfy our expectations.

First, we can check to see whether a new contact has been created in the account chosen when we authenticated in the Office 365 Outlook connector from creating a contact.

Working with Dynamics 365 and Office 365

Next, we can check to see whether an e-mail has been received by the account chosen when the Office 365 Outlook connector to send an e-mail was created.

Working with Dynamics 365 and Office 365

Finally, we are expecting a new event to have been created to remind us to follow up with a telephone call to the lead created in Dynamics 365.

Working with Dynamics 365 and Office 365

We can see that the Logic App has completed successfully, and all tasks have executed and performed the actions required.

At this point, we do not have a production-ready solution, and in a full scenario, we would add in handling for when issues are encountered or have other flows that may branch and perform other actions.

These approaches were discussed previously in the book, but we have shown how we can create a workflow that it is simple and codeless to set up a basic integration between two different SaaS applications. We have shown that we can chain events together to build a flow that can articulate what would be separate processes typically, thereby reducing the effort required for our sales team.

It is now time to consider a more complex workflow that uses other SaaS products.

User authorization using the Salesforce connector

In the case of the Salesforce connector, we can select it in the Logic App designer (from the list of Microsoft-managed APIs), and from there, it is the possible to authorize the connector to connect to Salesforce on behalf of the user, obtaining and managing the OAuth tokens required.

The first step is to select the Salesforce connector, as shown in the following screenshot:

User authorization using the Salesforce connector

Here, we can specify what should trigger the connector, based on the addition or modification of data in Salesforce.

Before the connector can be used, a username and a password must be entered into the connector. This allows the connector to act on the user's behalf, connecting to the Salesforce API under the hood. The connector will store credential information in blob storage in Azure.

Note

Note that the connector will fail to connect to Salesforce if the password changes or expires. If this occurs, it will be necessary to re-enter credentials into the connector, for the connection to be restored.

As shown in the screenshots here, the connector redirects to a Salesforce login page where the Salesforce username and password should be entered. After clicking on the Allow button, the Logic App connector will now have the required information to connect to Salesforce on behalf of the user, issuing a token received from Salesforce as a result of this login process. The connector will neither store the username and password and nor will the connector know what the username and password is: it will just store the authorization code and token issued by Salesforce instead.

User authorization using the Salesforce connector

Once authentication and authorization has been set up, it is possible to modify the connection details in the connector by clicking on the Change connection link at the bottom of the Salesforce card, as highlighted here:

User authorization using the Salesforce connector

Salesforce connector - Under the hood

It is interesting to examine the Logic App code view, after adding the connector. If we look at the JSON, we can see that the connector is actually a pointer to an endpoint hosted in Azure API Management. The URL represents the Microsoft-managed API, which in turn wraps the Salesforce API. In this way, we have a layer of abstraction that permits a common mechanism to connect to different SaaS applications (that is, via the standard Microsoft API pattern) and also we can enjoy increased stability, where it is the responsibility of Microsoft to manage changes to the Salesforce API in the internal workings of the Microsoft wrapper API. Of course, the addition of a management layer must be weighed up against the inevitable increased latency in response times that will be incurred.

Salesforce connector - Under the hood

Leveraging the Salesforce connector: Sunny Electricals automated credit check solution

In the case of our fictitious electrical retail company, Sunny Electricals, the management team decides to sign off allowing the sales team to manage the sales pipeline using Salesforce. The management team expects the investment in Salesforce subscriptions to be outweighed by the increased revenue the software will bring in, by better matching products to customers.

However, one aspect that remains a cause of frustration after purchasing the subscriptions is the customer onboarding process. This is relevant for the larger customers, who typically purchase items in bulk and wish to have an account with Sunny Electricals; such customers are entered into the Sunny Electricals debtor ledger, and the payment is expected on the agreed date and not necessarily on the date of purchase.

In order to have confidence that these customers are good customers (that is, they pay their invoices and pay on time!); Sunny Electricals have a credit check process to ensure that prospective customers have a good credit rating and are likely to pay their account. The credit check process occurs when a new customer is converted from a sales lead to a customer requiring an account (since they wish to make a purchase). Currently, this process is a manual one: the sales consultant completes a form, which is then couriered to a third-party company that then carries out the credit check and informs Sunny Electricals of the credit risk. If the decision is a positive one, the sales consultant may then create an account for the customer and the customer may then make purchases against the account.

This manual process is time consuming and demanding of the sales consultant's time, so the Sunny Electricals IT team is requested to build an automated credit check solution. The credit check company, Credit Checkers Limited (CCL), have an API that can be queried that will return a report for the indicated customer; the IT team decides to leverage this API in the Logic App solution, to determine if an account should be created in Salesforce for the customer (this step will also be automated using the Salesforce connector).

The first step in building the credit check solution is to create a new Logic App (using a blank template) and configure the Salesforce connector as follows:

Leveraging the Salesforce connector: Sunny Electricals automated credit check solution

The connector is a trigger that will fire when all the conditions specified in each field are met, at the configured polling interval. Each field is summarized as follows:

  • Object type: This is a mandatory field and is a drop-down list of all the available entities in Salesforce that can be queried and a Logic App instance triggered on. In the case of Sunny Electricals, the Leads object is selected.
  • Filter Query: As it is now becoming increasingly commonplace in APIs from the major vendors, the OData (Open Data Protocol) standard is available in the connector for sorting and filtering the results from Salesforce. The filter specified will ensure that only those leads with the status Closed - Converted will be returned, to trigger

    Note

    an instance of the Logic App. Further information about the OData OASIS standard to build RESTful APIs may be found here on the OData website: http://www.odata.org/.

  • Order By: It is possible to provide a sort order to the data returned by specifying an OData orderBy query here. This is applicable only if multiple records will be returned.
  • Skip Count: If multiple records are returned, this value indicates the number of records that should be skipped/ignored. The default value is 0.
  • Maximum Get Count: This specifies the maximum number of records that will be returned for those records matching the trigger criteria. The maximum number of records that may be returned is 256.
  • Frequency: This indicates the time parameter at which the connector will query the Salesforce API. The values that may be selected are as follows: Day, Hour, Minute, or Second. It is also possible to specify a custom value that must match an allowable value as specified in the Logic Apps trigger definition; this defines the following additional values that may be manually typed in: Week, Month, or Year.
  • Interval: This is an integer to specify the polling interval of the trigger.

If we take a look under the hood and view the underlying JSON Logic App workflow definition, we can see our trigger definition with all the parameters specified, as shown here:

Leveraging the Salesforce connector: Sunny Electricals automated credit check solution

Note that it is possible to overwrite the default trigger name with our own meaningful name, as highlighted in the screenshot, with underscores between each word; the underscores are replaced by spaces in the Logic Apps designer.

Reaching out to the credit check API

At this point, we have quite easily configured secure access to Salesforce through the Microsoft-managed API. Next, the Sunny Electricals IT team builds the next step in the workflow, which is requesting a credit check report on the prospective customer; this should be initiated after the Salesforce connector has triggered the Logic App to execute.

CCL have a credit check API configured in Azure API Management (APIM), and Sunny Electricals signs up to use the API through the APIM Developer portal, where the Sunny Electricals developers are issued with subscription keys to access the credit check API.

For a walkthrough of APIM, please see Chapter 4, What is Azure API Management?

The screenshot here shows the APIM Developer portal view:

Reaching out to the credit check API

The Accounts_GetAccounts operation enables CCL customers to search for accounts and their history, using the endpoint query parameters to specify the search filter.

In order to invoke the custom API from our credit check Logic App, a new step is selected in the designer and an HTTP call endpoint card is added and configured in the workflow as follows:

Reaching out to the credit check API

The HTTP connector allows us to call any custom API, and we don't have to write any code to do this. A summary of each field is explained here:

  • Method: This corresponds to the HTTP verb that is required when calling the URI. This is a mandatory field.
  • Uri: This specifies the URI endpoint that will be called. In this example, the APIM endpoint has been configured, when the accountName query parameter dynamically created by passing in parameters from the Salesforce connector trigger, for the customer we wish to create an account for.
  • Headers: HTTP headers may be specified here. In this case, the APIM subscription key has been entered to enable authentication against the API endpoint.
  • Body: A content body may be entered here. In this example, there is no body because this is an HTTP GET request. However, this would be required for a HTTP POST request, for example.

It is also possible to use the HTTP + Swagger connector to invoke an API that has a Swagger endpoint defined. This is convenient because URL creation is handled on our behalf and URL query parameters are called out for us in the connector, with a requirement only to input data into the provided fields.

However, there is currently a limitation with the designer where it is not possible to download Swagger documents for authenticated endpoints. The following error will be returned:

Reaching out to the credit check API

In our case, CORS has been enabled, and the endpoint is secured over HTTP.

What has happened is that since authentication has been configured on the APIM endpoint and there is no facility to configure the authentication required, the Logic Apps designer cannot download the Swagger API definition.

The API definition may be downloaded from the APIM Developer portal directly however, as shown here:

Reaching out to the credit check API

A simple workaround for this issue is to download the API definition and expose it via an unauthenticated endpoint in blob storage. The URL for the unauthenticated endpoint can be entered instead and then it is possible to specify the API operation required.

Note that it is also necessary to enable CORS on the Swagger endpoint to allow the Logic Apps designer running in the web browser to access the Swagger document. If this is not implemented, the error in the screenshot earlier will also be returned.

A tool such as Microsoft Azure Storage Explorer can be used to enable CORS on the blob container containing the Swagger file, as shown here:

Reaching out to the credit check API

Note

The Microsoft Azure Storage Explorer tool may be downloaded from http://storageexplorer.com/.

In the screenshot later, we can see that we have worked around the limitation of the HTTP + Swagger connector and have successfully loaded the Swagger JSON definition file from the public-facing blob storage endpoint. It is possible now to select the API operation required, as specified in the Swagger file.

Reaching out to the credit check API

Processing the credit check report using an Azure Function

On successfully calling the custom credit check API, CCL customers will receive back a report that contains account(s) on record for the person that they are interested in doing business with. We can see a sample JSON report later that is an array of account records containing the details of payment histories; customers may examine the report and determine if they wish to do business with the person:

[ 
  { 
    "PaymentHistoryCollection": [ 
      "OK", 
      "PD30", 
      "PD30", 
      "PD30", 
      "PD30", 
      "PD30", 
      "PD30", 
      "PD30", 
      "COLL" 
    ], 
    "AccountName": "Bob Smith", 
    "Balance": 4000, 
    "OpenDateTime": "2016-03-19T16:30:51.9189083+00:00", 
    "Terms": "4 month(s)", 
    "OriginalAmount": 40000, 
    "MonthlyPaymentAmount": 265, 
    "LastPaymentDate": "2016-06-21T08:30:14.1958915+00:00", 
    "AccountStatus": "Active", 
    "CurrentAddress": { 
      "StreetAddress": "3 Emerald Way", 
      "Suburb": "Rosefield", 
      "City": "Townsville", 
      "Postcode": "2344" 
    } 
  } 
] 

Briefly, each record contains the details of the account and an array of payment history codes where, for example, the code PD30 specifies that payment was past due by 30 days and COLL indicates that the account was referred to a debt collection agency.

Sunny Electricals decide that any customer with payments past due by greater than 30 days will not be promoted to have an account and also those customers with accounts that have been referred to a debt collection agency. These rules are captured in an Azure Function. It is the job of the Logic App to pass the credit report into the function, where it is determined if the customer has a good credit history.

We can see here how Logic Apps may be used to automate entire business processes and stich varied disparate systems together to reduce the operational expenditure of a business. Human intervention, for example, is not needed to process the report.

The following C# code is the business logic in the Azure Function:

#r "Newtonsoft.Json" 
#r "SunnyElectricalEntities.dll" 
using System.Net; 
using Newtonsoft.Json; 
using SunnyElectricalEntities; 
 
public static async Task<HttpResponseMessage> Run(HttpRequestMessage req, TraceWriter log) 
{ 
    // Get request body. 
    dynamic data = await req.Content.ReadAsAsync<object>(); 
 
    List<RootObject> c = JsonConvert.DeserializeObject<List<RootObject>>(data.ToString()); 
     
    log.Info(data.ToString()); 
     
    // Check the payment history for each account to determine if too late payments disqualifies the customer. 
    // Where "PDnn" == "Past Due nn days" and "COLL" == "Assigned to collection agency". 
    string[] pastDueExclusion = new string[] {"PD60", "PD90", "PD120", "PD150", "PD180", "COLL"}; 
 
    var results = from account in c 
                  from ph in account.PaymentHistoryCollection 
                  where pastDueExclusion.Contains(ph) 
                  select ph; 
     
    return req.CreateResponse(results.Count() > 0 ? HttpStatusCode.Forbidden : HttpStatusCode.OK, 
            results.Count() > 0 ? String.Format("Credit check failed due to bad payment history.") : String.Format("OK")); 
} 

Chapter 7, Azure Functions in Logic Apps, of this book covers Azure Functions in more depth.

Triggering automatic account creation using the Salesforce connector

Depending on the results of the credit check, encapsulated in the Azure Function, we optionally trigger the Salesforce connector to create an account in Salesforce on behalf of the sales consultant. (If the credit check fails, an e-mail containing the error information will be sent to an administrator for processing).

A setup is similar to that of the Salesforce trigger connector; however, authentication details do not need to be re-entered.

In the Logic App designer, the Salesforce - Create object connector is selected from the list of Microsoft-managed APIs, as shown here:

Triggering automatic account creation using the Salesforce connector

Once the connector has been added to the designer, it is possible to choose the type of object that should be created, along with any associated properties. This is demonstrated here (where the connector has also been renamed to something more meaningful for this workflow).

Triggering automatic account creation using the Salesforce connector

In this use case, we need to create an account, and here, we can see the account will be created with the account name matching the title and full name of the lead, as passed in from the originating Salesforce connector trigger.

Testing the credit check solution

At this point, we have a complete solution, as can be seen from the following screenshot:

Testing the credit check solution

In order to test it, a lead can be changed to the triggering status via the Salesforce website, as can be seen in the following screenshot. This simulates the action that the sales consultant would carry out:

Testing the credit check solution

As mentioned in the previous scenario, it is possible to manually trigger the Logic App to run via the Azure portal or wait for the Salesforce trigger to do its next poll against the Microsoft-managed API. The Salesforce connector should successfully detect the change of status and cause the Logic App workflow to activate.

In this case, the credit check is favorable, and an account is created for our test lead, as can be seen here:

Testing the credit check solution

In this more complex scenario, we can see how Logic Apps can orchestrate a flow across multiple SaaS solutions, from the mainstream Salesforce connector to the custom API of the credit check company. In this way, data is being extracted out of the silos of each SaaS provider and utilized to drive the automation of business processes.

Hybrid scenarios

The two scenarios earlier show how easy it is to create simple and more complex workflows between SaaS products to deliver real-world operational efficiencies.

However, in many situations, it is necessary to provide workflows that connect cloud-based SaaS solutions with key on-premises line of business applications such as an ERP system.

There are many ways to achieve connectivity between the cloud and on-premises infrastructure including establishing a Virtual Private Network (https://azure.microsoft.com/en-us/services/virtual-network/) or using a technology solution such as ExpressRoute (https://azure.microsoft.com/en-us/services/expressroute/) for more guaranteed connection and bandwidth.

Whatever the networking infrastructure solution chosen, Logic Apps provide a number of ways to connect together on-premises and cloud-based assets to provide ongoing business benefit.

In scenarios where an on-premises application has the capability to call out to an Internet-hosted service, Logic Apps can expose an HTTP endpoint that can be used to initiate an instance of a workflow.

When real-time messaging is not required, an Azure Service Bus queue or topic can be used as a store and forward mechanism to a Logic App via a provided connector.

In more complex on-premises workflow scenarios, where message transformation and orchestration is required before invoking any cloud-based workflow service, an on-premises messaging and orchestration broker, such as BizTalk Server, can be used since with BizTalk Server 2016 it is possible to create a connection between BizTalk Server and a Logic App.

The discussion of BizTalk Server connectivity and Logic Apps is beyond the scope of this chapter, but it is discussed later in the book.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset