Building Azure Functions

Azure Functions are created in the main Azure portal in a Resource Group in the same way as other resources.

When creating a Function App, the App Service plan needs to be selected. This can be either Classic, which allows the creation of a new App Service Plan or reuse of an old one, or Consumption based to use pay-as-you-go and on demand processing.

Building Azure Functions

The Function App takes a few minutes to be created, but once created, functions can be added to perform the tasks required.

The Function App has a number of settings that can be used to configure it once it has been created.

Building Azure Functions

The settings page provides options for setting a daily usage quota for the Function App, accessing development resources including app settings such as connection strings, setting up Continuous Integration, configuring Authentication/Authorization and cross-origin resource sharing and providing a link to a URL that contains the Swagger API definition for a Function App that contains HTTP triggers.

Building Azure Functions

The final important option on the settings page provides access to the App Service settings, which allows further configuration of other application configuration settings.

Building Azure Functions

The App Service Settings page is consistent with the normal App Service settings that are part of any App Service Plan used by Web Apps, API Apps, Mobile Apps, or Logic Apps.

Building Azure Functions

The Function App can be configured at any time after creation.

Creating a function

The first step to create a new function within the Function App is to click New Function.

Creating a function

Clicking on New Function displays a set of templates that can be filtered to provide guidance on the best function for your needs. In types, it is possible to select samples that can be used to create a fully coded sample to help with getting started. For our Sunny Electricals scenario later in the chapter, we will use a C# function that is called from a Logic App.

To get a better understanding of Azure Functions, it is useful to start with simple example. A Timer Trigger function runs on a schedule defined by a Cron expression and is a good way to look at the overall functionality of an individual function.

Note

For more information on Cron expressions, refer to: https://en.wikipedia.org/wiki/Cron.

First, we need to give the function a name, and this has to be unique within the Function App due to the way a Function App is structured. This will be discussed later in the chapter.

Creating a function

The Cron expression mentioned earlier (0/10 * * * * *) triggers the function to run every 10 seconds. Clicking on Create creates the function.

The function code is displayed after creation and can be edited and saved directly in the browser. For certain function types, it is possible to test the function by passing in a sample message and clicking the Run button.

Creating a function

A log stream that shows function invocations and provides immediate feedback on the state of the function is displayed by clicking on the Logs link. This can be useful when testing the function and making code changes. After a function is saved, there is immediate feedback in the log stream to show any compilation errors.

Creating a function

The files related to a specific function can be viewed and edited directly from the function by clicking on the View Files link.

It is possible to upload files or create new files as well as delete as required.

Creating a function

In order to understand why a function needs to have a unique name and what files make up a function, it is useful to look at the structure of a Function App and the files associated with them.

In this chapter, we will concentrate on C# functions and point out where there are differences for functions written that target Node.js.

The structure of a Function App

Since a Function App is just another part of an App Service, you are able to use the same and familiar tools to interact with it, either for development or management.

This includes, but is not limited to the following:

  • Advanced Tools: This is web-based interface that is accessed via an amended URL for the App Service
  • App Service Editor: This uses a browser implementation of Visual Studio that allows the user to both manage and develop function code

To access either of the tools, you need to first go to the App Service Settings from the Function App settings page.

The structure of a Function App

Because the experience is richer using the App Service Editor, we will concentrate on showing the functionality that contains. However, using Kudu also supports the same functionality but from a more console-driven perspective. Kudu is the engine that powers git deployments in Azure App Service (https://github.com/projectkudu/kudu/wiki).

The structure of a Function App

When functions are created within a Function App, an individual folder is created under the website within the App Service. This is the reason why a function name has to be unique within a Function App instance, but it is possible to reuse function names in different instances.

Code can be edited directly in the browser using the App Service Editor, new files can be uploaded or created, and it is possible to download the entire workspace.

A C# function contains the following files:

  • run.csx: This contains the code for the function and closely follows the rules of C# coding with a few exceptions that are described in detail later when we add some complexity to our function
  • function.json: This contains the definition for the function, including any integrations
  • project.json: This contains the NuGet packages that need to be restored for the function and is optional

A Node.js function contains the following files:

  • index.js: This contains the code for the function
  • function.json: This contains the definition for the function, including any integrations
  • package.json: This contains the npm packages that need to be restored for the function and is optional
  • node_modules: This folder contains the modules required by the function code and is optional

It is possible to share code between functions to help promote reuse. To do this, a folder should be created off of the root of the Function App to store the code files, so they can be loaded in to individual functions. We will discuss this briefly later in the chapter.

The function.json file contains all the configuration information required for an individual function. It can be accessed by going to the Integrate tab and clicking on Advanced editor. In our simple example, the file contains the following code:

{ 
  "bindings": [ 
    { 
      "name": "myTimer", 
      "type": "timerTrigger", 
      "direction": "in", 
      "schedule": "0/10 * * * * *" 
    } 
  ], 
  "disabled": false 
} 

The information this contains is related only to the configuration, integration, and operation of the function, it does not contain any code. We will look at a more complex example when we look at how we extend the function to integrate with other sources.

Adding complexity

Any real-world scenario will contain more complex code and will need to include referenced libraries, package restores, and shared code.

A Function App supports a number of .NET assemblies out-of-the-box both directly and indirectly. For directly addressable assemblies, there is no need to add a reference to the assembly within the code of the function although as usual providing using statements reduces coding effort.

The following is a list of assemblies that are automatically imported:

  • System
  • System.Collections.Generic
  • System.IO
  • System.Linq
  • System.Net.Http
  • System.Threading.Tasks
  • Microsoft.Azure.WebJobs
  • Microsoft.Azure.WebJobs.Host

There are a number of external assemblies that are automatically added by the Azure Functions runtime:

  • mscorlib
  • System
  • System.Core
  • System.Xml
  • System.Net.Http
  • Microsoft.Azure.WebJobs
  • Microsoft.Azure.WebJobs.Host
  • Microsoft.Azure.WebJobs.Extensions
  • System.Web.Http
  • System.Net.Http.Formatting

There are a number of assemblies that are special cases and do not require the full filename:

  • Newtonsoft.Json
  • Microsoft.WindowsAzure.Storage
  • Microsoft.ServiceBus
  • Microsoft.AspNet.WebHooks.Receivers
  • Microsoft.AspNet.WebHooks.Common

External assemblies are loaded differently to the way we would normally reference them in an environment such as Visual Studio. In order to add a reference to an external assembly you need to use #r "[Assembly Name]", for example, #r "Newtonsoft.Json".

To promote reuse and to provide opportunities for source control and proper application life cycle management, it is possible to add user-created assemblies. A user-created assembly should be uploaded to the bin folder relative to the root folder of the function, and it can then be referenced using the filename, for example, #r "MyCompany.Model.Objects.dll".

An external file is used to satisfy the need of performing a package restore. For C# functions using NuGet as the package manager, a project.json file is placed in the root folder of the function. For Node.js functions using npm as their package manager, this is replaced by the packages.json file.

For example, to restore the Emotion API code from Cognitive Services, the following project.json file would be used, note the dependency of .NET 4.6:

{ 
  "frameworks": { 
    "net46":{ 
      "dependencies": { 
        "Microsoft.ProjectOxford.Emotion": "1.0.251" 
      } 
    } 
   } 
} 

Assemblies that are provided by package restore do not need to be separately referenced using #r as they are included by default.

The final technique to show is how to share code between functions. Code can be written in the .csx files as normal and stored in a folder within the Function App that is easily referenceable from individual functions, for instance, in the same folder or subfolder of the function, or a separate folder under the Function App.

Once the code is complete, the #load "[File name]" command is used to include the file and code within the current function. To reference a file, you need to specify the relative path to the file within the folder structure of the Function App, for example, #load "..Sharedommon.csx".

Adding integration

Now that we know how to add references and reuse code, we can create a more realistic solution using these techniques and external integration.

Integration is added by clicking on the Integrate tab in the function designer.

Adding integration

For our example, we will add a storage queue and send a message. We will use references and external code to show how to bring together some of the ideas detailed so far through the chapter. First, we need to add a New Output.

Adding integration

We then need to choose the type of external integration we would like to use, so we select Azure Storage Queue.

Adding integration

The designer provides the opportunity to update the default information for a queue and to select the storage account, in this case, that the integration will use.

Adding integration

If a queue does not exist with the chosen name, one will be created when the function first writes a message.

Adding integration

The process automatically creates a connection string to the storage account or any other integration, in the application settings of the Function App instance.

Finally, clicking on Save finishes the task of adding the integration.

Adding integration

We need to update our code. First, we create a file, common.csx, in the shared folder of our Function App that will contain reusable code. We will only use this code once, but we will show how easy it is to share code between functions.

public class Message 
{ 
    public string msg; 
    public DateTime msgtime; 
} 

This file needs to be included in our Timer Trigger function using the #load directive described previously. We will serialize our message to JSON before sending to the queue using the Newtonsoft.Json assembly referenced using the #r directive.

#r "Newtonsoft.Json" 
#load "..Sharedcommon.csx" 
 
using System; 
 
public static void Run(TimerInfo myTimer, out string outputQueueItem, TraceWriter log) 
{ 
    log.Info($"C# Timer trigger function executed at: 
    {DateTime.Now}");     
     
    var msg = new Message 
    { 
      msg = "From trigger", 
      msgtime = DateTime.UtcNow 
    }; 
     
    outputQueueItem = 
    Newtonsoft.Json.JsonConvert.SerializeObject(msg); 
} 

Note that we have referenced the Newtonsoft.Json assembly using the simple name without the extension. We can do this as it is one the special cases as mentioned previously in the chapter.

Also, note that the method signature has been changed to include an out parameter. The parameter name is the name we chose as the message parameter name when we created the integration to the storage queue.

We create the new Message object based on the class that is defined in our external file that is loaded into the function with the use of the #load directive.

When we click on the Save button, the code is checked and compiled ready for execution. Any errors found are displayed in the log stream window:

Adding integration

Once the code is saved, the next invocation of the function picks up changes immediately and we are able to see the effects of the changes. In this case, the effect is that we should see messages being written to our queue, which should have been created as it did not previously exist.

To look at our queue, we can use Microsoft Azure Storage Explorer (http://storageexplorer.com/), which is a free tool that can be downloaded for most operating systems.

Adding integration

Previously, we looked at the function.json file that is created for a very basic function. Once we have added integration, the file contains more information that defines the structure, integration, and configuration of the function:

{ 
  "bindings": [ 
    { 
      "name": "myTimer", 
      "type": "timerTrigger", 
      "direction": "in", 
      "schedule": "0/10 * * * * *" 
    }, 
    { 
      "type": "queue", 
      "name": "outputQueueItem", 
      "queueName": "outqueue", 
      "connection": "function1f982b298769_STORAGE", 
      "direction": "out" 
    } 
  ], 
  "disabled": false 
} 

We can see that our integration point is defined by type, name, the name of the connection string, and a direction. In cases where more configuration is required, for example, output to DocumentDB, the information contained is more extensive.

Because we now have a trigger that is placing messages on to a queue, it would be useful to create a function that reads these messages. This would be similar to a real-world example where some form of asynchronous process uses a secondary store to offload processing and provide better scalability.

For this, we need to create a function that uses a queue trigger because this will then process the messages.

Again, we click on New Function in the Function App—this time by choosing a Queue Trigger. We give the function a name and need to provide the storage queue and the storage connection string, which has been saved in the App Service application settings.

Adding integration

Once created, the function will read messages off the queue and just log the output to the log stream as the function at creation time is very simple. If we examine Microsoft Azure Storage Explorer, we would expect to see an empty queue because when the Timer Trigger function puts a message into the queue, the Queue Trigger picks it up.

If we look at the log stream, we can see that the function is working successfully.

Adding integration

Using this simple example, we have been able to show how easy it is to use referenced assemblies, bring in external code, and chain two functions together to create an event-driven pipeline.

Most examples of functions will follow this process. It is possible to mark function signatures as async when needing to call external code that is awaitable, see the Developer Reference for more details (https://azure.microsoft.com/en-us/documentation/articles/functions-reference-csharp/#async).

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset