Creating Avatars with Azure Functions

Azure Functions is a popular way to handle event-driven process without the need to think about how to connect input/output data, how to trigger action on a specific event (like the input of a data) and how to manage the infrastructure that will host application. With Azure Functions, Developers can focus on what is important, and that is coding. In this blog post, we will show how to create a simple image processing application where we upload to cloud data storage a picture that will trigger logic that will process the picture and return to the storage resized image with round corners and some additional logo. We will create and debug Azure Function on macOS (but you can follow the example on any platform), data will be stored on Azure Storage and the final product will be hosted on Azure inside of a Function App.

So, what do we need to get started? First, you will need an Azure subscription (if you don’t have one create it now). We will be using Visual Studio Code for this since there are some troubles with Function runtime on Visual Studio for Mac (https://github.com/Azure/azure-functions-host/issues/3418).  
Also, we will be using Azure Function Extension for Visual Studio Code that will help us to create a Function project and run it in debug mode on a local machine. Please follow this link for installation and additional documentation. For accessing Azure Storage from our local machine we will be using Azure Storage Explorer (https://docs.microsoft.com/en-us/azure/vs-azure-tools-storage-manage-with-storage-explorer?tabs=macos). On Windows machine, you can use Storage Emulator also for development purpose. Now let’s talk about the assets and libraries that we will be using through this example. For the image processing in our code, we will be using a library from SixLabors and we have taken beaver (App Beaver Adsy) logo image from http://www.stickpng.com

Now that we have everything that we need let’s get started. First, we will create a storage account on Azure. Login to your Azure Subscription and go to Azure Storage Accounts. You can click on All services and filter by “Storage Account” to find the service. Populate the project details with desired values and click Review + create. Once validation has passed click Create. Please check Azure Storage Documentation: https://docs.microsoft.com/en-us/azure/storage/to learn more about storage features. When the storage account is created go to the resource and go to Access keys section in Settings. You will need those keys for Azure Storage Explorer. Copy the Key 1 and open Storage Explorer. In Storage Explorer got to Add Account, select Use a storage account name and key, add account name and key 1 and connect to the store. Now you are connected to the Storage Account in Storage Explorer. Check the video below that show all the steps:

Let’s create in avatars storage account two blob containers named profile and templates. In profile container, we will store our input data (original profile image) and output data (avatar image). In templates, we will store logo that will be attached to avatar image. Create one additional folder inside of templates container and upload logo into this folder.

Now let’s create our function. We will use function extension in Visual Studio Code to create our project and to add the project to the workspace. Then we will create our Function from the Blob trigger template as our starting point. The video below demonstrates how to create Function project in VS Code

Now let’s adapt the code to fit our needs. You can get a complete code on GitHub https://github.com/markosaric/azurefunctions-avatars. We will use our blob container as a path to triggering the function and as an output container for the result of the function. This is the function method that we will use:

[FunctionName(“AvatarsProcessing”)]
public static void Run([BlobTrigger(“profile/{guid}/original.{extension}”,
Connection = “AzureWebJobsStorage”)]Stream original,
[Blob(“profile/{guid}/profile.png”, FileAccess.Write)] Stream avatar,
string guid,
Uri uri,
ILogger log)
{
                log.LogInformation($”Image processing for URI: {uri.AbsolutePath} started…”);
}

Observe the path “profile/{guid}/original.{extension}“. We have told the trigger to observe inside of profile container if there is any kind of file (we named this part as the extension ) with a filename “original” that is placed in a folder in root container profile (we named this part of the path as guid and it doesn’t matter what is called). Also, observe that we used guid parameter from the path as an input parameter. What does this mean for our function? For example, if we have the following path: 
profile/1152fd7f-a004-48cc-9822-7937737b7f81/original.jpg
that triggered the function, we will have 
1152fd7f-a004-48cc-9822-7937737b7f81 
value in guid input parameter of the function.  With the next parameter with Blob attribute, we have told the function that we will be creating a new blob file named profile.png from the stream variable avatar that will be placed into this path profile/{guid}. Guid parameter will be inserted automatically. Also, we are adding URI and logger as a parameter so that we could track logs in debugger console and also in Azure portal easily.  So now we have implemented the way to trigger the function when new data has arrived into azure storage and to store processed data into storage. But what to do with additional data? Precisely how to take the data for the logo? For this case we will be using dynamic binding. The implementation is quite simple, just inject the IBinder interface and bind to the desired location on the storage.

[FunctionName(“AvatarsProcessing”)]
public static void Run([BlobTrigger(“profile/{guid}/original.{extension}”,
Connection = “AzureWebJobsStorage”)]Stream original,
[Blob(“profile/{guid}/profile.png”, FileAccess.Write)] Stream avatar,
string guid,
Uri uri,
IBinder binder,
ILogger log)
{
                log.LogInformation($”Image processing for URI: {uri.AbsolutePath} started…”);
                Streamlogo = binder.Bind<Stream>(newBlobAttribute($”templates/logos/beaver.png”, FileAccess.Read));
}

And now the fun part. Let’s implement the processing logic. First we will create a interface IImageProcessing that will have the method definition for processing image into an avatar. The metod will return stream and it will accept original and logo image stream as input parameters. Now let’s create an object that will implement the interface. We will need the ImageSharp references from the SixLabors so let’s include them into our project. You can add the package reference direct in the .csproj file or you can do it through terminal.

We will create some extension method for SixLabors image processing references to use with our image processing.
Add SixLaborsHelpers static class and add the implementation for building round corners and applying it to the avatar.
Finally, create a ImageProcessing class and implement the processing.
To inject my IImageProcessing implementation in the function I will use DI implementation described in the following blog post. So let’s add then the NuGet package by using this command:  

dotnet add package AzureFunctions.Autofac –version 3.0.5

Let’s create DIConfig class and register our interface to an implementation And as last part, inject the interface into function and call the ProcessToAvatar method. This is the final implementation of the AvatarsProcessing class. Now let’s check if everything is working as we expect.  We will run our code in debug mode and add the image through storage explorer into Azure Storage Container.

Code is working as expected and our last step is to deploy the function to Azure. We can do this through Visual Studio Code and after deployment to Azure our Function is ready to go.