Scenario

I host a client’s application in an Azure subscription that I own including a Sql Azure database. The client wanted to know whether it would be possible for them to have a regular backup of their data sent to them for contingency’s sake. It’s worth noting that Azure Sql performs daily backups and full weekly backups OOTB but this is more about mitigating the risk of loss of service by transferring some of the valuable assets into their own domain.

To do this, I want to automate a weekly export of the client database (to be discussed at a later date) into blob storage and then automate the transfer of that blob over to the client’s account. It’s important to note that, as the client has subscribed to Office 365, they already have an Azure account so we can set up a blob container for them in their domain which can be access via a url.

Plan

I want to use Azure Functions for the blob transfer as firstly, it decouples the work from any web application and can be hosted in its own plan, and secondly, it allows me to make use of the triggers and bindings that make Azure Functions so appealing for both automation and integration purposes. I may well consider experimenting with alternative approaches using Logic Apps or WebJobs in the future but will focus for now on Azure Functions. I’ll also look at improvements that can be made in the areas of security (i.e. replacing the use of account keys) and automation (scripts for creating resources without the need for portal access).

Requirements & Scope

  1. Project handles the transfer of a blob from a source container in one Azure account to a target container in another Azure Account
  2. Source must be a blob and target account should be a blob storage account
  3. Target should set blob to cool tier access (for cheaper storage against an expected drop in access requests)
  4. Target container should be created if it does not already exist
  5. Copy should overwrite blobs of the same name
  6. Copy should be automated on creation or update of a blob in the source container

Source Code

All the source for this project can be found on GitHub:

https://github.com/agiletea/azure-blobstoragetransfer

Test Coverage

Azure functions are best used as lightweight automation solutions and, as such, in my opinion, are not best placed to perform complex chains of tasks. That said, there is no reason why they should not be subject to unit tests but this can cause a few headaches due to the dependency on many of the binding parameters you need to work on being hard to mock.

When it comes to Blobs, there are several properties that used sealed classes with read-only properties meaning that you cannot mock these properties. As a result, certain methods used against blobs can fail unit tests either because null reference exceptions are encountered, or certain states are never set to all code execution to complete. Here a couple of examples:

CopyState

This is often used when performing a copy task and you need to verify the task has completed before continuing execution of the code. All the public properties in this class are read-only along with a single public parameterless constructor.

AccountProperties

This is often used when determining if the target account supports certain actions, such as setting the blob access tier. Again, all the public properties in this class are read-only along with a single public parameterless constructor.

Being able to unit test my copy function without being able to mock these properties makes life difficult to the best remedy for now is to consider where they fit within the function and if they can be refactored out into a separate unit of logic that itself can be then mocked. This then brings up the question of dependency injection within Azure Functions and this is quite a hot topic with changes in the pipeline to improve the DI experience. At the time of writing, one option is to use the WebJobsStartup class registration and implement IWebJobsStartup on a bootstrapper class to create an injectable service collection:

[assembly:WebJobsStartup(typeof(Startup))]
namespace BlobStorageTransfer
{
    [ExcludeFromCodeCoverage]
    public class Startup : IWebJobsStartup
    {
        public void Configure(IWebJobsBuilder builder)
        {
            builder.Services.AddLogging(logging =>
            {
                logging.AddFilter(level => true);
            });

            builder.Services.AddTransient<IBlobCopyService, BlobCopyService>();
        }
    }
}

The BlobCopyService can then take responsibility for performing an async copy and not returning until complete, thereby encapsulating the dependency on checking the CopyState. It can also be responsible for setting the access tier on the newly create blob in a separate method, encapsulating the dependency on AccountProperties. This then allows me to mock this service for unit testing the Azure Function itself.

So that I can inject my dependencies into a constructor on my Function class, I need to change both the class and the RunAsync method to no longer be static.

private readonly IBlobCopyService blobCopyService;
        
public CrossAccountBlobTransfer(IBlobCopyService blobCopyService)
{
     this.blobCopyService = blobCopyService;
}

Copy Permissions

Using the account keys for both source and target containers lead me to initially believe that I would be able to call StartCopyAsync on my target blob and simply pass in the source blob and all would be fine. However, my first test failed with a Resource not found error suggesting the function could not access the source blob. I then realised that the copy method does not actually use the function host as the intermediary to perform the copy. Instead, the copy is completed within the cloud under the context of the target blob and this does not have automatic rights to do this.

As a result, I needed to create a temporary Shared Access Signature for the source blob with read permissions for a short period of time (default to 5 minutes). This can then be used to create shared access uri for the source blob to pass into the copy method.

I also tripped up when setting the target blob access tier to cool as I had not paid attention to the storage account I created for testing and allows it to default to General Storage v1. This does not allow you to set blob access tier levels on it as covered in my post here.

Conclusions

Overall, Azure Functions did show promise in their simplicity to allow very little code to react to any one of a number of events, provide context and allow subsequent actions to be performed. The complexity comes the more you need to do and then it’s split between the area in which you are working – in this case, blob storage and understanding the requirements of working with blobs, as well as the requirements of being able to turn an Azure Function development solution into a robust, testable piece of work that can be improved on for production environments.

I’m keen to hear more on dependency injection for Azure Functions as version 3 comes out, as well as consider alternatives both within Azure Functions itself as well as other Azure products.


Ben

Certified Azure Developer, .Net Core/ 5, Angular, Azure DevOps & Docker. Based in Shropshire, England, UK.

2 Comments

Automating Azure Sql Export – jones.busy · 17th March 2019 at 3:25 pm

[…] on from Part 1 of this mini project to automatically transfer a client’s database across to their domain, […]

Is this a Microservice – jones.busy · 25th March 2019 at 6:56 pm

[…] simple and determine whether or not it could be described as a Microservice. Take my previous posts about a requirement to automatically transfer an export of a client’s data across to their […]

Leave a Reply to Is this a Microservice – jones.busy Cancel reply

Your email address will not be published. Required fields are marked *

%d bloggers like this: