Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Best Way to Pass Multiple Blob Inputs to a QueueTrigger Azure Function

The Problem

Upon a trigger, generate 3 XML files and once complete, ftp them to a site.

The Current Approach

I have a HTTP Trigger Azure function that when run will construct 3 XML files and save these to an Azure Storage Blob container. Because of the multiple outputs, and the need to control the output path/filenames, I use the imperative binding approach and use the IBinder outputBinder in my function. This all works just fine. An example of the output path in the blob storage is export/2017-03-22/file-2017-03-22T12.03.02.54.xml. The file is in a folder with the date, and each filename has the time stamp to ensure uniqueness.

When all 3 files are generated, I want to trigger another function that will sFTP these to a site. Now I initially thought that I should use a blob trigger, but I couldn't figure how how to trigger on inputs that whose filenames and paths were dynamic. I coudldn't find such an example in the blob trigger documentation.

So then I thought I could have my HTTP Trigger output to a declarative binding and also output the XML files into an outgoing container in my blob storage which my blob trigger could be looking at. This also works however because my function is on the consumption plan, there can be up to a 10-minute day in processing new blobs.

So the documented alternative is to use a queue trigger. I can output to my queue and have the queue trigger just fine, but how do I also pass the 3 XML streams to my QueueTrigger function?

I suppose as a fall back, I can post an object that can contain the Azure Storage paths of the constructed XMLs and then use the Storage SDK to fetch the streams and use that to post to the FTP, but would it be more efficient to also pass those Storage Blob streams as an input to my QueueTrigger?

like image 758
flyte Avatar asked Mar 24 '17 04:03

flyte


People also ask

Can a azure function have multiple triggers?

There are no plans to support multiple triggers per Function. You will have to create a Function for each EventHub. If there is common code that may be shared between Functions, you may move them to a helper method that can be called from each Function.

How many triggers can an azure function have?

Note that because any Azure Function can have exactly only one trigger as specified, the trigger type you intend to use is what differentiates your Azure Functions when you create them in Visual Studio, so let's take a look at that.

How does Azure function blob trigger work?

The function is triggered by the creation of a blob in the test-samples-trigger container. It reads a text file from the test-samples-input container and creates a new text file in an output container based on the name of the triggered file.


1 Answers

I think your approach with Queue Trigger makes sense. I would construct a message like this

public class QueueItem
{
    public string FirstBlobPath { get; set; }
    public string SecondBlobPath { get; set; }
    public string ThirdBlobPath { get; set; }
}

and then use declarative binding in the queue processing function, something like

{
  "bindings": [
    {
      "type": "queueTrigger",
      "name": "item",
      "direction": "in",
      "queueName": "myqueue",
      "connection":"...",    
    },
    {
      "type": "blob",
      "name": "file1",
      "path": "mycontainer/{FirstBlobPath}",
      "connection": "...",
      "direction": "in"
    },
    {
      "type": "blob",
      "name": "file2",
      "path": "mycontainer/{SecondBlobPath}",
      "connection": "...",
      "direction": "in"
    },
    {
      "type": "blob",
      "name": "file3",
      "path": "mycontainer/{ThirdBlobPath}",
      "connection": "...",
      "direction": "in"
    }
  ],
  "disabled": false
}

and the function

public static void Run(QueueItem item, Stream file1, Stream file2, Stream file3)
{
}
like image 144
Mikhail Shilkov Avatar answered Oct 13 '22 00:10

Mikhail Shilkov