Azure Functions develop locally with input and output bindings - c#

Looking at the examples for developing Azure Functions in Visual Studio 2017 and can see that a new function template can be set up with a trigger.
So for a queue, the template would be the following:
using System;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Host;
namespace FunctionApp1
{
public static class Function1
{
[FunctionName("QueueTriggerCSharp")]
public static void Run([QueueTrigger("myqueue-items", Connection = "QueueStorage")]string myQueueItem, TraceWriter log)
{
log.Info($"C# Queue trigger function processed: {myQueueItem}");
}
}
}
Are you able to add and run other input and output bindings locally such as:
using System;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Host;
namespace FunctionApp1
{
public static class Function1
{
[FunctionName("QueueTriggerCSharp")]
public static async Task Run([QueueTrigger("myqueue-items", Connection = "QueueStorage")]string myQueueItem, CloudTable inputTable, IAsyncCollector<string> outputEventHubMessages, TraceWriter log)
{
log.Info($"C# Queue trigger function processed: {myQueueItem}");
TableQuery<TableEntity> query = new TableQuery<FailedEventEntity>().Where(
TableQuery.GenerateFilterCondition("PartitionKey", QueryComparisons.Equal, "helloWorld"));
List<TableEntity> entities = inputTable.ExecuteQuery(query).ToList();
await outputEventHubMessages.AddAsync(myQueueItem);
}
}
}
Do they need to be configured in local.settings.json?

Sure thing you are. You need to decorate them with attributes too:
[Table("table-name")] CloudTable inputTable,
[EventHub("event-hub-name")] IAsyncCollector<string> outputEventHubMessages
The config values for local environment will be taken from local.settings.json indeed, so you need to add them there (connection strings etc).

For anyone looking for information about function binding attributes:
https://learn.microsoft.com/en-us/azure/azure-functions/functions-dotnet-class-library
And a completed example from my question:
Function1.cs
using System;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Host;
using Microsoft.Azure.WebJobs.ServiceBus; // INCLUDE THIS FOR EVENT HUB ATTRIBUTE
namespace FunctionApp1
{
public static class Function1
{
[FunctionName("QueueTriggerCSharp")]
public static async Task Run([QueueTrigger("myqueue-items", Connection = "QueueStorageConnectionString")]string myQueueItem, [Table("tableName", Connection = "StorageAccountConnectionString")]CloudTable inputTable, [EventHub("eventHubName", Connection = "EventHubConnectionString")]IAsyncCollector<string> outputEventHubMessages, TraceWriter log)
{
log.Info($"C# Queue trigger function processed: {myQueueItem}");
TableQuery<TableEntity> query = new TableQuery<FailedEventEntity>().Where(
TableQuery.GenerateFilterCondition("PartitionKey", QueryComparisons.Equal, "helloWorld"));
List<TableEntity> entities = inputTable.ExecuteQuery(query).ToList();
await outputEventHubMessages.AddAsync(myQueueItem);
}
}
}
local.settings.json
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "your_storage_account_connection_string",
"AzureWebJobsDashboard": "your_storage_account_connection_string",
"QueueStorageConnectionString": "your_queue_storage_connection_string"
"StorageAccountConnectionString": "your_storage_account_connection_string"
"EventHubConnectionString": "your_event_hub_connection_string"
}
}

#Chris: This is strange, "my" version of EventHubAttribute doesn't have a Connection property. I am using Microsoft.Azure.WebJobs.ServiceBus 2.0.0.
What version are you using? As far as I can see, the latest available version is 2.0.0.

Related

Azure Function Event Hub Output Binding not working when deployed

I am using an Azure Function to get messages from a Rabbit MQ Broker to an Event Hub.
The function works perfect when I run it locally.
Here is the code of the function:
using System.Text;
using System.Dynamic;
using System.Threading.Tasks;
using CaseOnline.Azure.WebJobs.Extensions.Mqtt;
using CaseOnline.Azure.WebJobs.Extensions.Mqtt.Messaging;
using Microsoft.Azure.WebJobs;
using Newtonsoft.Json;
public static class Test
{
[FunctionName("EventHubOutput")]
public static async Task Run(
[MqttTrigger("topic/#")] IMqttMessage message,
[EventHub("eventhubname", Connection = "EventHubConnectionAppSetting")] IAsyncCollector<string> outputEvents,
ILogger log)
{
var body = message.GetMessage();
var bodyString = Encoding.UTF8.GetString(body);
dynamic obj = JsonConvert.DeserializeObject<ExpandoObject>(bodyString);
obj.Topic = message.Topic;
await outputEvents.AddAsync(JsonConvert.SerializeObject(obj));
}
}
When deployed and run in the Azure portal, I get the following error messages:
Microsoft.Azure.WebJobs.Host.FunctionInvocationException: Exception while executing function: EventHubOutput
---> System.InvalidOperationException: Error while handling parameter outputEvents after function returned:
---> System.Net.Sockets.SocketException (0xFFFDFFFF): Name or service not known
at (...)
Any idea what the problem might be?
Thank you.
You are using the bindings incorrectly. Check out RabbitMQ bindings for Azure Functions overview.
The following example shows a C# function that reads and logs the RabbitMQ message as a RabbitMQ Event:
[FunctionName("RabbitMQTriggerCSharp")]
public static void RabbitMQTrigger_BasicDeliverEventArgs(
[RabbitMQTrigger("queue", ConnectionStringSetting = "rabbitMQConnectionAppSetting")] BasicDeliverEventArgs args,
ILogger logger
)
{
logger.LogInformation($"C# RabbitMQ queue trigger function processed message: {Encoding.UTF8.GetString(args.Body)}");
}
The following example shows how to read the message as a POCO:
namespace Company.Function
{
public class TestClass
{
public string x { get; set; }
}
public class RabbitMQTriggerCSharp{
[FunctionName("RabbitMQTriggerCSharp")]
public static void RabbitMQTrigger_BasicDeliverEventArgs(
[RabbitMQTrigger("queue", ConnectionStringSetting = "rabbitMQConnectionAppSetting")] TestClass pocObj,
ILogger logger
)
{
logger.LogInformation($"C# RabbitMQ queue trigger function processed message: {pocObj}");
}
}
}
I recommend you to check out this complete guide to setup Rabbit MQ Trigger in Azure Functions: RabbitMQ trigger for Azure Functions overview

How to set blob properties in an Azure Function?

This example shows how to set blob properties such as ContentType using C#. How can this be done in the following Azure Function? The method signature does not use a CloudBlob object, but rather a Stream object to read the blob.
[FunctionName("MyFunction")]
public static async Task Run([BlobTrigger("container-name/folder-name/{name}", Connection = "ConnectionString")]Stream myBlob, string name, ILogger log, Binder binder)
{
// How to change the ContentType property?
}
Please use the code below(I'm using visual studio 2017, and create function v2):
using Microsoft.Azure.WebJobs;
using Microsoft.Extensions.Logging;
using Microsoft.WindowsAzure.Storage.Blob;
namespace FunctionApp3
{
public static class Function1
{
[FunctionName("Function1")]
public static void Run([BlobTrigger("container-name/folder-name/{name}", Connection = "AzureWebJobsStorage")]ICloudBlob myBlob, string name, ILogger log)
{
log.LogInformation("...change blob property...");
//specify the property here
myBlob.Properties.ContentType = "text/html";
//commit the property
myBlob.SetPropertiesAsync();
}
}
}

Azure WebJob BlobTrigger - Output Blob makes trigger not fire

I have a v3 WebJob that successfully fires when my function method signature is as follows:
public static void ProcessQueueMessage(
[BlobTrigger("process/{name}", Connection = "storage-connection")] Stream blob,
ILogger log
)
However when I add an output blob the BlobTrigger never fires.
public static void ProcessQueueMessage(
[BlobTrigger("process/{name}", Connection = "storage-connection")] Stream blob,
[Blob("output/{name}", FileAccess.Write, Connection = "storage-connection")] Stream processedBlob,
ILogger log
)
The documentation I'm following is here: https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-storage-blob#output
If you want to use Azure WebJob BlobTrigger and also use a Output binding, you can follow my steps.
In my side, it works fine.
1.create a console app and install everything needed. You can follow by this doc. This will tell you how to use the WebJob SDK.
This is my code:
Functions.cs:
using Microsoft.Azure.WebJobs;
using Microsoft.Extensions.Logging;
using System.IO;
namespace ConsoleApp1
{
public class Functions
{
public static void ProcessQueueMessage(
[BlobTrigger("images/{name}")]Stream myBlob,
[Blob("form/{name}", FileAccess.Write)] Stream imageSmall,
string name,
ILogger log
)
{
log.LogInformation("webjobs blob trigger works!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!");
}
}
}
Program.cs:
using System;
using Microsoft.Extensions.Hosting;
using Microsoft.Extensions.Logging;
namespace ConsoleApp1
{
class Program
{
static void Main(string[] args)
{
var builder = new HostBuilder();
builder.ConfigureWebJobs(b =>
{
b.AddAzureStorage();
b.AddAzureStorageCoreServices();
});
builder.ConfigureLogging((context, b) =>
{
b.AddConsole();
});
builder.ConfigureWebJobs(b =>
{
b.AddAzureStorageCoreServices();
b.AddAzureStorage();
});
var host = builder.Build();
using (host)
{
host.Run();
}
}
}
}
appsettings.json:
{
"AzureWebJobsStorage": "DefaultEndpointsProtocol=https;AccountName=bowmanimagestorage02;AccountKey=xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxSAHfYi1d2yc+BU3NG9hkbGEPU/lJP5wtqYZ3pdDq1lGEkdUx7w==;EndpointSuffix=core.windows.net"
}
You can find that I have already add the output binding in the blobtrigger, Just follow the doc you give. I upload a image to the images container and the console show the loginformation, the image also been upload to the form container.
Things works in my side, if you have more questions, please show more details. I hope my answer can give you some help.

How to catch an Exception throw by Azure Table in an async Azure Function HTTP Triggered function

I have an Azure Function HTTP triggered function which writes to Azure Table that may end in duplicated entries. I noticed that even if I try/catch'd the whole function, there will still be an Exception "leaked" to the function runner thus returning HTTP 500. Is there any way to catch this kind of exception?
Here's a minified version of the code:
using System.Threading.Tasks;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.AspNetCore.Http;
using Microsoft.Extensions.Logging;
using Microsoft.WindowsAzure.Storage;
namespace FunctionTest
{
public class Entry
{
public string PartitionKey { get; set; }
public string RowKey { get; set; }
}
public static class Debug
{
[FunctionName("Debug")]
public static async Task<IActionResult> Run(
[HttpTrigger(AuthorizationLevel.Anonymous, "get", "post", Route = null)]
HttpRequest req,
[Table("Debug")]
IAsyncCollector<Entry> tableBinding,
ILogger log)
{
try
{
await tableBinding.AddAsync(new Entry()
{
PartitionKey = "1111",
RowKey = "1111",
});
await tableBinding.FlushAsync();
}
catch (StorageException)
{
// we expect an Exception "The specified entity already exists"
return new OkObjectResult("This passes test");
}
return new OkObjectResult("This passes test too");
}
}
}
The code is written under Azure Function runtime 2.0 (the .NET Core one).
Trigger /api/debug twice or more and you will see:
HTTP 500
The catch{} code is entered, and still returns an HTTP 500(!)
In Application Insights, two table dependency call per request (shouldn't happen, the documentation says table do not have auto retry)
I guess, that using IAsyncCollector<> breaks things here. If you want to avoid such problems, try to exchange the following binding:
[Table("Debug")] IAsyncCollector<Entry> tableBinding
to:
[Table("Debug")] CloudTable tableBinding
Then, instead of using tableBinding.AddAsync() use the following snippet:
var op = TableOperation.Insert(new Entry());
await tableBinding.ExecuteAsync(op);
With that approach, you should be able to catch the exception, without leaking it to the Functions runtime.
Your try/catch block should look like following to catch all errors
try
{
}
catch (StorageException)
{
return new OkObjectResult("This passes test");
}
catch (Exception ex)
{
// return different error code
}

How do you track custom events from Azure WebJobs in Application Insights?

I have an Azure WebJobs (v2.2.0) project that I would like to monitor with Application Insights (AI), and there are events that I would like to be able to track. In a normal web app that is configured to use AI you can just use this:
TelemetryClient tc = new TelemetryClient();
tc.TrackEvent("EventName");
However this seems not to work in the context of a WebJob! I have configured my WebJob project as per the instructions on the WebJob SDK repo which ends up looking like this:
Program
using System.Configuration;
using System.Diagnostics;
using Microsoft.Azure.WebJobs;
using Microsoft.Extensions.Logging;
namespace WebJobs
{
public class Program
{
public static void Main()
{
JobHostConfiguration config = new JobHostConfiguration();
config.UseTimers();
using (LoggerFactory loggerFactory = new LoggerFactory())
{
string key = ConfigurationManager.AppSettings["webjob-instrumentation-key"];
loggerFactory.AddApplicationInsights(key, null);
loggerFactory.AddConsole();
config.LoggerFactory = loggerFactory;
config.Tracing.ConsoleLevel = TraceLevel.Off;
if (config.IsDevelopment)
config.UseDevelopmentSettings();
JobHost host = new JobHost(config);
host.RunAndBlock();
}
}
}
}
Functions
This is just a test function that will run every minute for half an hour.
using Core.Telemetry;
using Microsoft.ApplicationInsights;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.Timers;
using System;
using System.Collections.Generic;
namespace WebJobs.Functions
{
public class TestFunctions
{
public void TelemetryTest([TimerTrigger(typeof(Schedule))] TimerInfo timer)
{
TelemetryClient tc = new TelemetryClient();
tc.TrackEvent("TelemetryTestEvent");
}
// schedule that will run every minute
public class Schedule : DailySchedule
{
private static readonly string[] times =
{
"12:01","12:02","12:03","12:04","12:05","12:06","12:07","12:08","12:09","12:10",
"12:11","12:12","12:13","12:14","12:15","12:16","12:17","12:18","12:19","12:20",
"12:21","12:22","12:23","12:24","12:25","12:26","12:27","12:28","12:29","12:30"
};
public Schedule() : base(times) { }
}
}
}
This seems to partially work in that I can see some telemetry in AI but not the custom events. For example I can see a Request show up each time TestFunctions.TelemetryTest() runs and various Traces during the initialisation of the WebJob.
I have probably not configured something properly or am not getting the TelemetryClient in the correct manner, but I cannot find any documentation on tracking custom events in WebJobs.
Any help would be appreciated.
Try setting the instrumentationkey explicit:
tc.Context.InstrumentationKey = "<your_key>";
According to the docs you should be able to get the key using
System.Environment.GetEnvironmentVariable(
"APPINSIGHTS_INSTRUMENTATIONKEY", EnvironmentVariableTarget.Process)
if you have set up application insights integration.

Categories