Azure Function BlobTrigger doesnt capture all incoming events - c#

with a very simple Azure Function program, I will test Azure Event Grid. My goal is if a file is uploaded to Storage account, then my Azure function should be triggered and log a message like Hello World. I have this block of code in my Azure Function:
namespace BlobTrigger
{
public static class BlobEventGrid
{
[FunctionName("BlobEventGrid")]
public static void Run([EventGridTrigger]JObject eventGridEvent,
[Blob("{data.url}", Connection = "BlobConnection")] string file,
TraceWriter log)
{
log.Info("Hello World");
}
}
}
I have set my Event grid from this article.
If I upload more than 50 file to my container, and then control the Live Metrics of my Azure Functions I can only see 4 incoming events:
By controlling metrics in event subscription, I see such metrics:
Delivered Events:66
Matched Events: 51
Do you have any Idea, why I only 4 events are tracked by my azure functions?

Related

Run program.cs file of console application in Azure Function?

I have a console application which I want to convert to an Azure Function Timer Trigger app which will run every hour after some data processing and uploads are done. The data processing and uploads are being done via classes which are injected in the program.cs file of the console application. Somewhere in the classes I have a task.delay by 1hour where it will query new data after the data has been queried and uploaded for the first time. So, I copied the entire code of the console application with its packages to the Azure Function Timer trigger app. What I am trying to do is to run the program.cs file of the console application first in the azure function app in order to do its job (data processing, querying data, uploading data to azure...). and then initiate the timer trigger. Is that doable ? What line of code can I add in the run method of the azure function app to execute the program.cs file first and then initiate the trigger. You can find here the startup code of the azure function time trigger app.
using System;
using Microsoft.Azure.WebJobs;
using Microsoft.Extensions.Logging;
namespace ExportServiceFunctionApp
{
public static class ExportServiceFunctionApp
{
[FunctionName("ExportServiceFunctionApp")]
public static void Run([TimerTrigger("0 0 */1 * * * ")]TimerInfo myTimer, ILogger log)
{
log.LogInformation($"C# Timer trigger function executed at: {DateTime.Now}");
}
}
}
There are a few solutions to achieve this.
Solution 1. Temporarily replace the timer trigger with http trigger
While debugging the app we just comment the first line of the original Run function and add an http trigger instead like this:
public static async Task Run([HttpTrigger] Microsoft.AspNetCore.Http.HttpRequest req, ILogger log)
// public static async Task Run([TimerTrigger("0 0 * * * *")] TimerInfo myTimer, ILogger log)
{
// YOUR REGULAR CODE HERE
}
Then when running the app you'll see an endpoint like this:
Just open the endpoint in browser (or postman) and the function will get called.
And right before pushing the code to the repo, just bring back the original Run function and remove the http trigger one.
Solution 2: Add another http trigger that calls the timer function
Add the following function to your app to expose an http trigger.
[FunctionName("Test")]
public static async Task Test([HttpTrigger] Microsoft.AspNetCore.Http.HttpRequest req, ILogger log)
{
Run(null, log);
}
The function basically calls the Run function.
So when you run the app, again you'll get an endpoint from the console that can be used from the browser to trigger the function.
The url will look like this:
http://localhost:7071/api/Test
Azure functions is event driven in nature. If function trigger means the event handled.
Run method of function means that function has triggered and its entry point for it.
If you want any processing or code execution before it you may need to write one more function and perform the steps and then trigger another function of either timer trigger or ant different type.

Understanding Azure Functions HttpTrigger scaling algorithm

I'm trying to understand how Http triggered function decides when it should be scaled.
I found that for queue triggers IScaleMonitor implementations are used. Here they are for:RabbitMQ Blob trigger Event hub 1 Event hub 2 Kafka Service bus 1 Service bus 2 Cosmos DB Storage queue
But I couldn't find any code that works for HttpTriggers. Does anyone knows where to look for http scaling algorithms?
Azure Function that uses HTTP triggers is scaled based on the setting of maxConcurrentRequests in host.json:
{
"extensions": {
"http": {
"maxConcurrentRequests": 100,
}
}
}
Ducumentation is here: Azure Functions HTTP output bindings
Azure Function that uses a Service Bus is scaled based on the setting of maxConcurrentCalls in host.json file.
Example host.json file with maximum concurrent calls set to 10:
{
"extensions": {
"serviceBus": {
"messageHandlerOptions": {
"maxConcurrentCalls": 10
}
}
}
}
Ducumentation is here: Azure Service Bus bindings for Azure Functions

Azure Function Trigger Name/Connections from App Configuration

Is there now a way to set the Trigger Properties(Name/Connection) using the value from Azure App Configuration?.
I added a startup class that reads the data from Azure App Configuration but it seems the trigger set its properties earlier than that, therefore not able to bind the data that came from the app configuration.
I also found this thread about it but im not sure if there is a new update?:
https://github.com/MicrosoftDocs/azure-docs/issues/63419
https://github.com/Azure/AppConfiguration/issues/203
You can do this. The following code gets the name of the queue to monitor from an app setting, and it gets the queue message creation time in the insertionTime parameter:
public static class BindingExpressionsExample
{
[FunctionName("LogQueueMessage")]
public static void Run(
[QueueTrigger("%queueappsetting%")] string myQueueItem,
DateTimeOffset insertionTime,
ILogger log)
{
log.LogInformation($"Message content: {myQueueItem}");
log.LogInformation($"Created at: {insertionTime}");
}
}
Similarly, you can use this approach for other triggers.

How to connect from .NET .dll file to Azure using Linked Service

I want to write a code, similar to the code at the bottom of this link (https://azure.microsoft.com/en-us/blog/automating-azure-analysis-services-processing-with-azure-functions/) in Visual Studio and building a DLL file. However instead of using the connection string, i would like to use an existing Linked Service from my Azure portal.
The goal is to create a DLL that refreshes my Cube, while at the same time using an existing Linked Service which is already in my Azure Portal.
Is this possible?
Thanks.
#r "Microsoft.AnalysisServices.Tabular.DLL"
#r "Microsoft.AnalysisServices.Core.DLL"
#r "System.Configuration"
using System;
using System.Configuration;
using Microsoft.AnalysisServices.Tabular;
public static void Run(TimerInfo myTimer, TraceWriter log)
{
log.Info($"C# Timer trigger function started at: {DateTime.Now}");
try
{
Microsoft.AnalysisServices.Tabular.Server asSrv = new Microsoft.AnalysisServices.Tabular.Server();
var connStr = ConfigurationManager.ConnectionStrings["AzureASConnString"].ConnectionString; // Change this to a Linked Service connection
asSrv.Connect(connStr);
Database db = asSrv.Databases["AWInternetSales2"];
Model m = db.Model;
db.Model.RequestRefresh(RefreshType.Full); // Mark the model for refresh
//m.RequestRefresh(RefreshType.Full); // Mark the model for refresh
m.Tables["Date"].RequestRefresh(RefreshType.Full); // Mark only one table for refresh
db.Model.SaveChanges(); //commit which will execute the refresh
asSrv.Disconnect();
}
catch (Exception e)
{
log.Info($"C# Timer trigger function exception: {e.ToString()}");
}
log.Info($"C# Timer trigger function finished at: {DateTime.Now}");
}
So I guess you're using the Data Factory and you want to process your analysis services model from your pipeline. I don't see what your question actually has to do with the Data lake store.
To trigger Azure Functions from the Data Factory (v2 only), you'll have to use a web activity. It is possible to pass a Linked Service as part of your payload, as shown in the documentation. It looks like this:
{
"body": {
"myMessage": "Sample",
"linkedServices": [{
"name": "MyService1",
"properties": {
...
}
}]
}
However, there is no Analysis services linked service in the Data Factory, at least, I didn't hear of such a thing. Passing in a connectionstring from the pipeline seems like a good idea however. You could pass it as a pipeline parameter in your body of the webrequest.
Create a parameter in your pipeline
Add it to your Web Activity Payload
{
"body": {
"AzureASConnString": "#pipeline().parameters.AzureASConnString"
}
You can retrieve this value from functions like described here

Azure WebJobs QueueTrigger not triggering

I'm trying to find what I'm doing wrong regarding an Azure WebJobs QueueTrigger method that should be triggered from an Azure Storage Queue.
I've read a couple of documents (as in blog posts / msdn articles). But I'm still not clear.
Main question / misunderstood aspect:
What should be the name of the connection string for Azure storage console app App.config or Windows Azure Configuration (portal). So far I have the following name set at both places.
AzureJobsStorage
AzureWebJobsStorage
AzureJobsRuntime
AzureJobsDashboard
AzureJobsData
Here's my WebJobs console app code.
static void Main()
{
JobHost host = new JobHost();
host.RunAndBlock();
}
public static void CreateLeague([QueueTrigger("temp")] string msg)
{
var task = JsonConvert.DeserializeObject<QueueTask>(msg);
if (task.TaskType == QueueTask.TaskTypes.Pdf)
RenderPdf(task.Id);
}
This console app is continuously running on my Azure Website.
I can access its "debug" page where I can toggle output and I see it is started / running.
My code to add queue (from my ASP.NET MVC app):
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(ConfigurationManager.AppSettings["StorageConnectionString"]);
CloudQueueClient queueClient = storageAccount.CreateCloudQueueClient();
CloudQueue queue = queueClient.GetQueueReference("temp");
queue.CreateIfNotExists();
Common.QueueTask task = new Common.QueueTask();
task.TaskType = Common.QueueTask.TaskTypes.Pdf;
task.Id = p.Id;
CloudQueueMessage msg = new CloudQueueMessage(JsonConvert.SerializeObject(task) );
queue.AddMessage(msg);
This code is executed, and queue are added to my Storage Account. But they did not get "dequeue" or read from the WebJobs.
Hmm, the WebJobs class had to be public.
using Microsoft.Azure.WebJobs;
using Microsoft.WindowsAzure.Storage;
using Microsoft.WindowsAzure.Storage.Blob;
using Newtonsoft.Json;
using Proceed.Common;
using System;
using System.Configuration;
using System.IO;
public class WebJobsTask {
public static void Main()
{
JobHost host = new JobHost();
host.RunAndBlock();
}
public static void CreateLeague([QueueTrigger("temp")] string msg)
{
var task = JsonConvert.DeserializeObject<QueueTask>(msg);
if (task.TaskType == QueueTask.TaskTypes.Pdf)
RenderPdf(task.Id);
}
}
Also found a quick way to explore my queues: https://azurestorageexplorer.codeplex.com/.
In my case, I had assumed that QueueTrigger was referring to Service Bus Queues instead of Azure Queues, and I actually needed to use ServiceBusTrigger.
You can use the server explorer in VS to explore the content of the Storage queues.
The queue triggers for the WebJobs SDK will exponentially back off if there is no work to do. There might be a delay between the moment a message is put in a queue and the moment when it is picked up. You can configure the maximum back off through the JobHostConfiguration.Queues.MaxPollingInterval property.
For the latest SDK you need two storage connection strings AzureWebJobsStorage and AzureWebJobsDashboard
This is a great place for more resources: https://learn.microsoft.com/en-us/azure/app-service-web/websites-webjobs-resources

Categories