I have a working azure function which puts a message on a service bus queue.
public static void Run(
[TimerTrigger("0 * * * *")]TimerInfo myTimer,
[ServiceBus("queueName", Connection = "ServiceBusConnection")] ICollector<Message> queue,
TraceWriter log)
{
//function logic here
}
The connection string is currently in the plain text in the app settings. Is it possible to have this encyrpted and still use the built in integration between azure functions and the service bus?
I have tried creating a ServiceBusAttribute at runtime but it doesn't look like you can pass it a connection string.
Any help is much appreciated
This is currently not possible. There is a feature request to retrieve secrets used in bindings from KeyVault: https://github.com/Azure/azure-webjobs-sdk/issues/746
The GitHub issue also describes a workaround to retrieve the secrets from KeyVault at build time within VSTS.
Related
I am getting this below error when I am running my timer function app in Azure Cloud. It is just a basic code and I wanted the log to be shown in Insight.
public static class Function1
{
[FunctionName("Function1")]
public static void Run([TimerTrigger("0 * * * * *")]TimerInfo myTimer, ILogger log)
{
log.LogInformation($"C# Timer trigger function executed at: {DateTime.Now}");
}
}
Here is the workaround I did to identify this issue cause:
Created the Azure Function (Stack: .NET 3.1) of type Timer Trigger with your given timer "0 * * * * *" and the connection string given from the storage account created in Azure portal, which is running successfully (in local):
2. Deleted the Storage account from the Azure Portal and tried to run the function locally which gave me the error: The listener for function Function1 was unable to start.
Recovered the Storage Account and then deployed to the Azure Portal Function App and running successfully in cloud also:
In the Azure Cloud, Yes, as #Skin Said that it would be a Storage Account configuration issue.
Few of the steps to resolve this issue were:
Check the AzureWebJobsStorage value contains correct the correct storage account connection string.
Check the Storage account is not deleted.
Check the Networking Option in the Function App that might be the firewall is blocking/restricting the access to the associated storage account.
It was a firewall issue.
Added the appropriate Virtual Network and Subnet under Storage Account > Networking > Firewalls and Virtual Network
I have a webjob in sdk 3.
public class NotificationsFunction
{
public async Task CustomerNotificationFunctionAsync([TimerTrigger("0 * * * * *")]TimerInfo myTimer, ILogger log)
log.LogInformation("Running job");
It used to run correctly.Now, if i try to debug it locally, it just hangs.. It finds the functions but never trigger it:
Now, if i just change the name in :
public class NotificationsFunction
{
public async Task CustomerNotificationFunctionAsyncTest([TimerTrigger("0 * * * * *")]TimerInfo myTimer, ILogger log)
log.LogInformation("Running job");
It runs perfectly:
I have the same exact problem when i deploy to azure.
I have no idea why this happens (and it took me a while to find this problem)...
As anyone ever had this problem? If so, what can I do?
Thanks
From this document:
TimerTrigger uses the Singleton feature of the WebJobs SDK to ensure that only a single instance of your triggered function is running at any given time. When the JobHost starts up, for each of your TimerTrigger functions a blob lease (the Singleton Lock) is taken. This distributed lock ensures that only a single instance of your scheduled function is running at any time. If the blob for that function is not currently leased, the function will acquire the lease and start running on schedule immediately. If the blob lease cannot be acquired, it generally means that another instance of that function is running, so the function is not started in the current host.
The lock ID is based the fully qualified function name.
According to your webjob in sdk 3, you could use AddAzureStorageCoreServices.
var builder = new HostBuilder()
.ConfigureWebJobs(b=>
{
b.AddTimers();
b.AddAzureStorageCoreServices();
})
.Build();
I have the same exact problem when i deploy to azure.
Also note that if you're sharing the same storage account between your local development and production deployment, the Singleton locks (blob leases) will be shared. To get around this, you can either use a separate storage account for local development or perhaps stop the job running in Azure.
I have created a Service bus triggered Azure function and want to log custom events in application insights.
private static string key = TelemetryConfiguration.Active.InstrumentationKey =
System.Environment.GetEnvironmentVariable(
"APPINSIGHTS_INSTRUMENTATIONKEY", EnvironmentVariableTarget.Process);
private static TelemetryClient telemetryClient =
new TelemetryClient() { InstrumentationKey = key };
[FunctionName("Function1")]
public static void Run([ServiceBusTrigger("xxxxx", "xxxxx", AccessRights.Manage, Connection = "SBConnectionKey")]string mySbMsg, ILogger logger, TraceWriter log)
{
log.Info($"C# ServiceBus topic trigger function processed message: {mySbMsg}");
telemetryClient.Context.Cloud.RoleName = "AIFunction";
logger.LogMetric("test", 123);
telemetryClient.TrackEvent("Ack123 Recieved");
telemetryClient.TrackMetric("Test Metric", DateTime.Now.Millisecond);
}
I can see only log.Info($"C# ServiceBus topic trigger function processed message: {mySbMsg}");this logs in the trace. But custom events and metrics are not logging to application insights.
Any ideas what could be going on?
Answering your explicit question:
What is wrong with the telemetry that I send or where to find it in Application Insights Portal?
I created the function with almost the same code and tested. You can analyse the repo. I deployed the function and got the following results:
Answering your implicit question:
How to use Application Insights?
It is tricky at the beginning to use App Insights query language, I found this succinct file helpful. Other elements to consider in working with this monitoring tool:
there is a lagging between the telemetry is sent and you see it in application insights portal. Real time monitoring would be an expensive tool.
In the past I faced the same issue and the problem was that event/metric name was not found in the name of the telemetry but somewhere in details. This issue, might be referring to it. So what we decided to do is put more details and use of this method and MetricTelemetry class.
Although Application Insights might seem confusing, that is a powerful tool, it is worth investing time to learn better.
TL;DR: This example is not working for me in VS2017.
I have an Azure Cosmos DB and want to fire some logic when something adds or updates there. For that, CosmosDBTrigger should be great.
Tutorial demonstrates creating trigger in Azure Portal and it works for me. However, doing just the same thing in Visual Studio (15.5.4, latest by now) does not.
I use the default Azure Functions template, predefined Cosmos DB trigger and nearly default code:
[FunctionName("TestTrigger")]
public static void Run(
[CosmosDBTrigger("Database", "Collection", ConnectionStringSetting = "myCosmosDB")]
IReadOnlyList<Document> input,
TraceWriter log)
{
log.Info("Documents modified " + input.Count);
log.Info("First document Id " + input[0].Id);
}
App runs without errors but nothing happens when I actually do stuff in the database. So I cannot debug things and actually implement some required logic.
Connection string is specified in the local.settings.json and is considered. If I deliberately foul it, trigger spits runtime errors.
It all looks like the connection string is to a wrong database. But it is exactly the one, copypasted, string I have in the trigger made via Azure Portal.
Where could I go wrong? What else can I check?
Based on your comment, you were running both portal and local Apps at the same time for the same collection and the same lease collection.
That means both Apps were competing to each other for locks (leases) on collection processing. The portal App won in your case, took the lease, so the local App was sitting doing nothing.
I'm building an ASP.NET Core MVC site which uses an EF Core Sql Server database.
In part of the site I need to add the ability to upload files, which then get processed and applied to the database. I thought I would use WebJobs for this.
What I'm not clear on is how I share the database connection string between my website and the webjob. Is there something analogous to the ConfigureServices() method in the website's Startup.cs which I can use in the webjob's program Main()?
Program Main
Per request, here is my current WebJobs code in all its glory:
public static void Main(string[] args)
{
var config = new JobHostConfiguration();
if( config.IsDevelopment )
{
config.UseDevelopmentSettings();
config.DashboardConnectionString = "...";
config.StorageConnectionString = "...";
}
JobHost host = new JobHost( config );
host.RunAndBlock();
}
Not at lot there, but I'm just starting :)
As I know, Azure Web app and WebJob share App setting and connection strings set on Azure portal. So we can define connection string on azure portal, then use ConfigurationManager to get the connection string. I did a small test to verify this for you.
1) Set connection string 'myconnection' with value 'test'
2) use web job project to output the connection
public static void ProcessQueueMessage([QueueTrigger("queue")] string message, TextWriter log)
{
log.WriteLine(message);
string connectionString = ConfigurationManager.ConnectionStrings["myconnection"].ToString();
Console.WriteLine("this is my webjob project console write " + connectionString);
}
3) Then I see the result in Azure web job dashboard