Azure Functions Logging - c#

In Azure Functions you use host.json to configure logging. However, the logging options/sections are unclear to me. Can someone help?
Per this: https://learn.microsoft.com/en-us/azure/azure-functions/functions-host-json
"logging": {
// FILE CONFIG?
"fileLoggingMode": "debugOnly"
"logLevel": {
"Function.MyFunction": "Information", // where is this log?
"default": "None"
},
// INSIGHTS CONFIG
"applicationInsights": {
"samplingSettings": {
"isEnabled": true,
"excludedTypes" : "Dependency;Event",
"includedTypes" : "PageView;Trace"
},
Is the FILE CONFIG section used for logging in "log stream"? or is the INSIGHT section?
What is 'default' vs 'Function" log level?
I assume Filesystem logs is what controlled via FILE CONFIG section, correct?
What configuration is used for "Monitor" section of Azure Functions?, is it FILE CONFIG?
if so, If I wanted to see only errors under "Monitor" section would I set Function.MyFunction": "Error" or "default"?

FileLoggingMode is used to generate the logs in azure portal or in a local Environment. The different modes in “fileLoggingMode” are
“debugOnly”: This level will generate logs when the function app is running on Azure Portal. This is the default mode.
“always”: This mode is used to generate logs in local environment as well as when running on Azure Portal. This code reference can be helpful to understand it better.
“never”: This mode does not generate any logs.
When the “fileLoggingMode” is set to “always” the log file generated when running in local environment is stored in “C:\Users{user}\AppData\Local\Temp\LogFiles\Application\Functions\Function{Function_Name}”, the path can be referenced here in the Host Settings code. If you are running the function app as docker/container in your local premises you can change the path by updating the host configuration code.
The fileLoggingMode is still in issue see here
Default Vs Function log is both are same level check the below json
In a logging loglevel we have Parameters default/function/… we where mention the Which type of log can we need to get like (error, information, trace, ….)
{
"logging": {
"fileLoggingMode": "always",
"logLevel": {
"default": "Information",
"Function": "Error"
}
}
}
Refer: Configure monitoring for Azure Functions | Microsoft Docs
In a default it get the log information of what you added in a parameter (host.json). If you are using "default": "Error" you will get the Error logs by default.
If you are using below configuration by default it will get the Information Log and Funcition get the Error log.
"default": "Information",
"Function": "Error"

Related

Use json app settings instead of system environment variables

I am working with a asp.net core project which will have 2 environments that I deploy to a single server, staging and production. My aim is to make staging use the appsetting.json file instead of window environment variable.
I have set up a environment variable on the server.
Environment variable
In my appsettings.json file, I have the process setting with a different value.
{
"Logging": {
"LogLevel": {
"Default": "Information",
"Microsoft": "Warning",
"Microsoft.Hosting.Lifetime": "Information"
}
},
"Process": "Not ABC"
}
This is the code to retrieve it in my project.
string process = _config.GetValue<string>("Process");
Issue is both staging and production are on the same server meaning they using the same environment system variable. Both situations, the enviroment variable overrides the one in appsettings.json.
I want staging to use the appsettings.json. Is there a way to do this?
I tried looking over the internet and can't find any solution.
Leveraging the information from documentation: https://learn.microsoft.com/en-us/aspnet/core/fundamentals/configuration/?view=aspnetcore-6.0
I propose the following:
Create app settings for both production and Staging.
Create two releases, one for staging and one for production.
In the production configuration file, omit the values you want to load from environment variables.
Add the environment provider first and the app setting one second
In the Staging app config, add your values.
This way, the Staging value will come from the app settings file and the production from the environment variable.

adding placeholder variables to the appSettings.json file

So when generating a new .NET Core web project there is an appSettings.json and appSettings.Development.json for configuration parts. By default, those are not ignored by the .gitignore file so I think they should stay in the repository.
I'm using a database and want to store my connection string to those files. Obviously the parameters contain sensitive information and shouldn't be in the repository. If the environment is the development I can add the local connection string to the appSettings.Development.json (but then every developer would have to use the same settings until I add the dev file to the .gitignore)
"Database": {
"Server": "localhost",
"Port": 5432,
"UserId": "admin",
"Password": "admin",
"Name": "myDb"
}
How can I set-up the appSettings.json for production or other purposes? Is there a way for something like
"Database": {
"Server": "$DB_SERVER",
"Port": "$DB_PORT",
"UserId": "$DB_USER_ID",
"Password": "$DB_PASSWORD",
"Name": "$DB_NAME"
}
and those placeholders would be replaced with the values from the environment variables?
If you include any sort of sensitive information (like production connection strings) in source control they are usually considered compromised.
You have two options:
First option wopuld be to go for appsettings file override values. It is supported by all the established CI/CD tools. That step usually happen in the release pipeline right before you deploy.In this scenario you store your values encrypted in the CD tool.
Second option is to use Environment variables. In this case for development purposes you can just pass this variables in the launchSettings.json file. And you set the values of your environment variables in the server running your application.
If you want to use Environment variables you do not need place holders in appsettings file. You can read them directly
Environment.GetEnvironmentVariable("ASPNETCORE_ENVIRONMENT ")
Use secret
They save in a local file the configuration, so they aren't on repository.
When you go in production, you save this information in environment variable.
In visual studio on Project, select Manage user secret and add your connection
{
"ConnectionStrings": {
"MyConnection": "the connection"
}
in startup use IConfiguration
configuration.GetConnectionString("MyConnection")
and at production, in EnvironmentVariable of server, you save a new EnvironmentVariable
ConnectionStrings:MyConnection
So only administrator know the connection.
If you use Azure, you could use Azure key vault.

Serilog in ASP.NET Core Windows Service cannot write file as Local System

I am running an ASP.NET Core web server as a Windows service and am using Serilog to log to a file in %PROGRAMDATA%. When I run the service as Local System, nothing is logged.
I am using .Net Core 2.2 on Windows 10. I am testing by triggering an error in my service that writes a log event at the error level. I've tried running the service as my own administrative account and the logging works fine; the issue only occurs when running as Local System.
I have other Windows Services using .Net Framework that run as Local System and have no problem logging to %PROGRAMDATA% with Serilog, but this is the first time I have tried it on .Net Core. I can manually create and write to the log file within the service with Directory.CreateDirectory and File.AppendText and that works while running as Local System, but the Serilog logging does not.
Here is my Program.Main:
public static async Task Main(string[] args)
{
var isService = !(Debugger.IsAttached || args.Contains("--console"));
if (isService)
{
var pathToExe = Process.GetCurrentProcess().MainModule.FileName;
var pathToContentRoot = Path.GetDirectoryName(pathToExe);
Directory.SetCurrentDirectory(pathToContentRoot);
}
var host = WebHost.CreateDefaultBuilder(args.Where(arg => arg != "--console").ToArray())
.UseStartup<Startup>()
.UseSerilog((hostingContext, loggerConfiguration) => loggerConfiguration
.ReadFrom.Configuration(hostingContext.Configuration)
.Enrich.FromLogContext())
.Build();
// additional async initialization omitted
if (isService)
{
host.RunAsService();
}
else
{
host.Run();
}
}
And here is the Serilog section of my appsettings.json:
"Serilog": {
"MinimumLevel": {
"Default": "Verbose",
"Override": {
"Microsoft": "Warning",
"System": "Warning"
}
},
"WriteTo": [
{
"Name": "File",
"Args": {
"path": "%PROGRAMDATA%/foo/bar baz/logs/qux.log",
"fileSizeLimitBytes": 1048576,
"rollOnFileSizeLimit": "true",
"retainedFileCountLimit": 99,
"flushToDiskInterval": "00:00:01",
"outputTemplate": "[{Timestamp:yyyy-MM-dd HH:mm:ss.fff} {Level:u3}] {Message:lj} [{SourceContext}]{NewLine}{Exception}"
}
},
{
"Name": "Console",
"Args": {
"outputTemplate": "[{Timestamp:HH:mm:ss.fff} {Level:u3}] {Message:lj} [{SourceContext}]{NewLine}{Exception}"
}
}
]
}
I expected logging to be written to the file in %PROGRAMDATA% when the service is running as Local System, but nothing happens. Logging is written without issue when the service is run as any other administrative account.
If i dont miss-understood it is nteresting with account system permissions:
A local system account of "local admin" is the same as an admin account.
if you have a Domain admin and you have a local admin. Both pretty much have the same function.
If you are part of a domain you typically do not want to log into your computer as the domain admin.
You always want to use the local admin account if you can. The reason behind this is that your computer may have a virus on it and you log in as domain admin you have just opened the door for to virus across your entire network
If you need to write to %PROGRAMDATA% then you should give the permission and use like this https://stackoverflow.com/a/30792263/914284

Azure function implemented locally won't work in the cloud

I have the following function, which I define locally and am able to debug it normally.
[FunctionName("QueueTrigger")]
public static void DUMMYFUNCTION(
[QueueTrigger("myqueue", Connection = "AzureWebJobsStorage")]string myQueueItem, TraceWriter log)
{
log.Info($"C# function processed: {myQueueItem}");
}
Locally, "AzureWebJobsStorage" is defined in the local.settings.json file to use the storage account which has "myqueue". In the function settings on Azure "AzureWebJobsStorage" is also set to the correct connection string (same as the one set locally). That means, I do not have the same problem as in Azure Function does not execute in Azure (No Error)
Now, I use Visual Studio Team Service to host my source code in a git repository. I've configured the deployment to use the source code and deploy the functions contained in it.
I don't think the issue is related to VSTS because the deployment is performed successfully and the function is displayed in my functions list:
After the deployment, the file function.json is generated and has the content below:
{
"generatedBy": "Microsoft.NET.Sdk.Functions.Generator-1.0.8",
"configurationSource": "attributes",
"bindings": [
{
"type": "queueTrigger",
"connection": "AzureWebJobsStorage",
"queueName": "myqueue",
"name": "myQueueItem"
}],
"disabled": false,
"scriptFile": "../bin/myAssembly.dll",
"entryPoint": "myAssembly.MyClass.DUMMYFUNCTION"
}
The problem is that, when I add an item to the queue while debugging it locally, the function is executed, but when the function is running on azure it does not.
What do I need to change in the code to have it work on azure as well? I thought it would work out-of-the-box.
Is your function running at all? If you go in the KUDU, do you see any log that your function actually ran?
If your function is not running at all, Azure functions 2 (using the .NET Standard 2 framework) is still in preview (beta). So when you deploy your function through, make sure to go in the Application Settings of your function app and set the FUNCTIONS_EXTENSION_VERSION value to beta

Trying to host a test site with Azure, but SQL Database won't transfer

I am a new web developer and am trying to host a test site with Azure test services.
I can see the test site(you can access this to test ) at: http://kencast20160830102548.azurewebsites.net/
However, if you go to the Services -> Fazzt --> Equipment and Applications pages, I get this error:
Error.
An error occurred while processing your request.
Development Mode
Swapping to Development environment will display more detailed information about the error that occurred.
Development environment should not be enabled in deployed applications, as it can result in sensitive information from exceptions being displayed to end users. For local debugging, development environment can be enabled by setting the ASPNETCORE_ENVIRONMENT environment variable toDevelopment, and restarting the application."
These pages are relying on a SQL database, so I think this is where the problem is.
I've been trying to follow along with the directions published here: https://docs.asp.net/en/latest/tutorials/publish-to-azure-webapp-using-vs.html
however I cannot find the "Configure SQL Database" pop-up box when logged into my Microsoft Azure account.
The directions do not seem to go along with what exists in Azure.
Update- 8/31/2016
I have researched and learned a bit more:
I have one project with two DBContexts.
When I publish to Azure, it publishes tables from ApplicationDBContext but not the tables from MyCompanyContext. I can verify using SQL Server Object Explorer.
I can see my local connection strings in appsettings.json file for both ApplicationDB and MyCompanyDB. Here is the code from appsettings.json:
{
"ConnectionStrings": {
"DefaultConnection": "Server=(localdb)\\mssqllocaldb;Database=aspnet-MyCompany-3097a012-5e00-4c25-ad83-2730b4d73b4b;Trusted_Connection=True;MultipleActiveResultSets=true"
},
"Logging": {
"IncludeScopes": false,
"LogLevel": {
"Default": "Debug",
"System": "Information",
"Microsoft": "Information"
}
},
"Data": {
"MyCompanyContext": {
"ConnectionString": "Server=(localdb)\\mssqllocaldb;Database=MyCompanyContext-f9de2588-77e8-41dd-abb3-84341610b31a;Trusted_Connection=True;MultipleActiveResultSets=true"
}
}
}
However, when I look at the "SQL Server Object Explorer" window, I see that the database hosted on Azure, mycompanydb.database.windows.net(SQL Server 11.0.9231 -mycompanydb, MyCompany_db) only has the tables from the "DefaultConnection" database, and nothing from "MyCompanyContext".
How do I get the tables from the second database (MYCompanyContext) to Azure?
I have been studying this Stack Overflow response, but it uses Enable-Migration in the PMC. When I do that, I get an error that enable-migrations is obsolete.
Locally, I have always done migrations with this:
add-migrations -c MyCompanyContext
Any help you could give me would be greatly appreciated.
Thanks!
From the comments, I am guessing you need to ensure both your contexts are applied in the startup class like this
services.AddEntityFramework().AddSqlServer()
.AddDbContext<MyCompanyContext>(options => options.UseSqlServer(Configuration.Get("Data:SQL")))
.AddDbContext< ApplicationDBContext >(options => options.UseSqlServer(Configuration.Get("Data:SQL")));
the above assumes two things:
1) you have the connection string set up like this in your appsettings.json
"Data": {
"SQL": "Server=tcp:yourDbServer.database.windows.net,1433;Initial Catalog=yourDb;Persist Security Info=False;User ID=youId;Password=YourPswd;MultipleActiveResultSets=True;TrustServerCertificate=False;Connection Timeout=30;",
}
2) Both the contexts share the same Db, if not then substitute the appropriate connection string (make sure to have it matched to the way you have it in your appsettings.json) for the context like this sample:
.AddDbContext<MyCompanyContext>(options => options.UseSqlServer(Configuration.Get("Data:SQL2")));

Categories