I am writing an uwp app with C# and I would like to use serilog to write logfiles.
When I do it in a .NET Core Console application it creates the file with the wanted content.
In the uwp I can show the log messages on the GUI, but no log file is created (yes, I have broadFileSystemAccess).
My code:
Log.Logger = new LoggerConfiguration().WriteTo.File("log.txt").CreateLogger();
Log.Debug("TEST");
Log.CloseAndFlush();
I would expect that it creates log.txt file in the directory which can be get with Windows.Storage.ApplicationData.Current.LocalFolder like
storageFolder.CreateFileAsync("test.txt", CreationCollisionOption.ReplaceExisting);
does.
Can anyone tell me what I have to change or to consider to log to a file with serilog in my uwp app?
One of the great things about Serilog is that it's open-source and you can look at its source code and see what it is doing / how it works.
If you peek at the source code of the File sink, you'll notice that it is simply opening a file via System.IO.File.Open on the same folder where your app is running, which running from Visual Studio is probably going to be something like C:\Users\augustoproiete\MyApp\MyUwpApp\bin\x64\Debug\AppX\ where you don't have access to write file.
That means you have to be explicit about where to store the file, for example in the application data folder of your app. E.g.:
var logFilePath = Path.Combine(ApplicationData.Current.LocalFolder.Path, "log.txt");
Log.Logger = new LoggerConfiguration().WriteTo.File(logFilePath).CreateLogger();
Log.Debug("TEST");
Log.CloseAndFlush();
I'd recommend you read the Debugging and Diagnostics page on Serilog's docs, as it explains how you see error messages from Serilog - which you'd have seen that it was failing to create a file and the path it was using.
Thank yu for your answer!
I thought that the local folder path is automatically used, so I did not try that.
I jus had to modify the code as follows
var logFilePath = Path.Combine(ApplicationData.Current.LocalFolder.Path, "log.txt");
Log.Logger = new LoggerConfiguration()
.MinimumLevel.Verbose() //otherwise Debug is not logged
.WriteTo.File(logFilePath)
.CreateLogger();
Log.Debug("TEST");
Log.CloseAndFlush();
then it woked and wrote the Debug message to the log file :)
Thank you very much!
Related
I am using Serilog library in a C# Console Application to build a Scheduler, which logs messages in a Log File for each day. The filenames are created dynamically using below code.
Log.Logger = new LoggerConfiguration()
.WriteTo.Console()
.WriteTo.File("logs\\log-scheduler-.txt", rollingInterval: RollingInterval.Day)
.CreateLogger();
How can I get to know the Name of File that my Console application has created? At the end of my program, I want to email the log file as attachment.
Serilog doesn't expose a way to inspect the name of the file(s) that were created, as of this writing.
If you don't use a RollingInterval and have the log messages be written to a single file, then you know exactly what the name of the file is, because you already specified it in the log pipeline configuration: logs\\log-scheduler-.txt.
But if you want to use a RollingInterval then you what you could do, is to inspect the folder where your log files are being written to, and find the log file(s) that were updated since your application started, by capturing the current timestamp when your app starts, and then looking at the LastWriteTimeUtc attribute of the log files in the file system to see anything that has changed since that time.
e.g.
DateTime appStartTime = DateTime.UtcNow;
// ... (app code)
// Ensure that all messages are flushed to the file
Log.CloseAndFlush();
// Get all log files modified since the app started
var logFilesUpdatedInThisSession = new DirectoryInfo(#"C:\Temp\logs\")
.EnumerateFileSystemInfos("log-scheduler-*.txt")
.Where(f => f.LastWriteTimeUtc >= appStartTime)
.OrderBy(f => f.LastWriteTimeUtc)
.ToList();
if (logFilesUpdatedInThisSession.Any())
{
// Send all files via e-mail ...
}
Another alternative (better IMO) would be to not send e-mails at all from your app, and just ship the logs to a server, such as a Seq server which allows you to easily browse the logs, apply filters, etc., which is a much better experience than opening a log file attached to an e-mail on a text editor.
I want to be able to send application logs to Cloud Watch Log. and I got to know that there is a Cloud Watch Agent service that runs in the background and reads logs from log file and send only the delta (extra logs) to Cloud Watch Log. All this makes sense to me. Then I got to know about NLog a C# logging framework, and wrote below POC to send logs.
static void Main(string[] args)
{
ConfigureNLog();
var logger = NLog.LogManager.GetCurrentClassLogger();
logger.Info("Hello World");
logger.Log(LogLevel.Info, "Sample informational message");
}
static void ConfigureNLog()
{
var accessKey = ConfigurationManager.AppSettings.Get("AWSAccessKey");
var secretKey = ConfigurationManager.AppSettings.Get("AWSSecretKey");
var config = new LoggingConfiguration();
var awsTarget = new AWSTarget()
{
LogGroup = "NLog.ProgrammaticConfigurationExample",
Region = "us-east-1",
Credentials = new BasicAWSCredentials(accessKey, secretKey)
};
config.AddTarget("aws", awsTarget);
config.LoggingRules.Add(new LoggingRule("*", LogLevel.Debug, awsTarget));
LogManager.Configuration = config;
}
Now when I run above code, I am able to send log to Cloud Watch. But I am confused now, where is the significance of Cloud Watch Agent?
Since I am directly sending log data, does that mean I don't need Cloud Watch Agent in my scenario?
In case I want to use Cloud Watch Agent then I need to use FILE as a target for logs by NLog and then tell Cloud Watch Agent to send that log file to Cloud Watch Log??
Is my understanding correct? Please help me in understanding the flow.
Is below flow correct?
NLog write log to File -> Cloud Agent read log from there -> Send log
to Cloud Watch
Question: How to use Cloud Watch Agent in above POC to send data via NLog?
Cloud Watch Agent runs on your server and can watch logs files that are produced. These log files can be anything, IIS Logs, Time Logs, Event Log, Etc. When the log file is updated, CWA will grab the updates and send to Cloud Watch. This is the generic behavior of the CWA and is great for Event Logs and OS logging.
By modifying the AWS.EC2.Windows.CloudWatch.json CWA json file, you can configure it to watch log files for certain formats and send changes to CW outside the standard/example ones it does by default. You can update the json to your NLog entry layout format and have it watch for that specific format in the file. CW Does have a delay sending.
Now you have Nlog which writes log files. You can have NLog send the log entries to a file and have the Cloud Watch Agent watch that file, pick up the change and send it or you can have NLog send the entries directly to CW. Since you are writing directly to CW through a NLog target, you don't need the Cloud Agent for your NLog files. I suggest keeping CWA for other log files like IIS or event logs.
I guess it is a matter a preference on how you do it. I think NLog Targets with layouts is easier than dealing with the CloudWatch json file to try and match the log format. I only use CWA to send log files I have no control over and use a NLog Target to send my NLog entries.
I can post an example CWA json snippet for a 3rd party log file I monitor with CWA if you need an example.
When an application just have to write to a file, then it lives a very simple life with few problems.
When an application suddenly have to handle network traffic (with timeouts, disconnects, retries, connectivity, latency, etc.) then it suddenly will have issues with things queuing up, taking memory, using sockets, causing garbage collections, stalls, etc. (And loosing all pending logevents on crash)
Depending on the lifetime of your application and the critically of your applications, then it can be useful to give it a simple life. And let a friend like Cloud Watch Agent worry about the network-stuff.
See also https://github.com/NLog/NLog.Extensions.Logging/wiki/NLog-cloud-logging-with-Azure-function-or-AWS-lambda
I created a new .NET Core web project using Visual Studio and integrated Serilog. It reads settings from the appsettings.json (via .UseSerilog((ctx, config) => { config.ReadFrom.Configuration(ctx.Configuration); })).
In the appsettings.json, the path to write the log is specified at Serilog/WriteTo/Args/pathFormat. If I set that value to log.txt, it will attempt to write the file to `c:\program files\iisexpress\log.txt'.
How can I get it to write to the content root of my web app?
Ideally, you don't want to write log files to the content root of your web app, since these could end up being accessible over the web, which would be a serious security consideration.
The log file path configuration can include environment variables, so something like %TMP%\log.txt, or a path based on %LOCALAPPDATA% etc. is a good option (as long as the web site's worker process has permission to write to the location).
You can write to a path that's relative to your web app by changing the current directory before setting up the logger:
Environment.CurrentDirectory = AppContext.BaseDirectory;
Or, you can combine these approaches and set your own environment variable to do it:
Environment.SetEnvironmentVariable("BASEDIR", AppContext.BaseDirectory);
Thus the following config:
"%BASEDIR%\logs\log.txt"
I'm trying to implement Serilog in a .Net Core library to have a good abstraction for this Third party and be able to use it on different project that are in my Solution.
So I configure Serilog like the example in their GitHub
if(Log.Logger == null){
Log.Logger = new LoggerConfiguration()
.MinimumLevel.Information()
.WriteTo.LiterateConsole()
.WriteTo.RollingFile("logs/myUsefullLogs.txt")
.CreateLogger();
}
And I log an information like this :
Log.Information(message,ex,source);
Log.CloseAndFlush();
If I put a breakpoint all seems to work perfectly but when I search the file I'm not able to find it.
Somebody already face to this in macOS?
Log.Logger will never have the value null, so your configuration code above will never run.
When logging is not configured, Log.Logger is assigned an instance of the (internal) SilentLogger class: https://github.com/serilog/serilog/blob/dev/src/Serilog/Log.cs#L43.
I have a recent problem . I can upload file in my intetpub/wwwrooot/folder
But I can't write a log file in this same folder ...
I have all the permissions for the network service. Everything is on my server.
DirectoryInfo di = new DirectoryInfo(~/);
// Get a reference to each file in that directory.
FileInfo[] fiArr = di.GetFiles();
string strLogText = di;
// Create a writer and open the file:
StreamWriter log;
if (!System.IO.File.Exists("C:\\inetpub\\wwwroot\\logfile.txt"))
{
log = new StreamWriter("C:\\inetpub\\wwwroot\\logfile.txt");
}
else
{
log = System.IO.File.AppendText("C:\\inetpub\\wwwroot\\logfile.txt");
}
// Write to the file:
log.WriteLine(DateTime.Now);
log.WriteLine(strLogText);
log.WriteLine();
// Close the stream:
log.Close();
The error is the access is denied !
It works locally , but on my server it doesnt. On the folder Inetpub , I just need to allow writting for Network service ? That is strange because I can upload file and writting is already enable
Emged in case of exceptions your code does not close the streams on the log file and this is surely not good.
You should use a using statement around the streams so in any case streams are closed and disposed also in case of exceptions.
As Chris has suggested I would absolutely opt for a logging Framework and I would also avoid writing in that wwwroot folder.
ELMAH or NLog or Log4Net are good and easy alternatives far better than any custom logging lie you are doing right now and the big advantage of these technologies/libraries is that you can change the behaviour at runtime simply by editing the configuration file, no need to rebuild or redeploy anything...
my favourite is actually Log4Net, check these ones for a simple example on how to use it:
http://logging.apache.org/log4net/release/manual/configuration.html
Log4Net in App object?
Depending on the version of your server (windows 2008 and above), that directory has additional protection against writes.
I'd highly recommend you look into ELMAH to do your logging. It gives you a number of options including in memory or database backed and collects a LOT of additional data you might want.
Further, opening up various physical directory locations for write access is a HUGE security no-no.
On the server, is the web app running under an Application Pool that has alternate credentials, other than the normal network service account? If you haven't done so already, try turning on Auditing to see what user is trying to access the file.