I want to be able to send application logs to Cloud Watch Log. and I got to know that there is a Cloud Watch Agent service that runs in the background and reads logs from log file and send only the delta (extra logs) to Cloud Watch Log. All this makes sense to me. Then I got to know about NLog a C# logging framework, and wrote below POC to send logs.
static void Main(string[] args)
{
ConfigureNLog();
var logger = NLog.LogManager.GetCurrentClassLogger();
logger.Info("Hello World");
logger.Log(LogLevel.Info, "Sample informational message");
}
static void ConfigureNLog()
{
var accessKey = ConfigurationManager.AppSettings.Get("AWSAccessKey");
var secretKey = ConfigurationManager.AppSettings.Get("AWSSecretKey");
var config = new LoggingConfiguration();
var awsTarget = new AWSTarget()
{
LogGroup = "NLog.ProgrammaticConfigurationExample",
Region = "us-east-1",
Credentials = new BasicAWSCredentials(accessKey, secretKey)
};
config.AddTarget("aws", awsTarget);
config.LoggingRules.Add(new LoggingRule("*", LogLevel.Debug, awsTarget));
LogManager.Configuration = config;
}
Now when I run above code, I am able to send log to Cloud Watch. But I am confused now, where is the significance of Cloud Watch Agent?
Since I am directly sending log data, does that mean I don't need Cloud Watch Agent in my scenario?
In case I want to use Cloud Watch Agent then I need to use FILE as a target for logs by NLog and then tell Cloud Watch Agent to send that log file to Cloud Watch Log??
Is my understanding correct? Please help me in understanding the flow.
Is below flow correct?
NLog write log to File -> Cloud Agent read log from there -> Send log
to Cloud Watch
Question: How to use Cloud Watch Agent in above POC to send data via NLog?
Cloud Watch Agent runs on your server and can watch logs files that are produced. These log files can be anything, IIS Logs, Time Logs, Event Log, Etc. When the log file is updated, CWA will grab the updates and send to Cloud Watch. This is the generic behavior of the CWA and is great for Event Logs and OS logging.
By modifying the AWS.EC2.Windows.CloudWatch.json CWA json file, you can configure it to watch log files for certain formats and send changes to CW outside the standard/example ones it does by default. You can update the json to your NLog entry layout format and have it watch for that specific format in the file. CW Does have a delay sending.
Now you have Nlog which writes log files. You can have NLog send the log entries to a file and have the Cloud Watch Agent watch that file, pick up the change and send it or you can have NLog send the entries directly to CW. Since you are writing directly to CW through a NLog target, you don't need the Cloud Agent for your NLog files. I suggest keeping CWA for other log files like IIS or event logs.
I guess it is a matter a preference on how you do it. I think NLog Targets with layouts is easier than dealing with the CloudWatch json file to try and match the log format. I only use CWA to send log files I have no control over and use a NLog Target to send my NLog entries.
I can post an example CWA json snippet for a 3rd party log file I monitor with CWA if you need an example.
When an application just have to write to a file, then it lives a very simple life with few problems.
When an application suddenly have to handle network traffic (with timeouts, disconnects, retries, connectivity, latency, etc.) then it suddenly will have issues with things queuing up, taking memory, using sockets, causing garbage collections, stalls, etc. (And loosing all pending logevents on crash)
Depending on the lifetime of your application and the critically of your applications, then it can be useful to give it a simple life. And let a friend like Cloud Watch Agent worry about the network-stuff.
See also https://github.com/NLog/NLog.Extensions.Logging/wiki/NLog-cloud-logging-with-Azure-function-or-AWS-lambda
Related
I am adding Application Insights (AI) to my web API service by following this page Application Insights Instructions. I managed to get my service to connect to AI and I am able to see when my service preforms a post, get, etc. I also placed log calls through my service, but none of them are being written to my AI's Traces log.
I made sure to setup my Startup.cs and appsettings.json files to contain the new code needed to run AI in throughout my service, and the logging data need to have AI grab logs debug and up.
Startup.cs
public void ConfigureServices(IServiceCollection services)
{
services.AddApplicationInsightsTelemetry();
}
appsettings.json
Logging Example
public async Task ProcessQueueAsync(dBData dbContext)
{
// _logger is of type ILogger<[INSERT CLASS NAME]>
_logger.LogDebug("This is a test log by Lotzi11.");
await ProcessQueueAsyncSingle(dbContext, CancellationToken.None);
}
Can someone help me figure out why my logs are not being sent to AI?
Your code configuration and appsettings.json are correct. I can see these logs in AI as per your settings at my side.
One thing you should know is that, it may take a few minutes for these data to arrive in AI server. Please wait for a few minutes, like 5 minutes or more, then query these data again from azure portal -> application insights.
And here is a simple way to check if the data is sent to AI. In visual studio, when running the project, you can search the logs in visual studio output window. If you can find the logs there, then it should be sent to AI:
Search in visual studio output window:
If you still cannot see the these logs in AI, you should also check if you have set something like filter or sampling in your code.
While I was working on solving my problem, I found out that my company uses Serilog to handle logging, so I had to alter my project so Serilog would also send logs to AI. I modified my code using the following page serilog-sinks-applicationinsights.
This led my to realize that even though I followed Microsofts instructions on setting up AI, my ILogger class is not properly setup to handle sending logs to AI. To fix that, I alter my Startup.cs's constructor:
public Startup(IHostEnvironment environment, IConfiguration configuration)
{
var env = new Environment(environment.EnvironmentName);
_systemConfiguration = new SystemConfiguration(env, configuration);
_systemConfiguration.Validate();
Log.Logger = new LoggerConfiguration().Enrich.FromLogContext().WriteTo.ApplicationInsights(_systemConfiguration.BaseConfiguration["APPINSIGHTS_INSTRUMENTATIONKEY"], TelemetryConverter.Traces).CreateLogger();
using var provider = new SerilogLoggerProvider(Log.Logger);
_logger = provider.CreateLogger(nameof(Startup));
}
After adding AI to Log.Logger, my logs began showing up in my AI's page.
I am using Serilog library in a C# Console Application to build a Scheduler, which logs messages in a Log File for each day. The filenames are created dynamically using below code.
Log.Logger = new LoggerConfiguration()
.WriteTo.Console()
.WriteTo.File("logs\\log-scheduler-.txt", rollingInterval: RollingInterval.Day)
.CreateLogger();
How can I get to know the Name of File that my Console application has created? At the end of my program, I want to email the log file as attachment.
Serilog doesn't expose a way to inspect the name of the file(s) that were created, as of this writing.
If you don't use a RollingInterval and have the log messages be written to a single file, then you know exactly what the name of the file is, because you already specified it in the log pipeline configuration: logs\\log-scheduler-.txt.
But if you want to use a RollingInterval then you what you could do, is to inspect the folder where your log files are being written to, and find the log file(s) that were updated since your application started, by capturing the current timestamp when your app starts, and then looking at the LastWriteTimeUtc attribute of the log files in the file system to see anything that has changed since that time.
e.g.
DateTime appStartTime = DateTime.UtcNow;
// ... (app code)
// Ensure that all messages are flushed to the file
Log.CloseAndFlush();
// Get all log files modified since the app started
var logFilesUpdatedInThisSession = new DirectoryInfo(#"C:\Temp\logs\")
.EnumerateFileSystemInfos("log-scheduler-*.txt")
.Where(f => f.LastWriteTimeUtc >= appStartTime)
.OrderBy(f => f.LastWriteTimeUtc)
.ToList();
if (logFilesUpdatedInThisSession.Any())
{
// Send all files via e-mail ...
}
Another alternative (better IMO) would be to not send e-mails at all from your app, and just ship the logs to a server, such as a Seq server which allows you to easily browse the logs, apply filters, etc., which is a much better experience than opening a log file attached to an e-mail on a text editor.
I am an experienced windows C# developer, but new to the world of Azure, and so trying to figure out a "best practice" as I implement one or more Azure Cloud Services.
I have a number of (external, and outside of my control) sources that can all save files to a folder (or possibly a set of folders). In the current state of my system under Windows, I have a FileSystemWatcher set up to monitor a folder and raise an event when a file appears there.
In the world of Azure, what is the equivalent way to do this? Or is there?
I am aware I could create a timer (or sleep) to pass some time (say 30 seconds), and poll the folder, but I'm just not sure that's the "best" way in a cloud environment.
It is important to note that I have no control over the inputs - in other words the files are saved by an external device over which I have no control; so I can't, for example, push a message onto a queue when the file is saved, and respond to that message...
Although, in the end, that's the goal... So I intend to have a "Watcher" service which will (via events or polling) detect the presence of one or more files, and push a message onto the appropriate queue for the next step in my workflow to respond to.
It should be noted that I am using VS2015, and the latest Azure SDK stuff, so I'm not limited by anything legacy.
What I have so far is basically this (a snippet of a larger code base):
storageAccount = CloudStorageAccount.Parse(CloudConfigurationManager.GetSetting("StorageConnectionString"));
// Create a CloudFileClient object for credentialed access to File storage.
fileClient = storageAccount.CreateCloudFileClient();
// Obtain the file share name from the config file
string sharenameString = CloudConfigurationManager.GetSetting("NLRB.Scanning.FileSharename");
// Get a reference to the file share.
share = fileClient.GetShareReference(sharenameString);
// Ensure that the share exists.
if (share.Exists())
{
Trace.WriteLine("Share exists.");
// Get a reference to the root directory for the share.
rootDir = share.GetRootDirectoryReference();
//Here is where I want to start watching the folder represented by rootDir...
}
Thanks in advance.
If you're using an attached disk (or local scratch disk), the behavior would be like on any other Windows machine, so you'd just set up a file watcher accordingly with FileSystemWatcher and deal with callbacks as you normally would.
There's Azure File Service, which is SMB as-a-service and would support any actions you'd be able to do on a regular SMB volume on your local network.
There's Azure blob storage. These can not be watched. You'd have to poll for changes to, say, a blob container.
You could create a loop that polls the root directory periodically using
CloudFileDirectory.ListFilesAndDirectories method.
https://msdn.microsoft.com/en-us/library/dn723299.aspx
You could also write a small recursive method to call this in sub directories.
To detect differences you can build up an in memory hash map of all files and directories. If you want something like a persistent distributed cache then you can use ie. Redis to keep this list of files/directories. Every time you poll if the file or directory is not in your list then you detected a new file/ directory under root.
You could separate the responsibility of detection and business logic ie. a worker role keeps polling the directory and writes the new files to a queue and the consumer end another worker role/ web service that does the processing with that information.
Azure Blob Storage pushes events through Azure Event Grid. Blob storage has two event types, Microsoft.Storage.BlobCreated and Microsoft.Storage.BlobDeleted. So instead of long polling you can simply react to the created event.
See this link for more information:
https://learn.microsoft.com/en-us/azure/storage/blobs/storage-blob-event-overview
I had a very similar requirement. I used BOX application. It has a Webhook feature for events occurring in Files or Folders: such as Add, Move, Delete etc..
Also there are some newer alternatives with Azure Autromation.
I'm pretty new to Azure too, and actually I'm investigating a file watcher type thing. I'm considering something involving Azure Functions, because of this, which looks like a way of triggering some code when a blog is created or updated. There's a way of specifying a pattern too: https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-storage-blob
I am trying to create an event log for a Windows Universal Application.
Earlier we had System.Diagnostics EventLog to log events, but I could not find anything similar on the Windows 10 Universal Apps platform.
Is it possible to create logs for Windows 10 and can these logs be written to a file for accessing it later?
I searched a lot, but could not find anything.
FileLoggingSession
Since Windows 8.1 there are FileLoggingSession and LoggingChannel classes in the Windows.Foundation.Diagnostics namespace, which can perform logging to files when configured to do so. You can read more in the official documentation.
Initialization, usage and retrieving the log file can be done like in the following snippet, of course you need to create interfaces, singletons etc. to make it usable:
// Initialization
FileLoggingSession fileLoggingSession = new FileLoggingSession("session");
var loggingChannel = new LoggingChannel("channel");
fileLoggingSession.AddLoggingChannel(loggingChannel);
// Log messages
loggingChannel.LogMessage("error message", LoggingLevel.Error);
// When file is needed
var file = await fileLoggingSession.CloseAndSaveToFileAsync();
// Do anything with file
LoggingSession
Just as FileLoggingSession writes logs to a file but the main difference is that FileLoggingSession writes logs immediately to the file, and LoggingSession does not, and you need to manually request writing the logs to a file with the SaveToFileAsync method. From the documentation:
The FileLoggingSession class sends logged messages to disk files as they are logged. The FileLoggingSession class uses sequential logging, which means that all messages are sent to a disk file, and a sequential history of messages is retained. This is distinct from the LoggingSession class, which sends logged messages to disk on-demand, and this happens when there's a problem and the immediate history of in-memory messages is needed for analysis.
MetroLog
You have another alternatives if you do not wan't to use FileLoggingSession or LoggingSession classes. One good solution is MetroLog which has a FileStreamingTarget target that makes it very simple to log in a Windows/Phone app.
You create the logger when you need it, for example in a page:
public sealed partial class LogSamplePage : Win8Sample.Common.LayoutAwarePage
{
private ILogger Log = LogManagerFactory.DefaultLogManager.GetLogger<LogSamplePage>();
}
Then you can use it in the page like this:
// flat strings...
if (this.Log.IsInfoEnabled)
this.Log.Info("I've been navigated to.");
// formatting...
if (this.Log.IsDebugEnabled)
this.Log.Debug("I can also format {0}.", "strings");
// errors...
try
{
this.DoMagic();
}
catch(Exception ex)
{
if (this.Log.IsWarnEnabled)
this.Log.Warn("You can also pass in exceptions.", ex);
}
MetroEventSource
The second solution is this logging sample on MSDN sample gallery by Can Bilgin where you have the MetroEventSource class. You can log messages for example an error like this:
MetroEventSource.Log.Error("Here is the error message");
If you use this logger don't forget to initialize it on application run, as described in the sample project.
I have a recent problem . I can upload file in my intetpub/wwwrooot/folder
But I can't write a log file in this same folder ...
I have all the permissions for the network service. Everything is on my server.
DirectoryInfo di = new DirectoryInfo(~/);
// Get a reference to each file in that directory.
FileInfo[] fiArr = di.GetFiles();
string strLogText = di;
// Create a writer and open the file:
StreamWriter log;
if (!System.IO.File.Exists("C:\\inetpub\\wwwroot\\logfile.txt"))
{
log = new StreamWriter("C:\\inetpub\\wwwroot\\logfile.txt");
}
else
{
log = System.IO.File.AppendText("C:\\inetpub\\wwwroot\\logfile.txt");
}
// Write to the file:
log.WriteLine(DateTime.Now);
log.WriteLine(strLogText);
log.WriteLine();
// Close the stream:
log.Close();
The error is the access is denied !
It works locally , but on my server it doesnt. On the folder Inetpub , I just need to allow writting for Network service ? That is strange because I can upload file and writting is already enable
Emged in case of exceptions your code does not close the streams on the log file and this is surely not good.
You should use a using statement around the streams so in any case streams are closed and disposed also in case of exceptions.
As Chris has suggested I would absolutely opt for a logging Framework and I would also avoid writing in that wwwroot folder.
ELMAH or NLog or Log4Net are good and easy alternatives far better than any custom logging lie you are doing right now and the big advantage of these technologies/libraries is that you can change the behaviour at runtime simply by editing the configuration file, no need to rebuild or redeploy anything...
my favourite is actually Log4Net, check these ones for a simple example on how to use it:
http://logging.apache.org/log4net/release/manual/configuration.html
Log4Net in App object?
Depending on the version of your server (windows 2008 and above), that directory has additional protection against writes.
I'd highly recommend you look into ELMAH to do your logging. It gives you a number of options including in memory or database backed and collects a LOT of additional data you might want.
Further, opening up various physical directory locations for write access is a HUGE security no-no.
On the server, is the web app running under an Application Pool that has alternate credentials, other than the normal network service account? If you haven't done so already, try turning on Auditing to see what user is trying to access the file.