I am developing an app to capture event logs (security) from multiple Windows systems. I have a handler to EntryWritten. I am able to map most fields from the Event Viewer to the EntryWrittenEventArgs entry in .net. However, I cannot seem to find the mappings for the Level, OpCode and Task Category fields which show up in Event Viewer. Any ideas on how I get this in vb.net or c#? Thanks
The EventLog class in the System.Diagnostics namespace does not contain fields for Level, OpCode or Task. There is, however, the EventRecord class in the System.Diagnostics.Eventing.Reader namespace which is capable of returning those fields. Note that this namespace is mainly used for retrieving event logs from a remote machine. Even though you could use it to get logs on the local machine as well, it opens a local pipe to the system, which makes it slower than the EventLog class. If you really need to access those fields though, this is how this class is generally used:
private void LoadEventLogs()
{
List<EventRecord> eventLogs = new List<EventRecord>();
EventLogSession session = new EventLogSession();
foreach (string logName in session.GetLogNames())
{
EventLogQuery query = new EventLogQuery(logName, PathType.LogName);
query.TolerateQueryErrors = true;
query.Session = session;
EventLogWatcher logWatcher = new EventLogWatcher(query);
logWatcher.EventRecordWritten +=
new EventHandler<EventRecordWrittenEventArgs>(LogWatcher_EventRecordWritten);
try
{
logWatcher.Enabled = true;
}
catch (EventLogException) { }
// This is how you'd read the logs
//using (EventLogReader reader = new EventLogReader(query))
//{
// for (EventRecord eventInstance = reader.ReadEvent(); eventInstance != null; eventInstance = reader.ReadEvent())
// {
// eventLogs.Add(eventInstance);
// }
//}
}
}
And the LogWatcher_EventRecordWritten event handler:
private void LogWatcher_EventRecordWritten(object sender, EventRecordWrittenEventArgs e)
{
var level = e.EventRecord.Level;
var task = e.EventRecord.TaskDisplayName;
var opCode = e.EventRecord.OpcodeDisplayName;
// Other properties
}
Note that I wrapped the logWatcher.Enabled = true; statement in a try-catch block, because not all sources allow entry-written listeners (security should work fine). The commented-out section shows you an example of reading all the logs, if you need it.
Related
I have been told to write a software to burn a CD synchronously/asynchronously as per user choice. I am using IMAPIv2 with C# for the project, and it does not provide the functionality explicitly to write the data asynchronously.
In order to design the functionality, I have researched online resources, but in vain.
Can someone explain what Synchronous/Asynchronous I/O is, in terms of burning an image on a disc?
Any help is appreciated.
IMAPI does not provide in-build class/method to write data asynchronously. But it is designed a way that it is possible with any technology that supports asynchronous programming. The one you are using (C# as you mentioned in comments) does support it.
IMAPI exposes interfaces those report status for progress and actions. All you need to do is use the threading to run the activity asynchronously; this will free up your UI and you can perform other activities. Then, you may subscribe for the events those will report the status to you.
Refer this project on CodeProject which uses BackgroundWorker for the same:
Multithreading
Burning or formatting media can take some time, so we do not want to perform these actions on the main UI thread. I use the BackgroundWorker class to handle the multithreading of these lengthy tasks. The BackgroundWorker class allows you to set values within the thread and then call the ReportProgress method which fires a ProgressChanged event in the calling thread. When you are finished with your worker thread, it fires the RunWorkerCompleted event to notify the calling thread that it is finished.
Following are DoWork and Update events:
private void backgroundBurnWorker_DoWork(object sender, DoWorkEventArgs e)
{
MsftDiscRecorder2 discRecorder = null;
MsftDiscFormat2Data discFormatData = null;
try
{
//
// Create and initialize the IDiscRecorder2 object
//
discRecorder = new MsftDiscRecorder2();
var burnData = (BurnData)e.Argument;
discRecorder.InitializeDiscRecorder(burnData.uniqueRecorderId);
//
// Create and initialize the IDiscFormat2Data
//
discFormatData = new MsftDiscFormat2Data
{
Recorder = discRecorder,
ClientName = ClientName,
ForceMediaToBeClosed = _closeMedia
};
//
// Set the verification level
//
var burnVerification = (IBurnVerification)discFormatData;
burnVerification.BurnVerificationLevel = _verificationLevel;
//
// Check if media is blank, (for RW media)
//
object[] multisessionInterfaces = null;
if (!discFormatData.MediaHeuristicallyBlank)
{
multisessionInterfaces = discFormatData.MultisessionInterfaces;
}
//
// Create the file system
//
IStream fileSystem;
if (!CreateMediaFileSystem(discRecorder, multisessionInterfaces, out fileSystem))
{
e.Result = -1;
return;
}
//
// add the Update event handler
//
discFormatData.Update += discFormatData_Update;
//
// Write the data here
//
try
{
discFormatData.Write(fileSystem);
e.Result = 0;
}
catch (COMException ex)
{
e.Result = ex.ErrorCode;
MessageBox.Show(ex.Message, "IDiscFormat2Data.Write failed",
MessageBoxButtons.OK, MessageBoxIcon.Stop);
}
finally
{
if (fileSystem != null)
{
Marshal.FinalReleaseComObject(fileSystem);
}
}
//
// remove the Update event handler
//
discFormatData.Update -= discFormatData_Update;
if (_ejectMedia)
{
discRecorder.EjectMedia();
}
}
catch (COMException exception)
{
//
// If anything happens during the format, show the message
//
MessageBox.Show(exception.Message);
e.Result = exception.ErrorCode;
}
finally
{
if (discRecorder != null)
{
Marshal.ReleaseComObject(discRecorder);
}
if (discFormatData != null)
{
Marshal.ReleaseComObject(discFormatData);
}
}
}
void discFormatData_Update([In, MarshalAs(UnmanagedType.IDispatch)] object sender,
[In, MarshalAs(UnmanagedType.IDispatch)] objectprogress)
{
//
// Check if we've cancelled
//
if (backgroundBurnWorker.CancellationPending)
{
var format2Data = (IDiscFormat2Data)sender;
format2Data.CancelWrite();
return;
}
var eventArgs = (IDiscFormat2DataEventArgs)progress;
_burnData.task = BURN_MEDIA_TASK.BURN_MEDIA_TASK_WRITING;
// IDiscFormat2DataEventArgs Interface
_burnData.elapsedTime = eventArgs.ElapsedTime;
_burnData.remainingTime = eventArgs.RemainingTime;
_burnData.totalTime = eventArgs.TotalTime;
// IWriteEngine2EventArgs Interface
_burnData.currentAction = eventArgs.CurrentAction;
_burnData.startLba = eventArgs.StartLba;
_burnData.sectorCount = eventArgs.SectorCount;
_burnData.lastReadLba = eventArgs.LastReadLba;
_burnData.lastWrittenLba = eventArgs.LastWrittenLba;
_burnData.totalSystemBuffer = eventArgs.TotalSystemBuffer;
_burnData.usedSystemBuffer = eventArgs.UsedSystemBuffer;
_burnData.freeSystemBuffer = eventArgs.FreeSystemBuffer;
//
// Report back to the UI
//
backgroundBurnWorker.ReportProgress(0, _burnData);
}
I'm running Windows version 10.0.16299.0, and building on Visual Studio C# 2017. I can successfully connect to an unpaired BLE device from a Windows Forms app, and get ValueChanged events (1 per second), but not for long. I usually stop receiving those events in 40 seconds or less - usually less.
I realize this is likely a dispose/GC issue, but I don't see how. The device, service, characteristics, and descriptors are all stored as member variables in the main form and should not get collected:
public partial class Form1 : Form
{
private BluetoothLEDevice _device;
private List<GattDeviceService> _services;
private List<GattDescriptor> _descriptors = new List<GattDescriptor>();
private List<GattCharacteristic> _characteristics = new List<GattCharacteristic>();
private async void button1_Click(object sender, EventArgs e)
{
_device = await BluetoothLEDevice.FromIdAsync("BluetoothLE#BluetoothLE00:xx:xx:xx:xx:xx:xx:xx:xx:xx");
var services = await _device.GetGattServicesAsync();
foreach (var service in services.Services)
{
var chars = await service.GetCharacteristicsAsync();
foreach (var ch in chars.Characteristics)
{
var descriptors = await ch.GetDescriptorsAsync();
foreach (var desc in descriptors.Descriptors)
{
if (desc.AttributeHandle == 15 || desc.AttributeHandle == 26)
{
_services.Add(service);
_descriptors.Add(desc);
_characteristics.Add(ch);
var writer = new DataWriter();
writer.WriteBytes(new byte[] { 1, 0 });
var buf = writer.DetachBuffer();
await desc.WriteValueAsync(buf);
}
ch.ValueChanged += ChOnValueChanged;
}
}
}
}
In my sample, I click a button to establish a connection and subscribe to events. Before you say that writing to the descriptor is not how you would do it - I know. The device uses non-standard descriptor IDs which is why I must write to them directly.
Note that everything works, including the writes - I get no errors. It's just that the ValueChanged event is no longer fired after a short duration, and I can't figure out what else I must "cache" in order to prevent objects from being disposed, assuming that's what the problem is.
The problem is that because of the nested for each iterations you attach the characteristic_changed_event to multiple characteristics. That leads to unwanted behaviour.
The best way is to select the service that contains the wanted characteristic by UUID, then select the characteristic by UUID from that service.
If you insist on filtering by the wanted descriptor attribute handle,
finish all the "for each-es" before attaching the characteristic_changed_event.
The characteristic to attach to is probably first in _characteristics list.
I'm new the c# and am writing a program that will monitor a folder for .xml files using fileSystemWatcher being called from a method called folderWatch . The .xml files contain an email address and a path to a image which once read will be emailed. The code I have works fine if I add only a few xml's at a time however when I trying to dump a large number into the folder fileSystemWatcher is not processing all of them. Please help point me in the right direction.
private System.IO.FileSystemWatcher m_Watcher;
public string folderMonitorPath = Properties.Settings.Default.monitorFolder;
public void folderWatch()
{
if(folderMonitorPath != "")
{
m_Watcher = new System.IO.FileSystemWatcher();
m_Watcher.Filter = "*.xml*";
m_Watcher.Path = folderMonitorPath;
m_Watcher.NotifyFilter = NotifyFilters.LastAccess | NotifyFilters.LastWrite
| NotifyFilters.FileName | NotifyFilters.DirectoryName;
m_Watcher.Created += new FileSystemEventHandler(OnChanged);
m_Watcher.EnableRaisingEvents = true;
}
}
public void OnChanged(object sender, FileSystemEventArgs e)
{
displayText("File Added " + e.FullPath);
xmlRead(e.FullPath);
}
read xml
public void xmlRead(string path)
{
XDocument document = XDocument.Load(path);
var photo_information = from r in document.Descendants("photo_information")
select new
{
user_data = r.Element("user_data").Value,
photos = r.Element("photos").Element("photo").Value,
};
foreach (var r in photo_information)
{
if (r.user_data != "")
{
var attachmentFilename = folderMonitorPath + #"\" + r.photos;
displayText("new user data " + r.user_data);
displayText("attemting to send mail");
sendemail(r.user_data, attachmentFilename);
}
else
{
displayText("no user data moving to next file");
}
}
send mail
public void sendemail(string email, string attachmentFilename)
{
//myTimer.Stop();
MailMessage mail = new MailMessage();
SmtpClient SmtpServer = new SmtpClient(smtpClient);
mail.From = new MailAddress(mailFrom);
mail.To.Add(email);
mail.Subject = "test";
mail.Body = "text";
SmtpServer.Port = smtpPort;
SmtpServer.Credentials = new System.Net.NetworkCredential("username", "password");
SmtpServer.EnableSsl = true;
// SmtpServer.UseDefaultCredentials = true;
if (attachmentFilename != null)
{
Attachment attachment = new Attachment(attachmentFilename, MediaTypeNames.Application.Octet);
ContentDisposition disposition = attachment.ContentDisposition;
disposition.CreationDate = File.GetCreationTime(attachmentFilename);
disposition.ModificationDate = File.GetLastWriteTime(attachmentFilename);
disposition.ReadDate = File.GetLastAccessTime(attachmentFilename);
disposition.FileName = Path.GetFileName(attachmentFilename);
disposition.Size = new FileInfo(attachmentFilename).Length;
disposition.DispositionType = DispositionTypeNames.Attachment;
mail.Attachments.Add(attachment);
}
try
{
SmtpServer.Send(mail);
displayText("mail sent");
}
catch (Exception ex)
{
displayText(ex.Message);
}
}
First, FileSystemWatcher has internal limited buffer to store pending notifications. As per documentation:
The system notifies the component of file changes, and it stores those
changes in a buffer the component creates and passes to the APIs. Each
event can use up to 16 bytes of memory, not including the file name.
If there are many changes in a short time, the buffer can overflow.
This causes the component to lose track of changes in the directory
You can increase that buffer by setting InternalBufferSize to 64 * 1024 (64KB, max allowed value).
Next (and maybe even more important) is how this buffer is cleared. Your OnChanged handler is called and only when it is finished - notification is removed from that buffer. That means if you do a lot of work in a handler - buffer has much higher chance of being overflowed. To avoid this - do at little work as possible in OnChanged handler and do all heavy work in separate thread, for example (not production ready code, just for illustation purposes):
var queue = new BlockingCollection<string>(new ConcurrentQueue<string>());
new Thread(() => {
foreach (var item in queue.GetConsumingEnumerable()) {
// do heavy stuff with item
}
}) {
IsBackground = true
}.Start();
var w = new FileSystemWatcher();
// other stuff
w.Changed += (sender, args) =>
{
// takes no time, so overflow chance is drastically reduced
queue.Add(args.FullPath);
};
You are also not subscribed to the Error event of FileSystemWatcher so you have no idea when (and if) something goes wrong.
FSW's documentation warns that if event processing takes too long, some events may be lost. That's why it's always used with a queue and/or background processing.
One option is to use Task.Run to perform processing in the background :
public void OnChanged(object sender, FileSystemEventArgs e)
{
_logger.Info("File Added " + e.FullPath);
Task.Run(()=>xmlRead(e.FullPath));
}
Notice that I use logging instead of whatever displayText does. You can't access the UI thread from another thread. If you want to log progress, use a logging library.
You can also use the IProgress< T> interface to report progress of a long running job, or anything else that you want to publish through it. The Progress< T> implementation takes care to marshal the progress object to it parent thread, typically the UI thread.
An even better solution is to use ActionBlock< T>. An ActionBlock has an input buffer that can queue incoming messages and a DOP setting that allows you to specify how many operations can be performed concurrently. The default is 1 :
ActionBlock<string> _mailerBlock;
public void Init()
{
var options=new ExecutionDataflowBlockOptions {
MaxDegreeOfParallelism = 5
};
_mailerBlock = new ActionBlock<string>(path=>xlmRead(path),options);
}
public void OnChanged(object sender, FileSystemEventArgs e)
{
_logger.Info("File Added " + e.FullPath);
_mailerBlock.Post(e.FullPath);
}
Better yet, you can create differnt blocks for reading and emailing, and connect them in a pipeline. In this case the file reader generates a lot of emails, which means a TransformManyBlock is needed :
class EmailInfo
{
public string Data{get;set;}
public string Attachement{get;set;}
}
var readerBlock = new TransformManyBlock<string,EmailInfo>(path=>infosFromXml(path));
var mailBlock = new ActionBlock<EmailInfo>(info=>sendMailFromInfo(info));
readerBlock.LinkTo(mailBlock,new DataflowLinkOptions{PropagateCompletion=true});
The xmlRead method should be changed into an iterator
public IEnumerable<EmailInfo> infosFromXml(string path)
{
// Same as before ...
foreach (var r in photo_information)
{
if (r.user_data != "")
{
...
yield return new EmailInfo{
Data=r.user_data,
Attachment=attachmentFilename};
}
...
}
}
And sendmail to :
public void sendMailFromInfo(EmailInfo info)
{
string email=info.Data;
string attachmentFilename=info.Attachment;
}
When you want to terminate the pipeline you call Complete() on the head block and await for the tail's completion. This ensures that all remaining files will be processed :
readerBlock.Complete();
await mailerBlock.Completion;
I learnt the hard way that if you must use a reliable file monitor, use USN Journals.
https://msdn.microsoft.com/en-us/library/windows/desktop/aa363798(v=vs.85).aspx
Here is a way you could access it .NET if you have sufficient privileges: https://stackoverflow.com/a/31931109/612717
You could also implement it manually yourself with timer polling using the flie Length + LastModifiedDate.
EDIT: The issue here wan't the fact it was locked via GP, it was that it was being run as a service under a service account and it didn't have access to the interactive desktop
I have a C# application that needs to check for when a user's session is locked, I'm using Microsoft.Win32.SystemEvents.SessionSwitch and this works fine when a user manually locks the machine.
The problem is that when the machine is locked via a group policy (User Configuration > Policies > Administrative Templates > Personalization > Screen saver timeout) the application doesn't pick up the switch.
Is there another way to check for a machine being locked? Or is there another way to lock machines via group policy that will be picked up by the application?
N.B. The application is running on windows 7 as a service with full admin rights
Here's my code, Thanks in advance!!! :)
public void OnStart(string[] args)
{
Microsoft.Win32.SystemEvents.SessionSwitch += new Microsoft.Win32.SessionSwitchEventHandler(SystemEvents_SessionSwitch);
}
void SystemEvents_SessionSwitch(object sender, Microsoft.Win32.SessionSwitchEventArgs e)
{
if (e.Reason == SessionSwitchReason.SessionLock)
{
//DO STUFF
}
}
I managed to resolve this by enabling 'Other Logon/Logoff Events' in Windows Event Viewer and searching for the lock and unlock events.
//Define strings for searching the eventlog.
string lockEvent = "4800";
string unlockEvent = "4801";
//Define the Eventlog source you want (in this case it's Security)
string LogSource = #"Security";
//Add these together to make the full query for Lock and Unlock
string LockQuery = " *[System/EventID=" + lockEvent + "]";
string UnlockQuery = "*[System/EventID=" + unlockEvent + "]";
//Return true if there is any locked events found.
private bool CheckForLock()
{
//Create Eventlog Reader and Query
var elQuery = new EventLogQuery(LogSource, PathType.LogName, LockQuery);
var elReader = new System.Diagnostics.Eventing.Reader.EventLogReader(elQuery);
//Create a list of Eventlog records and add the found records to this
List<EventRecord> eventList = new List<EventRecord>();
for (EventRecord eventInstance = elReader.ReadEvent();
null != eventInstance; eventInstance = elReader.ReadEvent())
{
eventlist.add(eventInstance);
}
if(eventList.count > 0)
{
return true;
}
else
{
return false;
}
}
N.B. This will check all the event log, so you need to put a qualifier on how far into the past you want to bee looking.
If you check for lock/unlock sessions every ten seconds, you only want to deal with an EventRecord if it's from the same time period.
You can access the eventlist.TimeCreated value to do something like...
if (eventInstance.TimeCreated > DateTime.Now - TimeSpan.FromSeconds(10))
{
eventList.Add(eventInstance);
}
Is it elegant? No. Does it work? Yes.
I am writing a simple service and logging exceptions and other notable items to the EventLog. Below is the code for the service. Somehow, although I can see the "FDaemon" log, I don't see any events in it. My started and stopped events are nowhere in the log; the log lists 0 events.
using System;
using System.ComponentModel;
using System.Diagnostics;
using System.ServiceProcess;
using System.Threading;
namespace FDaemon
{
public class EmailDigester : ServiceBase, IDebuggableService
{
private Timer digestTimer;
private EmailDigesterWorker worker;
private EventLog eventLog;
public EmailDigester()
{
// fire up the event log
this.eventLog = new System.Diagnostics.EventLog();
((ISupportInitialize)(this.eventLog)).BeginInit();
this.eventLog.Source = "EmailDigester";
this.eventLog.Log = "FDaemon";
if (!EventLog.SourceExists(this.eventLog.Source))
{
EventLog.CreateEventSource(this.eventLog.Source, this.eventLog.Log);
}
this.AutoLog = false;
this.ServiceName = this.eventLog.Source;
((ISupportInitialize)(this.eventLog)).EndInit();
}
public void DebugStart(string[] args)
{
this.OnStart(args);
}
protected override void OnStart(string[] args)
{
this.worker = new EmailDigesterWorker(1, eventLog);
// no need to multithread, so use a simple Timer
// note: do not take more time in the callback delegate than the repetition interval
if (worker.RunTime.HasValue)
{
worker.ServiceStarted = true;
TimerCallback work = new TimerCallback(this.worker.ExecuteTask);
TimeSpan daily = new TimeSpan(24, 0, 0); // repeat every 24 hrs
TimeSpan startIn; // how much time till we start timer
if (worker.RunTime <= DateTime.Now)
startIn = (worker.RunTime.Value.AddDays(1.00) - DateTime.Now); // runTime is earlier than now. we missed, so add a day to runTime
else
startIn = (worker.RunTime.Value - DateTime.Now);
this.digestTimer = new Timer(work, null, startIn, daily);
}
eventLog.WriteEntry("EmailDigester started.", EventLogEntryType.Information);
}
public void DebugStop()
{
this.OnStop();
}
protected override void OnStop()
{
worker.ServiceStarted = false;
if (this.digestTimer != null)
{
this.digestTimer.Dispose();
}
eventLog.WriteEntry("EmailDigester stopped.", EventLogEntryType.Information);
}
}
}
First: I am assuming that you've stepped through and the WriteEntry() function does, in fact, execute.
If your source "EmailDigester" is registered with any other EventLogs (e.g. Application, Security, etc.), then the messages will appear in that EventLog no matter what you do. In fact, I believe only the first 8 characters of a source are considered.
You can check this by going to the registry #:
HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\Eventlog\ and checking each log's sources.
You might also considering changing your source to a random value (that you know won't be registered) and seeing if your logs show up.