I am setting out to create an app that will watch a directory for any files created. pretty straightforward time to use a filesystemwatcher. My question relates to how to utilize it. Is it common practice to use a windows service to ensure the application is always running?
i have been trying to get away from building windows services if i don't have to, but i really don't see an alternative to doing it that way in this instance. normally, i would convert my service to a console app and schedule it using windows scheduler, but that doesn't really apply in this situation.
can anyone recommend a better way of implementing the filesystemwatcher than a windows service?
thanks for any thoughts.
EDIT
in response to the comments below, more specifically, i just have to watch a directory on a server, and when a new file is created, i have to move a copy of that file into a different directory on the same server, perhaps renaming it in the process.
The frequency and amount of files will be quite small. perhaps 5-10 at most in a day.
I'm not sure yet how the file watcher works, but this is what I'm thinking: The file system fires events; I mean like NTFS must be doing that. Your file watcher hooks into those events. The file watcher probably suspends the thread it's running in until an event occurs and the event somehow wakes up the thread. A suspended thread uses pretty much very few cpu cycles (actually none) while it is suspended, so waiting for a file event costs nothing. So a polled approach wastes mucho beaucoup (that's French, it means 'a shit load') of cpu cycles but the file watcher does not. You could probably look at PerfMon to see if this is likely true.
You should describe more about what you want to do, but typically if you have something that needs to run in the background and does not require direct user interaction, then a service often makes sense.
You can use Remoting to connect a front-end to your service as needed if you'd like.
Yes, use a service for this kind of operation, but don't use filesystem watcher. If you poll for files in your service, dont use the Timer class either.
Do check to make sure the file is completed writing and is no longer locked before trying to move it.
Its trivial to poll for file changes (syntax may be off), and eliminates much of the extra programming associated with file system watcher events.
While True 'or your exit condition'
Dim _files() as FileInfo = Directory.GetFiles(yourTargetDirectory)
For Each _file as FileInfo In _files
If _file.LastModifiedDate < DateTime.Now.AddMinutes(1) Then
'move your files'
End If
Next
End While
Using a Windows service to wrap FileSystemWatcher is perfectly fine for what you need.
FSW is the right tool for the job (native code for filesystem watching is a bear to get right), and a service is the right mechanism to deploy it given you need 'always on' operation.
The service credentials will be independent of logged-in user too, which may be useful to you.
Related
Okay so I'm in a dilemma here. What I need is a solution for my following problem:
I basically want to automatically upload some files to my server as soon as they got created into client system's specific folder. There will be some files that would be created in a specific path, suppose on path:
C:\User\Desktop\Scanned Files, now as soon as user saves their files into this path some window service kind of thing should pick the file up and upload to my server. Process should be fast enough, I mean no longer than 1 minute.
I can either go with the windows service or scheduled task. As far as I know windows service are always running on background but they need to run with a timer which runs periodically and do the allotted operation.
and scheduled task runs on a particular point of time. First of all I'm quite confused here which approach should I go with and what would fulfill my requirement more efficiently and feasibly. Or is there any better way to do this other than windows service and scheduled task?
You can make use of FileSystemWatcher class and can utilize its events in order to upload any new file in that folder to the server like below
FileSystemWatcher folderWatcher= new FileSystemWatcher("C:\User\Desktop\Scanned Files");
folderWatcher.EnableRaisingEvents = true;
folderWatcher.Created += new FileSystemEventHandler(folderWatcher_Created);
void folderWatcher_Created(object sender, FileSystemEventArgs e)
{
// Your code to upload the file to the server
}
You should without doubt use a Scheduled Task.
The right solution for scheduling a simple process like yours (file uploading) is the Windows Task Scheduler. It is simply better and more stable.
A service performs a specific system function to support other programs, particularly at a low level.
Services offer no scheduling, because they are designed to run all the time!
On the other hand, Windows Scheduled Tasks have advanced scheduling features which can be accessed administratively. This is an advantage.
An file upload is a bad candidate for Service implementation. Even if your write it with great care and stability.
You can, however, start with a simple timer, and then migrate to a Scheduled Task.
I hope this serves you.
If you need more arguments, please Jon Galloway thoughts:
See here: http://weblogs.asp.net/jongalloway//428303
Also here there is a related question: windows service vs scheduled task
I have an existing windows service that uses FileSystemWatcher object to monitor one folder. Now, I would like to create another windows service that uses the FileSystemWatcher object to monitor the same folder. So, can someone please clarify me on below questions ?
Is there a restriction on number of FileSystemWatchers monitoring the same folder ?
how do we handle the file lock or access issues such as when one FileSystemWatcher is being written to the directory and the other is trying to read the same file ?
Any other implications on this set-up?
Regards,
Ram
I am dealing with a similar situation. I have the same service installed and running on 2 machines. Each is monitoring the same collection of folders on a network.
In my experience:
Having 2 services each monitoring the same network share location(s) is not a problem - both will react.
My FSW triggers a file-copy from the monitored location to another network location. Since both instances of the service react, both will attempt the copy. One of my services throws an error (which I try..catch) the other does the work. This is a satisfactory solution for me although I appreciate it's not very "tidy".
The FSW is remarkably unreliable. I'm currently dealing with a situation where the network connection fails and I'm having to restart my filewatchers. there are plenty of examples of ways to restart a FSW on SO... Personally, I'm setting off a timer in the onError event and recreating my FSW(s) after 30 seconds. Yes, I'm recreating it - i read advice that a simple ".EnableRaisingEvents=true;" would not work well enough.
HTH
I've used FileSystemWatcher in the past. However, I am hoping someone can explain how it actually is working behind the scenes.
I plan to utilize it in an application I am making and it would monitor about 5 drives and maybe 300,000 files.
Does the FileSystemWatcher actually do "Checking" on the drive - as in, will it be causing wear/tear on the drive? Also does it impact hard drive ability to "sleep"
This is where I do not understand how it works - if it is like scanning the drives on a timer etc... or if its waiting for some type of notification from the OS before it does anything.
I just do not want to implement something that is going to cause extra reads on a drive and keep the drive from sleeping.
Nothing like that. The file system driver simply monitors the normal file operations requested by other programs that run on the machine against the filters you've selected. If there's a match then it adds an entry to an internal buffer that records the operation and the filename. Which completes the driver request and gets an event to run in your program. You'll get the details of the operation passed to you from that buffer.
So nothing actually happens the operations themselves, there is no extra disk activity at all. It is all just software that runs. The overhead is minimal, nothing slows down noticeably.
The short answer is no. The FileSystemWatcher calls the ReadDirectoryChangesW API passing it an asynchronous flag. Basically, Windows will store data in an allocated buffer when changes to a directory occur. This function returns the data in that buffer and the FileSystemWatcher converts it into nice notifications for you.
hello guys I have been given the task to create a web-app that would allow an end user to load a file to one of our servers. the server would then manipulate the file and then give it back to the end user.The server has IIS7 and .net framework 4
Here is my problem.
I have everything figured out except that I'm having a hard time getting rid of the file from the server once i manipulated. Is there a way that i can put a timer say after 30 mins to have the file removed from the server in my code? or is there another solution to doing this?
I know the timer suggestion perhaps does not make sense however i cant think of another way to do it so i'm looking for an opinion or another method.
A timer is a good method to schedule something in the future. You could even reset the timer if the user requests the file again. Just give the timer a delegate that deletes the file when the timer fires.
Are you able to set up a scheduled task on the server?
This sort of thing is perfect for a simple console app that simply deletes files that have a modified date/time that is older than, say, now.AddMinutes(-10).
The task can run every 10 minutes or so if you like too.
Sometimes best to keep this sort of thing away from your website. Let your site serve your users and create something else to serve your server.. :)
Update
If its a heavy-traffic site you could simply delete all old files the next time someone uploads a file.
So:
User Selects file to upload, clicks Upload -> you get file -> you
delete old files (regardless of who they belong to) -> you manipulate
file -> file will be deleted by next users upload...
Why not delete it after manipulating? Or whatever the last step in the process is? That would seem to be the best and easiest way.
Depending on volume, it's probably not a great idea to do a single task for each file - rather you should batch them into a queue and have a single thread process the queue.
For instance, you can spin up a background thread in global.asax (perhaps using a Timer) to handle cleanup tasks by comparing file times or something.
Or, as step 1 of the process you could clean any old files. Not exactly the same thing, but may be close enough.
Or, you could abuse the Cache remove callback as a timer.
If you can ensure the app stays running all the time you can skip scheduled tasks and use Quartz.NET. In this case, even if it shuts down using quartz would not be that bad -- unless there is something else to this having a few old files hanging around while the app is idle wouldn't hurt.
Insofar as handling this, I would store in an appropriate manner (eg -- your database) an list of the files with a marker for the job being complete as well as deleted. Your quartz task could then grab the files that are marked done but not marked deleted and clean those up. Bonus points for using transactions around the file deletes and updates and logging so you know what happened while the world was sleeping.
I currently have a multithreaded application which runs in following order:
Start up and change XML file
Do work
Change XML to default values
The step 3 is very important and I have to insure that it always happens. But if the application crashes, I might end up with the wrong XML.
The scenario where I am using it is:
My application is a small utility which connects to a remote device, but on the same machine there is a running service which is connected to the same remote device, which I want to connect to. Service exposes restartService method and during startup depending on the XML data it will connect to the remote device or will not. So in the end I have to ensure that whatever happened to my application, XML is set to the default state.
I thought having a thread running as a separate process and checking every n seconds if the main process is alive and responding would solve this issue. But I have found very few examples of multiprocess applications in C#. So if someone could show an example of how you to create a thread which runs as a separate process, that would be great.
What if I create a separate project - console application. It is compiled into separate executable and is launched from within main application. Then use IpcChannel for the communication between 2 processes. Or Create a WCF application. Will one of these approach work?
A Thread belongs to a Process, so if the process dies then so do all it's threads. Each application is expected to be a single process and while you can launch additional processes it sounds like a complex solution to what might be a simple problem.
Rather than changing and reverting the file could you just read it into memory and leave the filesystem alone?
You can subscribe to an event called DispatcherUnhandledException so when ever an Unhandled exception is thrown , you can safely revert your XML settings.
public partial class App : Application
{
public App()
{
this.DispatcherUnhandledException += new System.Windows.Threading.DispatcherUnhandledExceptionEventHandler(App_DispatcherUnhandledException);
}
void App_DispatcherUnhandledException(object sender, System.Windows.Threading.DispatcherUnhandledExceptionEventArgs e)
{
//When ever an Unhandeled exception is thrown
// You can change your XML files to default values.
}
}
// If you killed process through Task Manager
AppDomain.CurrentDomain.ProcessExit += new EventHandler(CurrentDomain_ProcessExit);
void CurrentDomain_ProcessExit(object sender, EventArgs e)
{
// Change your Settings Here.
}
// If you initiated Windows ShutDown
this.SessionEnding += new SessionEndingCancelEventHandler(App_SessionEnding);
void App_SessionEnding(object sender, SessionEndingCancelEventArgs e)
{
// XML Changes
}
What you are talking about is usually called "supervision" in mainframe computing and other large-ish computing infrastructures. A supervised process is a process that runs under the scrutiny of a supervisor process, which restarts it or otherwise "fixes" the problem if the former crashes or is unable to finish its job.
You can see a glimpse of this in the way that Windows restarts services automatically if they stop working; that is a very simplistic version of a supervision subsystem.
As far as I understand, this is a complex area of computer engineering, and I don't think that Windows or .NET provide a programmatic interface to it. Depending on your specific needs, you might be able to develop a simple approach to it.
Consider setting a "dirty" flag in your config file and storing a backup of the default XML in another file. When your application starts it changes the XML and sets the flag. If it successfully completes then it resets the flag and restores the XML. Your service checks the flag to see if it needs to use the last XML written by your app or switch to the backup file.
I think that whether the application is multithreaded or multiprocess or whatever is not actually the problem you need to solve. The real problem is: how do I make this operation atomic?
When you say that you have to insure that step 3 always happens, what you're really saying is your program needs to perform an atomic transaction: either all of it happens, or none of it happens.
To accomplish this, your process should be designed the way that database transactions are designed. It should begin the transaction, do the work, and then either commit the transaction or roll it back. The process should be designed so that if, when it starts up, it detects that a transaction was begun and not committed or rolled back by an earlier run, it should start by rolling back that transaction.
Crucially, the commit method should have as little instability as possible. For instance, a typical way to design a transactional process is to use the file system: create a temporary file to indicate that the transaction has begun, write the output to temporary files, and then have the commit method rename the temporary files to their final names and delete the transaction-flag file. There's still a risk that the file system will go down in between the time you've renamed the files and the time you've deleted the flag file (and there are ways to mitigate that risk too), but it's a much smaller risk than the kind you've been describing.
If you design the process so that it implements the transactional model, whether it uses multiprocessing or multithreading or single-threading is just an implementation detail.