I have an existing windows service that uses FileSystemWatcher object to monitor one folder. Now, I would like to create another windows service that uses the FileSystemWatcher object to monitor the same folder. So, can someone please clarify me on below questions ?
Is there a restriction on number of FileSystemWatchers monitoring the same folder ?
how do we handle the file lock or access issues such as when one FileSystemWatcher is being written to the directory and the other is trying to read the same file ?
Any other implications on this set-up?
Regards,
Ram
I am dealing with a similar situation. I have the same service installed and running on 2 machines. Each is monitoring the same collection of folders on a network.
In my experience:
Having 2 services each monitoring the same network share location(s) is not a problem - both will react.
My FSW triggers a file-copy from the monitored location to another network location. Since both instances of the service react, both will attempt the copy. One of my services throws an error (which I try..catch) the other does the work. This is a satisfactory solution for me although I appreciate it's not very "tidy".
The FSW is remarkably unreliable. I'm currently dealing with a situation where the network connection fails and I'm having to restart my filewatchers. there are plenty of examples of ways to restart a FSW on SO... Personally, I'm setting off a timer in the onError event and recreating my FSW(s) after 30 seconds. Yes, I'm recreating it - i read advice that a simple ".EnableRaisingEvents=true;" would not work well enough.
HTH
Related
I was curious how the FileSystemWatcher worked and found the answer here very helpful. Since Windows raises a flag I wonder if I can effectively use FileSystemWatcher on a mapped drive that is on a remote machine? If so, what kind of permissions do I need? I only have access to part of the harddrive (the manufacturer of the machine did this so I can copy log files off the harddrive). I have no access to the OS on the remote.
FileSystemWatcher is not 100% reliable under any circumstances, although it is usually acceptable with local folders. However, network shares can disconnect, have added latency, isn't being completely monitored by your local client Windows, etc.
Polling is about the only reliable way to check the folder. "Wear and tear" is not a problem since there are plenty of other processes, including Windows, that do a much higher amount of I/O. Also, drives are cheap.
Please don't get confuse yourself with the title of this question, I don't know what is the exact technical term of what I want to accomplish :). My requirement may be little strange and I already implemented it but I need some best practice/method to do it properly.
Here is my situation.
I am developing a client system monitoring windows application (Tracking software in client side and monitoring software in my system). I have many systems connected to a LAN and I have a monitoring system. If any certain actions happen on client system, I will get notified. I cannot use any databases in my network so what I am doing is, Since my system is also connected to LAN I shared one folder in my system. Whenever some actions happens in client system, Tracking software will create a file containing event to the shared folder in my system. The monitoring software uses a timer which will continuously check for any new files in the shared folder on a certain interval(15 Minutes). If any file found, monitoring system will know some event has happened and will show the event.
But the problem I will get notified only after 15 minutes. Also is I don't think this is the best way. There may be some good and best methods. Is there any way like registering event directly to my Monitoring application from client machine?
Please NOTE: I cannot use any Database for this purpose.
Any suggestions will be appreciated.
Take a look at SignalR - it provides real time notification and can be used exactly as you describe.
You would not require a database (but remember if your server isn't running you will miss events - this may or may not be acceptable).
Take a look at FileSystemWatcher. This will monitor directories and raise events. IME, it works well, but can fail with large amounts of traffic.
This sounds like a perfect candidate for MSMQ (MS Message Queue) and Triggers.
Create an MSMQ that all your Tracking Softwares can write to. Then have an MSMQ trigger (perhaps connecting to a front-end through WCF/named pipes) to display an alert in your Monitoring Software
You may want to use WCF Framework.
Here is two links that can help you:
wcf-tutorial-events-and-callbacks
wcf-tutorial-basic-interprocess-communication
I am setting out to create an app that will watch a directory for any files created. pretty straightforward time to use a filesystemwatcher. My question relates to how to utilize it. Is it common practice to use a windows service to ensure the application is always running?
i have been trying to get away from building windows services if i don't have to, but i really don't see an alternative to doing it that way in this instance. normally, i would convert my service to a console app and schedule it using windows scheduler, but that doesn't really apply in this situation.
can anyone recommend a better way of implementing the filesystemwatcher than a windows service?
thanks for any thoughts.
EDIT
in response to the comments below, more specifically, i just have to watch a directory on a server, and when a new file is created, i have to move a copy of that file into a different directory on the same server, perhaps renaming it in the process.
The frequency and amount of files will be quite small. perhaps 5-10 at most in a day.
I'm not sure yet how the file watcher works, but this is what I'm thinking: The file system fires events; I mean like NTFS must be doing that. Your file watcher hooks into those events. The file watcher probably suspends the thread it's running in until an event occurs and the event somehow wakes up the thread. A suspended thread uses pretty much very few cpu cycles (actually none) while it is suspended, so waiting for a file event costs nothing. So a polled approach wastes mucho beaucoup (that's French, it means 'a shit load') of cpu cycles but the file watcher does not. You could probably look at PerfMon to see if this is likely true.
You should describe more about what you want to do, but typically if you have something that needs to run in the background and does not require direct user interaction, then a service often makes sense.
You can use Remoting to connect a front-end to your service as needed if you'd like.
Yes, use a service for this kind of operation, but don't use filesystem watcher. If you poll for files in your service, dont use the Timer class either.
Do check to make sure the file is completed writing and is no longer locked before trying to move it.
Its trivial to poll for file changes (syntax may be off), and eliminates much of the extra programming associated with file system watcher events.
While True 'or your exit condition'
Dim _files() as FileInfo = Directory.GetFiles(yourTargetDirectory)
For Each _file as FileInfo In _files
If _file.LastModifiedDate < DateTime.Now.AddMinutes(1) Then
'move your files'
End If
Next
End While
Using a Windows service to wrap FileSystemWatcher is perfectly fine for what you need.
FSW is the right tool for the job (native code for filesystem watching is a bear to get right), and a service is the right mechanism to deploy it given you need 'always on' operation.
The service credentials will be independent of logged-in user too, which may be useful to you.
I would like to be able to do an "inplace" update with my program. Basically, I want to be able to login remotely where the software is deployed, install it while other users are still using it (in a thin client way), and it update their program.
Is this possible without too much of a hassle? I've looked into clickonce technology, but I don't think that's really what I'm looking for.
What about the way firefox does it's updates? Just waits for you to restart the program, and notifies you when it's been updated.
UPDATE: I'm not remoting into the users' PC. This program is ran on a server, and I remote in and update it, the users run it directly off the server through remote access.
ClickOnce won't work because it requires a webserver.
I had some example code that I can't find right now but you can do something similar to Firefox with the System.Deployment.Application namespace.
If you use the ApplicationDeployment class, you should be able to do what you want.
From MSDN, this class...
Supports updates of the current deployment programmatically, and handles on-demand downloading of files.
Consider the MS APIs with BITS, just using bitsadmin.exe in a script or the Windows Update Services.
Some questions:
Are the users running the software locally, but the files are located on a networked share on your server?
Are they remoting into the same server you want to remote into, and execute it there?
If 2. are they executing the files where they are placed on the server, or are they copying them down to a "private folder"?
If you cannot change the location of the files, and everyone is remoting in, and everyone is executing the files in-place, then you have a problem. As long as even 1 user is running the program, the files will be locked. You can only update the files once everyone is out.
If, on the other hand, the users are able to run their own private copy of the files, then I would set up a system where you have a central folder with the latest version of the files, and when a user starts his program, it checks if the central folder has newer versions than the user is about to execute. If it does, copy the new version down first.
Or, if that will take too long, and the user will get impatient (what, huh, users getting impatient?), then having the program check the versions after startup, and remind the user to exit would work instead. In this case, the program would set a flag that upon next startup would do the copying, only now the user is aware of it happening.
The copying part would easily be handled by either having a separate executable that does the actual copying, and executing that instead, or the program could copy itself temporarily to another location and run that copy with parameters that says "update the original files".
While you can design your code to modify itself (maybe not in C#?), this is generally a bad idea. This means that you must restart something to get the update. (In Linux you are able to replace files that are in use, however an update does not happen until the new data is loaded into memory i.e. application restart)
The strategy used by Firefox (never actually looked into it) is storing the updated executable in a different file which is checked for when program starts to load. This allows the program to overwrite the program with the update before the resource is locked by the OS. You can also design you program more modular so that portions of it can be "restarted" without requiring a restart of the entire program.
How you actually do this is probably provided by the links given by others.
Edit:: In light of a response given to Lasse V. Karlsen
You can have your main program looking for the latest version of the program to load (This program wouldn't be able to get updates without everyone out). You then can remove older versions once people are no longer using it. Depending on how frequent people restart their program you may end up with a number of older programs versions.
ClickOnce and Silverlight (Out of browser) both support your scenario, if we talk about upgrades. Remote login to your users machine? Nope. And no, Firefox doesn't do that either as far as I can tell..
Please double-check both methods and add them to your question, explaining why they might not do what you need. Otherwise it's hard to move on and suggest better alternatives.
Edit: This "I just updated, please restart" thing you seem to like is one method call for Silverlight applications running outside of the browser. At this point I'm fairly certain that this might be the way to go for you.
ClickOnce doesn't require a webserver, it will let you publish updates while users are running the software. You can code your app to check for new update every few minutes and prompt the user to restart the app if a new version is found which will then take them through the upgrade process.
Another option is a Silverlight OOB application, but this would be more work if your app is already built as WinForms/WPF client app.
Various deployment/update scenarios (for .NET applications) are discussed with there pros and cons in Microsoft's Smart Client Architecture and Design Guide. Though a little bit old I find that most still holds today, as it is describing rather the basic architectural principles than technical details. There is a PDF version, but you find it online as well:
Deploying and Updating Smart Client Applications
Is this possible without too much of a hassle?
Considering the concurrency issues with thin clients and the complexity of Windows installations, yes hot updates will be a hassel without doing it the way the system demands.
I need to process large image files into smaller image files. I would like to distribute the work to many "slave" servers, rather than tasking my main server with this. I am using Windows Server 2005/2008, C#, and ASP.NET. I have a lot of web application development experience but have not developed distributed systems. I had a notion that this could be designed as follows:
1) Files would be placed in a shared network drive
2) Slave servers would periodically poll the drive for new content
3) Slave servers would rename newly found files to something like UNPROCESSED_appIDXXXX_jidXXXXX_photoidXXXXX.tif and begin processing that file.
4) Other slave servers would avoid trying to process files that are in process by examining file name, i.e. if something has been named "UNPROCESSED" they will not attempt to process.
I am wondering a few things:
1) Will there be issues with two slave servers trying to "grab" and rename the file at once, or will Windows Server automatically lock the file?
2) What do you think the best mechanism for notification of new content for processing should be? One simple idea is to write a basic aspx page on each slave system and have it running on a timer. A better idea might be to write a Windows Service that utilizes SystemFileWatcher and have it running on each slave system. A third idea is to have a central server somehow dispatch instructions to a given slave server to attempt a processing job, but I do not know of ways of invoking that kind of communication beyond a very hack-ish approach of having the master server pass a message via HTTP.
I'd much appreciate any guidance you have to offer.
Cheers,
-KF
If you don't want to go all the way with a compute cluster type solution. You should consider having a job manager running somewhere that will parcel out the work. That way, when a server becomes available to do work, it asks the job manager for a new bit of work to do. It can then tell the job manager that it's finished and the job manager can inform your "client" when the work on the whole job is complete. That way, it's easy to register work and know it's complete and the job manager can parcel out the work without the worry of race conditions on file renames. :)