Okay so I'm in a dilemma here. What I need is a solution for my following problem:
I basically want to automatically upload some files to my server as soon as they got created into client system's specific folder. There will be some files that would be created in a specific path, suppose on path:
C:\User\Desktop\Scanned Files, now as soon as user saves their files into this path some window service kind of thing should pick the file up and upload to my server. Process should be fast enough, I mean no longer than 1 minute.
I can either go with the windows service or scheduled task. As far as I know windows service are always running on background but they need to run with a timer which runs periodically and do the allotted operation.
and scheduled task runs on a particular point of time. First of all I'm quite confused here which approach should I go with and what would fulfill my requirement more efficiently and feasibly. Or is there any better way to do this other than windows service and scheduled task?
You can make use of FileSystemWatcher class and can utilize its events in order to upload any new file in that folder to the server like below
FileSystemWatcher folderWatcher= new FileSystemWatcher("C:\User\Desktop\Scanned Files");
folderWatcher.EnableRaisingEvents = true;
folderWatcher.Created += new FileSystemEventHandler(folderWatcher_Created);
void folderWatcher_Created(object sender, FileSystemEventArgs e)
{
// Your code to upload the file to the server
}
You should without doubt use a Scheduled Task.
The right solution for scheduling a simple process like yours (file uploading) is the Windows Task Scheduler. It is simply better and more stable.
A service performs a specific system function to support other programs, particularly at a low level.
Services offer no scheduling, because they are designed to run all the time!
On the other hand, Windows Scheduled Tasks have advanced scheduling features which can be accessed administratively. This is an advantage.
An file upload is a bad candidate for Service implementation. Even if your write it with great care and stability.
You can, however, start with a simple timer, and then migrate to a Scheduled Task.
I hope this serves you.
If you need more arguments, please Jon Galloway thoughts:
See here: http://weblogs.asp.net/jongalloway//428303
Also here there is a related question: windows service vs scheduled task
Related
I have a simple web application module which basically accepts requests to save a zip file on PageLoad from a mobile client app.
Now, What I want to do is to unzip the file and read the file inside it and process it further..including making entries into a database.
Update: the zip file and its contents will be fairly smaller in size so the server shouldn't be burdened with much load.
Update 2: I just read about when IIS queues requests (at global/app level). So does that mean that I don't need to implement complex request handling mechanism and the IIS can take care of the app by itself?
Update 3: I am looking for offloading the processing of the downloaded zip not only for the sake of minimizing the overhead (in terms of performance) but also in order to avoid the problem of table-locking when the file is processed and records updated into the same table. In the scenario of multiple devices requesting the page and the background task processing database updateing in parallel would cause an exception.
As of now I have zeroed on two solutions:
To implement a concurrent/message queue
To implement the file processing code into a separate tool and schedule a job on the server to check for non-processed file(s) and process them serially.
Inclined towards a Queuing Mechanism I will try to implement is as it seems less dependent on config. v/s manually configuring the job/schedule at the server side.
So, what do you guys recommend me for this purpose?
Moreover after the zip file is requested and saved on server side, the client & server side connection is released after doing so. Not looking to burden my IIS.
Imagine a couple of hundred clients simultaneously requesting the page..
I actually haven't used neither of them before so any samples or how-to's will be more appreciated.
I'd recommend TPL and Rx Extensions: you make your unzipped file list an observable collection and for each item start a new task asynchronously.
I'd suggest a queue system.
When you received a file you'll save the path into a thread-synchronized queue. Meanwhile a background worker (or preferably another machine) will check this queue for new files and dequeue the entry to handle it.
This way you won't launch an unknown amount of threads (every zip file) and can handle the zip files in one location. This way you can also easier move your zip-handling code to another machine when the load gets too heavy. You just need to access a common queue.
The easiest would probably be to use a static Queue with a lock-object. It is the easiest to implement and does not require external resources. But this will result in the queue being lost when your application recycles.
You mentioned losing zip files was not an option, then this approach is not the best if you don't want to rely on external resources. Depending on your load it may be worth to utilize external resources - meaning upload the zip file to a common storage on another machine and add a message to an queue on another machine.
Here's an example with a local queue:
ConcurrentQueue<string> queue = new ConcurrentQueue<string>();
void GotNewZip(string pathToZip)
{
queue.Enqueue(pathToZip); // Added a new work item to the queue
}
void MethodCalledByWorker()
{
while (true)
{
if (queue.IsEmpty)
{
// Supposedly no work to be done, wait a few seconds and check again (new iteration)
Thread.Sleep(TimeSpan.FromSeconds(5));
continue;
}
string pathToZip;
if (queue.TryDequeue(out pathToZip)) // If TryDeqeue returns false, another thread dequeue the last element already
{
HandleZipFile(pathToZip);
}
}
}
This is a very rough example. Whenever a zip arrives, you add the path to the queue. Meanwhile a background worker (or multiple, the example s threadsafe) will handle one zip after another, getting the paths from the queue. The zip files will be handled in the order they arrive.
You need to make sure that your application does not recycle meanwhile. But that's the case with all resources you have on the local machine, they'll be lost when your machine crashes.
I believe you are optimising prematurely.
You mentioned table-locking - what kind of db are you using? If you add new rows or update existing ones most modern databases in most configurations will:
use row-level locking; and
be fast enough without you needing to worry about
locking.
I suggest starting with a simple method
//Unzip
//Do work
//Save results to database
and get some proof it's too slow.
I am looking to program a simple "data push" service, that extracts data out of a SQL Server database, and deposits a CSV file to a remote FTP site every ten minutes. This service will be running on a remote server, managed over TeamViewer.
There are a few ways I've thought to do this, but would like a bit of advice as to which is the best and most reliable method. A few pro's and cons would also be very helpful from people who have experience in this type of work.
Possible solutions:
Windows service with use of Thread.Sleep(..) to run task every ten minutes
Simple EXE console project that runs as a Windows Scheduler task
Windows service with use of a Timer class
Any other methods?
The program will be written in C#, but I am very flexible in terms of project type, design etc.
The main requirement of this service is to be reliable, and I'd also look to build in an alerts system to notify on failure.
Many thanks
I would favour a scheduled task for this kind of application, as it's far easier to make changes to the schedule at a later date.
There's a previous question along a similar line here: Windows Service or Task Scheduler for maintenance tasks?
I have seen a lot of posts to configure a windows service for daily/weekly etc schedules, but if I want a schedule that is not uniform, how do I manage that from a windows-service perspective? I have an app that I want to run at particular times. Running it at a uniform schedule wouldn't do me any good and just waste resources. Can I configure it by using some XML file, or windows service configuration?
You have three options.
Take a look at Quartz.net
Use Windows scheduler. Just have a different "schedule" for each date/time you need the app to run.
Write your own.
Here's one way to do it that is something of a hybrid approach:
Create a text file that has the dates and times you want the program to run. For example, it might contain:
2011-03-01
0100
0312
0815
0945
1340
2011-03-02
0220
...
Then, write your program that does whatever task it needs to do, and the last thing it does before exiting is read the file, find the next time that it needs to run, and schedules itself (by issuing an AT command, by calling schtasks.exe, or by calling the equivalent Task Scheduler API functions).
The Task Scheduler API is not for the timid. I would suggest looking into a wrapper. A search for "windows task scheduler C#" returns several.
I am setting out to create an app that will watch a directory for any files created. pretty straightforward time to use a filesystemwatcher. My question relates to how to utilize it. Is it common practice to use a windows service to ensure the application is always running?
i have been trying to get away from building windows services if i don't have to, but i really don't see an alternative to doing it that way in this instance. normally, i would convert my service to a console app and schedule it using windows scheduler, but that doesn't really apply in this situation.
can anyone recommend a better way of implementing the filesystemwatcher than a windows service?
thanks for any thoughts.
EDIT
in response to the comments below, more specifically, i just have to watch a directory on a server, and when a new file is created, i have to move a copy of that file into a different directory on the same server, perhaps renaming it in the process.
The frequency and amount of files will be quite small. perhaps 5-10 at most in a day.
I'm not sure yet how the file watcher works, but this is what I'm thinking: The file system fires events; I mean like NTFS must be doing that. Your file watcher hooks into those events. The file watcher probably suspends the thread it's running in until an event occurs and the event somehow wakes up the thread. A suspended thread uses pretty much very few cpu cycles (actually none) while it is suspended, so waiting for a file event costs nothing. So a polled approach wastes mucho beaucoup (that's French, it means 'a shit load') of cpu cycles but the file watcher does not. You could probably look at PerfMon to see if this is likely true.
You should describe more about what you want to do, but typically if you have something that needs to run in the background and does not require direct user interaction, then a service often makes sense.
You can use Remoting to connect a front-end to your service as needed if you'd like.
Yes, use a service for this kind of operation, but don't use filesystem watcher. If you poll for files in your service, dont use the Timer class either.
Do check to make sure the file is completed writing and is no longer locked before trying to move it.
Its trivial to poll for file changes (syntax may be off), and eliminates much of the extra programming associated with file system watcher events.
While True 'or your exit condition'
Dim _files() as FileInfo = Directory.GetFiles(yourTargetDirectory)
For Each _file as FileInfo In _files
If _file.LastModifiedDate < DateTime.Now.AddMinutes(1) Then
'move your files'
End If
Next
End While
Using a Windows service to wrap FileSystemWatcher is perfectly fine for what you need.
FSW is the right tool for the job (native code for filesystem watching is a bear to get right), and a service is the right mechanism to deploy it given you need 'always on' operation.
The service credentials will be independent of logged-in user too, which may be useful to you.
I need to process large image files into smaller image files. I would like to distribute the work to many "slave" servers, rather than tasking my main server with this. I am using Windows Server 2005/2008, C#, and ASP.NET. I have a lot of web application development experience but have not developed distributed systems. I had a notion that this could be designed as follows:
1) Files would be placed in a shared network drive
2) Slave servers would periodically poll the drive for new content
3) Slave servers would rename newly found files to something like UNPROCESSED_appIDXXXX_jidXXXXX_photoidXXXXX.tif and begin processing that file.
4) Other slave servers would avoid trying to process files that are in process by examining file name, i.e. if something has been named "UNPROCESSED" they will not attempt to process.
I am wondering a few things:
1) Will there be issues with two slave servers trying to "grab" and rename the file at once, or will Windows Server automatically lock the file?
2) What do you think the best mechanism for notification of new content for processing should be? One simple idea is to write a basic aspx page on each slave system and have it running on a timer. A better idea might be to write a Windows Service that utilizes SystemFileWatcher and have it running on each slave system. A third idea is to have a central server somehow dispatch instructions to a given slave server to attempt a processing job, but I do not know of ways of invoking that kind of communication beyond a very hack-ish approach of having the master server pass a message via HTTP.
I'd much appreciate any guidance you have to offer.
Cheers,
-KF
If you don't want to go all the way with a compute cluster type solution. You should consider having a job manager running somewhere that will parcel out the work. That way, when a server becomes available to do work, it asks the job manager for a new bit of work to do. It can then tell the job manager that it's finished and the job manager can inform your "client" when the work on the whole job is complete. That way, it's easy to register work and know it's complete and the job manager can parcel out the work without the worry of race conditions on file renames. :)