In a Windows Form window, multiple events can trigger an asynchronous method. This method downloads a file and caches it. My problem is that I want that method to be executed once. In other words, I want to prevent the file to be downloaded multiple times.
If the method downloading the file is triggered twice, I want the second call to wait for the file (or wait for the first method to be done).
Does someone have an idea on how to achieve that?
UPDATE: I am simply trying to prevent unnecessary downloads. In my case, when a client put its mouse over an item in a ListBox for more than a couple milliseconds, we start to download. We make the assumption that the user will click and request the file. What can potentially happen is that the user keeps his mouse over the item for one second and then click. In this case two downloads start. I am looking for the best way to handle such scenario.
UPDATE 2:: There is a possibility that the user will move its mouse over multiple items. In consequences, multiple downloads will occur. I've not really tough of this scenario, but right now if we face such scenario we don't abandon the download. The file will be downloaded (files are usually around 50-100kb) and then are going to be cached.
Maintain the state of what's happening in a form variable and have your async method check that state before it does anything. Make sure you synchronize access to it, though! Mutexes and semaphores are good for this kind of thing.
If you can download different files simultaneously, you'll need to keep track of what's being downloaded in a list for reference.
If only one file can be downloaded at a time, and you don't want to queue things up, you could just unhook the event while something is being downloaded, too, and rehook it when the download is complete.
Here is a dummy implementation that supports multiple file downloads:
Dictionary<string, object> downloadLocks = new Dictionary<string, object>();
void DownloadFile(string localFile, string url)
{
object fileLock;
lock (downloadLocks)
{
if (!downloadLocks.TryGetValue(url, out fileLock))
{
fileLock = new object();
downloadLocks[url] = fileLock;
}
}
lock (fileLock)
{
// check if file is already downloaded
// if not then download file
}
}
You can simply wrap your method call within a lock statement like this
private static readonly Object padLock = new Object();
...
lock(padLock)
{
YourMethod();
}
i'm not sure how it would be done in C#, but in java, you would synchonize on an private static final object in the class before downloading the file. This would block any further requests until the current one was completed. You could then check to see if the file was downloaded or not and act appropriately.
private static final Object lock = new Object();
private File theFile;
public method() {
synchronized(lock) {
if(theFile != null) {
//download the file
}
}
}
In general, I agree with Michael, use a lock around the code that actually gets the file. However, if there's a single event that always occurs first and you can always load the file then, consider using Futures. In the initial event, start the future running
Future<String> file = InThe.Future<String>(delegate { return LoadFile(); });
and in every other event, wait on the future's value
DoSomethingWith(file.Value);
If you want one thread to wait for another thread to finish a task, you probably want to use a ManualResetEvent. Maybe something like this:
private ManualResetEvent downloadCompleted = new ManualResetEvent();
private bool downloadStarted = false;
public void Download()
{
bool doTheDownload = false;
lock(downloadCompleted)
{
if (!downloadStarted)
{
downloadCompleted.Reset();
downloadStarted = true;
doTheDownload = true;
}
}
if (doTheDownload)
{
// Code to do the download
lock(downloadCompleted)
{
downloadStarted = false;
}
// When finished notify anyone waiting.
downloadCompleted.Set();
}
else
{
// Wait until it is done...
downloadCompleted.WaitOne();
}
}
Related
I'm working on a windows forms application and fighting with a very harsh error. The application is supposed to run on a local machine and handle requests form a server applicaton. The client application looks like this:
public Reader mr_obj;
public Form1()
{
mr_obj = new MyReader.Reader(7137);
mr_obj.UserEvent += new ReaderEvent(UserEvent);
}
private void UserEvent(UserEvent e, long threadID)
{
Thread.Sleep(1000);
SafeSomethingToDB();
}
The Reader() object is connecting the client application to the server application. So after this, the server application is able to trigger the UserEvent() method in the client application. Ther problem is now, that the client application, which handles the UserEvents, crashes if the UserEvent() method gets triggered twice within one second.
(Its actually not crashing just hanging untill you kill the task, a try catch wont return an error)
What I've tried so far is to delegate the Thread.Sleep() and SafeSomethingToDB() to another thread. This doesnt work because the server application does not wait until the tread is finished. So the server application does not find the data in the DB because its not waiting 1 second...
The same problem happens when I did that with background workers.
Is there a possibility to handle these two triggers, which come from the same server application, in sort of a parallell way at the same time?
Any suggestions very apreciated
EDIT: I think locking the method does not cause the application to process both triggers in the same time. To make this visible I'v tried this:
private void UserEventHandler(UserEvent e, long threadID)
{
lock (_lockObject)
{
MessageBox.Show("Messagebox 1");
MessageBox.Show("Messagebox 2");
}
}
When the first request triggers UserEvent() "MessageBox1" appeares. If you press OK, "MessageBox2" appeares. But if the UserEvent gets triggered a second time while "Messagebox2" is still opened, "MessageBox1" does not appear. Instead of that the application start hanging. Shouldn "MessageBox1" appear again triggered by the second trigger of UserEvent() when the two triggers really ar bbeing processed at the same time? So the two triggers are not beeing preformed parallel or am I mistaking here?
Without knowing why you do the Sleep or what exactly SafeSomethingToDB does and what causes your problems, try to synchronize the calls:
private readonly object _lockObject = new object();
private void UserEvent(UserEvent e, long threadID)
{
lock(_lockObject)
{
Thread.Sleep(1000);
SafeSomethingToDB();
}
}
I think a simple lock for synchronization will work for you, try this
public Reader mr_obj;
private static readonly object sync = new object();
public Form1()
{
mr_obj = new MyReader.Reader(7137);
mr_obj.UserEvent += new ReaderEvent(UserEvent);
}
private void UserEvent(UserEvent e, long threadID)
{
lock(sync)
{
SafeSomethingToDB();
}
}
As you write in the comments, if SafeSomethingToDB() is called a second time before the first call has finished, then it crashes. So in other words: SafeSomethingToDB() is not re-entrant.
What you can do is use a Mutex (which stands for mutual exclusion), which defines a "critical section" in your code, meaning a code that can have only one thread executing it at any one time.
For instance:
private static Mutex mutex = new Mutex();
public void SafeSomethingToDB()
{
mutex.WaitOne(); // wait until it is safe to enter the critical section
// Critical section begins here
DoWorkAndStuff();
mutex.ReleaseMutex(); // indicate the end of the critical section
}
For more about System.Threading.Mutex, see http://msdn.microsoft.com/en-us/library/system.threading.mutex(v=vs.110).aspx.
I am writing a WCF service that has source data from multiple sources. These are large files in various formats.
I have implemented Caching and set-up a polling interval so these files are kept up to date with fresh data.
I have constructed a manager class that basically is responsible for returning XDocument objects back to the caller. The manager class first checks the cache for existence. If it doesn't exist - it makes the call to retrieve fresh data. Nothing big here.
What I would like to do to keep the response snappy is serialize the file previously downloaded and pass that back to the caller - again nothing new...however...I want to spawn a new thread as soon as the serialization is complete to retrieve the fresh data and overwrite the old file. This is my problem...
Admittedly an intermediate programmer - I came across a few examples on multi-threading (here for that matter)...The problem is it introduced the concept of delegates and I am really struggling with this.
Here is some of my code:
//this method invokes another object that is responsible for making the
//http call, decompressing the file and persisting to the hard drive.
private static void downloadFile(string url, string LocationToSave)
{
using (WeatherFactory wf = new WeatherFactory())
{
wf.getWeatherDataSource(url, LocationToSave);
}
}
//A new thread variable
private static Thread backgroundDownload;
//the delegate...but I am so confused on how to use this...
delegate void FileDownloader(string url, string LocationToSave);
//The method that should be called in the new thread....
//right now the compiler is complaining that I don't have the arguments from
//the delegate (Url and LocationToSave...
//the problem is I don't pass URL and LocationToSave here...
static void Init(FileDownloader download)
{
backgroundDownload = new Thread(new ThreadStart(download));
backgroundDownload.Start();
}
I'd like to implement this the correct way...so a bit of education on how to make this work would be appreciated.
I would use the Task Parallel library to do this:
//this method invokes another object that is responsible for making the
//http call, decompressing the file and persisting to the hard drive.
private static void downloadFile(string url, string LocationToSave)
{
using (WeatherFactory wf = new WeatherFactory())
{
wf.getWeatherDataSource(url, LocationToSave);
}
//Update cache here?
}
private void StartBackgroundDownload()
{
//Things to consider:
// 1. what if we are already downloading, start new anyway?
// 2. when/how to update your cache
var task = Task.Factory.StartNew(_=>downloadFile(url, LocationToSave));
}
I have a process that needs to read and write to a file. The application has a specific order to its reads and writes and I want to preserve this order. What I would like to do is implement something that lets the first operation start and makes the second operation wait until the first is done with a first come first served like of queue to access the file. From what I have read file locking seems like it might be what I am looking for but I have not been able to find a very good example. Can anyone provide one?
Currently I am using a TextReader/Writer with .Synchronized but this is not doing what I hoped it would.
Sorry if this is a very basic question, threading gives me a headache :S
It should be as simple as this:
public static readonly object LockObj = new object();
public void AnOperation()
{
lock (LockObj)
{
using (var fs = File.Open("yourfile.bin"))
{
// do something with file
}
}
}
public void SomeOperation()
{
lock (LockObj)
{
using (var fs = File.Open("yourfile.bin"))
{
// do something else with file
}
}
}
Basically, define a lock object, then whenever you need to do something with your file, make sure you get a lock using the C# lock keyword. On reaching the lock statement, execution will block indefinitely until a lock has been obtained.
There are other constructs you can use for locking, but I find the lock keyword to be the most straightforward.
If you're using a current version of the .Net Framework, you can benefit from Task.ContinueWith.
If your units of work are logically always, "read some, then write some", the following expresses that intent succinctly and should scale:
string path = "file.dat";
// Start a reader task
var task = Task.Factory.StartNew(() => ReadFromFile(path));
// Continue with a writer task
task.ContinueWith(tt => WriteToFile(path));
// We're guaranteed that the read will occur before the write
// and that the write will occur once the read completes.
// We also can check the antecedent task's result (tt.Result in our
// example) for any special error logic we need.
I have multithreaded operation running, each one append some info to my log file. The problem is, sometimes log file is locked for editing and at the same time is accessed by other thread, which throws an exception. How can i make sure the log is properly written?
here are the snippet
try
{
File.AppendAllText(fileName, appendString);
}
catch (System.Exception )
{
}
for now, i just ignore the exception. this cause some of the logs are not written.
You need to synchronize your log writes.
What happens is that two threads are appending to the log file at the same time.
Try the following:
class Program
{
public static readonly object LogWriteLock = new object();
// The rest of your Program class.
}
Then, when writing the log:
lock (Program.LogWriteLock)
{
File.AppendAllText(fileName, appendString);
}
What this does is the following. You create an object (Program.LogWriteLock) which you use to "synchronize" your log operations on. Then, when one thread is writing to the log file, the next thread will simply wait on the first thread to complete, and write after that.
You can even wrap this into a nice little helper class and you get something like this:
public static class LogHelper
{
private static readonly object _syncRoot = new object();
public static void AppendToLog(string appendString)
{
lock (_syncRoot)
{
File.AppendAllText("log.txt", appendString);
}
}
}
Replace "log.txt" with your actual log file location.
Have you tried creating a thread thats sole purpose is to update the log, then having your other threads just add to a thread safe queue that the log updater thread will process?
This seems like a better idea than writing to the file from all of the other threads.
I'm using FileSystemWatcher to check when a file is modified or deleted, but I'm wondering if there is any way to check when a file is read by another application.
Example:
I have the file C:\test.txt on my harddrive and am watching it using FileSystemWatcher. Another program (not under my control) goes to read that file; I would like to catch that event and, if possible, check what program is reading the file then modify the contents of the file accordingly.
It sounds like you want to write to your log file when your log file is read externally, or something to that effect. If that is the case, there is a NotifyFilters value, LastAccess. Make sure this is set as one of the flags in your FileSystemWatcher.NotifyFilter property. A change in the last access time will then fire the Changed event on FileSystemWatcher.
Currently, FileSystemWatcher does not allow you to directly differentiate between a read and a change; they both fire the Changed event based on the "change" to LastAccess. So, it would be infeasible to watch for reads to a large number of files. However, you seem to know which file you're watching, so if you had a FileInfo object for that file, and FileSystemWatcher fired its Changed event, you could get a new one and compare LastAccessTime values. If the access time changed, and LastWriteTime didn't, your file is only being read.
Now, in simplest terms, changes you make to the file while it is being read are not going to immediately show up in the other app, nor are you going to be able to "get there first", lock the file and write to it before they see it. So, you cannot use FileSystemWatcher to "intercept" a read request and show the content you want that app to see. The only way the user of another application can see what you just wrote is if the application is also watching the file and re-loads the file. That will fire another Changed event, causing an infinite loop as long as the other application continues to reload the file.
You will also get a Changed event for a read and a write. Opening a file in a text editor (virtually any will do), making some changes, then saving will fire two Changed events if you're looking for changes to Last Access Time. The first one will go off when the file is opened by the editor; at that time, you may not be able to tell that a write will happen, so if you are looking for pure read-only accesses to the file then you're SOL.
The easiest way I can think of to do this would be with a timer (System.Threading.Timer) whose callback checks and stores the last
System.IO.File.GetLastAccessTime(path)
Something like (maybe with a bit more locking...)
public class FileAccessWatcher
{
public Dictionary<string, DateTime> _trackedFiles = new Dictionary<string, DateTime>();
private Timer _timer;
public event EventHandler<EventArgs<string>> FileAccessed = delegate { };
public FileAccessWatcher()
{
_timer = new Timer(OnTimerTick, null, 500, Timeout.Infinite);
}
public void Watch(string path)
{
_trackedFiles[path] = File.GetLastAccessTime(path);
}
public void OnTimerTick(object state)
{
foreach (var pair in _trackedFiles.ToList())
{
var accessed = File.GetLastAccessTime(pair.Key);
if (pair.Value != accessed)
{
_trackedFiles[pair.Key] = accessed;
FileAccessed(this, new EventArgs<string>(pair.Key));
}
}
_timer.Change(500, Timeout.Infinite);
}
}
There is SysInternals' program FileMon... It can trace every file access in the system. If you can find its source, and understand what win32 hooks it uses, you might marshal those functions in C# and get what you want.
You could use FileInfo.LastAccessTime and FileInfo.Refresh() in a polling loop.
http://msdn.microsoft.com/en-us/library/system.io.fileinfo_members.aspx
Yes, using file system filter driver you can catch all read requests, analyze them and even substitute the data being read. Development of such driver yourself is possible, but very time-consuming and complicated. We offer a product called CallbackFilter, which includes a ready to use driver and lets you implement your filtering business logic in user-mode.
A little snippet that I found useful for detecting when another process has a lock:
static bool IsFileUsedbyAnotherProcess(string filename)
{
try
{
using(var file = File.Open(filename, FileMode.Open, FileAccess.Read, FileShare.None))
{
}
}
catch (System.IO.IOException exp)
{
return true;
}
return false;
}