I need a synchronizing class that behaves exactly like the AutoResetEvent class, but with one minor exception:
A call to the Set() method must release all waiting threads, and not just one.
How can I construct such a class? I am simply out of ideas?
Martin.
So you have multiple threads doing a .WaitOne() and you want to release them?
Use the ManualResetEvent class and all the waiting threads should release...
Thank you very much for all your thougts and inputs which I have read with great interest. I did some more searching here on Stackoverflow, and suddenly I found this, whcih turned out to be just what I was looking for. By cutting it down to just the two methods I need, I ended up with this small class:
public sealed class Signaller
{
public void PulseAll()
{
lock (_lock)
{
Monitor.PulseAll(_lock);
}
}
public bool Wait(TimeSpan maxWaitTime)
{
lock (_lock)
{
return Monitor.Wait(_lock, maxWaitTime);
}
}
private readonly object _lock = new object();
}
and it does excactly what it should! I'm amazed that a solution could be that simple, and I love such simplicity. I'ts beautiful. Thank you, Matthew Watson!
Martin.
Two things you might try.
Using a Barrier object add conditionally adding threads too it and signaling them.
The other might be to use a publisher subscriber setup like in RX. Each thread waits on an object that it passes to a collection. When you want to call 'set' loop over a snapshot of it calling set on each member.
Or you could try bears.
If the event is being referenced by all threads in a common field or property, you could replace the common field or property with a new non-signaled event and then signal the old one. It has some cost to it since you'll be regularly creating new synchronization objects, but it would work. Here's an example of how I would do that:
public static class Example
{
private static volatile bool stopRunning;
private static ReleasingAutoResetEvent myEvent;
public static void RunExample()
{
using (Example.myEvent = new ReleasingAutoResetEvent())
{
WaitCallback work = new WaitCallback(WaitThread);
for (int i = 0; i < 5; ++i)
{
ThreadPool.QueueUserWorkItem(work, i.ToString());
}
Thread.Sleep(500);
for (int i = 0; i < 3; ++i)
{
Example.myEvent.Set();
Thread.Sleep(5000);
}
Example.stopRunning = true;
Example.myEvent.Set();
}
}
private static void WaitThread(object state)
{
while (!Example.stopRunning)
{
Example.myEvent.WaitOne();
Console.WriteLine("Thread {0} is released!", state);
}
}
}
public sealed class ReleasingAutoResetEvent : IDisposable
{
private volatile ManualResetEvent manualResetEvent = new ManualResetEvent(false);
public void Set()
{
ManualResetEvent eventToSet = this.manualResetEvent;
this.manualResetEvent = new ManualResetEvent(false);
eventToSet.Set();
eventToSet.Dispose();
}
public bool WaitOne()
{
return this.manualResetEvent.WaitOne();
}
public bool WaitOne(int millisecondsTimeout)
{
return this.manualResetEvent.WaitOne(millisecondsTimeout);
}
public bool WaitOne(TimeSpan timeout)
{
return this.manualResetEvent.WaitOne(timeout);
}
public void Dispose()
{
this.manualResetEvent.Dispose();
}
}
Another more lightweight solution you could try that uses the Monitor class to lock and unlock objects is below. However, I'm not as happy with the cleanup story for this version of ReleasingAutoResetEvent since Monitor may hold a reference to it and keep it alive indefinitely if it is not properly disposed.
There are a few limitations/gotchas with this implementation. First, the thread that creates this object will be the only one that will be able to signal it with a call to Set; other threads that attempt to do the same thing will receive a SynchronizationLockException. Second, the thread that created it will never be able to wait on it successfully since it already owns the lock. This will only be an effective solution if you have exactly one controlling thread and several other waiting threads.
public static class Example
{
private static volatile bool stopRunning;
private static ReleasingAutoResetEvent myEvent;
public static void RunExample()
{
using (Example.myEvent = new ReleasingAutoResetEvent())
{
WaitCallback work = new WaitCallback(WaitThread);
for (int i = 0; i < 5; ++i)
{
ThreadPool.QueueUserWorkItem(work, i.ToString());
}
Thread.Sleep(500);
for (int i = 0; i < 3; ++i)
{
Example.myEvent.Set();
Thread.Sleep(5000);
}
Example.stopRunning = true;
Example.myEvent.Set();
}
}
private static void WaitThread(object state)
{
while (!Example.stopRunning)
{
Example.myEvent.WaitOne();
Console.WriteLine("Thread {0} is released!", state);
}
}
}
public sealed class ReleasingAutoResetEvent : IDisposable
{
private volatile object lockObject = new object();
public ReleasingAutoResetEvent()
{
Monitor.Enter(this.lockObject);
}
public void Set()
{
object objectToSignal = this.lockObject;
object objectToLock = new object();
Monitor.Enter(objectToLock);
this.lockObject = objectToLock;
Monitor.Exit(objectToSignal);
}
public void WaitOne()
{
object objectToMonitor = this.lockObject;
Monitor.Enter(objectToMonitor);
Monitor.Exit(objectToMonitor);
}
public bool WaitOne(int millisecondsTimeout)
{
object objectToMonitor = this.lockObject;
bool succeeded = Monitor.TryEnter(objectToMonitor, millisecondsTimeout);
if (succeeded)
{
Monitor.Exit(objectToMonitor);
}
return succeeded;
}
public bool WaitOne(TimeSpan timeout)
{
object objectToMonitor = this.lockObject;
bool succeeded = Monitor.TryEnter(objectToMonitor, timeout);
if (succeeded)
{
Monitor.Exit(objectToMonitor);
}
return succeeded;
}
public void Dispose()
{
Monitor.Exit(this.lockObject);
}
}
Related
Is this possible to lock method for one thread and force another to go futher rather than waiting until first thread finish? Can this problem be resolved with static thread or some proper pattern with one instance of mendtioned below service.
For presentation purposes, it can be done with static boolen like below.
public class SomeService
{
private readonly IRepository _repo;
public SomeService(IRepository repo)
{
_repo = repo;
}
private Thread threadOne;
public static bool isLocked { get; set; }
public void StartSomeMethod()
{
if(!isLocked)
{
threadOne = new Thread(SomeMethod);
isLocked = true;
}
}
public void SomeMethod()
{
while(true)
{
lots of time
}
...
isLocked = false;
}
}
I want to avoid situation when user clicked, by accident, two times to start and accidentailly second thread starts immediatelly after first finished.
You can use lock :)
object locker = new object();
void MethodToLockForAThread()
{
lock(locker)
{
//put method body here
}
}
Now the result will be that when this method is called by a thread (any thread) it puts something like flag at the beginning of lock: "STOP! You are not allowed to go any further, you must wait!" Like red light on crossroads.
When thread that called this method first, levaes the scope, then at the beginning of the scope this "red light" changes into green.
If you want to not call the method when it is already called by another thread, the only way to do this is by using bool value. For example:
object locker = new object();
bool canAccess = true;
void MethodToLockForAThread()
{
if(!canAccess)
return;
lock(locker)
{
if(!canAccess)
return;
canAccess = false;
//put method body here
canAccess = true;
}
}
Other check of canAccess in lock scope is because of what has been told on comments. No it's really thread safe. This is kind of protection that is advisible in thread safe singleton.
EDIT
After some discussion with mjwills I have to change my mind and turn more into Monitor.TryEnter. You can use it like that:
object locker = new object();
void ThreadMethod()
{
if(Monitor.TryEnter(locker, TimeSpan.FromMiliseconds(1))
{
try
{
//do the thread code
}
finally
{
Monitor.Exit(locker);
}
} else
return; //means that the lock has not been aquired
}
Now, lock could not be aquired because of some exception or because some other thread has already acuired it. In second parameter you can pass the time that a thread will wait to acquire a lock. I gave here short time because you don't want the other thread to do the job, when first is doing it.
So this solution seems the best.
When the other thread could not acquire the lock, it will go further instead of waiting (well it will wait for 1 milisecond).
Since lock is a language-specific wrapper around Monitor class, you need Monitor.TryEnter:
public class SomeService
{
private readonly object lockObject = new object();
public void StartSomeMethod()
{
if (Monitor.TryEnter(lockObject))
{
// start new thread
}
}
public void SomeMethod()
{
try
{
// ...
}
finally
{
Monitor.Exit(lockObject);
}
}
}
You can use a AutoResetEvent instead of your isLocked flag.
AutoResetEvent autoResetEvent = new AutoResetEvent(true);
public void StartSomeMethod()
{
if(autoResetEvent.WaitOne(0))
{
//start thread
}
}
public void SomeMethod()
{
try
{
//Do your work
}
finally
{
autoResetEvent.Set();
}
}
I have a wrapper class around serial port which looks something like this:
static class HASPCLass
{
private static SerialPort m_port;
private static bool m_initialized;
private static int m_baudRate;
static readonly object _syncObject = new object();
public DoInitialization(int baudRate /*also could be other params*/)
{
lock(_syncObject)
{
if (!m_initialized)
{
Initialize(baudRate);
}
}
}
private Initialize(int baudrate /*also could have other params*/)
{
m_port.open(..);
m_baudRate = baudRate;
m_initialized = true;
}
private Uninitialize()
{
m_port.close();
m_initialized = false;
}
public void Read(byte[] buff)
{
lock(_syncObject)
{
//Other custom read stuff
m_port.Read(buff);
}
}
public void Write(byte [] buff)
{
lock(_syncObject)
{
//Other write related code
m_port.Write(buff);
}
}
public void Close()
{
lock(_syncObject)
{
if (m_initialized)
{
Uninitialize();
}
}
}
}
I tried making this class thread safe. Someone initializes it - read and writes maybe used from other threads - and in the end calls Close.
Now Imagine I have two additional static methods from other class which do something like this:
public static void function1()
{
HASPClass.Read(...);
// Some other code
HASPClass.Write(...);
}
public static void function2()
{
HASPClass.Read(...);
// Some other code
HASPClass.Write(...);
}
For overall thread safety I also enclosed these functions in locks:
public static void function1()
{
lock(otherlock1)
{
HASPClass.Read(...);
// Some other code
HASPClass.Write(...);
}
}
public static void function2()
{
lock(otherlock1)
{
HASPClass.Read(...);
// Some other code
HASPClass.Write(...);
}
}
Because order in which read and writes are called might be relavant for the HASP.
My question is: is now my final approach (of using function1 and function2) correct/thread safe?
Since you kind of use a singleton you are fine without additional locks as long as the functions do not use resources that have to be locked in // Some other code.
The class itself is thread safe because it locks all uses of the variables with the same lock. This is as tight as it gets. But make sure to not introduce dead locks in the code that lies behind the comments.
In general you should make sure no one closes your object before all threads are done with it.
Besides this code example is more or less inconsistent. You don't declare it static and write no return types and all.
Edit: From the higher persepctive of the need to give commands in a special order I correct the statement and say yes you need to lock it.
But beware of dead locks.
A more explicit way how this can go wrong (though I don't see it happening in your example code):
There are 2 threads that can hold the lock. Your device will always send you 1 except if you transmit 2 to it then it will send you 2.
Thread 1 is trying to first read a 1 and after that a 2 from the device without releasing the lock.
Now suppose somehow the actions taken after receiving 1 start Thread 2 which wants to transmit 2 to the device. But it can not because Thread 1 is still waiting but it will wait forever because Thread 2 can not transmit.
The most often case for this is GUI events used with invoke (which leads to an other thread executing code).
Imagine I have two additional static methods from other class ... To ensure thread safety do I have to put additional locks ... ?
No.
A lock does not care about the calling method or the stack trace - it only concerns the current thread. Since you already put locks in the critical sections, there is no point in putting higher level locks in your case.
You don't want a thread-safe class, you want a message queue.
By the comments I see your concern is if read/writes are mixed, you write from one thread and other issues a read before the writer thread reads the response.
In that scenario the best you can do is to create a queue of operations, when a write must read then you add a Read and Write operation in only one call, in this way the sequence will be warranted to follow the correct order, and in this way you only need to lock the queue.
Something like this:
Queue:
public class SerialQueue
{
SerialPort sp;
ManualResetEvent processQueue = new ManualResetEvent(false);
Queue<QueueCommand> queue = new Queue<QueueCommand>();
public event EventHandler<ReadEventArgs> ReadSuccess;
public event EventHandler<IdEventArgs> WriteSuccess;
public SerialQueue()
{
ThreadPool.QueueUserWorkItem(ProcessQueueThread);
sp = new SerialPort(); //Initialize it according to your needs.
sp.Open();
}
void ProcessQueueThread(object state)
{
while (true)
{
processQueue.WaitOne();
QueueCommand cmd;
while(true)
{
lock (queue)
{
if (queue.Count > 0)
cmd = queue.Dequeue();
else
{
processQueue.Reset();
break;
}
}
if (cmd.Operation == SerialOperation.Write || cmd.Operation == SerialOperation.WriteRead)
{
sp.Write(cmd.BytesToWrite, 0, cmd.BytesToWrite.Length);
if (WriteSuccess != null)
WriteSuccess(this, new IdEventArgs { Id = cmd.Id });
}
if(cmd.Operation == SerialOperation.Read || cmd.Operation == SerialOperation.WriteRead)
{
byte[] buffer = new byte[cmd.BytesToRead];
sp.Read(buffer, 0, buffer.Length);
if (ReadSuccess != null)
ReadSuccess(this, new ReadEventArgs { Id = cmd.Id, Data = buffer });
}
}
}
}
public void EnqueueCommand(QueueCommand Command)
{
lock(queue)
{
queue.Enqueue(Command);
processQueue.Set();
}
}
}
QueueCommand:
public class QueueCommand
{
public QueueCommand()
{
Id = Guid.NewGuid();
}
public Guid Id { get; set; }
public SerialOperation Operation { get; set; }
public int BytesToRead { get; set; }
public byte[] BytesToWrite { get; set; }
}
Enums:
public enum SerialOperation
{
Read,
Write,
WriteRead
}
Event arguments:
public class IdEventArgs : EventArgs
{
public Guid Id { get; set; }
}
public class ReadEventArgs : IdEventArgs
{
public byte[] Data{ get; set; }
}
To use the queue you instantiate it and hook to the WriteSucces and ReadSucces.
SerialQueue queue = new SerialQueue();
queue.ReadSuccess += (o, args) => { /*Do whatever you need to do with the read data*/ };
queue.WriteSuccess += (o, args) => { /*Do whatever you need to do after the write */ };
Note that each QueueCommand has a property named Id which is a unique Guid, it allows you to track when the commands are executed.
Now, when you want to perform a read you do:
QueueCommand cmd = new QueueCommand { Operation = SerialOperation.Read, BytesToRead = 1024 };
queue.Enqueue(cmd);
In this moment the queue will add the command and set the reset event, when the reset event is set the thread processing the commands will continue it's execution (if wasn't already executing) and process all the possible commands in the queue.
For a write you will do:
QueueCommand cmd = new QueueCommand { Operation = SerialOperation.Write, BytesToWrite = new byte[]{ 1, 10, 40 } };
And for a write followed by a read you will do:
QueueCommand cmd = new QueueCommand { Operation = SerialOperation.WriteRead, BytesToWrite = new byte[]{ 1, 10, 40 }, BytesToRead = 230 };
I have been working with serial ports for years in multi threaded environments and this is the only way to ensure sequentiallity between sent commands and received responses, else you will mix responses from different commands.
Remember this is just a base implementation, you need to add error handling and customize it to your needs.
The thread safety of a method has nothing to deal with serial port operations (see this interesting discussion What Makes a Method Thread-safe? What are the rules?).
At the end, I think that your lock(_syncObject) in your first class is not necessary (but I don't know the rest of your code!), if you call the methods in the way you did, because the Read() and Write() calls are enclosed in a sync-lock to the same object (I'm supposing that your lock object is declared like private static readonly object otherlock1 = new object();).
In my opinion, if you only call function1 and function2 in the rest of your code, your approach is definitely thread-safe (supposed that your // Some other code don't spawn another thread that can make some thread-unsafe operations on the same variables on which function1 and function2 are working...).
Talking about the serial port protocol, what does it happen if your // Some other code fails for some reason? For example a computation error between your HASPClass.Read(...) and HASPClass.Write(...). This might not affect the thread-safety it-self, but damage the sequence of the read-write operations (but only you can know the details on that).
First of all, using singletons in such manner is a bad practice. You should consider using something like this.
public sealed class SerialPortExt
{
private readonly SerialPort _serialPort;
private readonly object _serialPortLock = new object();
public SerialPortExt(SerialPort serialPort)
{
_serialPort = serialPort;
}
public void DoSomething()
{
}
public IDisposable Lock()
{
return new DisposableLock(_serialPortLock);
}
}
Where DisposableLock looks like this.
public sealed class DisposableLock : IDisposable
{
private readonly object _lock;
public DisposableLock(object #lock)
{
_lock = #lock;
Monitor.Enter(_lock);
}
#region Implementation of IDisposable
public void Dispose()
{
Monitor.Exit(_lock);
}
#endregion
}
Then you can work with your instance in the following way.
class Program
{
static void Main()
{
var serialPortExt = new SerialPortExt(new SerialPort());
var tasks =
new[]
{
Task.Run(() => DoSomething(serialPortExt)),
Task.Run(() => DoSomething(serialPortExt))
};
Task.WaitAll(tasks);
}
public static void DoSomething(SerialPortExt serialPortExt)
{
using (serialPortExt.Lock())
{
serialPortExt.DoSomething();
Thread.Sleep(TimeSpan.FromSeconds(5));
}
}
}
Since I cannot try out your code and it wouldn't compile I would just advice that you make your wrapper into a singleton and perform the locking from there.
Here is an example of your sample code converted to a singleton class based on MSDN Implementing Singleton in C#:
public class HASPCLass
{
private static SerialPort m_port;
private static bool m_initialized;
private static int m_baudRate;
static readonly object _syncObject = new object();
private static HASPCLass _instance;
public static HASPCLass Instance
{
get
{
if(_instance == null)
{
lock(_syncObject)
{
if (_instance == null)
{
_instance = new HASPCLass();
}
}
}
return _instance;
}
}
public void DoInitialization(int baudRate /*also could be other params*/)
{
if (!m_initialized)
{
Initialize(baudRate);
}
}
private void Initialize(int baudrate /*also could have other params*/)
{
m_port.Open();
m_baudRate = baudrate;
m_initialized = true;
}
private void Uninitialize()
{
m_port.Close();
m_initialized = false;
}
public void Read(byte[] buff)
{
m_port.Read(buff, 0, buff.Length);
}
public void Write(byte[] buff)
{
m_port.Write(buff, 0, buff.Length);
}
public void Close()
{
if (m_initialized)
{
Uninitialize();
}
}
}
Notice that locking is only applied on the instance of HASPCLass.
if(_instance == null)
This check is added because when multiple threads try to access the singleton instance it will be null. In this case that is the time where it should wait and check if it is currently locked. These modifications has already made your HASPCLass thread safe! Now consider adding more functions such as for setting the port name and other properties as needed.
Generally, in this case of situation, you have to use a Mutex().
A mutex permits mutual exclusion to shared resources.
I found this helper class:
public sealed class QueuedLock
{
private object innerLock;
private volatile int ticketsCount = 0;
private volatile int ticketToRide = 1;
public QueuedLock()
{
innerLock = new Object();
}
public void Enter()
{
int myTicket = Interlocked.Increment(ref ticketsCount);
Monitor.Enter(innerLock);
while (true)
{
if (myTicket == ticketToRide)
{
return;
}
else
{
Monitor.Wait(innerLock);
}
}
}
public void Exit()
{
Interlocked.Increment(ref ticketToRide);
Monitor.PulseAll(innerLock);
Monitor.Exit(innerLock);
}
}
here :
Is there a synchronization class that guarantee FIFO order in C#?
It works nice for me but I've just read some comments on this topic about the using of those volatile variables and I am not sure if there is an issue here or not. I understand the code and implemented it in my application in order to do the right job but I admit I can't see a negative impact with this approach in the future, at least, if there is one.
I haven't seen many ways of doing this so it would be nice to have a complete and safe version ready to be used in any situation.
So basically, the question is, can this approach be improved? Does it have any subtle issues?
For multithreaded FIFO (queue), .NET has provided ConcurrentQueue, I would recommend using ConcurrentQueue instead of using low level locks.
MSDN: ConcurrentQueue
You can also review the TPL Dataflow BufferBlock class for this purpose:
// Hand-off through a BufferBlock<T>
private static BufferBlock<int> m_buffer = new BufferBlock<int>();
// Producer
private static void Producer()
{
while(true)
{
int item = Produce();
// storing the messages in FIFO queue
m_buffer.Post(item);
}
}
// Consumer
private static async Task Consumer()
{
while(true)
{
int item = await m_buffer.ReceiveAsync();
Process(item);
}
}
// Main
public static void Main()
{
var p = Task.Factory.StartNew(Producer);
var c = Consumer();
Task.WaitAll(p,c);
}
How to share data between different threads In C# without using the static variables?
Can we create a such machanism using attribute?
Will Aspect oriented programming help in such cases?
To acheive this all the different threads should work on single object?
You can't beat the simplicity of a locked message queue. I say don't waste your time with anything more complex.
Read up on the lock statement.
lock
EDIT
Here is an example of the Microsoft Queue object wrapped so all actions against it are thread safe.
public class Queue<T>
{
/// <summary>Used as a lock target to ensure thread safety.</summary>
private readonly Locker _Locker = new Locker();
private readonly System.Collections.Generic.Queue<T> _Queue = new System.Collections.Generic.Queue<T>();
/// <summary></summary>
public void Enqueue(T item)
{
lock (_Locker)
{
_Queue.Enqueue(item);
}
}
/// <summary>Enqueues a collection of items into this queue.</summary>
public virtual void EnqueueRange(IEnumerable<T> items)
{
lock (_Locker)
{
if (items == null)
{
return;
}
foreach (T item in items)
{
_Queue.Enqueue(item);
}
}
}
/// <summary></summary>
public T Dequeue()
{
lock (_Locker)
{
return _Queue.Dequeue();
}
}
/// <summary></summary>
public void Clear()
{
lock (_Locker)
{
_Queue.Clear();
}
}
/// <summary></summary>
public Int32 Count
{
get
{
lock (_Locker)
{
return _Queue.Count;
}
}
}
/// <summary></summary>
public Boolean TryDequeue(out T item)
{
lock (_Locker)
{
if (_Queue.Count > 0)
{
item = _Queue.Dequeue();
return true;
}
else
{
item = default(T);
return false;
}
}
}
}
EDIT 2
I hope this example helps.
Remember this is bare bones.
Using these basic ideas you can safely harness the power of threads.
public class WorkState
{
private readonly Object _Lock = new Object();
private Int32 _State;
public Int32 GetState()
{
lock (_Lock)
{
return _State;
}
}
public void UpdateState()
{
lock (_Lock)
{
_State++;
}
}
}
public class Worker
{
private readonly WorkState _State;
private readonly Thread _Thread;
private volatile Boolean _KeepWorking;
public Worker(WorkState state)
{
_State = state;
_Thread = new Thread(DoWork);
_KeepWorking = true;
}
public void DoWork()
{
while (_KeepWorking)
{
_State.UpdateState();
}
}
public void StartWorking()
{
_Thread.Start();
}
public void StopWorking()
{
_KeepWorking = false;
}
}
private void Execute()
{
WorkState state = new WorkState();
Worker worker = new Worker(state);
worker.StartWorking();
while (true)
{
if (state.GetState() > 100)
{
worker.StopWorking();
break;
}
}
}
You can pass an object as argument to the Thread.Start and use it as a shared data storage between the current thread and the initiating thread.
You can also just directly access (with the appropriate locking of course) your data members, if you started the thread using the instance form of the ThreadStart delegate.
You can't use attributes to create shared data between threads. You can use the attribute instances attached to your class as a data storage, but I fail to see how that is better than using static or instance data members.
Look at the following example code:
public class MyWorker
{
public SharedData state;
public void DoWork(SharedData someData)
{
this.state = someData;
while (true) ;
}
}
public class SharedData {
X myX;
public getX() { etc
public setX(anX) { etc
}
public class Program
{
public static void Main()
{
SharedData data = new SharedDate()
MyWorker work1 = new MyWorker(data);
MyWorker work2 = new MyWorker(data);
Thread thread = new Thread(new ThreadStart(work1.DoWork));
thread.Start();
Thread thread2 = new Thread(new ThreadStart(work2.DoWork));
thread2.Start();
}
}
In this case, the thread class MyWorker has a variable state. We initialise it with the same object. Now you can see that the two workers access the same SharedData object. Changes made by one worker are visible to the other.
You have quite a few remaining issues. How does worker 2 know when changes have been made by worker 1 and vice-versa? How do you prevent conflicting changes? Maybe read: this tutorial.
When you start a thread you are executing a method of some chosen class. All attributes of that class are visible.
Worker myWorker = new Worker( /* arguments */ );
Thread myThread = new Thread(new ThreadStart(myWorker.doWork));
myThread.Start();
Your thread is now in the doWork() method and can see any atrributes of myWorker, which may themselves be other objects. Now you just need to be careful to deal with the cases of having several threads all hitting those attributes at the same time.
I'm having a small background thread which runs for the applications lifetime - however when the application is shutdown, the thread should exit gracefully.
The problem is that the thread runs some code at an interval of 15 minutes - which means it sleeps ALOT.
Now in order to get it out of sleep, I toss an interrupt at it - my question is however, if there's a better approach to this, since interrupts generate ThreadInterruptedException.
Here's the gist of my code (somewhat pseudo):
public class BackgroundUpdater : IDisposable
{
private Thread myThread;
private const int intervalTime = 900000; // 15 minutes
public void Dispose()
{
myThread.Interrupt();
}
public void Start()
{
myThread = new Thread(ThreadedWork);
myThread.IsBackground = true; // To ensure against app waiting for thread to exit
myThread.Priority = ThreadPriority.BelowNormal;
myThread.Start();
}
private void ThreadedWork()
{
try
{
while (true)
{
Thread.Sleep(900000); // 15 minutes
DoWork();
}
}
catch (ThreadInterruptedException)
{
}
}
}
There's absolutely a better way - either use Monitor.Wait/Pulse instead of Sleep/Interrupt, or use an Auto/ManualResetEvent. (You'd probably want a ManualResetEvent in this case.)
Personally I'm a Wait/Pulse fan, probably due to it being like Java's wait()/notify() mechanism. However, there are definitely times where reset events are more useful.
Your code would look something like this:
private readonly object padlock = new object();
private volatile bool stopping = false;
public void Stop() // Could make this Dispose if you want
{
stopping = true;
lock (padlock)
{
Monitor.Pulse(padlock);
}
}
private void ThreadedWork()
{
while (!stopping)
{
DoWork();
lock (padlock)
{
Monitor.Wait(padlock, TimeSpan.FromMinutes(15));
}
}
}
For more details, see my threading tutorial, in particular the pages on deadlocks, waiting and pulsing, the page on wait handles. Joe Albahari also has a tutorial which covers the same topics and compares them.
I haven't looked in detail yet, but I wouldn't be surprised if Parallel Extensions also had some functionality to make this easier.
You could use an Event to Check if the Process should end like this:
var eventX = new AutoResetEvent(false);
while (true)
{
if(eventX.WaitOne(900000, false))
{
break;
}
DoWork();
}
There is CancellationTokenSource class in .NET 4 and later which simplifies this task a bit.
private readonly CancellationTokenSource cancellationTokenSource =
new CancellationTokenSource();
private void Run()
{
while (!cancellationTokenSource.IsCancellationRequested)
{
DoWork();
cancellationTokenSource.Token.WaitHandle.WaitOne(
TimeSpan.FromMinutes(15));
}
}
public void Stop()
{
cancellationTokenSource.Cancel();
}
Don't forget that CancellationTokenSource is disposable, so make sure you dispose it properly.
One method might be to add a cancel event or delegate that the thread will subscribe to. When the cancel event is invoke, the thread can stop itself.
I absolutely like Jon Skeets answer. However, this might be a bit easier to understand and should also work:
public class BackgroundTask : IDisposable
{
private readonly CancellationTokenSource cancellationTokenSource;
private bool stop;
public BackgroundTask()
{
this.cancellationTokenSource = new CancellationTokenSource();
this.stop = false;
}
public void Stop()
{
this.stop = true;
this.cancellationTokenSource.Cancel();
}
public void Dispose()
{
this.cancellationTokenSource.Dispose();
}
private void ThreadedWork(object state)
{
using (var syncHandle = new ManualResetEventSlim())
{
while (!this.stop)
{
syncHandle.Wait(TimeSpan.FromMinutes(15), this.cancellationTokenSource.Token);
if (!this.cancellationTokenSource.IsCancellationRequested)
{
// DoWork();
}
}
}
}
}
Or, including waiting for the background task to actually have stopped (in this case, Dispose must be invoked by other thread than the one the background thread is running on, and of course this is not perfect code, it requires the worker thread to actually have started):
using System;
using System.Threading;
public class BackgroundTask : IDisposable
{
private readonly ManualResetEventSlim threadedWorkEndSyncHandle;
private readonly CancellationTokenSource cancellationTokenSource;
private bool stop;
public BackgroundTask()
{
this.threadedWorkEndSyncHandle = new ManualResetEventSlim();
this.cancellationTokenSource = new CancellationTokenSource();
this.stop = false;
}
public void Dispose()
{
this.stop = true;
this.cancellationTokenSource.Cancel();
this.threadedWorkEndSyncHandle.Wait();
this.cancellationTokenSource.Dispose();
this.threadedWorkEndSyncHandle.Dispose();
}
private void ThreadedWork(object state)
{
try
{
using (var syncHandle = new ManualResetEventSlim())
{
while (!this.stop)
{
syncHandle.Wait(TimeSpan.FromMinutes(15), this.cancellationTokenSource.Token);
if (!this.cancellationTokenSource.IsCancellationRequested)
{
// DoWork();
}
}
}
}
finally
{
this.threadedWorkEndSyncHandle.Set();
}
}
}
If you see any flaws and disadvantages over Jon Skeets solution i'd like to hear them as i always enjoy learning ;-)
I guess this is slower and uses more memory and should thus not be used in a large scale and short timeframe. Any other?