I have an object in C# on which I need to execute a method on a regular basis. I would like this method to be executed only when other people are using my object, as soon as people stop using my object I would like this background operation to stop.
So here is a simple example is this (which is broken):
class Fish
{
public Fish()
{
Thread t = new Thread(new ThreadStart(BackgroundWork));
t.IsBackground = true;
t.Start();
}
public void BackgroundWork()
{
while(true)
{
this.Swim();
Thread.Sleep(1000);
}
}
public void Swim()
{
Console.WriteLine("The fish is Swimming");
}
}
The problem is that if I new a Fish object anywhere, it never gets garbage collected, cause there is a background thread referencing it. Here is an illustrated version of broken code.
public void DoStuff()
{
Fish f = new Fish();
}
// after existing from this method my Fish object keeps on swimming.
I know that the Fish object should be disposable and I should clean up the thread on dispose, but I have no control over my callers and can not ensure dispose is called.
How do I work around this problem and ensure the background threads are automatically disposed even if Dispose is not called explicitly?
Here is my proposed solution to this problem:
class Fish : IDisposable
{
class Swimmer
{
Thread t;
WeakReference fishRef;
public ManualResetEvent terminate = new ManualResetEvent(false);
public Swimmer(Fish3 fish)
{
this.fishRef = new WeakReference(fish);
t = new Thread(new ThreadStart(BackgroundWork));
t.IsBackground = true;
t.Start();
}
public void BackgroundWork()
{
bool done = false;
while(!done)
{
done = Swim();
if (!done)
{
done = terminate.WaitOne(1000, false);
}
}
}
// this is pulled out into a helper method to ensure
// the Fish object is referenced for the minimal amount of time
private bool Swim()
{
bool done;
Fish fish = Fish;
if (fish != null)
{
fish.Swim();
done = false;
}
else
{
done = true;
}
return done;
}
public Fish Fish
{
get { return fishRef.Target as Fish3; }
}
}
Swimmer swimmer;
public Fish()
{
swimmer = new Swimmer(this);
}
public void Swim()
{
Console.WriteLine("The third fish is Swimming");
}
volatile bool disposed = false;
public void Dispose()
{
if (!disposed)
{
swimmer.terminate.Set();
disposed = true;
GC.SuppressFinalize(this);
}
}
~Fish()
{
if(!disposed)
{
Dispose();
}
}
}
I think the IDisposable solution is the correct one.
If the users of your class don't follow the guidelines for using classes that implement IDisposable it's their fault - and you can make sure that the documentation explicitly mentions how the class should be used.
Another, much messier, option would be a "KeepAlive" DateTime field that each method called by your client would update. The worker thread then checks the field periodically and exits if it hasn't been updated for a certain amount of time. When a method is setting the field the thread will be restarted if it has exited.
This is how I would do it:
class Fish3 : IDisposable
{
Thread t;
private ManualResetEvent terminate = new ManualResetEvent(false);
private volatile int disposed = 0;
public Fish3()
{
t = new Thread(new ThreadStart(BackgroundWork));
t.IsBackground = true;
t.Start();
}
public void BackgroundWork()
{
while(!terminate.WaitOne(1000, false))
{
Swim();
}
}
public void Swim()
{
Console.WriteLine("The third fish is Swimming");
}
public void Dispose()
{
if(Interlocked.Exchange(ref disposed, 1) == 0)
{
terminate.Set();
t.Join();
GC.SuppressFinalize(this);
}
}
~Fish3()
{
if(Interlocked.Exchange(ref disposed, 1) == 0)
{
Dispose();
}
}
}
Related
Is this possible to lock method for one thread and force another to go futher rather than waiting until first thread finish? Can this problem be resolved with static thread or some proper pattern with one instance of mendtioned below service.
For presentation purposes, it can be done with static boolen like below.
public class SomeService
{
private readonly IRepository _repo;
public SomeService(IRepository repo)
{
_repo = repo;
}
private Thread threadOne;
public static bool isLocked { get; set; }
public void StartSomeMethod()
{
if(!isLocked)
{
threadOne = new Thread(SomeMethod);
isLocked = true;
}
}
public void SomeMethod()
{
while(true)
{
lots of time
}
...
isLocked = false;
}
}
I want to avoid situation when user clicked, by accident, two times to start and accidentailly second thread starts immediatelly after first finished.
You can use lock :)
object locker = new object();
void MethodToLockForAThread()
{
lock(locker)
{
//put method body here
}
}
Now the result will be that when this method is called by a thread (any thread) it puts something like flag at the beginning of lock: "STOP! You are not allowed to go any further, you must wait!" Like red light on crossroads.
When thread that called this method first, levaes the scope, then at the beginning of the scope this "red light" changes into green.
If you want to not call the method when it is already called by another thread, the only way to do this is by using bool value. For example:
object locker = new object();
bool canAccess = true;
void MethodToLockForAThread()
{
if(!canAccess)
return;
lock(locker)
{
if(!canAccess)
return;
canAccess = false;
//put method body here
canAccess = true;
}
}
Other check of canAccess in lock scope is because of what has been told on comments. No it's really thread safe. This is kind of protection that is advisible in thread safe singleton.
EDIT
After some discussion with mjwills I have to change my mind and turn more into Monitor.TryEnter. You can use it like that:
object locker = new object();
void ThreadMethod()
{
if(Monitor.TryEnter(locker, TimeSpan.FromMiliseconds(1))
{
try
{
//do the thread code
}
finally
{
Monitor.Exit(locker);
}
} else
return; //means that the lock has not been aquired
}
Now, lock could not be aquired because of some exception or because some other thread has already acuired it. In second parameter you can pass the time that a thread will wait to acquire a lock. I gave here short time because you don't want the other thread to do the job, when first is doing it.
So this solution seems the best.
When the other thread could not acquire the lock, it will go further instead of waiting (well it will wait for 1 milisecond).
Since lock is a language-specific wrapper around Monitor class, you need Monitor.TryEnter:
public class SomeService
{
private readonly object lockObject = new object();
public void StartSomeMethod()
{
if (Monitor.TryEnter(lockObject))
{
// start new thread
}
}
public void SomeMethod()
{
try
{
// ...
}
finally
{
Monitor.Exit(lockObject);
}
}
}
You can use a AutoResetEvent instead of your isLocked flag.
AutoResetEvent autoResetEvent = new AutoResetEvent(true);
public void StartSomeMethod()
{
if(autoResetEvent.WaitOne(0))
{
//start thread
}
}
public void SomeMethod()
{
try
{
//Do your work
}
finally
{
autoResetEvent.Set();
}
}
I need a synchronizing class that behaves exactly like the AutoResetEvent class, but with one minor exception:
A call to the Set() method must release all waiting threads, and not just one.
How can I construct such a class? I am simply out of ideas?
Martin.
So you have multiple threads doing a .WaitOne() and you want to release them?
Use the ManualResetEvent class and all the waiting threads should release...
Thank you very much for all your thougts and inputs which I have read with great interest. I did some more searching here on Stackoverflow, and suddenly I found this, whcih turned out to be just what I was looking for. By cutting it down to just the two methods I need, I ended up with this small class:
public sealed class Signaller
{
public void PulseAll()
{
lock (_lock)
{
Monitor.PulseAll(_lock);
}
}
public bool Wait(TimeSpan maxWaitTime)
{
lock (_lock)
{
return Monitor.Wait(_lock, maxWaitTime);
}
}
private readonly object _lock = new object();
}
and it does excactly what it should! I'm amazed that a solution could be that simple, and I love such simplicity. I'ts beautiful. Thank you, Matthew Watson!
Martin.
Two things you might try.
Using a Barrier object add conditionally adding threads too it and signaling them.
The other might be to use a publisher subscriber setup like in RX. Each thread waits on an object that it passes to a collection. When you want to call 'set' loop over a snapshot of it calling set on each member.
Or you could try bears.
If the event is being referenced by all threads in a common field or property, you could replace the common field or property with a new non-signaled event and then signal the old one. It has some cost to it since you'll be regularly creating new synchronization objects, but it would work. Here's an example of how I would do that:
public static class Example
{
private static volatile bool stopRunning;
private static ReleasingAutoResetEvent myEvent;
public static void RunExample()
{
using (Example.myEvent = new ReleasingAutoResetEvent())
{
WaitCallback work = new WaitCallback(WaitThread);
for (int i = 0; i < 5; ++i)
{
ThreadPool.QueueUserWorkItem(work, i.ToString());
}
Thread.Sleep(500);
for (int i = 0; i < 3; ++i)
{
Example.myEvent.Set();
Thread.Sleep(5000);
}
Example.stopRunning = true;
Example.myEvent.Set();
}
}
private static void WaitThread(object state)
{
while (!Example.stopRunning)
{
Example.myEvent.WaitOne();
Console.WriteLine("Thread {0} is released!", state);
}
}
}
public sealed class ReleasingAutoResetEvent : IDisposable
{
private volatile ManualResetEvent manualResetEvent = new ManualResetEvent(false);
public void Set()
{
ManualResetEvent eventToSet = this.manualResetEvent;
this.manualResetEvent = new ManualResetEvent(false);
eventToSet.Set();
eventToSet.Dispose();
}
public bool WaitOne()
{
return this.manualResetEvent.WaitOne();
}
public bool WaitOne(int millisecondsTimeout)
{
return this.manualResetEvent.WaitOne(millisecondsTimeout);
}
public bool WaitOne(TimeSpan timeout)
{
return this.manualResetEvent.WaitOne(timeout);
}
public void Dispose()
{
this.manualResetEvent.Dispose();
}
}
Another more lightweight solution you could try that uses the Monitor class to lock and unlock objects is below. However, I'm not as happy with the cleanup story for this version of ReleasingAutoResetEvent since Monitor may hold a reference to it and keep it alive indefinitely if it is not properly disposed.
There are a few limitations/gotchas with this implementation. First, the thread that creates this object will be the only one that will be able to signal it with a call to Set; other threads that attempt to do the same thing will receive a SynchronizationLockException. Second, the thread that created it will never be able to wait on it successfully since it already owns the lock. This will only be an effective solution if you have exactly one controlling thread and several other waiting threads.
public static class Example
{
private static volatile bool stopRunning;
private static ReleasingAutoResetEvent myEvent;
public static void RunExample()
{
using (Example.myEvent = new ReleasingAutoResetEvent())
{
WaitCallback work = new WaitCallback(WaitThread);
for (int i = 0; i < 5; ++i)
{
ThreadPool.QueueUserWorkItem(work, i.ToString());
}
Thread.Sleep(500);
for (int i = 0; i < 3; ++i)
{
Example.myEvent.Set();
Thread.Sleep(5000);
}
Example.stopRunning = true;
Example.myEvent.Set();
}
}
private static void WaitThread(object state)
{
while (!Example.stopRunning)
{
Example.myEvent.WaitOne();
Console.WriteLine("Thread {0} is released!", state);
}
}
}
public sealed class ReleasingAutoResetEvent : IDisposable
{
private volatile object lockObject = new object();
public ReleasingAutoResetEvent()
{
Monitor.Enter(this.lockObject);
}
public void Set()
{
object objectToSignal = this.lockObject;
object objectToLock = new object();
Monitor.Enter(objectToLock);
this.lockObject = objectToLock;
Monitor.Exit(objectToSignal);
}
public void WaitOne()
{
object objectToMonitor = this.lockObject;
Monitor.Enter(objectToMonitor);
Monitor.Exit(objectToMonitor);
}
public bool WaitOne(int millisecondsTimeout)
{
object objectToMonitor = this.lockObject;
bool succeeded = Monitor.TryEnter(objectToMonitor, millisecondsTimeout);
if (succeeded)
{
Monitor.Exit(objectToMonitor);
}
return succeeded;
}
public bool WaitOne(TimeSpan timeout)
{
object objectToMonitor = this.lockObject;
bool succeeded = Monitor.TryEnter(objectToMonitor, timeout);
if (succeeded)
{
Monitor.Exit(objectToMonitor);
}
return succeeded;
}
public void Dispose()
{
Monitor.Exit(this.lockObject);
}
}
I'm trying to implement a basic Future class (yeah, I know about Task but this is for educational purposes) and ran into strange behavior of Monitor class. The class is implemented so that it enters the lock in constructor, queues an action which exits the lock to a thread pool. Result getter checks an instance variable to see if the action is completed and if it isn't, enters lock and then returns the result. Problem is that in fact result getter doesn't wait for the queued action to finish and proceeds anyway leading to incorrect results. Here's the code.
// The class itself
public class Future<T>
{
private readonly Func<T> _f;
private volatile bool _complete = false;
private T _result;
private Exception _error = new Exception("WTF");
private volatile bool _success = false;
private readonly ConcurrentStack<Action<T>> _callbacks = new ConcurrentStack<Action<T>>();
private readonly ConcurrentStack<Action<Exception>> _errbacks = new ConcurrentStack<Action<Exception>>();
private readonly object _lock = new object();
public Future(Func<T> f)
{
_f = f;
Monitor.Enter(_lock);
ThreadPool.QueueUserWorkItem(Run);
}
public void OnSuccess(Action<T> a)
{
_callbacks.Push(a);
if (_complete && _success)
a(_result);
}
public void OnError(Action<Exception> a)
{
_errbacks.Push(a);
if (_complete && !_success)
a(_error);
}
private void Run(object state)
{
try {
_result = _f();
_success = true;
_complete = true;
foreach (var cb in _callbacks) {
cb(_result);
}
} catch (Exception e) {
_error = e;
_complete = true;
foreach (var cb in _errbacks) {
cb(e);
}
} finally {
Monitor.Exit(_lock);
}
}
public T Result {
get {
if (!_complete) {
Monitor.Enter(_lock);
}
if (_success) {
return _result;
} else {
Console.WriteLine("Throwing error complete={0} success={1}", _complete, _success);
throw _error;
}
}
}
// Failing test
public void TestResultSuccess() {
var f = new Future<int>(() => 1);
var x = f.Result;
Assert.AreEqual (1, x);
}
I'm using Mono 3.2.3 on Mac OS X 10.9.
Only the thread that took the lock can exit the lock. You can't Enter it in the constructor on the calling thread then Exit from the thread-pool when it completes - the thread-pool worker does not have the lock.
And conversely: presumably it is the same thread that created the future that is accessing the getter: that is allowed to Enter again: it is re-entrant. Also, you need to Exit the same number of times that you Enter, otherwise it isn't actually released.
Basically, I don't think Monitor is the right approach here.
I have one thread, that is sending data stored in a buffer of type List< string> via tcp. Another thread is writing into the buffer. As I am not very familiar with c# I'd like to know how I should use lock or Mutex correctly.
This is the code I'd like to use eventually:
while(buffer.isLocked())
{
buffer.wait();
}
buffer.lockBuffer();
buffer.add(tcpPacket);
buffer.unlockBuffer();
buffer.notify();
This is my current code. I hope someone can help me complete it.
public class Buffer
{
private Mutex mutex;
private List<string> buffer;
private bool locked = false;
public Buffer()
{
mutex = new Mutex(false);
buffer = new List<string>();
}
public bool isLocked()
{
return locked;
}
public void lockBuffer()
{
if (!locked)
{
//...
locked = true;
}
}
public void unlockBuffer()
{
if(locked)
{
mutex.ReleaseMutex();
locked = false;
}
}
public void wait()
{
mutex.WaitOne();
}
public void notify()
{
//...
}
}
It would be better if you use System.Collections.Concurrent.BlockingCollection. It doesn't require an external sync.
For those who don't use 4.0
using System;
using System.Collections.Generic;
using System.Threading;
namespace MyCollections
{
public class BlockingQueue<T> : IDisposable
{
Queue<T> _Queue = new Queue<T>();
SemaphoreSlim _ItemsInQueue = null;
SemaphoreSlim _FreeSlots = null;
int _MaxItems = -1;
public BlockingQueue(int maxItems=Int32.MaxValue)
{
_MaxItems = maxItems;
_ItemsInQueue = new SemaphoreSlim(0, maxItems);
_FreeSlots = new SemaphoreSlim(maxItems, maxItems);
}
public void Dispose()
{
if (_ItemsInQueue != null) _ItemsInQueue.Dispose();
if (_FreeSlots != null) _FreeSlots.Dispose();
}
public int Count
{
get { return _ItemsInQueue.CurrentCount; }
}
public void Add(T item)
{
if(_MaxItems != Int32.MaxValue) _FreeSlots.Wait();
lock (this)
{
_Queue.Enqueue(item);
_ItemsInQueue.Release();
}
}
public T Take()
{
T item = default(T);
_ItemsInQueue.Wait();
lock (this)
{
item = _Queue.Dequeue();
if (_MaxItems != Int32.MaxValue) _FreeSlots.Release();
}
return item;
}
}
}
The following code is not thread-safe. If two threads are entering this method at the same time, both might pass the if condition successfully.
public void lockBuffer()
{
if (!locked)
{
//...
locked = true;
}
}
You simply might want to do something like this:
lock (_sycnObject)
{
buffer.lockBuffer();
buffer.add(tcpPacket);
buffer.unlockBuffer();
buffer.notify();
}
I don't think you're doing something sophisticated that requires more than the simple to use lock-statement.
I wouldn't use Mutexes since I suppose you aren't dealing with multiple processes synchronization. Locks are pretty fine and simpler to implement:
class Buffer
{
private readonly object syncObject = new object();
private readonly List<string> buffer = new List<string>();
public void AddPacket(string packet)
{
lock (syncObject)
{
buffer.Add(packet);
}
}
public void Notify()
{
// Do something, if needed lock again here
// lock (syncObject)
// {
// Notify Implementation
// }
}
}
The usage is obviously (as you requested):
var myBuffer = new Buffer();
myBuffer.Add("Hello, World!");
myBuffer.Notify();
How to share data between different threads In C# without using the static variables?
Can we create a such machanism using attribute?
Will Aspect oriented programming help in such cases?
To acheive this all the different threads should work on single object?
You can't beat the simplicity of a locked message queue. I say don't waste your time with anything more complex.
Read up on the lock statement.
lock
EDIT
Here is an example of the Microsoft Queue object wrapped so all actions against it are thread safe.
public class Queue<T>
{
/// <summary>Used as a lock target to ensure thread safety.</summary>
private readonly Locker _Locker = new Locker();
private readonly System.Collections.Generic.Queue<T> _Queue = new System.Collections.Generic.Queue<T>();
/// <summary></summary>
public void Enqueue(T item)
{
lock (_Locker)
{
_Queue.Enqueue(item);
}
}
/// <summary>Enqueues a collection of items into this queue.</summary>
public virtual void EnqueueRange(IEnumerable<T> items)
{
lock (_Locker)
{
if (items == null)
{
return;
}
foreach (T item in items)
{
_Queue.Enqueue(item);
}
}
}
/// <summary></summary>
public T Dequeue()
{
lock (_Locker)
{
return _Queue.Dequeue();
}
}
/// <summary></summary>
public void Clear()
{
lock (_Locker)
{
_Queue.Clear();
}
}
/// <summary></summary>
public Int32 Count
{
get
{
lock (_Locker)
{
return _Queue.Count;
}
}
}
/// <summary></summary>
public Boolean TryDequeue(out T item)
{
lock (_Locker)
{
if (_Queue.Count > 0)
{
item = _Queue.Dequeue();
return true;
}
else
{
item = default(T);
return false;
}
}
}
}
EDIT 2
I hope this example helps.
Remember this is bare bones.
Using these basic ideas you can safely harness the power of threads.
public class WorkState
{
private readonly Object _Lock = new Object();
private Int32 _State;
public Int32 GetState()
{
lock (_Lock)
{
return _State;
}
}
public void UpdateState()
{
lock (_Lock)
{
_State++;
}
}
}
public class Worker
{
private readonly WorkState _State;
private readonly Thread _Thread;
private volatile Boolean _KeepWorking;
public Worker(WorkState state)
{
_State = state;
_Thread = new Thread(DoWork);
_KeepWorking = true;
}
public void DoWork()
{
while (_KeepWorking)
{
_State.UpdateState();
}
}
public void StartWorking()
{
_Thread.Start();
}
public void StopWorking()
{
_KeepWorking = false;
}
}
private void Execute()
{
WorkState state = new WorkState();
Worker worker = new Worker(state);
worker.StartWorking();
while (true)
{
if (state.GetState() > 100)
{
worker.StopWorking();
break;
}
}
}
You can pass an object as argument to the Thread.Start and use it as a shared data storage between the current thread and the initiating thread.
You can also just directly access (with the appropriate locking of course) your data members, if you started the thread using the instance form of the ThreadStart delegate.
You can't use attributes to create shared data between threads. You can use the attribute instances attached to your class as a data storage, but I fail to see how that is better than using static or instance data members.
Look at the following example code:
public class MyWorker
{
public SharedData state;
public void DoWork(SharedData someData)
{
this.state = someData;
while (true) ;
}
}
public class SharedData {
X myX;
public getX() { etc
public setX(anX) { etc
}
public class Program
{
public static void Main()
{
SharedData data = new SharedDate()
MyWorker work1 = new MyWorker(data);
MyWorker work2 = new MyWorker(data);
Thread thread = new Thread(new ThreadStart(work1.DoWork));
thread.Start();
Thread thread2 = new Thread(new ThreadStart(work2.DoWork));
thread2.Start();
}
}
In this case, the thread class MyWorker has a variable state. We initialise it with the same object. Now you can see that the two workers access the same SharedData object. Changes made by one worker are visible to the other.
You have quite a few remaining issues. How does worker 2 know when changes have been made by worker 1 and vice-versa? How do you prevent conflicting changes? Maybe read: this tutorial.
When you start a thread you are executing a method of some chosen class. All attributes of that class are visible.
Worker myWorker = new Worker( /* arguments */ );
Thread myThread = new Thread(new ThreadStart(myWorker.doWork));
myThread.Start();
Your thread is now in the doWork() method and can see any atrributes of myWorker, which may themselves be other objects. Now you just need to be careful to deal with the cases of having several threads all hitting those attributes at the same time.