I have a System.Timers.Timer object that I want to use, but I don't want the processing tied to the Timer to interfere with normal to high priority threads. In other words, I'd like to say that I want to process X every 5 seconds as long as nothing else is running.
How could I ensure that my Timer operations are running in a low-priority manner?
The nice thing about System.Timers.Timer is that you can assign a synchronzing object via the SynchronizingObject property and then exploit it to run the Elapsed event a thread whose priority can be controlled.
Just assign an instance of the ElapsedEventReceiver to the SynchronizingObject property of your timer.
Disclaimer: I whipped this up pretty fast so you will need to add your own finishing touches to make it more robust.
public class ElapsedEventReceiver : ISynchronizeInvoke
{
private Thread m_Thread;
private BlockingCollection<Message> m_Queue = new BlockingCollection<Message>();
public ElapsedEventReceiver()
{
m_Thread = new Thread(Run);
m_Thread.Priority = ThreadPriority.BelowNormal;
m_Thread.IsBackground = true;
m_Thread.Start();
}
private void Run()
{
while (true)
{
Message message = m_Queue.Take();
message.Return = message.Method.DynamicInvoke(message.Args);
message.Finished.Set();
}
}
public IAsyncResult BeginInvoke(Delegate method, object[] args)
{
Message message = new Message();
message.Method = method;
message.Args = args;
m_Queue.Add(message);
return message;
}
public object EndInvoke(IAsyncResult result)
{
Message message = result as Message;
if (message != null)
{
message.Finished.WaitOne();
return message.Return;
}
throw new ArgumentException("result");
}
public object Invoke(Delegate method, object[] args)
{
Message message = new Message();
message.Method = method;
message.Args = args;
m_Queue.Add(message);
message.Finished.WaitOne();
return message.Return;
}
public bool InvokeRequired
{
get { return Thread.CurrentThread != m_Thread; }
}
private class Message : IAsyncResult
{
public Delegate Method;
public object[] Args;
public object Return;
public object State;
public ManualResetEvent Finished = new ManualResetEvent(false);
public object AsyncState
{
get { return State; }
}
public WaitHandle AsyncWaitHandle
{
get { return Finished; }
}
public bool CompletedSynchronously
{
get { return false; }
}
public bool IsCompleted
{
get { return Finished.WaitOne(0); }
}
}
}
Probably the easiest way would be to have a "busy" flag (or count), and ignore the timer ticks as long as it's non-zero.
P.S. Changing thread priorities is not recommended. It's hardly ever necessary.
Related
I have a class that constantly refreshes devices physically connected to PC via USB. The monitoring method runs on a thread checking a _monitoring flag, and Start and Stop methods just set and unset that flag.
My current problem is: when the thread is running, I get the expected "busy" and "not busy" console prints, but when I call Stop method, it keeps running while(_busy) forever, because somehow the _monitoringThread seems to stop running!
I suspect it stops running because the last print is always busy, that is, the ExecuteMonitoring runs midway and then nobody knows (at least I don't).
Pause debugging and looking at StackTrace didn't help either, because it keeps in the while(_busy) statement inside Stop() method, forever.
public class DeviceMonitor
{
bool _running;
bool _monitoring;
bool _busy = false;
MonitoringMode _monitoringMode;
Thread _monitoringThread;
readonly object _lockObj = new object();
// CONSTRUTOR
public DeviceMonitor()
{
_monitoringThread = new Thread(new ThreadStart(ExecuteMonitoring));
_monitoringThread.IsBackground = true;
_running = true;
_monitoringThread.Start();
}
public void Start()
{
_monitoring = true;
}
public void Stop()
{
_monitoring = false;
while (_busy)
{
Thread.Sleep(5);
}
}
void ExecuteMonitoring()
{
while (_running)
{
Console.WriteLine("ExecuteMonitoring()");
if (_monitoring)
{
lock (_lockObj)
{
_busy = true;
}
Console.WriteLine("busy");
if (_monitoringMode == MonitoringMode.SearchDevices)
{
SearchDevices();
}
else
if (_monitoringMode == MonitoringMode.MonitorDeviceConnection)
{
MonitorDeviceConnection();
}
lock (_lockObj)
{
_busy = false;
}
Console.WriteLine("not busy");
}
Thread.Sleep(1000);
_busy = false;
}
}
private void SearchDevices()
{
var connected = ListDevices();
if (connected.Count > 0)
{
Device = connected.First();
ToggleMonitoringMode();
}
else
Device = null;
}
void MonitorDeviceConnection()
{
if (Device == null)
{
ToggleMonitoringMode();
}
else
{
bool responding = Device.isConnected;
Console.WriteLine("responding " + responding);
if (!responding)
{
Device = null;
ToggleMonitoringMode();
}
}
}
void ToggleMonitoringMode()
{
if (_monitoringMode == MonitoringMode.SearchDevices)
_monitoringMode = MonitoringMode.MonitorDeviceConnection;
else
if (_monitoringMode == MonitoringMode.MonitorDeviceConnection)
_monitoringMode = MonitoringMode.SearchDevices;
}
enum MonitoringMode
{
SearchDevices,
MonitorDeviceConnection
}
}
The most likely explanation is: optimization: The compiler sees that _busy is never changed inside the Stop method and it is therefore allowed to convert this to an endless loop by replacing _busy with true. This is valid, because the _busy field is not marked as being volatile and as such the optimizer doesn't have to assume changes happening on another thread.
So, try marking _busy as volatile. Or, even better - actually A LOT BETTER - use a ManualResetEvent:
ManualResetEvent _stopMonitoring = new ManualResetEvent(false);
ManualResetEvent _monitoringStopped = new ManualResetEvent(false);
ManualResetEvent _stopRunning = new ManualResetEvent(false);
public void Stop()
{
_stopMonitoring.Set();
_monitoringStopped.Wait();
}
void ExecuteMonitoring()
{
while (!_stopRunning.Wait(0))
{
Console.WriteLine("ExecuteMonitoring()");
if(!_stopMonitoring.Wait(0))
{
_monitoringStopped.Unset();
// ...
}
_monitoringStopped.Set();
Thread.Sleep(1000);
}
}
Code is from memory, might contain some typos.
I need a synchronizing class that behaves exactly like the AutoResetEvent class, but with one minor exception:
A call to the Set() method must release all waiting threads, and not just one.
How can I construct such a class? I am simply out of ideas?
Martin.
So you have multiple threads doing a .WaitOne() and you want to release them?
Use the ManualResetEvent class and all the waiting threads should release...
Thank you very much for all your thougts and inputs which I have read with great interest. I did some more searching here on Stackoverflow, and suddenly I found this, whcih turned out to be just what I was looking for. By cutting it down to just the two methods I need, I ended up with this small class:
public sealed class Signaller
{
public void PulseAll()
{
lock (_lock)
{
Monitor.PulseAll(_lock);
}
}
public bool Wait(TimeSpan maxWaitTime)
{
lock (_lock)
{
return Monitor.Wait(_lock, maxWaitTime);
}
}
private readonly object _lock = new object();
}
and it does excactly what it should! I'm amazed that a solution could be that simple, and I love such simplicity. I'ts beautiful. Thank you, Matthew Watson!
Martin.
Two things you might try.
Using a Barrier object add conditionally adding threads too it and signaling them.
The other might be to use a publisher subscriber setup like in RX. Each thread waits on an object that it passes to a collection. When you want to call 'set' loop over a snapshot of it calling set on each member.
Or you could try bears.
If the event is being referenced by all threads in a common field or property, you could replace the common field or property with a new non-signaled event and then signal the old one. It has some cost to it since you'll be regularly creating new synchronization objects, but it would work. Here's an example of how I would do that:
public static class Example
{
private static volatile bool stopRunning;
private static ReleasingAutoResetEvent myEvent;
public static void RunExample()
{
using (Example.myEvent = new ReleasingAutoResetEvent())
{
WaitCallback work = new WaitCallback(WaitThread);
for (int i = 0; i < 5; ++i)
{
ThreadPool.QueueUserWorkItem(work, i.ToString());
}
Thread.Sleep(500);
for (int i = 0; i < 3; ++i)
{
Example.myEvent.Set();
Thread.Sleep(5000);
}
Example.stopRunning = true;
Example.myEvent.Set();
}
}
private static void WaitThread(object state)
{
while (!Example.stopRunning)
{
Example.myEvent.WaitOne();
Console.WriteLine("Thread {0} is released!", state);
}
}
}
public sealed class ReleasingAutoResetEvent : IDisposable
{
private volatile ManualResetEvent manualResetEvent = new ManualResetEvent(false);
public void Set()
{
ManualResetEvent eventToSet = this.manualResetEvent;
this.manualResetEvent = new ManualResetEvent(false);
eventToSet.Set();
eventToSet.Dispose();
}
public bool WaitOne()
{
return this.manualResetEvent.WaitOne();
}
public bool WaitOne(int millisecondsTimeout)
{
return this.manualResetEvent.WaitOne(millisecondsTimeout);
}
public bool WaitOne(TimeSpan timeout)
{
return this.manualResetEvent.WaitOne(timeout);
}
public void Dispose()
{
this.manualResetEvent.Dispose();
}
}
Another more lightweight solution you could try that uses the Monitor class to lock and unlock objects is below. However, I'm not as happy with the cleanup story for this version of ReleasingAutoResetEvent since Monitor may hold a reference to it and keep it alive indefinitely if it is not properly disposed.
There are a few limitations/gotchas with this implementation. First, the thread that creates this object will be the only one that will be able to signal it with a call to Set; other threads that attempt to do the same thing will receive a SynchronizationLockException. Second, the thread that created it will never be able to wait on it successfully since it already owns the lock. This will only be an effective solution if you have exactly one controlling thread and several other waiting threads.
public static class Example
{
private static volatile bool stopRunning;
private static ReleasingAutoResetEvent myEvent;
public static void RunExample()
{
using (Example.myEvent = new ReleasingAutoResetEvent())
{
WaitCallback work = new WaitCallback(WaitThread);
for (int i = 0; i < 5; ++i)
{
ThreadPool.QueueUserWorkItem(work, i.ToString());
}
Thread.Sleep(500);
for (int i = 0; i < 3; ++i)
{
Example.myEvent.Set();
Thread.Sleep(5000);
}
Example.stopRunning = true;
Example.myEvent.Set();
}
}
private static void WaitThread(object state)
{
while (!Example.stopRunning)
{
Example.myEvent.WaitOne();
Console.WriteLine("Thread {0} is released!", state);
}
}
}
public sealed class ReleasingAutoResetEvent : IDisposable
{
private volatile object lockObject = new object();
public ReleasingAutoResetEvent()
{
Monitor.Enter(this.lockObject);
}
public void Set()
{
object objectToSignal = this.lockObject;
object objectToLock = new object();
Monitor.Enter(objectToLock);
this.lockObject = objectToLock;
Monitor.Exit(objectToSignal);
}
public void WaitOne()
{
object objectToMonitor = this.lockObject;
Monitor.Enter(objectToMonitor);
Monitor.Exit(objectToMonitor);
}
public bool WaitOne(int millisecondsTimeout)
{
object objectToMonitor = this.lockObject;
bool succeeded = Monitor.TryEnter(objectToMonitor, millisecondsTimeout);
if (succeeded)
{
Monitor.Exit(objectToMonitor);
}
return succeeded;
}
public bool WaitOne(TimeSpan timeout)
{
object objectToMonitor = this.lockObject;
bool succeeded = Monitor.TryEnter(objectToMonitor, timeout);
if (succeeded)
{
Monitor.Exit(objectToMonitor);
}
return succeeded;
}
public void Dispose()
{
Monitor.Exit(this.lockObject);
}
}
I'm trying to implement a basic Future class (yeah, I know about Task but this is for educational purposes) and ran into strange behavior of Monitor class. The class is implemented so that it enters the lock in constructor, queues an action which exits the lock to a thread pool. Result getter checks an instance variable to see if the action is completed and if it isn't, enters lock and then returns the result. Problem is that in fact result getter doesn't wait for the queued action to finish and proceeds anyway leading to incorrect results. Here's the code.
// The class itself
public class Future<T>
{
private readonly Func<T> _f;
private volatile bool _complete = false;
private T _result;
private Exception _error = new Exception("WTF");
private volatile bool _success = false;
private readonly ConcurrentStack<Action<T>> _callbacks = new ConcurrentStack<Action<T>>();
private readonly ConcurrentStack<Action<Exception>> _errbacks = new ConcurrentStack<Action<Exception>>();
private readonly object _lock = new object();
public Future(Func<T> f)
{
_f = f;
Monitor.Enter(_lock);
ThreadPool.QueueUserWorkItem(Run);
}
public void OnSuccess(Action<T> a)
{
_callbacks.Push(a);
if (_complete && _success)
a(_result);
}
public void OnError(Action<Exception> a)
{
_errbacks.Push(a);
if (_complete && !_success)
a(_error);
}
private void Run(object state)
{
try {
_result = _f();
_success = true;
_complete = true;
foreach (var cb in _callbacks) {
cb(_result);
}
} catch (Exception e) {
_error = e;
_complete = true;
foreach (var cb in _errbacks) {
cb(e);
}
} finally {
Monitor.Exit(_lock);
}
}
public T Result {
get {
if (!_complete) {
Monitor.Enter(_lock);
}
if (_success) {
return _result;
} else {
Console.WriteLine("Throwing error complete={0} success={1}", _complete, _success);
throw _error;
}
}
}
// Failing test
public void TestResultSuccess() {
var f = new Future<int>(() => 1);
var x = f.Result;
Assert.AreEqual (1, x);
}
I'm using Mono 3.2.3 on Mac OS X 10.9.
Only the thread that took the lock can exit the lock. You can't Enter it in the constructor on the calling thread then Exit from the thread-pool when it completes - the thread-pool worker does not have the lock.
And conversely: presumably it is the same thread that created the future that is accessing the getter: that is allowed to Enter again: it is re-entrant. Also, you need to Exit the same number of times that you Enter, otherwise it isn't actually released.
Basically, I don't think Monitor is the right approach here.
All the examples I have seen using SynchronisationContext.Post have been used in the same class. What I have is the UI thread passing some by-ref arguments to a threadwrapper class, updating the arguments and then I want it to update some labels etc on the UIThread.
internal class ConnThreadWrapper
{
....
public event EventHandler<MyEventArgs<String, Boolean>> updateConnStatus =
delegate { };
public void updateUIThread(string conn, bool connected)
{
uiContext.Post(new SendOrPostCallback((o) =>
{
updateConnStatus(this,
new MyEventArgs<String, Boolean>(conn,
connected));
}),
null);
}
}
//on ui thread
public void updateConnStatus(object sender, MyEventArgs<String, Boolean> e)
{
switch (e.val1)
{
case "CADS" :
if (e.val2 == true)
{
}
The Event seems to fire without any errors but nothing is ever received on the uiThread - i'm not sure if my signature for the sub updateConnStatus is correct or if it works like this. I obviously want the event to handled on the uithread and update the labels from that sub.
In a previous vb.net project I used to reference the form directly on the thread and used a delegate to invoke a callback but apparently this was a bad design as I was mixing application layers. I wanted to use the sync context as it was meant to be thread safe but most of the examples i've seen have used invoke.
Any ideas what I'm missing? Thanks
I wrote this helper class which works for me. Prior to using this class call InitializeUiContext() on UI thread somewhere on application start.
public static class UiScheduler
{
private static TaskScheduler _scheduler;
private static readonly ConcurrentQueue<Action> OldActions =
new ConcurrentQueue<Action>();
public static void InitializeUiContext()
{
_scheduler = TaskScheduler.FromCurrentSynchronizationContext();
}
private static void ExecuteOld()
{
if(_scheduler != null)
{
while(OldActions.Count > 0)
{
Action a;
if(OldActions.TryDequeue(out a))
{
UiExecute(_scheduler, a);
}
}
}
}
private static void UiExecute(TaskScheduler scheduler,
Action a,
bool wait = false)
{
//1 is usually UI thread, dunno how to check this better:
if (Thread.CurrentThread.ManagedThreadId == 1)
{
a();
}
else
{
Task t = Task.Factory.StartNew(a,
CancellationToken.None,
TaskCreationOptions.LongRunning,
scheduler);
if (wait) t.Wait();
}
}
public static void UiExecute(Action a, bool wait = false)
{
if (a != null)
{
if (_scheduler != null)
{
ExecuteOld();
UiExecute(_scheduler, a, wait);
}
else
{
OldActions.Enqueue(a);
}
}
}
}
In the end I ditched the ThreadWrapper and trying to marshal the event to the UI Thread and used a Task instead, in fact I think I can use task to do most of the stuff in this project so happy days.
Task<bool> t1 = new Task<bool>(() => testBB(ref _bbws_wrapper));
t1.Start();
Task cwt1 = t1.ContinueWith(task => { if (t1.Result == true) { this.ssi_bb_conn.BackColor = Color.Green;} else { this.ssi_bb_conn.BackColor = Color.Red; } }, TaskScheduler.FromCurrentSynchronizationContext());
.....
private static bool testBB(ref BBWebserviceWrapper _bbwsw)
{
try
{
//test the connections
if (_bbwsw.initialize_v1() == true)
{
if (_bbwsw.loginUser("XXXXXXXX", "XXXXXXXXX") == true)
{
return true;
}
else
{
return false;
}
}
else
{
return false;
}
}
catch
{
return false;
}
}
How to share data between different threads In C# without using the static variables?
Can we create a such machanism using attribute?
Will Aspect oriented programming help in such cases?
To acheive this all the different threads should work on single object?
You can't beat the simplicity of a locked message queue. I say don't waste your time with anything more complex.
Read up on the lock statement.
lock
EDIT
Here is an example of the Microsoft Queue object wrapped so all actions against it are thread safe.
public class Queue<T>
{
/// <summary>Used as a lock target to ensure thread safety.</summary>
private readonly Locker _Locker = new Locker();
private readonly System.Collections.Generic.Queue<T> _Queue = new System.Collections.Generic.Queue<T>();
/// <summary></summary>
public void Enqueue(T item)
{
lock (_Locker)
{
_Queue.Enqueue(item);
}
}
/// <summary>Enqueues a collection of items into this queue.</summary>
public virtual void EnqueueRange(IEnumerable<T> items)
{
lock (_Locker)
{
if (items == null)
{
return;
}
foreach (T item in items)
{
_Queue.Enqueue(item);
}
}
}
/// <summary></summary>
public T Dequeue()
{
lock (_Locker)
{
return _Queue.Dequeue();
}
}
/// <summary></summary>
public void Clear()
{
lock (_Locker)
{
_Queue.Clear();
}
}
/// <summary></summary>
public Int32 Count
{
get
{
lock (_Locker)
{
return _Queue.Count;
}
}
}
/// <summary></summary>
public Boolean TryDequeue(out T item)
{
lock (_Locker)
{
if (_Queue.Count > 0)
{
item = _Queue.Dequeue();
return true;
}
else
{
item = default(T);
return false;
}
}
}
}
EDIT 2
I hope this example helps.
Remember this is bare bones.
Using these basic ideas you can safely harness the power of threads.
public class WorkState
{
private readonly Object _Lock = new Object();
private Int32 _State;
public Int32 GetState()
{
lock (_Lock)
{
return _State;
}
}
public void UpdateState()
{
lock (_Lock)
{
_State++;
}
}
}
public class Worker
{
private readonly WorkState _State;
private readonly Thread _Thread;
private volatile Boolean _KeepWorking;
public Worker(WorkState state)
{
_State = state;
_Thread = new Thread(DoWork);
_KeepWorking = true;
}
public void DoWork()
{
while (_KeepWorking)
{
_State.UpdateState();
}
}
public void StartWorking()
{
_Thread.Start();
}
public void StopWorking()
{
_KeepWorking = false;
}
}
private void Execute()
{
WorkState state = new WorkState();
Worker worker = new Worker(state);
worker.StartWorking();
while (true)
{
if (state.GetState() > 100)
{
worker.StopWorking();
break;
}
}
}
You can pass an object as argument to the Thread.Start and use it as a shared data storage between the current thread and the initiating thread.
You can also just directly access (with the appropriate locking of course) your data members, if you started the thread using the instance form of the ThreadStart delegate.
You can't use attributes to create shared data between threads. You can use the attribute instances attached to your class as a data storage, but I fail to see how that is better than using static or instance data members.
Look at the following example code:
public class MyWorker
{
public SharedData state;
public void DoWork(SharedData someData)
{
this.state = someData;
while (true) ;
}
}
public class SharedData {
X myX;
public getX() { etc
public setX(anX) { etc
}
public class Program
{
public static void Main()
{
SharedData data = new SharedDate()
MyWorker work1 = new MyWorker(data);
MyWorker work2 = new MyWorker(data);
Thread thread = new Thread(new ThreadStart(work1.DoWork));
thread.Start();
Thread thread2 = new Thread(new ThreadStart(work2.DoWork));
thread2.Start();
}
}
In this case, the thread class MyWorker has a variable state. We initialise it with the same object. Now you can see that the two workers access the same SharedData object. Changes made by one worker are visible to the other.
You have quite a few remaining issues. How does worker 2 know when changes have been made by worker 1 and vice-versa? How do you prevent conflicting changes? Maybe read: this tutorial.
When you start a thread you are executing a method of some chosen class. All attributes of that class are visible.
Worker myWorker = new Worker( /* arguments */ );
Thread myThread = new Thread(new ThreadStart(myWorker.doWork));
myThread.Start();
Your thread is now in the doWork() method and can see any atrributes of myWorker, which may themselves be other objects. Now you just need to be careful to deal with the cases of having several threads all hitting those attributes at the same time.