I have class which implements an endless worker thread like this example, in my case representing a body. During runtime I will have between 0 and ~8 instances live at any time with instances constantly being created and destroyed.
Most of the time this class has a lifecycle of 30 seconds to 5 minutes but occasionally there may be a number of instances created and destroyed in a relatively short period of time. This is where I tend to run into performance issues given the low spec hardware this code is running on.
I would now like to rewrite the behavior so that I use a ThreadPool for my collection of running workers and I am struggling to find the correct way to structure the code.
Basically the code I have at the moment is something like
public class BodyCollection : IReadOnlyDictionary<ulong, TrackedBody>
{
public void Update()
{
if (createNew)
{
var body = new TrackedBody();
body.BeginTracking();
this.Add(1234, body);
}
if (remove)
{
TrackedBody body = this[1234];
body.StopTracking();
this.Remove(body);
}
}
}
public class TrackedBody
{
private readonly Thread _BiometricsThread;
private volatile bool _Continue = true;
public TrackedBody()
{
_BiometricsThread = new Thread(RunBiometricsThread);
}
public void BeginTracking()
{
_BiometricsThread.Start();
}
public void StopTracking()
{
_Continue = false;
}
private void RunBiometricsThread()
{
while(_Continue)
{
System.Threading.Thread.Sleep(1000);
}
}
}
So how do I re-write the above to utilize a ThreadPool correctly and so that I can cancel running threads on the ThreadPool as required? Do I use CancellationTokens or ManualResetEvents to control the threads?
I strongly believe you should be using more modern methods of asynchronous programming. We are going to use the Task Parallel Library here because it gives you the features you want for free:
Tracking completion
Cancellation
Thread pool
public class TrackedBody
{
public Task BeginTrackingAsync(CancellationToken cancellation)
{
return Task.Run(() => RunBiometricsThread(cancellation));
}
private void RunBiometricsThread(CancellationToken cancellation)
{
while(!cancellation.IsCancellationRequested)
{
Task.Delay(1000, cancellation);
}
}
}
Note that I have removed the async keyword. This was doing nothing on its own.
You can use the task to track the state of the ongoing work. You can use the cancellation token to stop all work.
Related
The code below receives messages from tcp and passes them to the appropriate message handler. Depending on the message type, a message handler may take many minutes or several seconds to process it.
I chose the design of having a separate handler for each message type. But, now I'm thinking:
Even though I have async producer-consumer (await _messages.Reader.WaitToReadAsync()), I still have Task.Run with a loop for each message handler, meaning it will hold a whole thread from thread pool for duration of the whole program, right? So, If I have 3 message handlers, I'm holding 3 threads from thread pool, right?
Is there any benefit at all of using async producer-consumer in the way the code is currently implemented? Again, since each message handler starts Task.Run for duration of the whole program I think there is no benefit and I could use in general just a synchronous collection like BlockingCollection, right?
What is the better way to do this? Should I just have one message handler with single Task.Run that have a loop and checks for new messages and it will spawn other Task.Runs for new massage? But, I need a way to wait for the previous Task to complete and not block checking on new messages. Maybe I should have some cancelable-execution task so I could cancel the prevoius one and start a new one for same message type?
CODE
public class Distributor
{
private readonly Dictionary<string, MessageHandler> _messageHandlers;
public void StartReceive()
{
// tcpClient PSEUDO CODE
while (_tcpClient.NewMessageAvailable)
{
var data = _tcpClient.GetNewMessage();
_messageHandlers[data.MsgType].Add(data.Data);
}
}
private void RegisterMessageHandlers()
{
_messageHandlers["msgType1"] = new MyMessageHandler1(...);
_messageHandlers["msgType2"] = new MyMessageHandler2(...);
_messageHandlers["msgType3"] = new MyMessageHandler3(...);
...
}
}
public abstract class MessageHandler
{
private readonly Channel<string> _messages;
public MessageHandler()
{
_messages = Channel.CreateBounded<int>(new BoundedChannelOptions(1)
{
SingleReader = true,
SingleWriter = true,
FullMode = BoundedChannelFullMode.DropOldest,
});
}
public void Start()
{
_task = Task.Run(async () =>
{
try
{
while (await _messages.Reader.WaitToReadAsync())
{
try
{
_messages.Reader.TryRead(out var msg);
await Task.Run(async () => await HandleAsync(msg));
}
catch (Exception ex)
{
}
}
}
catch { } // OperationCanceledException
}
}
public void Add(string msg)
{
_messages.Writer.TryWrite(msg);
}
protected abstract Task HandleAsync(string msg);
}
public class MyMessageHandler1 : MessageHandler
{
protected override async Task HandleAsync(string msg)
{
// DO SOME LONG WORK
await _service1.DoWork();
}
}
public class MyMessageHandler2 : MessageHandler
{
protected override async Task HandleAsync(string msg)
{
// DO SOME WORK
await _service2.DoWork();
}
}
I still have Task.Run with a loop for each message handler, meaning it will hold a whole thread from thread pool for duration of the whole program, right? So, If I have 3 message handlers, I'm holding 3 threads from thread pool, right?
I'll answer just this question. Your assumption is wrong. You are using the Task.Run with asynchronous delegate:
_task = Task.Run(async () =>
{
while (await _messages.Reader.WaitToReadAsync())
{
//...
}
}
The _task is not running on a single thread from start to finish, unless all the awaiting inside the delegate is happening on completed awaitables, which is unlikely. Initially a ThreadPool thread is used for invoking the _messages.Reader.WaitToReadAsync method, and when the method returns that thread is released back to the ThreadPool. There is no thread involved during the await periods, and after each await a different thread might run the continuation until the next await.
Theoretically you could have thousands of tasks similar to the _task running concurrently, using only a handful of threads. The ratio tasks/threads depends on how much of the work inside the loop is synchronous, and how much is asynchronous.
To understand better the Task.Run as a mechanism, make sure to read this article by Stephen Toub: Task.Run vs Task.Factory.StartNew.
I have an Asp.Net Core 6 Web Api.
I have a Singleton class with several methods in it.
I want only 1 thread to enter any of the methods of the class at a time.
Is it ok to initialize the SemaphoreSlim class in the constructor and use it in every method in the following way? Are there any dangers from it?
Is there a better way to achieve what I am looking for?
public class Foo
{
private readonly SemaphoreSlim _sm;
public Foo()
{
_sm = new SemaphoreSlim(1, 1);
}
public async Task FirstMethod()
{
await _sm.WaitAsync();
//Do some work
_sm.Release();
}
public async Task SecondMethod()
{
await _sm.WaitAsync();
//Do some work
_sm.Release();
}
}
You should adopt the try/catch/finalize pattern - if you do so, you're mostly safe: all code under the WaitAsync will be executed as defined by the max semaphore. Only if the code under the semaphore block, you'll get a deadlock - you could consider using a timeout, but typically this is accepted.
public async Task SecondMethod()
{
await _sm.WaitAsync();
try
{
//Do some work
}
finally
{
//release in case of errors
_sm.Release();
}
}
Other stuff to consider is, especially if it is a long running process, is to use a cancellation token to indicate application closure.
Also keep in mind that if you fire a lot of these methods from various threads, the order is not guaranteed - but they will all be handled.
Consider the following abstract class:
public abstract class Worker {
protected bool shutdown;
protected Thread t;
/// <summary>
/// defines that we have an auto unpause scheduled
/// </summary>
private bool _unpauseScheduled;
/// <summary>
/// when paused; schedule an automatic unpause when we
/// reach this datetime
/// </summary>
private DateTime pauseUntil;
private bool _isStopped = true;
public bool IsStopped {
get {
return t.ThreadState == ThreadState.Stopped;
}
}
private bool _isPaused = false;
public bool IsPaused {
get {
return _isPaused;
}
}
private string stringRepresentation;
public Worker() {
t = new Thread(ThreadFunction);
stringRepresentation = "Thread id:" + t.ManagedThreadId;
t.Name = stringRepresentation;
}
public Worker(string name) {
t = new Thread(ThreadFunction);
stringRepresentation = name;
t.Name = stringRepresentation;
}
public void Start() {
OnBeforeThreadStart();
t.Start();
}
public void ScheduleStop() {
shutdown = true;
}
public void SchedulePause() {
OnPauseRequest();
_isPaused = true;
}
public void SchedulePause(int seconds) {
_unpauseScheduled = true;
pauseUntil = DateTime.Now.AddSeconds(seconds);
SchedulePause();
}
public void Unpause() {
_isPaused = false;
_unpauseScheduled = false;
}
public void ForceStop() {
t.Abort();
}
/// <summary>
/// The main thread loop.
/// </summary>
private void ThreadFunction() {
OnThreadStart();
while (!shutdown) {
OnBeforeLoop();
if (!IsPaused) {
if (!OnLoop()) {
break;
}
} else {
// check for auto-unpause;
if (_unpauseScheduled && pauseUntil < DateTime.Now) {
Unpause();
}
}
OnAfterLoop();
Thread.Sleep(1000);
}
OnShutdown();
}
public abstract void OnBeforeThreadStart();
public abstract void OnThreadStart();
public abstract void OnBeforeLoop();
public abstract bool OnLoop();
public abstract void OnAfterLoop();
public abstract void OnShutdown();
public abstract void OnPauseRequest();
public override string ToString() {
return stringRepresentation;
}
}
I use this class to create Threads that are designed to run for the lifetime of the application, but also with the ability to pause and stop the threads as needed.
I can't help but shake the feeling that my implementation is naive though. My use of Thread.Sleep() gives me pause. I am still learning the ins and outs of threads, and I am looking to see what others might do instead.
The Worker derived objects need to be able to do the following:
Run for the lifetime of the application (or as long as needed)
Be able to stop safely (finish what is was doing in OnLoop())
Be able to stop unsafely (disregard what is happening in OnLoop())
Be able to pause execution for a certain amount of time (or indefinitly)
Now, my implementation works, but that is not good enough for me. I want to use good practice, and I could use some review of this to help me with that.
I can't help but shake the feeling that my implementation is naive though. My use of Thread.Sleep() gives me pause. I am still learning the ins and outs of threads, and I am looking to see what others might do instead.
Your intuitions are good here; this is a naive approach, and any time you sleep a thread in production code you should think hard about whether you're making a mistake. You're paying for that worker; why are you paying for it to sleep?
The right way to put a thread to sleep until it is needed is not to sleep and poll in a loop. Use an appropriate wait handle instead; that's what wait handles are for.
But a better approach still would be to put an idle thread back into a pool of threads; if the work needs to be started up again in the future, schedule it onto a new worker thread. A thread that can sleep forever is a huge waste of resources; remember, a thread is a million bytes of memory by default. Would you allocate a bunch of million-byte arrays and then never use them?
You should study the design of the Task Parallel Library for additional inspiration. The insight of the TPL is that threads are workers, but what you care about is getting tasks completed. Your approach puts a thin layer on top of threads, but it does not get past the fact that threads are workers; managing workers is a pain. State your tasks, and let the TPL assign them to workers.
You might also examine the assumptions around the up-to-date-ness of your various flags. They have no locks and are not volatile, and therefore reads and writes can be moved forwards and backwards in time basically at the whim of the CPU.
You also have some non-threading bugs to think about. For example, suppose you decide to pause for thirty minutes, but at five minutes before clocks "spring forward" for daylight savings time. Do you pause for half an hour, or five minutes? Which do you actually intend?
I am working on some interesting concepts related to wrapping threads.
I have called it Fiber for now.
http://net7mma.codeplex.com/SourceControl/latest#Concepts/Classes/Threading/Threading.cs
Eric Lippert is is correct about paying a worker to sleep in some regard, if you imagine Eric Lippert is paid salary as opposed to via the hour then technically he is paid to sleep just as any other salaried employee.
How this relates to the concept at hand?
What about Priority? The CPU(s) that are executing your code are contending with their own pipelines for execution context as well as requests from the scheduler.
No one makes any mention of reducing the Priority which will reduce the amount of time given to execution of the context by the scheduler.
Chaining the Priority will thus increase the amount of cycles given to other context's and additionally will reduce the power consumption of your processor at the same time making your application run longer if it has a limited source of power (unless of course your using the excess heat to provide additional power to your system.)
My unity project is a procedural environment on Android and creates terrains and stuff at runtime. The overall workflow I use is to calculate anything non-unity in a worker thread and when the data is calculated, I call Unity API within the main thread.
The problem is that sometimes (like every 200 frames) the worker thread affects the main thread's performance. That could show itself with a nasty spike in rendering time.
So what to do with multithreading in unity?
EDIT: The android device is quad core.
EDIT 2: I believe what Sam said in the comments is exactly what's happening.
I wonder if you can determine what core the code is running on, or if
you can set the thread affinity? I found some links where people
described similar situations: here and here. Sounds like
it's picking the same core as the main thread occasionally.
So maybe this is not about unity but every real time interactive multithreaded application. I believe it is possible to set thread affinity on some versions of some platforms but that destroys the cross platform status of the system.
PS: Not exactly the main issue but to make the question more concrete, I will include the worker thread implementation. It is a modified version of the worker thread implementation suggested in this post. The modified version is like this:
public static class JobScheduler
{
private static Queue<Job> Jobs = new Queue<Job>();
private static volatile bool isBusy;
private static ManualResetEvent _workAvailable = new ManualResetEvent(false);
static JobScheduler()
{
var backgroundWorkThread = new Thread(BackgroundThread)
{
IsBackground = true,
Priority = ThreadPriority.Lowest,
Name = "BasicBackgroundWorker Thread"
};
backgroundWorkThread.Start();
}
private static void BackgroundThread()
{
int jobCnt;
while (true)
{
Job? workItem=null;
lock (Jobs)
{
jobCnt = Jobs.Count;
if (jobCnt != 0 && !isBusy)
{
workItem = Jobs.Dequeue();
}
}
if (workItem!=null)
{
isBusy = true;
workItem.Value.callback(workItem.Value.param);
}
else
{
_workAvailable.WaitOne();
_workAvailable.Reset();
}
}
}
public static void AddJob(Job Job)
{
lock (Jobs)
{
Jobs.Enqueue(Job);
}
_workAvailable.Set();
}
public static void JobDone()
{
isBusy = false;
_workAvailable.Set();
}
}
And the Job struct:
public struct Job
{
public object param;
public WaitCallback callback;
public Job(WaitCallback callBackP
, object parameter)
{
callback = callBackP;
param = parameter;
}
}
Whenever needed I call JobScheduler.AddJob to enqueue a job and call JobScheduler.JobDone after the job is done to allow the next job to run.
Also the ThreadPool is not an option since it produces unnecessary garbage and is not very flexible to use.
I have a single thread VB.net service that checks a database for specific information. If the info does not exist, it needs to wait 15 minutes and try again. What is the best method to have the service wait during this 15 minute period? I considered a Do loop with threading.thread.sleep, but am always reading how that is bad to use, but I do not know an alternative. Any suggestions of a better method for this would be appreciated.
I guess you may, alternatively to Thread.Sleep:
1 - Make your application as SINGLE INSTANCE (see the Properties of your Solution).
2- Add an Schedule Event into Task Scheduler of the Windows to call your application on each 15 minutes.
3- Your program will be terminated normally and will be called from Windows (or manually by user).
4- Since Single-Event you won´t have many instances of the application running at the same time - just one. So, even if the Task Scheduler starts a new instance, you may be sure that just one instance will be running.
Using a BackgroundWorker and a ManualResetEvent I think you can do what you have in mind.
public class LibraryBackgroundTimer : BackgroundWorker
{
private ManualResetEvent intervalManualReset;
public int Interval { get; set; }
public LibraryBackgroundTimer()
{
this.WorkerSupportsCancellation = true;
this.Interval = 1000;
}
protected override void OnDoWork(DoWorkEventArgs e)
{
while (!this.CancellationPending)
{
base.OnDoWork(e);
this.Sleep();
}
}
public void Start()
{
if (this.IsBusy)
return;
this.intervalManualReset = new ManualResetEvent(false);
this.RunWorkerAsync();
}
public void Stop()
{
this.CancelAsync();
this.WakeUp();
this.Dispose(true);
}
public void WakeUp()
{
if (this.intervalManualReset != null)
this.intervalManualReset.Set();
}
private void Sleep()
{
if (this.intervalManualReset != null)
{
this.intervalManualReset.Reset();
this.intervalManualReset.WaitOne(this.Interval);
}
}
}
using this class your timer can be stopped for a desired time and also it's capable of being waked up during the sleep time.
I hope this helps.