Thread-safe queue with tasks - c#

I have some kind of work manager in c# which will receive tasks to do and it will execute them. Task will be arriving from different threads, but they must be executed only one at the same time in the order they were received.
I don't want a while loop, which will be running all the time, checking if they are new tasks in the queue. Is there a built-in queue or an easy way to implement a queue which will wait for tasks and execute them synchronously without busy-waiting?

As per the comments you should look into ConcurrentQueue but also BlockingCollection and use GetConsumingEnumerable() instead of your unwanted WHILE loop
BlockingCollection<YourClass> _collection =
new BlockingCollection<YourClass>(new ConcurrentQueue<YourClass>());
_collection.Add() can be called from multiple threads
On a separate thread you can use
foreach (var message in _collection.GetConsumingEnumerable())
{}

You can use a SemaphoreSlim (https://msdn.microsoft.com/en-us/library/system.threading.semaphoreslim(v=vs.110).aspx) and a ConcurrentQueue
Example:
private delegate void TaskBody();
private class TaskManager
{
private ConcurrentQueue<TaskBody>
TaskBodyQueue = new ConcurrentQueue<TaskBody>();
private readonly SemaphoreSlim
TaskBodySemaphoreSlim = new SemaphoreSlim(1, 1);
public async void Enqueue(TaskBody body)
{
TaskBodyQueue.Enqueue(body);
await TaskBodySemaphoreSlim.WaitAsync();
Console.WriteLine($"Cycle ...");
if (TaskBodyQueue.TryDequeue(out body) == false) {
throw new InvalidProgramException($"TaskBodyQueue is empty!");
}
body();
Console.WriteLine($"Cycle ... done ({TaskBodyQueue.Count} left)");
TaskBodySemaphoreSlim.Release();
}
}
public static void Main(string[] args)
{
var random = new Random();
var tm = new TaskManager();
Parallel.ForEach(Enumerable.Range(0, 30), async number => {
await Task.Delay(100 * number);
tm.Enqueue(delegate {
Console.WriteLine($"Print {number}");
});
});
Task
.Delay(4000)
.Wait();
WaitFor(action: "exit");
}
public static void WaitFor(ConsoleKey consoleKey = ConsoleKey.Escape, string action = "continue")
{
Console.Write($"Press {consoleKey} to {action} ...");
var consoleKeyInfo = default(ConsoleKeyInfo);
do {
consoleKeyInfo = Console.ReadKey(true);
}
while (Equals(consoleKeyInfo.Key, consoleKey) == false);
Console.WriteLine();
}

Related

Cancel Token from infinite Parallel.Foreach loop from an event

I have write some code, where i am using Parallel.Foreach for few items to work parallel with infinite loop i.e working fine after every 60 sec.
But here my message can be change by the user at any time and i need to re-process with new message.
For this, i need to cancel the infinite Parallel.Foreach loop to reprocess the updated message.
when i am trying to reprocess the main method its working fine for new message, but its running twice because the previous scheduled tasks is not canceled. I am assuming i need to cancel process from Parrallel.Foreach loop and re-run again for updated message with new schedule.
So can anyone help me to cancel the queued task that is already scheduled for next 60 second.
static void Main(string[] args)
{
List<RealTimeMessage> messages = GetRealTimeMessage();
Parallel.ForEach(messages, (message) =>
{
processMessage(message);
});
Console.ReadLine();
}
private static async void processMessage(RealTimeMessage message)
{
try
{
while (true)
{
await Task.Delay(TimeSpan.FromSeconds(60));
await Task.Run(() => ProceesRequest(message));
}
}
catch (Exception)
{
Console.WriteLine("Critical error");
}
}
private static List<RealTimeMessage> GetRealTimeMessage()
{
List<RealTimeMessage> realTimeMessages = new List<RealTimeMessage>();
realTimeMessages.Add(new RealTimeMessage { MessageText = "Message 4", IntervalTime = "1", MessageType = "AIDX", TimeOfDay = "" });
realTimeMessages.Add(new RealTimeMessage { MessageText = "Message 5", IntervalTime = "2", MessageType = "AMSX", TimeOfDay = "" });
return realTimeMessages;
}
private static void ProceesRequest(RealTimeMessage message)
{
// do domething
}
This is a misuse of Parallel.ForEach, use Task.WhenAll instead
Don't start a Task in ProcessMessage (this could be intentional, however it looks like a mistake).
Use a CancellationToken to cancel a task
Don't use async void unless it's for an event
Use standard casing for method names
Don't use while(true) use while (!token.IsCancellationRequested)
When all things are considered, it would look something like this
static async Task Main(string[] args)
{
var ts = new CancellationTokenSource();
var messages = GetRealTimeMessage();
var tasks = messages.Select(x => ProcessMessage(x, ts.Token));
Console.WriteLine("Press any key to cancel tasks")
Console.ReadKey();
ts.Cancel();
await Task.WhenAll(tasks);
Console.WriteLine("All finished");
Console.ReadKey();
}
private static async Task ProcessMessage( RealTimeMessage message, CancellationToken token )
{
try
{
while (!token.IsCancellationRequested)
{
await Task.Delay(TimeSpan.FromSeconds(60), token);
ProcessRequest(message);
}
}
catch (OperationCanceledException)
{
Console.WriteLine("Operation Cancelled");
}
catch (Exception ex)
{
Console.WriteLine("Critical error: " + ex.Message);
}
}
To cancel your tasks, just call ts.Cancel().

Perform async operations with Hangfire instead of Tasks

I have 4 classes that perform different actions, each of the 2 tasks should run syncronously.
My current implementation works fine, but sometimes there are strange lags and some of the processors do not run. This could be the threads issue I suppose.
I would like to refactor the code to use Hangfire library or any other way to keep the program working correctly. I'm not sure how to property do that and would appreciate any help.
public void Run()
{
var processors1 = new CommonProcessor[] { new AProcessor(), new BProcessor() };
//AProcessor should be first!
var processors2 = new CommonProcessor[] { new CProcessor(), new DProcessor() };
//CProcessor should be first!
Task task1 = Task.Run(() => RunSyncProcess(processors1);
Task task2 = Task.Run(() => RunSyncProcess(processors2);
Task.WaitAll(task1, task2);
}
private void RunSyncProcess(CommonProcessor[] processors)
{
while (true)
{
foreach (var processor in processors)
{
// do some job
}
Thread.Sleep(frequency);
}
}
You are using Tasks the wrong way. Tasks are supposed to be non blocking or short term lived items.
Basically what happens here is that you launch Tasks which never end and never release their thread. This will result in blocking some threads of the ThreadPool.
There are multiple way to change things:
1) Non blocking tasks:
public void Run()
{
var processors1 = new CommonProcessor[] { new AProcessor(), new BProcessor() };
//AProcessor should be first!
var processors2 = new CommonProcessor[] { new CProcessor(), new DProcessor() };
//CProcessor should be first!
Task task1 = RunSyncProcess(processors1);
Task task2 = RunSyncProcess(processors2);
Task.WhenAll(task1, task2);
}
private async Task RunSyncProcess(CommonProcessor[] processors)
{
while (true)
{
foreach (var processor in processors)
{
// do some job
}
await Task.Delay(TimeSpan.FromMilliseconds(frequency));//will free threadpool while waiting
}
}
2) Using blocking threads but without impacting threadpool:
public void Run()
{
var processors1 = new CommonProcessor[] { new AProcessor(), new BProcessor() };
//AProcessor should be first!
var processors2 = new CommonProcessor[] { new CProcessor(), new DProcessor() };
//CProcessor should be first!
Thread t1 = new Thread(() => RunSyncProcess(processors1));
t1.Start();
Thread t2 = new Thread(() => RunSyncProcess(processors1));
t2.Start();
t1.Join();
t2.Join();
}
private void RunSyncProcess(CommonProcessor[] processors)
{
while (true)
{
foreach (var processor in processors)
{
// do some job
}
Thread.Sleep(frequency);
}
}
Hangfire is primarily used for fire and forget tasks where you can queue, schedule and requeue jobs, if thats what you want to achieve you can install via nuget and then use the following syntax
BackgroundJob.Enqueue(() => RunSyncProcess(processors1));
so in order to refactor your code you would need to decide whether you want to schedule a job or whether you want to wait for a successful completion of a previous task prior to waiting for a new task, really depends on what you want to achieve.
public void Run()
{
var processors1 = new CommonProcessor[] { new AProcessor(), new BProcessor() };
//AProcessor should be first!
var processors2 = new CommonProcessor[] { new CProcessor(), new DProcessor() };
//CProcessor should be first!
BackgroundJob.Enqueue(() => RunSyncProcess(processors1));
BackgroundJob.Enqueue(() => RunSyncProcess(processors2));
}
public void RunSyncProcess(CommonProcessor[] processors)
{
while (true)
{
foreach (var processor in processors)
{
// do some job
}
}
}
You won't have to await all as these will kick of behind the scene and your UI will be responsive. Remember that the methods will need to be public when you want to use hangfire.

Stop hanging synchronous method

There is a method HTTP_actions.put_import() in XenAPI, which is synchronous and it supports cancellation via its delegate.
I have the following method:
private void UploadImage(.., Func<bool> isTaskCancelled)
{
try
{
HTTP_actions.put_import(
cancellingDelegate: () => isTaskCancelled(),
...);
}
catch (HTTP.CancelledException exception)
{
}
}
It so happens that in some cases this method HTTP_actions.put_import hangs and doesn't react to isTaskCancelled(). In that case the whole application also hangs.
I can run this method in a separate thread and kill it forcefully once I receive cancellation signal, but this method doesn't always hang and sometimes I want to gracefully cancel this method. Only when this method is really hanging, I want to kill it myself.
What is the best way to handle such situation?
Wrote blog post for below : http://pranayamr.blogspot.in/2017/12/abortcancel-task.html
Tried lot of solution since last 2 hr for you and I come up with below working solution , please have try it out
class Program
{
//capture request running that , which need to be cancel in case
// it take more time
static Thread threadToCancel = null;
static async Task<string> DoWork(CancellationToken token)
{
var tcs = new TaskCompletionSource<string>();
//enable this for your use
//await Task.Factory.StartNew(() =>
//{
// //Capture the thread
// threadToCancel = Thread.CurrentThread;
// HTTP_actions.put_import(...);
//});
//tcs.SetResult("Completed");
//return tcs.Task.Result;
//comment this whole this is just used for testing
await Task.Factory.StartNew(() =>
{
//Capture the thread
threadToCancel = Thread.CurrentThread;
//Simulate work (usually from 3rd party code)
for (int i = 0; i < 100000; i++)
{
Console.WriteLine($"value {i}");
}
Console.WriteLine("Task finished!");
});
tcs.SetResult("Completed");
return tcs.Task.Result;
}
public static void Main()
{
var source = new CancellationTokenSource();
CancellationToken token = source.Token;
DoWork(token);
Task.Factory.StartNew(()=>
{
while(true)
{
if (token.IsCancellationRequested && threadToCancel!=null)
{
threadToCancel.Abort();
Console.WriteLine("Thread aborted");
}
}
});
///here 1000 can be replace by miliseconds after which you want to
// abort thread which calling your long running method
source.CancelAfter(1000);
Console.ReadLine();
}
}
Here is my final implementation (based on Pranay Rana's answer).
public class XenImageUploader : IDisposable
{
public static XenImageUploader Create(Session session, IComponentLogger parentComponentLogger)
{
var logger = new ComponentLogger(parentComponentLogger, typeof(XenImageUploader));
var taskHandler = new XenTaskHandler(
taskReference: session.RegisterNewTask(UploadTaskName, logger),
currentSession: session);
return new XenImageUploader(session, taskHandler, logger);
}
private XenImageUploader(Session session, XenTaskHandler xenTaskHandler, IComponentLogger logger)
{
_session = session;
_xenTaskHandler = xenTaskHandler;
_logger = logger;
_imageUploadingHasFinishedEvent = new AutoResetEvent(initialState: false);
_xenApiUploadCancellationReactionTime = new TimeSpan();
}
public Maybe<string> Upload(
string imageFilePath,
XenStorage destinationStorage,
ProgressToken progressToken,
JobCancellationToken cancellationToken)
{
_logger.WriteDebug("Image uploading has started.");
var imageUploadingThread = new Thread(() =>
UploadImageOfVirtualMachine(
imageFilePath: imageFilePath,
storageReference: destinationStorage.GetReference(),
isTaskCancelled: () => cancellationToken.IsCancellationRequested));
imageUploadingThread.Start();
using (new Timer(
callback: _ => WatchForImageUploadingState(imageUploadingThread, progressToken, cancellationToken),
state: null,
dueTime: TimeSpan.Zero,
period: TaskStatusUpdateTime))
{
_imageUploadingHasFinishedEvent.WaitOne(MaxTimeToUploadSvm);
}
cancellationToken.PerformCancellationIfRequested();
return _xenTaskHandler.TaskIsSucceded
? new Maybe<string>(((string) _xenTaskHandler.Result).GetOpaqueReferenceFromResult())
: new Maybe<string>();
}
public void Dispose()
{
_imageUploadingHasFinishedEvent.Dispose();
}
private void UploadImageOfVirtualMachine(string imageFilePath, XenRef<SR> storageReference, Func<bool> isTaskCancelled)
{
try
{
_logger.WriteDebug("Uploading thread has started.");
HTTP_actions.put_import(
progressDelegate: progress => { },
cancellingDelegate: () => isTaskCancelled(),
timeout_ms: -1,
hostname: new Uri(_session.Url).Host,
proxy: null,
path: imageFilePath,
task_id: _xenTaskHandler.TaskReference,
session_id: _session.uuid,
restore: false,
force: false,
sr_id: storageReference);
_xenTaskHandler.WaitCompletion();
_logger.WriteDebug("Uploading thread has finished.");
}
catch (HTTP.CancelledException exception)
{
_logger.WriteInfo("Image uploading has been cancelled.");
_logger.WriteInfo(exception.ToDetailedString());
}
_imageUploadingHasFinishedEvent.Set();
}
private void WatchForImageUploadingState(Thread imageUploadingThread, ProgressToken progressToken, JobCancellationToken cancellationToken)
{
progressToken.Progress = _xenTaskHandler.Progress;
if (!cancellationToken.IsCancellationRequested)
{
return;
}
_xenApiUploadCancellationReactionTime += TaskStatusUpdateTime;
if (_xenApiUploadCancellationReactionTime >= TimeForXenApiToReactOnCancel)
{
_logger.WriteWarning($"XenApi didn't cancel for {_xenApiUploadCancellationReactionTime}.");
if (imageUploadingThread.IsAlive)
{
try
{
_logger.WriteWarning("Trying to forcefully abort uploading thread.");
imageUploadingThread.Abort();
}
catch (Exception exception)
{
_logger.WriteError(exception.ToDetailedString());
}
}
_imageUploadingHasFinishedEvent.Set();
}
}
private const string UploadTaskName = "Xen image uploading";
private static readonly TimeSpan TaskStatusUpdateTime = TimeSpan.FromSeconds(1);
private static readonly TimeSpan TimeForXenApiToReactOnCancel = TimeSpan.FromSeconds(10);
private static readonly TimeSpan MaxTimeToUploadSvm = TimeSpan.FromMinutes(20);
private readonly Session _session;
private readonly XenTaskHandler _xenTaskHandler;
private readonly IComponentLogger _logger;
private readonly AutoResetEvent _imageUploadingHasFinishedEvent;
private TimeSpan _xenApiUploadCancellationReactionTime;
}
HTTP_actions.put_import
calls
HTTP_actions.put
calls
HTTP.put
calls
HTTP.CopyStream
The delegate is passed to CopyStream which then checks that the function isn’t null (not passed) or true (return value). However, it only does this at the While statement so the chances are it is the Read of the Stream that is causing the blocking operation. Though it could also occur in the progressDelegate if one is used.
To get around this, put the call to HTTP.put_import() inside a task or background thread and then separately check for cancellation or a return from the task/thread.
Interestingly enough, a quick glance at that CopyStream code revealed a bug to me. If the function that works out if a process has been cancelled returns a different value based off some check it is making, you can actually get the loop to exit without generating a CancelledException(). The result of the CancelledException call should be stored in a local variable.

BrokeredMessage disposed after accessing from different thread

This might be a duplicate of this question but that's confused with talk about batching database updates and still has no proper answer.
In a simple example using Azure Service Bus queues, I can't access a BrokeredMessage after it's been placed on a queue; it's always disposed if I read the queue from another thread.
Sample code:
class Program {
private static string _serviceBusConnectionString = "XXX";
private static BlockingCollection<BrokeredMessage> _incomingMessages = new BlockingCollection<BrokeredMessage>();
private static CancellationTokenSource _cancelToken = new CancellationTokenSource();
private static QueueClient _client;
static void Main(string[] args) {
// Set up a few listeners on different threads
Task.Run(async () => {
while (!_cancelToken.IsCancellationRequested) {
var msg = _incomingMessages.Take(_cancelToken.Token);
if (msg != null) {
try {
await msg.CompleteAsync();
Console.WriteLine($"Completed Message Id: {msg.MessageId}");
} catch (ObjectDisposedException) {
Console.WriteLine("Message was disposed!?");
}
}
}
});
// Now set up our service bus reader
_client = GetQueueClient("test");
_client.OnMessageAsync(async (message) => {
await Task.Run(() => _incomingMessages.Add(message));
},
new OnMessageOptions() {
AutoComplete = false
});
// Now start sending
Task.Run(async () => {
int sent = 0;
while (!_cancelToken.IsCancellationRequested) {
var msg = new BrokeredMessage();
await _client.SendAsync(msg);
Console.WriteLine($"Sent {++sent}");
await Task.Delay(1000);
}
});
Console.ReadKey();
_cancelToken.Cancel();
}
private static QueueClient GetQueueClient(string queueName) {
var namespaceManager = NamespaceManager.CreateFromConnectionString(_serviceBusConnectionString);
if (!namespaceManager.QueueExists(queueName)) {
var settings = new QueueDescription(queueName);
settings.MaxDeliveryCount = 10;
settings.LockDuration = TimeSpan.FromSeconds(5);
settings.EnableExpress = true;
settings.EnablePartitioning = true;
namespaceManager.CreateQueue(settings);
}
var factory = MessagingFactory.CreateFromConnectionString(_serviceBusConnectionString);
factory.RetryPolicy = new RetryExponential(minBackoff: TimeSpan.FromSeconds(0.1), maxBackoff: TimeSpan.FromSeconds(30), maxRetryCount: 100);
var queueClient = factory.CreateQueueClient(queueName);
return queueClient;
}
}
I've tried playing around with settings but can't get this to work. Any ideas?
Answering my own question with response from Serkant Karaca # Microsoft here:
Very basic rule and I am not sure if this is documented. The received message needs to be processed in the callback function's life time. In your case, messages will be disposed when async callback completes, this is why your complete attempts are failing with ObjectDisposedException in another thread.
I don't really see how queuing messages for further processing helps on the throughput. This will add more burden to client for sure. Try processing the message in the async callback, that should be performant enough.
Bugger.

await async lambda in ActionBlock

I have a class Receiver with an ActionBlock:
public class Receiver<T> : IReceiver<T>
{
private ActionBlock<T> _receiver;
public Task<bool> Send(T item)
{
if(_receiver!=null)
return _receiver.SendAsync(item);
//Do some other stuff her
}
public void Register (Func<T, Task> receiver)
{
_receiver = new ActionBlock<T> (receiver);
}
//...
}
The Register-Action for the ActionBlock is a async-Method with a await-Statement:
private static async Task Writer(int num)
{
Console.WriteLine("start " + num);
await Task.Delay(500);
Console.WriteLine("end " + num);
}
Now what i want to do is to wait synchronously (if a condition is set) until the action method is finished to get an exclusive behavior:
var receiver = new Receiver<int>();
receiver.Register((Func<int, Task) Writer);
receiver.Send(5).Wait(); //does not wait the action-await here!
The Problem is when the "await Task.Delay(500);" statement is executed, the "receiver.Post(5).Wait();" does not wait anymore.
I have tried several variants (TaskCompletionSource, ContinueWith, ...) but it does not work.
Has anyone an idea how to solve the problem?
ActionBlock by default will enforce exclusive behavior (only one item is processed at a time). If you mean something else by "exclusive behavior", you can use TaskCompletionSource to notify your sender when the action is complete:
... use ActionBlock<Tuple<int, TaskCompletionSource<object>>> and Receiver<Tuple<int, TaskCompletionSource<object>>>
var receiver = new Receiver<Tuple<int, TaskCompletionSource<object>>>();
receiver.Register((Func<Tuple<int, TaskCompletionSource<object>>, Task) Writer);
var tcs = new TaskCompletionSource<object>();
receiver.Send(Tuple.Create(5, tcs));
tcs.Task.Wait(); // if you must
private static async Task Writer(int num, TaskCompletionSource<object> tcs)
{
Console.WriteLine("start " + num);
await Task.Delay(500);
Console.WriteLine("end " + num);
tcs.SetResult(null);
}
Alternatively, you could use AsyncLock (included in my AsyncEx library):
private static AsyncLock mutex = new AsyncLock();
private static async Task Writer(int num)
{
using (await mutex.LockAsync())
{
Console.WriteLine("start " + num);
await Task.Delay(500);
Console.WriteLine("end " + num);
}
}

Categories