WaitHandle.WaitOne() method in SemaphoreSlim class does not work properly - c#

I have a complex situation but I will try to short it out and let only know for important details. I am trying to implement a task-based job handling. here is the class for that:
internal class TaskBasedJob : IJob
{
public WaitHandle WaitHandle { get; }
public JobStatus Status { get; private set; }
public TaskBasedJob(Func<Task<JobStatus>> action, TimeSpan interval, TimeSpan delay)
{
Status = JobStatus.NotExecuted;
var semaphore = new SemaphoreSlim(0, 1);
WaitHandle = semaphore.AvailableWaitHandle;
_timer = new Timer(async x =>
{
// return to prevent duplicate executions
// Semaphore starts locked so WaitHandle works properly
if (semaphore.CurrentCount == 0 && Status != JobStatus.NotExecuted)
{
return;
Status = JobStatus.Failure;
}
if(Status != JobStatus.NotExecuted)
await semaphore.WaitAsync();
try
{
await action();
}
finally
{
semaphore.Release();
}
}, null, delay, interval);
}
}
Below is the scheduler class :
internal class Scheduler : IScheduler
{
private readonly ILogger _logger;
private readonly ConcurrentDictionary<string, IJob> _timers = new ConcurrentDictionary<string, IJob>();
public Scheduler(ILogger logger)
{
_logger = logger;
}
public IJob ScheduleAsync(string jobName, Func<Task<JobStatus>> action, TimeSpan interval, TimeSpan delay = default(TimeSpan))
{
if (!_timers.ContainsKey(jobName))
{
lock (_timers)
{
if (!_timers.ContainsKey(jobName))
_timers.TryAdd(jobName, new TaskBasedJob(jobName, action, interval, delay, _logger));
}
}
return _timers[jobName];
}
public IReadOnlyDictionary<string, IJob> GetJobs()
{
return _timers;
}
}
Inside of this library I have a service like below: So the idea of this service is only to fetch some data at the dictionary called _accessInfos and its async method. You can see at the constructor I already add the job to fetch the data.
internal class AccessInfoStore : IAccessInfoStore
{
private readonly ILogger _logger;
private readonly Func<HttpClient> _httpClientFunc;
private volatile Dictionary<string, IAccessInfo> _accessInfos;
private readonly IScheduler _scheduler;
private static string JobName = "AccessInfoProviderJob";
public AccessInfoStore(IScheduler scheduler, ILogger logger, Func<HttpClient> httpClientFunc)
{
_accessInfos = new Dictionary<string, IAccessInfo>();
_config = config;
_logger = logger;
_httpClientFunc = httpClientFunc;
_scheduler = scheduler;
scheduler.ScheduleAsync(JobName, FetchAccessInfos, TimeSpan.FromMinutes(1));
}
public IJob FetchJob => _scheduler.GetJobs()[JobName];
private async Task<JobStatus> FetchAccessInfos()
{
using (var client = _httpClientFunc())
{
accessIds = //calling a webservice
_accessInfos = accessIds;
return JobStatus.Success;
}
}
All of this code is inside another library that I have referenced into my ASP.NET Core 2.1 project. On the startup class I have a call like this:
//adding services
...
services.AddScoped<IScheduler, Scheduler>();
services.AddScoped<IAccessInfoStore, AccessInfoStore>();
var accessInfoStore = services.BuildServiceProvider().GetService<IAccessInfoStore>();
accessInfoStore.FetchJob.WaitHandle.WaitOne();
At the first time WaitOne() method does not work so the data are not loaded(_accessInfos is empty) but if I refresh the page again I can see the data loaded(_accessInfos is not empty but has data). So, as far as I know WaitOne() method is to block thread execution until my job is completed.
Does anybody know why WaitOne() method does not work properly or what I might be doing wrong ?
EDIT 1:
Scheduler only stores all IJob-s into a concurrent dictionary in order to get them later if needed mainly for showing them in a health page. Then every time we insert a new TaskBasedJob in dictionary the constructor will be executed and at the end we use a Timer to re-execute the job later after some interval, but in order to make this thread-safe I use SemaphoreSlim class and from there I expose WaitHandle. This is only for those rare cases I need to turn a method from async to sync. Because in general I would not use this because the job will execute in async manner for normal cases.
What I expect - The WaitOne() should stop execution of current thread and wait until my scheduled job is executed and then continue on executing current thread. In my case current thread is the one running Configure method at StartUp class.

Colleague of Rajmond here. I figure out our issue. Basically, waiting works fine and so on. Our issue is simply that if you do IServiceCollection.BuildServiceProvider() you will get a different scope each time (and thus a different object is created even with Singleton instance). Simple way to try this out:
var serviceProvider1 = services.BuildServiceProvider();
var hashCode1 = serviceProvider1.GetService<IAccessInfoStore>().GetHashCode();
var hashCode2 = serviceProvider1.GetService<IAccessInfoStore>().GetHashCode();
var serviceProvider2 = services.BuildServiceProvider();
var hashCode3 = serviceProvider2.GetService<IAccessInfoStore>().GetHashCode();
var hashCode4 = serviceProvider2.GetService<IAccessInfoStore>().GetHashCode();
hashCode1 and hashCode2 are the same, same as hashCode3 and hashCode4 (because Singleton), but hashCode1/hashCode2 are not the same as hashCode3/hashCode4 (because different service provider).
The real fix will probably be some check in that IAccessInfoStore that will block internally until the job has finished the first time.
Cheers!

Related

Periodic Timer force to run every second

I create the periodic timer which run under background service
public class PeriodicHostedService : BackgroundService
{
private readonly TimeSpan period = TimeSpan.FromSeconds(1);
private readonly ILogger<PeriodicHostedService> logger;
private readonly IServiceScopeFactory factory;
private int executionCount = 0;
public PeriodicHostedService(ILogger<PeriodicHostedService> logger, IServiceScopeFactory factory)
{
this.logger=logger;
this.factory=factory;
}
protected override async Task ExecuteAsync(CancellationToken stoppingToken)
{
using PeriodicTimer timer = new(period);
using var scope = factory.CreateScope();
ITimerJob job = scope.ServiceProvider.GetRequiredService<ITimerJob>();
while (
!stoppingToken.IsCancellationRequested &&
await timer.WaitForNextTickAsync(stoppingToken))
{
try
{
await job.ProcessAsync();
executionCount++;
logger.LogInformation($"Executed PeriodicHostedService - Count: {executionCount}");
}
catch (Exception ex)
{
logger.LogInformation($"Failed to execute PeriodicHostedService with exception message {ex.Message}. Good luck next round!");
}
}
}
}
I have set the timer run every second
however, I have job in timer need to run over 1 second just an example
internal class TimerJob : ITimerJob
{
private int runningID;
public async Task ProcessAsync()
{
runningID++;
Console.WriteLine($"{DateTime.Now} > Current Running ID : {runningID}");
await LongTimeJob();
}
private async Task LongTimeJob ()
{
Console.WriteLine($"{DateTime.Now} > Step1 Async Job End ID : {runningID}");
await Task.Delay(3000).ConfigureAwait(false);
}
}
can I know how to write the timer which force to execute on every second (and let longtime job continue work)
Thank you
you can chose not to await the job.ProcessAsync() which would allow your code to continue waiting for the next tick.
_ = job.ProcessAsync();
I must admit, running jobs every minute that are likely to run long might become a resource hog eventually. You should check your design for any unwanted side effects.

Pattern that offers mutual exclusion, re-entrancy, distributed locking towards one (of many) APIs with HttpClient

I'm looking for an approach to locking that, by default, makes sure that all calls to a single API are run mutually exclusive using distributed locking. However, at the same time I need the option to instead lock larger blocks of code (critical procedures) containing several calls to that API. Those calls should still be run mutually exclusive. In those cases the approach should be re-entrant, so that each call isn't blocked because the block of code it is in already holds the lock. It should also support re-entrancy if there are several methods nested that lock sections of code.
Examples of use:
// Should have lock registered by default (f.ex. in HttpMessageHandler)
await _deviceClient.PerformAction();
async Task CriticalProcedure()
{
// Should only use one lock that is reused in nested code (re-entrant)
await using (await _reentrantLockProvider.AcquireLockAsync())
{
await _deviceClient.TriggerAction();
await SharedCriticalProcedure();
}
// Should only dispose lock at this point
}
async Task SharedCriticalProcedure()
{
await using (await _customLockProvider.AcquireLockAsync())
{
await _deviceClient.HardReset();
await _deviceClient.Refresh();
}
}
// Should be forced to run sequentially even though they are not awaited (mutex)
var task1 = _deviceClient.PerformAction1();
var task2 = _deviceClient.PerformAction2();
await Task.WhenAll(task1, task2);
Background:
My team is working on a WebAPI that is responsible for making calls to hardware devices. When an endpoint in our API is called, we get a header that identifies the hardware device in question, used in startup to configure the baseUrl of our HttpClients, and we make one or more calls to that API. We have the following limitations:
A device shouldn't be called when it is already busy with a request (mutual exclusion)
Some procedures against the device (blocks of code containing several calls) are critical and shouldn't be interrupted by other calls to the device (why I want re-entry)
A user may run multiple requests to our API simultaneously, so the locking should work across requests
Our WebAPI may have multiple deployments, so the locking should be distributed
We use Refit to describe the API of our hardware devices and create HttpClients
I have created the following solution which I believe works. However, it seems clumsy and overengineered, mostly because HttpMessageHandlers have unpredictable lifetimes, not scoped to request, so I needed to use TraceIdentifier and a dictionary to enable re-entry during the request lifecycle.
// In startup
service.AddSingleton<IReentrantLockProvider, ReentrantLockProvider>();
services.
.AddHttpClient(IDeviceClient)
.AddTypedClient(client => RestService.For<IDeviceClient>(client, refitSettings))
.ConfigureHttpClient((provider, client) => ConfigureHardwareBaseUrl())
.AddHttpMessageHandler<HardwareMutexMessageHandler>();
public class HardwareMutexMessageHandler : DelegatingHandler
{
private readonly IReentrantLockProvider _reentrantPanelLockProvider;
private readonly IHttpContextAccessor _httpContextAccessor;
private readonly ConcurrentDictionary<string, object> _locks;
public HardwareMutexMessageHandler(IReentrantLockProvider reentrantPanelLockProvider, IHttpContextAccessor httpContextAccessor)
{
_reentrantPanelLockProvider = reentrantPanelLockProvider;
_httpContextAccessor = httpContextAccessor;
}
protected override async Task<HttpResponseMessage> SendAsync(HttpRequestMessage request, CancellationToken cancellationToken)
{
await using (await _reentrantPanelLockProvider.AcquireLockAsync(cancellationToken))
{
var hardwareId = _httpContextAccessor.HttpContext.Request.Headers["HardwareId"];
var mutex = _locks.GetOrAdd(hardwareId, _ => new());
// This is only used to handle cases where developer chooses to batch calls or forgets to await a call
lock (mutex)
{
return base.SendAsync(request, cancellationToken).Result;
}
}
}
}
public class ReentrantLockProvider : IReentrantLockProvider
{
private readonly IDistributedLockProvider _distributedLockProvider;
private readonly IHttpContextAccessor _httpContextAccessor;
private readonly ConcurrentDictionary<string, ReferenceCountedDisposable> _lockDictionary;
private readonly object _lockVar = new();
public ReentrantLockProvider(IDistributedLockProvider distributedLockProvider, IHttpContextAccessor httpContextAccessor)
{
_distributedLockProvider = distributedLockProvider;
_httpContextAccessor = httpContextAccessor;
_lockDictionary = new ConcurrentDictionary<string, ReferenceCountedDisposable>();
}
public async Task<IAsyncDisposable> AcquireLockAsync(CancellationToken cancellationToken = default)
{
var hardwareId = _httpContextAccessor.HttpContext.Request.Headers["HardwareId"];
var requestId = _httpContextAccessor.HttpContext.TraceIdentifier;
lock (_lockVar)
{
if (_lockDictionary.TryGetValue(requestContext.CorrelationId, out referenceCountedLock))
{
referenceCountedLock.RegisterReference();
return referenceCountedLock;
}
acquireLockTask = _distributedLockProvider.AcquireLockAsync(hardwareId, timeout: null, cancellationToken);
referenceCountedLock = new ReferenceCountedDisposable(async () =>
await RemoveLock(acquireLockTask.Result, requestContext.CorrelationId)
);
_lockDictionary.TryAdd(requestContext.CorrelationId, referenceCountedLock);
}
}
private async Task RemoveLock(IDistributedSynchronizationHandle acquiredLock, string correlationId)
{
ValueTask disposeAsyncTask;
lock (_lockVar)
{
disposeAsyncTask = acquiredLock.DisposeAsync();
_ = _lockDictionary.TryRemove(correlationId, out _);
}
await disposeAsyncTask;
}
}
public class ReferenceCountedDisposable : IAsyncDisposable
{
private readonly Func<Task> _asyncDispose;
private int _refCount;
public ReferenceCountedDisposable(Func<Task> asyncDispose)
{
_asyncDispose = asyncDispose;
_refCount = 1;
}
public void RegisterReference()
{
Interlocked.Increment(ref _refCount);
}
public async ValueTask DisposeAsync()
{
var references = Interlocked.Decrement(ref _refCount);
if (references == 0)
{
await _asyncDispose();
}
else if (references < 0)
{
throw new InvalidOperationException("Can't dispose multiple times");
}
else
{
GC.SuppressFinalize(this);
}
}
}

Sharing a thread-safe field between instances of a Transient service in C# / NETC core?

I need to share some locks (aka "object" fields) and regular data fields between functions called from a Transient service in .NET Core.
Naturally, these locks and fields should be declared in a thread-safe manner.
What would be the best way to approach it? I am thinking of a separate Singleton service. Do I have to add some keywords to the fields and locks declared so these are thread-safe?
I am well familiar with Java multithreading but never done it so far in C#.
This is the most simple example I can think of. It uses lock. You can also use Monitor. Basically I resolve a transient service 3 times. And then start a couple of threads which will increase the value of a shared service.
class Program
{
static async Task Main(string[] args)
{
var serviceCollection = new ServiceCollection();
serviceCollection.AddSingleton<ValueService>();
serviceCollection.AddTransient<Service>();
var provider = serviceCollection.BuildServiceProvider();
var serviceOne = provider.GetRequiredService<Service>();
var serviceTwo = provider.GetRequiredService<Service>();
var serviceThree = provider.GetRequiredService<Service>();
// Manipulate the same object 1500 times, from different threads.
var task1 = serviceOne.DoStuff(500);
var task2 = serviceTwo.DoStuff(500);
var task3 = serviceThree.DoStuff(500);
// Wait for all the threads to complete.
await Task.WhenAll(task1, task2, task3);
// Verify the result.
var valueService = provider.GetRequiredService<ValueService>();
Console.WriteLine(valueService.SomeValue);
Console.ReadKey();
}
}
internal class Service
{
private readonly ValueService _service;
public Service(ValueService service)
{
_service = service;
}
public Task DoStuff(int noOfTimes)
{
var tasks = new Task[noOfTimes];
for (int i = 0; i < noOfTimes; i++)
{
tasks[i] = Task.Run(() =>
{
Thread.Sleep(100);
_service.Increase();
});
}
return Task.WhenAll(tasks);
}
}
internal class ValueService
{
public void Increase()
{
// Use lock to make sure that only one thread is changing the field at the time.
// Remove the lock statement and you will notice some "unwanted" behaviour.
lock (_lock)
{
SomeValue++;
}
// Alternatively you can use Interlocked.Increment(SomeValue)
}
private readonly object _lock = new object();
public int SomeValue { get; private set; }
}

.net core - Passing an unknown number of IProgress<T> to class library

I have a console app which uses a class library to execute some long running tasks. This is a .net core console app and uses the .net core Generic Host. I also use the ShellProgressBar library to display some progress bars.
My Hosted service looks like this
internal class MyHostedService : IHostedService, IDisposable
{
private readonly ILogger _logger;
private readonly IMyService _myService;
private readonly IProgress<MyCustomProgress> _progress;
private readonly IApplicationLifetime _appLifetime;
private readonly ProgressBar _progressBar;
private readonly IProgressBarFactory _progressBarFactory;
public MyHostedService(
ILogger<MyHostedService> logger,
IMyService myService,
IProgressBarFactory progressBarFactory,
IApplicationLifetime appLifetime)
{
_logger = logger;
_myService = myService;
_appLifetime = appLifetime;
_progressBarFactory = progressBarFactory;
_progressBar = _progressBarFactory.GetProgressBar(); // this just returns an instance of ShellProgressBar
_progress = new Progress<MyCustomProgress>(progress =>
{
_progressBar.Tick(progress.Current);
});
}
public void Dispose()
{
_progressBar.Dispose();
}
public Task StartAsync(CancellationToken cancellationToken)
{
_myService.RunJobs(_progress);
_appLifetime.StopApplication();
return Task.CompletedTask;
}
public Task StopAsync(CancellationToken cancellationToken)
{
return Task.CompletedTask;
}
}
Where MyCustomProgress looks like this
public class MyCustomProgress
{
public int Current {get; set;}
public int Total {get; set;}
}
and MyService looks something like so (Job1, Job2, Job3 implement IJob)
public class MyService : IMyService
{
private void List<IJob> _jobsToRun;
public MyService()
{
_jobsToRun.Add(new Job1());
_jobsToRun.Add(new Job2());
_jobsToRun.Add(new Job3());
}
public void RunJobs(IProgress<MyCustomProgress> progress)
{
_jobsToRun.ForEach(job =>
{
job.Execute();
progress.Report(new MyCustomProgress { Current = _jobsToRun.IndexOf(job) + 1, Total = _jobsToRun.Count() });
});
}
}
And IJob is
public interface IJob
{
void Execute();
}
This setup works well and I'm able to display the progress bar from my HostedService by creating a ShellProgressBar instance and using the one IProgress instance I have to update it.
However, I have another implementation of IMyService that I also need to run that looks something like this
public class MyService2 : IMyService
{
private void List<IJob> _sequentialJobsToRun;
private void List<IJob> _parallelJobsToRun;
public MyService()
{
_sequentialJobsToRun.Add(new Job1());
_sequentialJobsToRun.Add(new Job2());
_sequentialJobsToRun.Add(new Job3());
_parallelJobsToRun.Add(new Job4());
_parallelJobsToRun.Add(new Job5());
_parallelJobsToRun.Add(new Job6());
}
public void RunJobs(IProgress<MyCustomProgress> progress)
{
_sequentialJobsToRun.ForEach(job =>
{
job.Execute();
progress.Report(new MyCustomProgress { Current = _jobsToRun.IndexOf(job) + 1, Total = _jobsToRun.Count() });
});
Parallel.ForEach(_parallelJobsToRun, job =>
{
job.Execute();
// Report progress here
});
}
}
This is the one I'm struggling with. when _parallelJobsToRun is executed, I need to be able to create a new child ShellProgressBar (ShellProgressBar.Spawn) and display them as child progress bars of let's say 'Parallel Jobs'.
This is where I'm looking for some help as to how I can achieve this.
Note: I don't want to take a dependency on ShellProgressBar in my class library containing MyService
Any help much appreciated.
I am a little confused by your description, but let's see if I understand what you are up to. So if you wrap all of this in a class, then taskList1 and taskList2 could be class variables. (By the way taskList1/2 should be named better: say parallelTaskList and whatever . . . anyway.) Then you could write a new method on the class CheckTaskStatus() and just iterate over the two class variables. Does that help or have I completely missed your question?
Can you modify it like this?
public Task<ICollection<IProgress<int>>> StartAsync(CancellationToken cancellationToken)
{
var progressList = _myServiceFromLibrary.RunTasks();
return Task.FromResult(progressList);
}
public ICollection<IProgress<int>> RunTasks()
{
var taskList1 = new List<ITask> { Task1, Task2 };
var plist1 = taskList1.Select(t => t.Progress).ToList();
var taskList2 = new List<ITask> { Task3, Task4, Task5 }:
var plist2 = taskList2.Select(t => t.Progress).ToList();
taskList1.foreach( task => task.Run() );
Parallel.Foreach(taskList2, task => { task.Run() });
return plist1.Concat(plist2).ToList();
}
Task.Progress there is probably a progress getter. realistically IProgress should probably be injected via Tasks constructors. But the point is your public interface doesn't accept list of tasks, thus it should just return collection of progress reports.
How to inject progress reporters into your tasks is a different story that depends on tasks implementations and it may or may not be supported. out of the box.
However what you probably should do is to supply progress callback or progress factory so that progress reporters of your choice are created:
public Task StartAsync(CancellationToken cancellationToken, Action<Task,int> onprogress)
{
_myServiceFromLibrary.RunTasks(onprogress);
return Task.CompletedTask;
}
public class SimpleProgress : IProgress<int>
{
private readonly Task task;
private readonly Action<Task,int> action;
public SimpleProgress(Task task, Action<Task,int> action)
{
this.task = task;
this.action = action;
}
public void Report(int progress)
{
action(task, progress);
}
}
public ICollection<IProgress<int>> RunTasks(Action<Task,int> onprogress)
{
var taskList1 = new List<ITask> { Task1, Task2 };
taskList1.foreach(t => t.Progress = new SimpleProgress(t, onprogress));
var taskList2 = new List<ITask> { Task3, Task4, Task5 }:
taskList2.foreach(t => t.Progress = new SimpleProgress(t, onprogress));
taskList1.foreach( task => task.Run() );
Parallel.Foreach(taskList2, task => { task.Run() });
}
you may see here, that it really is mostly question about how your tasks are going to call IProgress<T>.Report(T value) method.
Honestly I would just use an event in your task prototype.
It's not really clear exactly what you want because the code you posted doesn't match the names you then reference in your question text... It would be helpful to have all the code (the RunTasks function for example, your IProgress prototype, etc).
Nevertheless, an event exists specifically to signal calling code. Let's go back to the basics. Let's say you have library called MyLib, with a method DoThings().
Create a new class that inherits from EventArgs, and that will carry your task's progress reports...
public class ProgressEventArgs : EventArgs
{
private int _taskId;
private int _percent;
private string _message;
public int TaskId => _taskId;
public int Percent => _percent;
public string Message => _message;
public ProgressEventArgs(int taskId, int percent, string message)
{
_taskId = taskId;
_percent = percent;
_message = message;
}
}
Then on your library's class definition, add an event like so:
public event EventHandler<ProgressEventArgs> Progress;
And in your console application, create a handler for progress events:
void ProgressHandler(object sender, ProgressEventArgs e)
{
// Do whatever you want with your progress report here, all your
// info is in the e variable
}
And subscribe to your class library's event:
var lib = new MyLib();
lib.Progress += ProgressHandler;
lib.DoThings();
When you are done, unsubscribe from the event:
lib.Progress -= ProgressHandler;
In your class library, now you can send back progress reports by raising the event in your code. First create a stub method to invoke the event:
protected virtual void OnProgress(ProgressEventArgs e)
{
var handler = Progress;
if (handler != null)
{
handler(this, e);
}
}
And then add this to your task's code where you want it:
OnProgress(new ProgressEventArgs(2452343, 10, "Reindexing google..."));
The only thing to be careful about is to report progress sparingly, because each time your event fires it interrupts your console application, and you can really bog it down hard if you send 10 million events all at once. Be logical about it.
Alternate way; If you own the code IProgress<T> and Progress
IProgress<T>
{
IProgress<T> CreateNew();
Report(T progress);
}
Progress<T> : IProgress<T>
{
Progress(ShellProgressClass)
{
// initialize progressBar or span new
}
....
IProgress<T> CreateNew()
{
return new Progress();
}
}
you can later improvise to have one big progressBar (collection of Sequential or Parallel) and what not
Your MyService could have a dependency similar to:
public interface IJobContainer
{
void Add(IJob job);
void RunJobs(IProgress<MyProgress> progress, Action<IJob>? callback = null); // Using an action for extra work you may want to do
}
This way you don't have to worry about reporting progress in MyService (which doesn't feel like it should be MyService's job anyway. The implementation could look something like this for the parallel job container:
public class MyParallelJobContainer
{
private readonly IList<IJob> parallelJobs = new List<IJob>();
public MyParallelJobContainer()
{
this.progress = progress;
}
public void Add(IJob job) { ... }
void RunJobs(IProgress<MyProgress> progress, Action<IJob>? callback = null)
{
using (var progressBar = new ProgressBar(options...))
{
Parallel.ForEach(parallelJobs, job =>
{
callback?.Invoke(job);
job.Execute();
progressBar.Tick();
})
}
}
}
MyService would then look like this:
public class MyService : IMyService
{
private readonly IJobContainer sequentialJobs;
private readonly IJobContainer parallelJobs;
public MyService(
IJobContainer sequentialJobs,
IJobContainer parallelJobs)
{
this.sequentialJobs = sequentialJobs;
this.parallelJobs = parallelJobs;
this.sequentialJobs.Add(new DoSequentialJob1());
this.sequentialJobs.Add(new DoSequentialJob2());
this.sequentialJobs.Add(new DoSequentialJob3));
this.parallelJobs.Add(new DoParallelJobA());
this.parallelJobs.Add(new DoParallelJobB());
this.parallelJobs.Add(new DoParallelJobC());
}
public void RunJobs(IProgress<MyCustomProgress> progress)
{
sequentialJobs.RunJobs(progress, job =>
{
// do something with the job if necessary
});
parallelJobs.RunJobs(progress, job =>
{
// do something with the job if necessary
});
}
The advantage of this way is that MyService only has one job and doesn't have to worry about what you do once the job is completed.
From my understanding of your issue the question is how do you display progress across both completion of the synchronous jobs and parallelized jobs.
In theory the parallel jobs could start and finish at the same time, so you could treat the parallel jobs as a single job. Instead of using the count of sequential jobs as your total, increase that number by one. This might be satisfactory for a small number of parallel jobs.
If you want to add progress between the parallel jobs, you will need to handle multi-threading in your code because the parallel jobs will be running concurrently.
object pJobLock = new object();
int numProcessed = 0;
foreach(var parallelJob in parallelJobs)
{
parallelJob.DoWork();
lock (pJobLock)
{
numProcessed++;
progress.Report(new MyCustomProgress { Current = numProcessed, Total = parallelJobs.Count() });
}
}

Array of ManualResetEvent objects

here's my story: I have wcf service. It receives request with work to do. Each task is inserted into blocking queue. The server will take items from this queue periodically and do the work (totally async in different thread). In my "Do" service I need to know when "my" task was done. Like this:
public bool Do(int input)
{
// 1. Add task to the BlockingCollection queue
// 2. Block this thread from returning and observe/wait til my task is finished
return true;
}
Here's my suggestion/solution:
public bool Do(int input)
{
// 1. Create a ManualResetEvent object
// 2. Add this object to task
// 3. Add task to the BlockingCollection queue
// 4. Block this thread from returning - wait for ManualResetEvent object
return true;
}
So, there will be as many ManualResetEvent objects as there are tasks to do. I will literally have an array of sync objects. Is it good solution for my problem?
Or is there better synchronization class to use in my case? Like Wait and Pulse?
Thanks for help,
I'm sorry for the title. I didn't know how to ask this question in the title.
Your plan is good, however I would suggest not tying up a dedicated thread waiting for the work to be done. Switching from a new ManualResetEvent(false) to a new SemephoreSlim(0,1) will let you use WaitAsync() which would allow you to use async/await in your Do method and freeing up the thread to do other work. (UPDATE: This really should be a TaskCompletionSource instead of a Semaphore Slim, but I will not update this example, see the 2nd part below)
public async Task<bool> Do(int input)
{
using(var completion = new new SemephoreSlim(0,1))
{
var job = new JobTask(input, completion);
_workQueue.Add(job);
await completion.WaitAsync().ConfigureAwait(false);
return job.ResultData;
}
}
private void ProcessingLoop()
{
foreach(var job in _workQueue.GetConsumingEnumerable())
{
job.PerformWork(); //Inside PerformWork there is a _completion.Release(); call.
}
}
To make everything self contained you can change the SemaphoreSlim / TaskCompletionSource and put it inside the job then just return the job itself.
public JobTask Do(int input)
{
var job = new JobTask(input);
_workQueue.Add(job);
return job;
}
public class JobTask
{
private readonly int _input;
private readonly TaskCompletionSource<bool> _completionSource;
public JobTask(int input)
{
_input = input;
_completionSource = new TaskCompletionSource<bool>();
}
public void PerformWork()
{
try
{
// Do stuff here with _input.
_completionSource.TrySetResult(true);
}
catch(Exception ex)
{
_completionSource.TrySetException(ex);
}
}
public Task<bool> Work { get { return _completionSource.Task; } }
}

Categories