c# queue and deque at the same time - c#

i am working on a project that requires read a qr code, queue it, as it is queue another task dequeues the data and send it to machine via socket messaging. this is all done in c#.
my question is how would i keep queue while another task is dequeing. i have the qr code working, i can queue and dequeue and the socket messaging portion working. but i don't know how to run the queue and dequeue at the same time.
i have looked into threading, specifically multi threading. i am more confused than before i started reading.
any help will be greatly appreciated.
EDIT: So based on you comments and doing a little bit of research, i started writing a bit of code. no matter what i did, the thread only runs once. it is supposed to keep running.
public partial class Form1 : Form
{
private BlockingCollection<string> _queue = new BlockingCollection<string>(30);
//private string item = "A";
//private int count = 0;
private Thread _th1;
public Form1()
{
InitializeComponent();
_th1 = new Thread(thread_example);
_th1.Start();
}
private void Form1_Load(object sender, EventArgs e)
{
}
private void thread_example()
{
if (_queue.Count > 0)
{
_queue.Take();
Console.WriteLine("Removed 1 Item from queue!");
}
else
{
Console.WriteLine("Queue Empty!");
}
Thread.Sleep(500);
}
private void btnProduce_Click(object sender, EventArgs e)
{
_queue.Add("test_string");
Console.WriteLine("Added 1 item to the queue");
}
}

I would highly recommend using BlockingCollection. The problem you have is called a Producer-Consumer problem
BlockingCollection provides an implementation that handles the Producer-Consumer problem.
Specifically, you have to also think about: What happens when the dequeuing thread gets slow and can't keep up with the scanning thread, say due to network slowness at that time?
The BlockingCollection will block the queuing thread to slow the whole process in sync, based on the BlockingCollection Capacity specified while constructing.
Also, you can get either FIFO or LIFO behavior, by using ConcurrentQueue or ConcurrentBag as the underlying storage. BlockingCollection just provides the "bounded-ness" properties on top of the underlying synchronized collections.

Related

Re-synchronize Process.RedirectStandardOutput

Background
I'm writing a c# wrapper for a node.js application. In this wrapper I continuously read the standard output via Process.RedirectStandardOutput. The event is bound to the function onOutputDataReceived, in an instance of the class ProcessManager. In this same instance, there is also an instance of a custom event system.
[ProcessManager]
EventSystem eventSystem;
private void Start()
{
[...]
process.OutputDataReceived += onOutputDataReceived;
[...]
}
private void onOutputDataReceived(object sender, DataReceivedEventArgs e)
{
[...]
eventSystem.call(eventName, args);
}
[EventSystem]
List<EventHandler> eventList;
public Boolean call(String eventName, dynamic args)
{
[...]
foreach (EventHandler handler in eventList)
{
handler(args);
}
[...]
}
The problem occurs when the event is being called. Here is an example from a winforms application using my wrapper.
Wrapper.ProcessManager procMan;
procMan.eventSystem.on(eventName, (a) =>
{
button1.Text = someValue;
});
When run, the application crashes with the message
Cross-thread operation not valid: Control 'button1' accessed from a thread other than the thread it was created on
My issue, as I understand it, is this:
onOutputDataReceived is being executed asynchronously, in its own thread. As this same thread, only meant to be handling the output, goes on to call the events, I'm unintentionally multithreading my wrapper, making life harder for anyone implementing it.
Basically,
I need to run the line eventSystem.call() in the same thread that maintains the rest of the ProcessManager instance, as soon as new output data has been received as possible. Any ideas on how this best can be achieved?
A solution I've thought of is something like this
[ProcessManager]
Queue<string> waiting = new Queue<string();
EventSystem eventSystem;
private void onOutputDataReceived(object sender, DataReceivedEventArgs e)
{
[...]
waiting.Enqueue(eventName);
}
private void WhenReady()
{
while(waiting.Count > 0)
eventSystem.call(waiting.Dequeue());
}
As far as I can see, this would involve some kind of polling every x milliseconds, which doesn't feel like a clean solution. Also, it seems to me as if such a solution would be way too expensive for when no messages are being received and too slow for when some are.
The code that executes the nodejs process and reads its output should not need to know about the threading requirements of event subscribers. Make the subscriber satisfy its own requirements:
(a) =>
{
Invoke(new Action(() => button1.Text = someValue)); //marshal to UI thread
}
Your tentative solution would not work because it would block the UI thread.
Also, waiting is being used in an unsynchronized way... This is an unrelated bug.

How to create certain number of threads dynamically and assign method when any of the thread is completed in C#?

I have a scenario in which I need to create number of threads dynamically based on the configurable variable.I can only start that number of thread at a time and as soon as one of the thread is completed,I need to assign a method in same thread as in queue.
Can any one help me to resolve the above scenario with an example.
I have been researching for a week but not able to get the concrete solution.
There are many ways to approach this, but which is best depends on your specific problem.
However, let's assume that you have a collection of items that you want to do some work on, with a separate thread processing each item - up to a maximum number of simultaneous threads that you specify.
One very simple way to do that is to use Plinq via AsParallel() and WithDegreeOfParallelism(), as the following console application demonstrates:
using System;
using System.Linq;
using System.Threading;
namespace Demo
{
static class Program
{
static void Main()
{
int maxThreads = 4;
var workItems = Enumerable.Range(1, 100);
var parallelWorkItems = workItems.AsParallel().WithDegreeOfParallelism(maxThreads);
parallelWorkItems.ForAll(worker);
}
static void worker(int value)
{
Console.WriteLine($"Worker {Thread.CurrentThread.ManagedThreadId} is processing {value}");
Thread.Sleep(1000); // Simulate work.
}
}
}
If you run this and inspect the output, you'll see that multiple threads are processing the work items, but the maximum number of threads is limited to the specified value.
You should have a look at thread pooling. See this link for more information Threadpooling in .NET. You will most likely have to work with the callbacks to accomplish your task to call a method as soon as work in one thread was done
There might be a smarter solution for you using async/await, depending on what you are trying to achieve. But since you explicitly ask about threads, here is a short class that does what you want:
public class MutliThreadWorker : IDisposable
{
private readonly ConcurrentQueue<Action> _actions = new ConcurrentQueue<Action>();
private readonly List<Thread> _threads = new List<Thread>();
private bool _disposed;
private void ThreadFunc()
{
while (true)
{
Action action;
while (!_actions.TryDequeue(out action)) Thread.Sleep(100);
action();
}
}
public MutliThreadWorker(int numberOfThreads)
{
for (int i = 0; i < numberOfThreads; i++)
{
Thread t = new Thread(ThreadFunc);
_threads.Add(t);
t.Start();
}
}
public void Dispose()
{
Dispose(true);
}
protected virtual void Dispose(bool disposing)
{
_disposed = true;
foreach (Thread t in _threads)
t.Abort();
if (disposing)
GC.SuppressFinalize(this);
}
public void Enqueue(Action action)
{
if (_disposed)
throw new ObjectDisposedException("MultiThreadWorker");
_actions.Enqueue(action);
}
}
This class starts the required number of threads when instantiated as this:
int requiredThreadCount = 16; // your configured value
MultiThreadWorker mtw = new MultiThreadWorker(requiredThreadCount);
It then uses a ConcurrentQueue<T> to keep track of the tasks to do. You can add methods to the queue via
mtw.Enqueue(() => DoThisTask());
I made it IDisposable to make sure the treads are stopped in the end. Of course this would need a little improvemnt since aborting threads like this is not the best practice.
The ThreadFunc itself checks repeatedly if there are queued actions and executes them. This could also be improved a little by patterns using Monitor.Pulse and Monitor.Wait etc.
And as I said, async/await may lead to better solutions, but you asked for threads explicitly.

C# Is it safe to call 'BackgroundWorker' inside 'using' statement?

I have developed a windows service. In the service I was using a BackgroundWorker to Post data in my Database.
I declared a BackgroundWorker inside my database constructor class and was using that whenever needed.
During the test I got one error:
This BackgroundWorker is currently busy and cannot run multiple tasks
concurrently
I tried to find out the solution and many people suggest to use new instance for each task. I changed my code like:
...
using (BackgroundWorker bw = new BackgroundWorker())
{
bw.DoWork += new DoWorkEventHandler(bkDoPost);
bw.RunWorkerAsync(dbobj);
}
...
and my 'bkDoPost' is:
void bkDoPost(object sender, DoWorkEventArgs e)
{
try
{
dbObject dbj = e.Argument as dbObject;
this.db.Insert(dbj.tableName, dbj.data);
}
catch (Exception ex)
{
logs.logMessage("There was an error in data post. See the ErrorLog");
logs.logError(ex);
}
}
The code works fine during test.
My question is am I doing correct way?
OR Is there any issue doing in that way?
Thanks
Don't do that. Your background worker will be disposed before your work completes.
It is better to call Dispose manually after the work completes.
Better still, consider using a different scheme for handling asynchronous work. Background worker is becoming obsolete and is targeted at UI applications, rather than services. The restriction on parallel operations highlights the intention of the class.
Don't put the BackgroundWorker into a using statement. Instead put the Dispose() call into the RunWorkerCompleted event.
Nevertheless BackgroundWorker is maybe not the best thing to use in your case, cause it is primilary use is to run some buisness code while the UI stays responive and to automatically update the UI within the RunWorkerCompeleted event.
If you don't need to interfere with the UI when the job is finished or you have a lot of smaller jobs to be done it would be more efficient to switch to encapsulate your jobs within Tasks.
If you have many updates, creating one BackgroundWorker for each one could be very time and memory consuming.
I would use an independant thread that I would wake up each time an update has to be done :
Queue<DbWork> dbUpdates = new Queue<DbWork>();
EventWaitHandle waiter = new EventWaitHandle(false, EventResetMode.ManualReset);
...
// Init :
new Thread(new ThreadStart(DbUpdateWorker));
...
private void DbUpdateWorker()
{
while (true)
{
DbWork currentWork = null;
lock (dbUpdates)
{
if (dbUpdates.Count > 0)
currentWork = dbUpdates.Dequeue();
}
if (currentWork != null)
{
currentWork.DoWork();
}
if (dbUpdates.Count == 0)
{
waiter.WaitOne();
}
}
}
public void AddWork(DbWork work)
{
lock (dbUpdates)
{
dbUpdates.Enqueue(work);
}
waiter.Set();
}

c# application, Handles in task manager increase and system hanged

In our application we used Threads and Delegate to get the best performance. In order to gate our operations we use the System.Timers.Timer class.
When the application start , after 1 hour the task manager shows that the number of handles increased and cpu usage increases as well.
What can we do to dispose objects of Thread and delegate?
below code can help you to check this task.
public partial class Form1 : Form
{
System.Timers.Timer MainTimer;
public Form1()
{
InitializeComponent();
MainTimer = new System.Timers.Timer();
MainTimer.Elapsed+=new System.Timers.ElapsedEventHandler(MainTimer_Elapsed);
}
Thread MainThread;
private void button1_Click(object sender, EventArgs e)
{
MainTimer.Interval = 10;
MainTimer.Start();
}
void MainTimer_Elapsed(object sender, System.Timers.ElapsedEventArgs e)
{
MainThread = new Thread(new ThreadStart(DoStart));
MainThread.Start();
}
void DoStart()
{
PrintInfo();
}
delegate void PrintInfo_Delegate();
void PrintInfo()
{
if (textBox1.InvokeRequired)
{
Invoke(new PrintInfo_Delegate(PrintInfo));
}
else
{
textBox1.Text += "Test\r\n";
}
}
}
Apologies if I'm misparsing something, but it looks like once the button is clicked, you're then starting a new thread (in MainTimer_Elapsed) every 10 milliseconds? Without knowing more about the context / versions, it's hard to give specific advice on an alternative (async/await, TPL, etc), but off-hand it seems like anything creating new threads (especially non-background threads) like that is doing so unnessarily?
For processing non-UI work, perhaps use ThreadPool.QueueUserWorkItem instead of creating your own? For UI work, maybe use BackgroundWorker? If you're exhausting the thread pool, you could also consider setting things like max thread count, but using TPL and avoiding the use/creation/management of threads would be better/simpler IMHO if you're able to target .Net 4.x

What's a useful pattern for waiting for all threads to finish?

I have a scenario where I will have to kick off a ton of threads (possibly up to a 100), then wait for them to finish, then perform a task (on yet another thread).
What is an accepted pattern for doing this type of work? Is it simply .Join? Or is there a higher level of abstraction nowadays?
Using .NET 2.0 with VS2008.
In .NET 3.5sp1 or .NET 4, the TPL would make this much easier. However, I'll tailor this to .NET 2 features only.
There are a couple of options. Using Thread.Join is perfectly acceptable, especially if the threads are all ones you are creating manually. This is very easy, reliable, and simple to implement. It would probably be my choice.
However, the other option would be to create a counter for the total amount of work, and to use a reset event when the counter reaches zero. For example:
class MyClass {
int workToComplete; // Total number of elements
ManualResetEvent mre; // For waiting
void StartThreads()
{
this.workToComplete = 100;
mre = new ManualResetEvent(false);
int total = workToComplete;
for(int i=0;i<total;++i)
{
Thread thread = new Thread( new ThreadStart(this.ThreadFunction) );
thread.Start(); // Kick off the thread
}
mre.WaitOne(); // Will block until all work is done
}
void ThreadFunction()
{
// Do your work
if (Interlocked.Decrement(ref this.workToComplete) == 0)
this.mre.Set(); // Allow the main thread to continue here...
}
}
Did you look at ThreadPool? Looks like here -ThreadPool tutorial, avtor solves same task as you ask.
What's worked well for me is to store each thread's ManagedThreadId in a dictionary as I launch it, and then have each thread pass its id back through a callback method when it completes. The callback method deletes the id from the dictionary and checks the dictionary's Count property; when it's zero you're done. Be sure to lock around the dictionary both for adding to and deleting from it.
I am not sure that any kind of standard thread locking or synchronization mechanisms will really work with so many threads. However, this might be a scenario where some basic messaging might be an ideal solution to the problem.
Rather than using Thread.Join, which will block (and could be very difficult to manage with so many threads), you might try setting up one more thread that aggregates completion messages from your worker threads. When the aggregator has received all expected messages, it completes. You could then use a single WaitHandle between the aggregator and your main application thread to signal that all of your worker threads are done.
public class WorkerAggregator
{
public WorkerAggregator(WaitHandle completionEvent)
{
m_completionEvent = completionEvent;
m_workers = new Dictionary<int, Thread>();
}
private readonly WaitHandle m_completionEvent;
private readonly Dictionary<int, Thread> m_workers;
public void StartWorker(Action worker)
{
var thread = new Thread(d =>
{
worker();
notifyComplete(thread.ManagedThreadID);
}
);
lock (m_workers)
{
m_workers.Add(thread.ManagedThreadID, thread);
}
thread.Start();
}
private void notifyComplete(int threadID)
{
bool done = false;
lock (m_workers)
{
m_workers.Remove(threadID);
done = m_workers.Count == 0;
}
if (done) m_completionEvent.Set();
}
}
Note, I have not tested the code above, so it might not be 100% correct. However I hope it illustrates the concept enough to be useful.

Categories