C# windows service multiple threads high CPU usage - c#

I am working on windows service.
I am need to thread 380 methods in a program.
Here is my code.
threadChunkList is a list of 20 threads and there is an array of thread in the list.
like this.
List<Thread[]> threadChunkList = new List<Thread[]>();
When it comes to following loop to execute method, the CPU usage in the task manager goes to over 90%.
foreach (Thread[] mainthread in threadChunkList)
{
Thread OneFinalThread = null;
OneFinalThread = new Thread(() =>
{
foreach (Thread chunkthread in mainthread)
{
chunkthread.Start();
}
foreach (var thread in mainthread)
{
thread.Join();
}
});
OneFinalThread.Priority = ThreadPriority.Lowest;
OneFinalThread.Start();
for (; ; )
{
if (OneFinalThread.IsAlive)
{
}
else
{
break;
}
}
}
Can anyone please let me know what is going wrong with above code. Also I dont know its proper way to manage thread or now. How can I use sleep method for thread to reduce CPU usage and how to remove previous thread from memory to clear memory?
Thank you

Related

Instantiating a Thread within a foreach loop in C#

I have a function that connects to a Multiple SQL server instances to fetch a set of data from each server for comparison of the data among multiple environments
I have a collection of the connection strings
I am calling this method in a foreach loop for each connection string in the collection
Since the data is fetched from different servers individually one at a time it takes up a lot of time
I would like to know if i use threads to call this method every time what would be the best way to do it?
There are a couple of ways to do this
1.) create a set of tasks and then do "await Task.WaitAll(listOfTasks)"
2.) Use Parallel.ForEach
3.) Manage threads
Managing threads
I do this in 2 steps:
1.) Create a list of threads
List<Thread> threads = new List<Thread>();
foreach(var connectionString in ConnectionStrings)
{
var thread = new Thread(DoWork);
thread.Start(connectionString);
threads.Add(thread);
}
2.) Join the threads to the current thread, has the effect of blocking until all are complete.
foreach(var thread in threads)
{
thread.Join()
}
You could Join the threads and make the program wait all of them until they are finished. It is a good pratice before moving to the next step. Look the code bellow with the comment, for sample:
// list of threads
List<Thread> threads = new List<Thread>();
// list of the results for each thread
List<SomeTypeDto> results = new List<SomeTypeDto>();
foreach(var connectionString in connectionStringList) {
// create the thread
Thread thread = new Thread(() => {
// access the database
SomeTypeDto result = /* process some result data*/;
lock (results) {
results.Add(result);
}
});
threads.Add(thread);
}
// start all threads
foreach (Thread thread in threads) {
thread.Start();
}
// Wait for all the threads to finish
foreach (Thread thread in threads) {
thread.Join();
}

Sporadic memory bloat using Toub's thread pool for long running tasks?

I have read the Toub's thread pool is a good solution for longer running tasks, so I implemented it in the following code. I'm not even sure if my implementation is a good one because I seem to have sporadic memory bloat. The process runs around 50 MB most of the time then will spike to almost a GB and stay there.
The thread pool implementation is as follows (should I even be doing this?):
private void Run()
{
while (!_stop)
{
// Create new threads if we have room in the pool
while (ManagedThreadPool.ActiveThreads < _runningMax)
{
ManagedThreadPool.QueueUserWorkItem(new WaitCallback(FindWork));
}
// Pause for a second so we don't run the CPU to death
Thread.Sleep(1000);
}
}
The method FindWork looks like this:
private void FindWork(object stateInfo)
{
bool result = false;
bool process = false;
bool queueResult = false;
Work_Work work = null;
try
{
using (Queue workQueue = new Queue(_workQueue))
{
// Look for work on the work queue
workQueue.Open(Queue.Mode.Consume);
work = workQueue.ConsumeWithBlocking<Work_Work>();
// Do some work with the message from the queue ...
return;
The ConsumeWithBlocking method blocks if there is nothing in the queue. Then we call return to exit the thread if we successfully retrieve a message and process it.
Typically we run 10 threads with them typically in the blocking state (WaitSleepJoin). The whole point of this is to have 10 threads running at all times.
Am I going about this all wrong?

Do EventWaitHandle.Set Wake another waiting thread?

I'm not to sure about something about EventWaitHandle.Set.
When called from within current thread and there is another thread waiting for the event, do the current thread get to sleep so that other thread gets to run (ASAP)?
I'm asking because in some of my code I have to add some object to a "threads shared" queue and that operation has really to go as quick as possible. But in the other thread where that queue is being used, speed is "not required".
So I'm proceeding like this:
// Speed "not required"
mailThread = new Task(() =>
{
for (; ; )
{
MailMessage mail;
pushMailLockMREvt.WaitOne();
{
if (mails.Count == 0)
{
mail = null;
}
else
{
mail = mails.Dequeue();
}
}
pushMailLockMREvt.Set(); // Does this put current on sleep on lower it's priority??
if (mail != null)
{
try
{
MailClient.Send(mail);
}
catch (Exception exe)
{
}
}
else
{
mailSem.WaitOne();
}
}
});
[...]
// Speed required
var task = new Task(() =>
{
pushMailLockMREvt.WaitOne(); // ASAP please...
{
mails.Enqueue(mailMessage);
if (mails.Count == 1)
{
mailSem.Set();
}
}
pushMailLockMREvt.Set();
});
No, the current thread will not sleep just because it signals the wait handle. But it may relinquish the rest of its timeslice (that's pretty subtle and low-level and isn't something you should rely on). You should not need to take any special action.
It is probably the case that the thread that has just finished waiting will get a brief boost in thread priority.
See this documentation for the Windows API function SetThreadPriorityBoost() from which I quote:
When a thread is running in one of the dynamic priority classes, the system temporarily boosts the thread's priority when it is taken out of a wait state.
Also see this documentation.
So the thread that just woke up from the wait should (normally) get a small boost. I'm not totally sure that this also applies to managed threads, but I seem to remember reading somewhere that it does. I can't find the source of that, though.

Loop through list and create multple threads

I want to loop through a list of URLs and check each URL if the website is down or not using multiple threads.
My approach:
while (_lURLs.Count > 0)
{
while (_iRunningThreads < _iNumThreads)
{
Thread t = new Thread(new ParameterizedThreadStart(CheckWebsite));
string strUrl = GetNextURL();
if (!string.IsNullOrEmpty(strUrl))
{
t.Start(strUrl);
_iRunningThreads++;
}
else
{
break;
}
}
}
private string GetNextURL()
{
lock (_lURLs)
{
if (_lURLs.Count > 0)
{
string strRetVal = _lURLs[0];
_lURLs.RemoveAt(0);
return strRetVal;
}
else
{
return string.Empty;
}
}
}
When a thread is finished the _iRunningThreads property gets decremented.
My problem is: The outer while loop blocks everything "while (_lURLs.Count > 0)".
Adding a Application.DoEvents() in the outer while loop helps but I want to use the code in a c# library where Application.DoEvents() is not available.
Thank you for you help.
Instead of managing the threads yourself, you can use the TPL.
Also, if you're using .Net Framework 4.5 you can even add async/await and the WhenAll method to prevent blocking...
Here is a small example:
private async Task CheckUrl()
{
List<Task> tasks = new List<Task>();
string url = GetNextUrl();
while (!String.IsNullOrEmpty(url))
{
tasks.Add(Task.Run(() => CheckWebSite(url)));
url = GetNextUrl();
}
await Task.WhenAll(tasks);
// All tasks have finished...
}
I think using the .NET ThreadPool would be a good idea in this case, if the tasks take quite a short time to complete.
Check out: http://msdn.microsoft.com/en-us/library/4yd16hza.aspx
This allows you to simplify your code a bit as the ThreadPool automatically manages the count of the worker threads. You just have to call ThreadPool.QueueUserWorkItem for each URL you have and increment a running task counter. Queuing items into the ThreadPool won't block the UI thread.
Have the ThreadPool tasks decrement the counter (as you have now) and when the counter gets to zero (all tasks have been ran) call a callback function so that your main code knows when all the URLs have been processed. You can update the UI or what ever else you want to do from that callback.

Multi-Threading - waiting for all threads to be signalled

I have scenarios where I need a main thread to wait until every one of a set of possible more than 64 threads have completed their work, and for that I wrote the following helper utility, (to avoid the 64 waithandle limit on WaitHandle.WaitAll())
public static void WaitAll(WaitHandle[] handles)
{
if (handles == null)
throw new ArgumentNullException("handles",
"WaitHandle[] handles was null");
foreach (WaitHandle wh in handles) wh.WaitOne();
}
With this utility method, however, each waithandle is only examined after every preceding one in the array has been signalled... so it is in effect synchronous, and will not work if the waithandles are autoResetEvent wait handles (which clear as soon as a waiting thread has been released)
To fix this issue I am considering changing this code to the following, but would like others to check and see if it looks like it will work, or if anyone sees any issues with it, or can suggest a better way ...
Thanks in advance:
public static void WaitAllParallel(WaitHandle[] handles)
{
if (handles == null)
throw new ArgumentNullException("handles",
"WaitHandle[] handles was null");
int actThreadCount = handles.Length;
object locker = new object();
foreach (WaitHandle wh in handles)
{
WaitHandle qwH = wh;
ThreadPool.QueueUserWorkItem(
delegate
{
try { qwH.WaitOne(); }
finally { lock(locker) --actThreadCount; }
});
}
while (actThreadCount > 0) Thread.Sleep(80);
}
If you know how many threads you have, you can use an interlocked decrement. This is how I usually do it:
{
eventDone = new AutoResetEvent();
totalCount = 128;
for(0...128) {ThreadPool.QueueUserWorkItem(ThreadWorker, ...);}
}
void ThreadWorker(object state)
try
{
... work and more work
}
finally
{
int runningCount = Interlocked.Decrement(ref totalCount);
if (0 == runningCount)
{
// This is the last thread, notify the waiters
eventDone.Set();
}
}
Actually, most times I don't even signal but instead invoke a callback continues the processing from where the waiter would continue. Less blocked threads, more scalability.
I know is different and may not apply to your case (eg. for sure will not work if some of thoe handles are not threads, but I/O or events), but it may worth thinking about this.
I'm not sure what exactly you're trying to do, but would a CountdownEvent (.NET 4.0) conceptually solve your problem?
I'm not a C# or .NET programmer, but you could use a semaphore that is posted when one of your worker threads exits. The monitoring thread would simply wait on the semaphore n times where n is the number of worker threads. Semaphores are traditionally used to count resources in use but they can be used to count jobs completed by waiting on the same semaphore for n times.
When working with lots of simultaneous threads, I prefer to add each thread's ManagedThreadId into a Dictionary when I start the thread, and then have each thread invoke a callback routine that removes the dying thread's id from the Dictionary. The Dictionary's Count property tells you how many threads are active. Use the value side of the key/value pair to hold info that your UI thread can use to report status. Wrap the Dictionary with a lock to keep things safe.
ThreadPool.QueueUserWorkItem(o =>
{
try
{
using (var h = (o as WaitHandle))
{
if (!h.WaitOne(100000))
{
// Alert main thread of the timeout
}
}
}
finally
{
Interlocked.Decrement(ref actThreadCount);
}
}, wh);

Categories