static void Main(string[] args)
{
Test c = new Test();
Thread oThread = new Thread(new ThreadStart(c.Lock));
oThread.Start();
Thread oThread2 = new Thread(new ThreadStart(c.AfterLock));
oThread2.Start();
Console.ReadLine();
}
public class Test
{
public Dictionary<string, string> dic = new Dictionary<string, string>();
public void Lock()
{
lock (((IDictionary)dic).SyncRoot)
{
for (var i = 3; i < 200; i++)
{
Console.WriteLine(i.ToString());
dic.Add(i.ToString(), i.ToString());
}
}
}
public void AfterLock()
{
Console.WriteLine(dic["100"]);
}
}
AfterLock throws exception: The given key was not present in the dictionary
dic was locked by first thread ? why afterlock does not wait for first threads lock ?
You need to lock around the access to the object both when reading the data as well as when writing to the data. Synchronizing around only the writes is not safe.
Nothing prevents your AfterLock method from actually running after your Lock method. It can just as easily run before it, or, since you're not properly synchronizing around the read, they can even run in parallel. You need to add some synchronization mechanism to prevent that method from running after the write, if it's execution depends on the write having run first (as it clearly does here).
You should lock on a dedicated object, not the one you're trying to use.
class Example
{
private object cacheLock;
public static void Load()
{
// . . .
lock (cacheLock)
{
CacheTable = (ThreadSafeDictionary<String,
ThreadSafeDictionary<Object, Object>>)CacheTableTmp.Clone();
}
}
Related
I have Method which is used by multiple threads at the same time. each one of this thread Call another method to receive the data they need from a List (each one should get a different data not same).
I wrote this code to get Data from a list and use them in the Threads.
public static List<string> ownersID;
static int idIdx = 0;
public static string[] GetUserID()
{
if (idIdx < ownersID.Count-1)
{
string[] ret = { ownersID[idIdx], idIdx.ToString() };
idIdx++;
return ret;
}
else if (idIdx >= ownersID.Count)
{
string[] ret = { "EndOfThat" };
return ret;
}
return new string[0];
}
Then each thread use this code to receive the data and remove it from the list:
string[] arrOwner = GetUserID();
string id = arrOwner[0];
ownersID.RemoveAt(Convert.ToInt32(arrOwner[1]));
But sometimes 2 or more threads can have the same data.
Is there has any better way to do this?
If you want to do it with List just add little bit of locking
private object _lock = new object();
private List<string> _list = new List<string>();
public void Add(string someStr)
{
lock(_lock)
{
if (_list.Any(s => s == someStr) // already added (inside lock)
return;
_list.Add(someStr);
}
}
public void Remove(string someStr)
{
lock(_lock)
{
if (!_list.Any(s => s == someStr) // already removed(inside lock)
return;
_list.Remove(someStr);
}
}
With that, no thread will be adding/removing anything while another thread does the same. Your list will be protected from multi-thread access. And you make sure that you only have 1 of the kind. However, you can achieve this using ConcurrentDictionary<T1, T2>
Update: I removed pre-lock check due to this MSDN thread safety statement
It is safe to perform multiple read operations on a List (read - multithreading), but issues can occur if the collection is modified while it's being read.
On a larger scale of application you can use .Net queue to communicate between two thread.
The benefit of using a queue is you don't need to lock the object which will be decrease the latency.From Main thread to Thread A , Thread B And Thread C the data will add and receive through queue.No Locking.
The code below is an example on multi-threading that the prof presented in class. I am new to coding (first course). I have read on multi-threading and using locks. Reading the theory is fun. var fun = Theory.Read(multi-threading); Actually coding threads and locks seems to baffle me.
Trying to understand how the two threads in the code below will behave. From testing the code it looks like lock1 will not release and message2 is not enqueue-ed, but I might be wrong. Looks like there is a synchronization issue. Is this an example of a deadlock?
I am also wondering why locks and threads are required if two different queues are used. I am not seeing a shared resource.
Is there a way to fix this code to prevent the synchronization issue?
private static object Lock1 = new object(); // Protect MessageQueueOne
private static object Lock2 = new object(); // Protect MessageQueueTwo
private static Queue<string> MessageQueueOne = new Queue<string>();
private static Queue<string> MessageQueueTwo = new Queue<string>();
private static void AddMessages(string message1, string message2)
{
lock (Lock1)
{
// (1) Thread 1 is here...
MessageQueueOne.Enqueue(message1);
lock (Lock2)
{
MessageQueueTwo.Enqueue(message2);
}
}
}
private static void RemoveMessages()
{
lock (Lock2)
{
if (MessageQueueTwo.Count > 0)
{
// (2) Thread 2 is here...
Console.WriteLine(MessageQueueTwo.Dequeue());
}
lock (Lock1)
{
if (MessageQueueOne.Count > 0)
{
Console.WriteLine(MessageQueueOne.Dequeue());
}
}
}
}
private static void Main()
{
Task taskOne = Task.Run(() =>
{
for (int i = 0; i < 100; ++i)
{
AddMessages($"Message One: {DateTime.Now}", $"Message Two: {DateTime.UtcNow}");
Thread.Sleep(25);
}
});
Task taskTwo = Task.Run(() =>
{
for (int i = 0; i < 100; ++i)
{
RemoveMessages();
Thread.Sleep(25);
}
});
taskOne.Wait();
taskTwo.Wait();
Console.Write("Tasks are finished");
Console.ReadKey();
}
The code in the post is classical example of deadlock and expected to deadlock most of the time. See more links in Wikipedia article on deadlocks.
What leads to deadlock: one thread locks "lock1" and waits for "lock2", the other thread at the same time holds lock on "lock2" and will release it after acquiring "lock1" which will never be release by waiting thread.
Standard solutions
listen to your class to know the answer
read existing examples
if above fails - one option is to acquire resources in fixed order (i.e. if need to lock on more than one resource get "lock1" first, than "lock2" and so on) for all thread (Would you explain lock ordering?).
In the following code I am using two threads to share sane resource in this example it's a queue so do I need to use lock while en-queueing or dequeuing if yes then why because program seems to work fine.
class Program
{
static Queue<string> sQ = new Queue<string>();
static void Main(string[] args)
{
Thread prodThread = new Thread(ProduceData);
Thread consumeThread = new Thread(ConsumeData);
prodThread.Start();
consumeThread.Start();
Console.ReadLine();
}
private static void ProduceData()
{
for (int i = 0; i < 100; i++)
{
sQ.Enqueue(i.ToString());
}
}
private static void ConsumeData()
{
while (true)
{
if (sQ.Count > 0)
{
string s = sQ.Dequeue();
Console.WriteLine("DEQUEUE::::" + s);
}
}
}
}
Yes you do, System.Collections.Generic.Queue<T> is not thread safe for being written to and read from at the same time. You either need to lock on the same object before enquing or dequing or if you are using .NET 4/4.5 use the System.Collections.Concurrent.ConcurrentQueue<T> class instead and use the TryDequeue method.
The reason your current implementation has not caused you a problem so far, is due to the Thread.Sleep(500) call (not something you should be using in production code) which means that the prodThread doesn't write to the queue while the consumeThread reads from it since the read operation takes less than 500ms. If you remove the Thread.Sleep odds are it will throw an exception at some point.
I have a mainthread and i do create another n-threads. Each of the n-threads is populating a List<String>. When all threads are finished, they are joined, and i would like to have all those n-threads List in a List<List<String>> BUT in the mainthread. The mainthread should be able to operate on that List<List<String>>. Each of the n-threads contributed a List<String>.
I have c# .NET 3.5 and i would like to avoid a static List<List<String>>
Thread t = new Thread(someObjectForThreading.goPopulateList);
list_threads.Add(t);
list_threads.last().start()
all those threads in list_threads go on and populate their List and when they are finished i would like to have something like
//this = mainThread
this.doSomethingWith(List<List<String>>)
Edit: Hmmm is there no a "standard concept" how to solve such a task? Many threads operating on a list and when all joined, the mainthread can proceed with operating on the list.
Edit2: the List<List<String>> listOfLists must not be static. It can be public or private. First i need the n-threads to operate (and lock) the listOfLists, insert their List and after all n-threads are done inserting their lists, i would join the threads and the mainthread could proceed with businesslogic and operate on the listOfLists
i think i will reRead some parts of http://www.albahari.com/threading/ report back
Here's a simple implementation using wait handles (in the case ManualResetEvent) to allow each worker thread to signal the main thread that it's done with its work. I hope this is somewhat self explanatory:
private List<List<string>> _listOfLists;
public void CreateListOfLists()
{
var waitHandles = new List<WaitHandle>();
foreach (int count in Enumerable.Range(1, 5))
{
var t = new Thread(CreateListOfStringWorker);
var handle = new ManualResetEvent(false);
t.Start(handle);
waitHandles.Add(handle);
}
// wait for all threads to complete by calling Set()
WaitHandle.WaitAll(waitHandles.ToArray());
// do something with _listOfLists
// ...
}
public void CreateListOfStringWorker(object state)
{
var list = new List<string>();
lock (_listOfLists)
{
_listOfLists.Add(list);
}
list.Add("foo");
list.Add("bar");
((ManualResetEvent) state).Set(); // i'm done
}
Note how I'm only locking while I add each thread's List to the main list of lists. There is no need to lock the main list for each add, as each thread has its own List. Make sense?
Edit:
The point of using the waithandle is to wait for each thread to complete before working on your list of lists. If you don't wait, then you run the risk of trying to enumerate one of the List instances while the worker is still adding strings to it. This will cause an InvalidOperationException to be thrown, and your thread(s) will die. You cannot enumerate a collection and simultaneously modify it.
Rather than making the static List<List<String>> make a local List<List<String>> and pass it to the Object the thread will be running. Of course, you'll need to wrap the List in a synchronous wrapper since it's being accessed by multiple threads.
List<List<String>> list = ArrayList.synchronized(new ArrayList<List<String>>());
// later
SomeObject o = new SomeObjectForThreading(list);
Thread t = new Thread(o.goPopulateList);
list_threads.Add(t);
list_threads.last().start();
// even later
this.doSomethingWith(list);
In o.goPopulateList, you might have
List<String> temp = new ArrayList<String>();
temp.add(random text);
temp.add(other random text);
this.list.add(temp); // this.list was passed in at construct time
I would provide each thread with a call-back method that updates the list in the main thread, protected with a lock statement.
Edit:
class Program
{
static List<string> listOfStuff = new List<string>();
static void Main(string[] args)
{
List<Thread> threads = new List<Thread>();
for (int i = 0; i < 20; i++)
{
var thread = new Thread(() => { new Worker(new AppendToListDelegate(AppendToList)).DoWork(); });
thread.IsBackground = true;
threads.Add(thread);
}
threads.ForEach(n => n.Start());
threads.ForEach(n => n.Join());
Console.WriteLine("Count: " + listOfStuff.Count());
Console.ReadLine();
}
static void AppendToList(string arg)
{
lock (listOfStuff)
{
listOfStuff.Add(arg);
}
}
}
public delegate void AppendToListDelegate(string arg);
class Worker
{
AppendToListDelegate Appender;
public Worker(AppendToListDelegate appenderArg)
{
Appender = appenderArg;
}
public void DoWork()
{
for (int j = 0; j < 10000; j++)
{
Appender(Thread.CurrentThread.ManagedThreadId.ToString() + "." + j.ToString());
}
}
}
private void someObjectForThreading.goPopulateList()
{
Do threaded stuff...
populate threaded list..
All done..
for (List list in myThreadedList)
{
myMainList.Add(list);
}
}
Here is some code that perpetually generate GUIDs. I've written it to learn about threading. In it you'll notice that I've got a lock around where I generate GUIDs and enqueue them even though the ConcurrentQueue is thread safe. It's because my actual code will need to use NHibernate and so I must make sure that only one thread gets to fill the queue.
While I monitor this code in Task Manager, I notice the process drops the number of threads from 18 (on my machine) to 14 but no less. Is this because my code isn't good?
Also can someone refactor this if they see fit? I love shorter code.
class Program
{
ConcurrentNewsBreaker Breaker;
static void Main(string[] args)
{
new Program().Execute();
Console.Read();
}
public void Execute()
{
Breaker = new ConcurrentNewsBreaker();
QueueSome();
}
public void QueueSome()
{
ThreadPool.QueueUserWorkItem(DoExecute);
}
public void DoExecute(Object State)
{
String Id = Breaker.Pop();
Console.WriteLine(String.Format("- {0} {1}", Thread.CurrentThread.ManagedThreadId, Breaker.Pop()));
if (Breaker.Any())
QueueSome();
else
Console.WriteLine(String.Format("- {0} XXXX ", Thread.CurrentThread.ManagedThreadId));
}
}
public class ConcurrentNewsBreaker
{
static readonly Object LockObject = new Object();
ConcurrentQueue<String> Store = new ConcurrentQueue<String>();
public String Pop()
{
String Result = null;
if (Any())
Store.TryDequeue(out Result);
return Result;
}
public Boolean Any()
{
if (!Store.Any())
{
Task FillTask = new Task(FillupTheQueue, Store);
FillTask.Start();
FillTask.Wait();
}
return Store.Any();
}
private void FillupTheQueue(Object StoreObject)
{
ConcurrentQueue<String> Store = StoreObject as ConcurrentQueue<String>;
lock(LockObject)
{
for(Int32 i = 0; i < 100; i++)
Store.Enqueue(Guid.NewGuid().ToString());
}
}
}
You are using .NET's ThreadPool so .NET/Windows manages the number of threads based on the amount of work waiting to be processed.
While I monitor this code in Task
Manager, I notice the process drops
the number of threads from 18 (on my
machine) to 14 but no less. Is this
because my code isn't good?
This does not indicate a problem. 14 is still high, unless you've got a 16-core cpu.
The threadpool will try to adjust and do the work with as few threads as possible.
You should start to worry when the number of threads goes up significantly.