Is Queue.Peek thread safe? - c#

A read operation on a 32 bit field is atomic. So if the queue holds object references the Queue.Peek method should be thread safe, right?

No. And even if it were, that misses the point. Let's assume a thread-safe peek for a moment. You typically end up writing code that does something kind of like this:
if (MyQueue.Peek() != null)
var item = MyQueue.Dequeue();
That's a bug in multi-threaded code even if Peek() and Dequeue() are themselves both thread-safe, because you need to remember that the queue can change in between when you check with Peek() and when you act on the information it gives you with Dequeue(). You need to make sure you lock around both parts.

No, you should still lock around each Peek() call.
Since Queue internally uses an Array, its instance methods are not thread safe, because array could be changed by a different thread at any time.
Peek() also checks for queue length to see if there are elements in the queue before returning the actual value, and some other thread might remove those elements before the method actually returns those values.

If you look at the implementation in .net reflector, it looks like this...
public virtual object Peek()
{
if (this._size == 0)
{
throw new InvalidOperationException(Environment.GetResourceString("InvalidOperation_EmptyQueue"));
}
return this._array[this._head];
}
So no. Not thread safe.

It is not thread safe.
But to synchronize it you may find a ReaderWriterLockSlim to be the best. Only the Enqueue() and Dequeue() methods would require a write-lock. Peek() would only require a read-lock.

Related

List thread safe?

Can the following be considered thread safe due to the atomic operation appearance of the code.
My main concern is if the lists needs to be re-sized it becomes non-thread safe during the re-sizing.
List<int> list = new List<int>(10);
public List<int> GetList()
{
var temp = list;
list = new List<int>(10);
return temp;
}
TimerElapsed(int number)
{
list.Add(number);
}
No. List<T> is explicitly documented not to be thread-safe:
It is safe to perform multiple read operations on a List, but issues can occur if the collection is modified while it’s being read. To ensure thread safety, lock the collection during a read or write operation. To enable a collection to be accessed by multiple threads for reading and writing, you must implement your own synchronization. For collections with built-in synchronization, see the classes in the System.Collections.Concurrent namespace. For an inherently thread–safe alternative, see the ImmutableList class.
Neither your code nor the List<T> are thread-safe.
The list isn't thread-safe according to its documentation. Your code is not thread safe because it lacks synchronization.
Consider two threads calling GetList concurrently. Let's say the first thread gets pre-empted right after setting up the temp. Now the second thread sets the temp of its own, replaces the list, and lets the GetList function run to completion. When the first thread gets to continue, it would return the same list that the second thread has just returned.
But that's not all! If a third thread has called TimerElapsed after the second thread has completed but before the first thread has completed, it would place a value in a list that is about to be overwritten without a trace. So not only would multiple threads return the same data, but also some of your data will disappear.
No. It is not ThreadSafe.
Try using members of the System.Collections.Concurrent namespace
As already mentioned, a List<T> is not thread safe. You can look at alternatives in the Concurrent namespace, possibly using the ConcurrentBag, or there is an article here by Dean Chalk Fast Parallel ConcurrentList<T> Implementation.
It is not thread safe since there can be a context switch between the first line of the GetList method which transfers to TimerElapsed method. This will create inconsistent result on different scenarions. Also as other users already mentioned the List class is not thread safe and you should use the System.Collections.Concurrent equivalent.
It is thread safe for reading only, not for writing.

Does ConcurrentQueue.IsEmpty require a memory barrier?

When looking into IsEmpty, I noticed this on MSDN:
However, as this collection is intended to be accessed concurrently, it may be the case that another thread will modify the collection after IsEmpty returns, thus invalidating the result.
Sure this is true, but does this also imply that ConcurrentQueue doesn't use a read barrier when checking if the queue is empty?
I want to have a piece of code that checks in another thread if the concurrent queue is empty. Something like this:
while (!queue.IsEmpty)
{
}
However.. if ConcurrentQueue doesn't use a read barrier, I'd say we need to add our own memory barrier to ensure we read the right data, like this:
Thread.MemoryBarrier();
while (!queue.IsEmpty)
{
Thread.MemoryBarrier();
}
(BTW: These are just a minimal example to illustrate the case, there's more code in reality).
Is my observation correct? Or does the ConcurrentQueue handle this and does the first implementation work? (e.g. what I would expect from 'Concurrent')?
And what about 'Count'? I can't find the answer on MSDN... Same story?
No, it's nothing to do with memory barriers. It's simply got to be the case that something else could nip in and add or remove something to or from the queue just after your test.
You shouldn't really use IsEmpty. Use TryDequeue instead. Or use a BlockingCollection.
You really don't want to start writing code that "locks" the queue while you mess around with IsEmpty or Count.
(I almost never use ConcurrentQueue nowadays, since BlockingCollection is so much better, although that does of course depend on what you're trying to do.)

Is this use of the generic List thread safe

I have a System.Collections.Generic.List<T> to which I only ever add items in a timer callback. The timer is restarted only after the operation completes.
I have a System.Collections.Concurrent.ConcurrentQueue<T> which stores indices of added items in the list above. This store operation is also always performed in the same timer callback described above.
Is a read operation that iterates the queue and accesses the corresponding items in the list thread safe?
Sample code:
private List<Object> items;
private ConcurrentQueue<int> queue;
private Timer timer;
private void callback(object state)
{
int index = items.Count;
items.Add(new object());
if (true)//some condition here
queue.Enqueue(index);
timer.Change(TimeSpan.FromMilliseconds(500), TimeSpan.FromMilliseconds(-1));
}
//This can be called from any thread
public IEnumerable<object> AccessItems()
{
foreach (var index in queue)
{
yield return items[index];
}
}
My understanding:
Even if the list is resized when it is being indexed, I am only accessing an item that already exists, so it does not matter whether it is read from the old array or the new array. Hence this should be thread-safe.
Is a read operation that iterates the queue and accesses the corresponding items in the list thread safe?
Is it documented as being thread safe?
If no, then it is foolish to treat it as thread safe, even if it is in this implementation by accident. Thread safety should be by design.
Sharing memory across threads is a bad idea in the first place; if you don't do it then you don't have to ask whether the operation is thread safe.
If you have to do it then use a collection designed for shared memory access.
If you can't do that then use a lock. Locks are cheap if uncontended.
If you have a performance problem because your locks are contended all the time then fix that problem by changing your threading architecture rather than trying to do dangerous and foolish things like low-lock code. No one writes low-lock code correctly except for a handful of experts. (I am not one of them; I don't write low-lock code either.)
Even if the list is resized when it is being indexed, I am only accessing an item that already exists, so it does not matter whether it is read from the old array or the new array.
That's the wrong way to think about it. The right way to think about it is:
If the list is resized then the list's internal data structures are being mutated. It is possible that the internal data structure is mutated into an inconsistent form halfway through the mutation, that will be made consistent by the time the mutation is finished. Therefore my reader can see this inconsistent state from another thread, which makes the behaviour of my entire program unpredictable. It could crash, it could go into an infinite loop, it could corrupt other data structures, I don't know, because I'm running code that assumes a consistent state in a world with inconsistent state.
Big edit
The ConcurrentQueue is only safe with regard to the Enqueue(T) and T Dequeue() operations.
You're doing a foreach on it and that doesn't get synchronized at the required level.
The biggest problem in your particular case is the fact the enumerating of the Queue (which is a Collection in it's own right) might throw the wellknown "Collection has been modified" exception. Why is that the biggest problem ? Because you are adding things to the queue after you've added the corresponding objects to the list (there's also a great need for the List to be synchronized but that + the biggest problem get solved with just one "bullet"). While enumerating a collection it is not easy to swallow the fact that another thread is modifying it (even if on a microscopic level the modification is a safe - ConcurrentQueue does just that).
Therefore you absolutely need synchronize the access to the queues (and the central List while you're at it) using another means of synchronization (and by that I mean you can also forget abount ConcurrentQueue and use a simple Queue or even a List since you never Dequeue things).
So just do something like:
public void Writer(object toWrite) {
this.rwLock.EnterWriteLock();
try {
int tailIndex = this.list.Count;
this.list.Add(toWrite);
if (..condition1..)
this.queue1.Enqueue(tailIndex);
if (..condition2..)
this.queue2.Enqueue(tailIndex);
if (..condition3..)
this.queue3.Enqueue(tailIndex);
..etc..
} finally {
this.rwLock.ExitWriteLock();
}
}
and in the AccessItems:
public IEnumerable<object> AccessItems(int queueIndex) {
Queue<object> whichQueue = null;
switch (queueIndex) {
case 1: whichQueue = this.queue1; break;
case 2: whichQueue = this.queue2; break;
case 3: whichQueue = this.queue3; break;
..etc..
default: throw new NotSupportedException("Invalid queue disambiguating params");
}
List<object> results = new List<object>();
this.rwLock.EnterReadLock();
try {
foreach (var index in whichQueue)
results.Add(this.list[index]);
} finally {
this.rwLock.ExitReadLock();
}
return results;
}
And, based on my entire understanding of the cases in which your app accesses the List and the various Queues, it should be 100% safe.
End of big edit
First of all: What is this thing you call Thread-Safe ? by Eric Lippert
In your particular case, I guess the answer is no.
It is not the case that inconsistencies might arrise in the global context (the actual list).
Instead it is possible that the actual readers (who might very well "collide" with the unique writer) end up with inconsistencies in themselves (their very own Stacks meaning: local variables of all methods, parameters and also their logically isolated portion of the heap)).
The possibility of such "per-Thread" inconsistencies (the Nth thread wants to learn the number of elements in the List and finds out that value is 39404999 although in reality you only added 3 values) is enough to declare that, generally speaking that architecture is not thread-safe ( although you don't actually change the globally accessible List, simply by reading it in a flawed manner ).
I suggest you use the ReaderWriterLockSlim class.
I think you will find it fits your needs:
private ReaderWriterLockSlim rwLock = new ReaderWriterLockSlim(LockRecursionPolicy.SupportsRecursion);
private List<Object> items;
private ConcurrentQueue<int> queue;
private Timer timer;
private void callback(object state)
{
int index = items.Count;
this.rwLock.EnterWriteLock();
try {
// in this place, right here, there can be only ONE writer
// and while the writer is between EnterWriteLock and ExitWriteLock
// there can exist no readers in the following method (between EnterReadLock
// and ExitReadLock)
// we add the item to the List
// AND do the enqueue "atomically" (as loose a term as thread-safe)
items.Add(new object());
if (true)//some condition here
queue.Enqueue(index);
} finally {
this.rwLock.ExitWriteLock();
}
timer.Change(TimeSpan.FromMilliseconds(500), TimeSpan.FromMilliseconds(-1));
}
//This can be called from any thread
public IEnumerable<object> AccessItems()
{
List<object> results = new List<object>();
this.rwLock.EnterReadLock();
try {
// in this place there can exist a thousand readers
// (doing these actions right here, between EnterReadLock and ExitReadLock)
// all at the same time, but NO writers
foreach (var index in queue)
{
this.results.Add ( this.items[index] );
}
} finally {
this.rwLock.ExitReadLock();
}
return results; // or foreach yield return you like that more :)
}
No because you are reading and writing to/from the same object concurrently. This is not documented to be safe so you can't be sure it is safe. Don't do it.
The fact that it is in fact unsafe as of .NET 4.0 means nothing, btw. Even if it was safe according to Reflector it could change anytime. You can't rely on the current version to predict future versions.
Don't try to get away with tricks like this. Why not just do it in an obviously safe way?
As a side note: Two timer callbacks can execute at the same time, so your code is doubly broken (multiple writers). Don't try to pull off tricks with threads.
It is thread-safish. The foreach statement uses the ConcurrentQueue.GetEnumerator() method. Which promises:
The enumeration represents a moment-in-time snapshot of the contents of the queue. It does not reflect any updates to the collection after GetEnumerator was called. The enumerator is safe to use concurrently with reads from and writes to the queue.
Which is another way of saying that your program isn't going to blow up randomly with an inscrutable exception message like the kind you'll get when you use the Queue class. Beware of the consequences though, implicit in this guarantee is that you may well be looking at a stale version of the queue. Your loop will not be able to see any elements that were added by another thread after your loop started executing. That kind of magic doesn't exist and is impossible to implement in a consistent way. Whether or not that makes your program misbehave is something you will have to think about and can't be guessed from the question. It is pretty rare that you can completely ignore it.
Your usage of the List<> is however utterly unsafe.

Lock scope in C#: is returned object still "locked"?

Assuming I have an object A containing
// ...
private List<double> someList = new List<double>();
// ...
public List<double> SomeList
{
get { lock (this) { return someList; } }
}
// ...
would it be thread safe to perform operation on the list as in the code below. Knowing that several operations could be executed simultaneously by different threads.
A.SomeList.Add(2.0);
or
A.SomeList.RemoveAt(0);
In other words, when is the lock released?
There is no thread safety here.
The lock is released as soon as the block it protects is finished, just before the property returns, so the calls to Add ad RemoveAt are not protected by the lock.
The lock you shown in the question isn't of much use.
To make list operations thread safe you need to implement your own Add/Remove/etc methods wrapping those of the list.
public void Add(double item)
{
lock(_list)
{
_list.Add(item);
}
}
Also, it's a good idea to hide the list itself from the consumers of your class, i.e. make the field private.
The lock is released when you exit the body of the lock statement. That means that your code is not thread-safe.
In other words, you can be sure that two threads won't be executing return someList on the same object at the same time. But it's certainly possible that one thread will execute Add() at the same time as another thread will execute RemoveAt(), which is what makes it non thread-safe.
The lock is released when the code inside the lock is finished executing.
Also, locking on this will only affect the current instance of the object
Ok, just for the hell of it.
There IS a way to make your object threadsafe, by using architectures that are already threadsafe.
For example, you could make your object a single threaded COM object. The COM object will be thread safe, but you'll pay with performance (the price of being lazy and not managing your own locks).
Create a COM Object in C#
...others said already, but just to formalize a problem a bit...
First, lock (this) {...} suggests a 'scope' - e.g. like using (){} - it only locks (this in this case) for variables inside. And that's a 'good thing' :) actually, as if you couldn't rely on that the whole locks/synchronization concept would be very much useless,
lock (this) { return something; } is an oxymoron of a sort - it's returning something that unlocks the very same moment it returns,
And the problems I think is the understanding of how it works. 'lock()' is not 'persisted' in a state of the object, so that you could return it etc. Take a look here how it's implemented How does lock work exactly? - answer explains it. It's more of a 'critical section' - i.e. you protect certain parts of 'code', which uses the variable - not the variable itself. Any type of synchronization requires 'synchronization objects' to hold the locks - and to be disposed of once lock is no longer needed. Take a look at this post https://stackoverflow.com/a/251668/417747, Esteban formulated that very well,
"Finally, there is the common misconception that lock(this) actually modifies the object passed as a parameter, and in some way makes it read-only or inaccessible. This is false. The object passed as a parameter to lock merely serves as a key" (this is a quote)
you either (usually) lock 'private code' of a class method, property... - to synchronize access to something you're doing inside - and access being to a private member (again usually) so that nobody else can access it w/o going through your synchronized code.
Or you make a thread-safe 'structure' - like a list - which is already 'synchronized' inside so that you can access it in a thread-safe manner. But there are no such things (or not used, almost never) as passing locks around or having one place lock the variable, while the other part of the code unlocks it etc. (in that case it's some type of EventWaitHandle that's rather used to synchronize things in between 'distant' code where one fires off on another etc.)
In your case, the choice is I think to go with the 'synchronized structure', i.e. the list that's internally handled,

is Queue.Count thread safe?

I need one thread to modify Queue (both adding and removing elements) and another thread only to call Queue.Count. Would it be safe or I need to use locks or ConcurrentQueue?
The Queue property is not thread-safe, as per the docs.
But it is an atomic int, the worst that could happen is that you read the wrong (outdated) value. Which may or may not be a problem.
But since you'll have to do something to prevent your reading thread from caching the value you might as well lock().
Queue does not provide thread safety guarantees, so yes you do need one of the two alternatives you mention.
Public static (Shared in Visual Basic) members of this type are thread
safe. Any instance members are not guaranteed to be thread safe.
A Queue(Of T) can support multiple readers concurrently, as long as the
collection is not modified. Even so, enumerating through a collection
is intrinsically not a thread-safe procedure. To guarantee thread
safety during enumeration, you can lock the collection during the
entire enumeration. To allow the collection to be accessed by multiple
threads for reading and writing, you must implement your own
synchronization.
It's not guaranteed to be threadsafe.
The current implementation of Count is threadsafe. It's not likely to change, but there's no promise.
Most of the time, this isn't very useful though. If you were doing something like outputting a current estimate of the size to UI, then that's perfectly safe. If you make any decision on the basis of it, that is not safe:
if(queue.Count != 0)
return queue.Dequeue; //not thread-safe as Dequeue isn't threadsafe.
if(queue.Count != 0)
{
lock(queue)
return queue.Dequeue; //not thread-safe, won't corrput
//queue but may error as Count could now be zero.
}
lock(queue)
if(queue.Count != 0)
return queue.Dequeue; //thread-safe
ConcurrentQueue<int> cQueue = new ConcurrentQueue<int>();
/*...*/
int val;
if(cQueue.TryDequeue(out val))
return val; //perfectly thread-safe and lock-free,
//but more expensive than single-threaded use of Queue<int>
From the Queue msdn documentation under the Thread Safety heading:
Public static (Shared in Visual Basic) members of this type are thread
safe. Any instance members are not guaranteed to be thread safe.
To guarantee the thread safety of the Queue, all operations must be
done through the wrapper returned by the Synchronized method.
Enumerating through a collection is intrinsically not a thread-safe
procedure. Even when a collection is synchronized, other threads can
still modify the collection, which causes the enumerator to throw an
exception. To guarantee thread safety during enumeration, you can
either lock the collection during the entire enumeration or catch the
exceptions resulting from changes made by other threads.
msdn has pretty good documentation. I advise you to look there the next time.

Categories