I want to create a thread-safe collection that can be modified while being enumerated.
The sample ActionSet class stores Action handlers. It has the Add method that adds a new handler to the list and the Invoke method that enumerates and invokes all of the collected action handlers. The intended working scenarios include very frequent enumerations with occasional modifications while enumerating.
Normal collections throw exception if you modify them using the Add method while the enumeration is not over.
There is an easy, but slow solution to the problem: Just clone the collection before enumeration:
class ThreadSafeSlowActionSet {
List<Action> _actions = new List<Action>();
public void Add(Action action) {
lock(_actions) {
_actions.Add(action);
}
}
public void Invoke() {
lock(_actions) {
List<Action> actionsClone = _actions.ToList();
}
foreach (var action in actionsClone ) {
action();
}
}
}
The problem with this solution is the enumeration overhead and I want enumeration to be very fast.
I've created a rather fast "recursion-safe" collection that allows adding new values even while enumerating. If you add new values while the main _actions collection is being enumerated, the values are added to the temporary _delta collection instead of the main one. After all enumerations are finished, the _delta values are added to the _actions collection. If you add some new values while the main _actions collection is being enumerated (creating the _delta collection) and then re-enter the Invoke method again we have to create a new merged collection (_actions + _delta) and replace _actions with it.
So, this collection looks "recursion-safe", but I want to make it thread-safe. I think that I need to use the Interlocked.* constructs, classes from System.Threading and other synchronization primitives to make this collection thread-safe, but I don't have a good idea on how to do that.
How to make this collection thread-safe?
class RecursionSafeFastActionSet {
List<Action> _actions = new List<Action>(); //The main store
List<Action> _delta; //Temporary buffer for storing added values while the main store is being enumerated
int _lock = 0; //The number of concurrent Invoke enumerations
public void Add(Action action) {
if (_lock == 0) { //_actions list is not being enumerated and can be modified
_actions.Add(action);
} else { //_actions list is being enumerated and cannot be modified
if (_delta == null) {
_delta = new List<Action>();
}
_delta.Add(action); //Storing the new values in the _delta buffer
}
}
public void Invoke() {
if (_delta != null) { //Re-entering Invoke after calling Add: Invoke->Add,Invoke
Debug.Assert(_lock > 0);
var newActions = new List<Action>(_actions); //Creating a new list for merging delta
newActions.AddRange(_delta); //Merging the delta
_delta = null;
_actions = newActions; //Replacing the original list (which is still being iterated)
}
_lock++;
foreach (var action in _actions) {
action();
}
_lock--;
if (_lock == 0 && _delta != null) {
_actions.AddRange(_delta); //Merging the delta
_delta = null;
}
}
}
Update: Added the ThreadSafeSlowActionSet variant.
A simpler approach (used, for example, by ConcurrentBag) is to have GetEnumerator() return an enumerator over a snapshot of the collection's contents. In your case this might look like:
public IEnumerator<Action> GetEnumerator()
{
lock(sync)
{
return _actions.ToList().GetEnumerator();
}
}
If you do this, you don't need a _delta field and the complexity it adds.
Here is your class modified for thread safety:
class SafeActionSet
{
Object _sync = new Object();
List<Action> _actions = new List<Action>(); //The main store
List<Action> _delta = new List<Action>(); //Temporary buffer for storing added values while the main store is being enumerated
int _lock = 0; //The number of concurrent Invoke enumerations
public void Add(Action action)
{
lock(sync)
{
if (0 == _lock)
{ //_actions list is not being enumerated and can be modified
_actions.Add(action);
}
else
{ //_actions list is being enumerated and cannot be modified
_delta.Add(action); //Storing the new values in the _delta buffer
}
}
}
public void Invoke()
{
lock(sync)
{
if (0 < _delta.Count)
{ //Re-entering Invoke after calling Add: Invoke->Add,Invoke
Debug.Assert(0 < _lock);
var newActions = new List<Action>(_actions); //Creating a new list for merging delta
newActions.AddRange(_delta); //Merging the delta
_delta.Clear();
_actions = newActions; //Replacing the original list (which is still being iterated)
}
++_lock;
}
foreach (var action in _actions)
{
action();
}
lock(sync)
{
--_lock;
if ((0 == _lock) && (0 < _delta.Count))
{
_actions.AddRange(_delta); //Merging the delta
_delta.Clear();
}
}
}
}
I made a few other tweaks, for the following reason:
reversed IF expressions to have constant value first, so if I do a
typo and put "=" instead of "==" or "!=" etc., the compiler will
instantly tell me of the typo.
(: a habit I got into because my brain and fingers are often out of sync :)
preallocated _delta, and called .Clear() instead of setting it to null,
because I find it is easier to read.
the various lock(_sync) {...} give you your thread safety on all instance variable access.
:( with the exception of your access to _action in the enumeration itself. ):
Since I actually also needed to delete items from the collection, the implementation that I ultimately used was based on a rewritten LinkedList that locks adjacent nodes on deletion/insertion and doesn't complain if the collection was changed during enumeration.
I also added a Dictionary to make the element search fast.
Related
I want to know if the following code is thread safe, which I assume it is not. And how I could possibly make it thread safe?
Basically I have a ConcurrentDictionary which acts as a cache for a database table. I want to query the DB every 10 seconds and update the db cache. There will be other threads querying this dictionary the whole time.
I can't just use TryAdd as there may also me elements which have been removed. So I decided instead of searching through the entire dictionary to possibly update, add or remove. I would just reinitialize the dictionary. Please do tell me if this is a silly idea.
My concern is that when I reinitialize the dictionary the querying threads will not longer by thread safe for the instance when the initialization takes place. For that reason I have used a lock for the dictionary when updating it, However I am not sure if this is correct as the object changes in the lock?
private static System.Timers.Timer updateTimer;
private static volatile Boolean _isBusyUpdating = false;
private static ConcurrentDictionary<int, string> _contactIdNames;
public Constructor()
{
// Setup Timers for data updater
updateTimer = new System.Timers.Timer();
updateTimer.Interval = new TimeSpan(0, 0, 10, 0).TotalMilliseconds;
updateTimer.Elapsed += OnTimedEvent;
// Start the timer
updateTimer.Enabled = true;
}
private void OnTimedEvent(Object source, System.Timers.ElapsedEventArgs e)
{
if (!_isBusyUpdating)
{
_isBusyUpdating = true;
// Get new data values and update the list
try
{
var tmp = new ConcurrentDictionary<int, string>();
using (var db = new DBEntities())
{
foreach (var item in db.ContactIDs.Select(x => new { x.Qualifier, x.AlarmCode, x.Description }).AsEnumerable())
{
int key = (item.Qualifier * 1000) + item.AlarmCode;
tmp.TryAdd(key, item.Description);
}
}
if (_contactIdNames == null)
{
_contactIdNames = tmp;
}
else
{
lock (_contactIdNames)
{
_contactIdNames = tmp;
}
}
}
catch (Exception e)
{
Debug.WriteLine("Error occurred in update ContactId db store", e);
}
_isBusyUpdating = false;
}
}
/// Use the dictionary from another Thread
public int GetIdFromClientString(string Name)
{
try
{
int pk;
if (_contactIdNames.TryGetValue(Name, out pk))
{
return pk;
}
}
catch { }
//If all else fails return -1
return -1;
}
You're right your code is not thread safe.
You need to lock _isBusyUpdating variable.
You need to lock _contactIdNames every time, not only when its not null.
Also this code is similar to singleton pattern and it has the same problem with initialization. You can solve it with Double checked locking. However you also need double checked locking when accessing entries.
In the case when you updating whole dictionary at once you need to lock current value every time when accessing. Otherwise you can access it while it's still changing and get error. So you either need to lock variable each time or use Interlocked.
As MSDN says volatile should do the trick with _isBusyUpdating, it should be thread safe.
If you don't want to keep track of _contactIdNames thread safety try to implement update of each entry on the same dictionary. The problem will be in difference detection between DB and current values (what entries have been removed or added, others can be simple rewritten), but not in thread safety, since ConcurrentDictionary is already thread safe.
You seem to be making a lot of work for yourself. Here's how I would tackle this task:
public class Constructor
{
private volatile Dictionary<int, string> _contactIdNames;
public Constructor()
{
Observable
.Interval(TimeSpan.FromSeconds(10.0))
.StartWith(-1)
.Select(n =>
{
using (var db = new DBEntities())
{
return db.ContactIDs.ToDictionary(
x => x.Qualifier * 1000 + x.AlarmCode,
x => x.Description);
}
})
.Subscribe(x => _contactIdNames = x);
}
public string TryGetValue(int key)
{
string value = null;
_contactIdNames.TryGetValue(key, out value);
return value;
}
}
I'm using Microsoft's Reactive Extensions (Rx) Framework - NuGet "Rx-Main" - for the timer to update the dictionary.
The Rx should be fairly straightforward. If you haven't seen it before in very simple terms it's like LINQ meets events.
If you don't like Rx then just go with your current timer model.
All this code does is create a new dictionary every 10 seconds from the DB. I'm just using a plain dictionary since it is only being created from one thread. Since reference assignment is atomic then you can just re-assign the dictionary when you like with complete thread-safety.
Multiple threads can safely read from a dictionary as long as the elements don't change.
I want to know if the following code is thread safe, which I assume it
is not. And how I could possibly make it thread safe?
I believe it's not. First of all i'd create property for ConcurrentDictionary and check if update is underway inside get method, and if it is, i'd return the previous version of object :
private object obj = new object();
private ConcurrentDictionary<int, string> _contactIdNames;
private ConcurrentDictionary<int, string> _contactIdNamesOld;
private volatile bool _isBusyUpdating = false;
public ConcurrentDictionary<int, string> ContactIdNames
{
get
{
if (!_isBusyUpdating) return _contactIdNames;
return _contactIdNamesOld;
}
private set
{
if(_isBusyUpdating) _contactIdNamesOld =
new ConcurrentDictionary<int, string>(_contactIdNames);
_contactIdNames = value;
}
}
And your method can be :
private static void OnTimedEvent(Object source, System.Timers.ElapsedEventArgs e)
{
if (_isBusyUpdating) return;
lock (obj)
{
_isBusyUpdating = true;
// Get new data values and update the list
try
{
ContactIdNames = new ConcurrentDictionary<int, string>();
using (var db = new DBEntities())
{
foreach (var item in db.ContactIDs.Select(x => new { x.Qualifier, x.AlarmCode, x.Description }).AsEnumerable())
{
int key = (item.Qualifier * 1000) + item.AlarmCode;
_contactIdNames.TryAdd(key, item.Description);
}
}
}
catch (Exception e)
{
Debug.WriteLine("Error occurred in update ContactId db store", e);
_contactIdNames = _contactIdNamesOld;
}
finally
{
_isBusyUpdating = false;
}
}
}
P.S.
My concern is that when I reinitialize the dictionary the querying
threads will not longer by thread safe for the instance when the
initialization takes place. For that reason I have used a lock for the
dictionary when updating it, However I am not sure if this is correct
as the object changes in the lock?
It's ConcurrentDictionary<T> type is threadsafe and not the instance of it, so even if you create new instance and change the reference to it - it's not something to worry about.
I have a background worker in a web page which processes a large file import. I have a static property containing a Dictionary of values which I need my background worker to access. To prevent issues with garbage collection, I stringify the Dictionary when passing it into the background worker. The problem is, 1 out of 20 or so times, the Dictionary appears to be garbage collected before it is stringified.
static readonly Dictionary<int, int> myDictionary = new Dictionary<int, int>();
// When button is clicked, I fire the background worker
// Assume for posterity, I've filled the Dictionary with a list of values and those values exist at the time the worker is queued up.
protected void OnProcessClickHandler(object sender, EventArgs e)
{
ThreadPool.QueueUserWorkItem(ProcessInBackground, new object[] {
DictionaryIntIntToCsv(myDictionary)
});
}
// Example of the Background Process
private void ProcessInBackground(object state)
{
object[] parms = state as object[];
if (parms != null && parms.Length > 0)
{
var MyNewDictionary = DictionaryIntIntFromCsv(parms[0] as string);
//... Doing something with the Dictionary
}
}
// Here are some helper methods I am using to stringify the Dictionary. You can ignore these unless you think they have something to do with the issue at hand.
public static Dictionary<int, int> DictionaryIntIntFromCsv(string csv)
{
var dictionary = new Dictionary<int, int>();
foreach (var pair in csv.Split(','))
{
var arrNameValue = pair.Split(':');
if (arrNameValue.Count() != 2) continue;
var key = 0;
var val = 0;
int.TryParse(arrNameValue[0], out key);
int.TryParse(arrNameValue[1], out val);
if (key > 0 && val > 0)
{
dictionary.Add(key, val);
}
}
return dictionary;
}
public static string DictionaryIntIntToCsv(Dictionary<int, int> dictionary)
{
var str = "";
foreach (var key in dictionary.Keys)
{
var value = 0;
dictionary.TryGetValue(key, out value);
if (key == 0 || value == 0) continue;
var item = key + ":" + value;
str += (item + ",");
}
return str;
}
I know there is an issue with Garbage Collection. My theory is sometimes the main thread completes and garbage collection is run before the background worker has a chance to stringify the Dictionary. Would I be correct in assuming I could avoid issues with Garbage Collection if I stringify the Dictionary before queuing the background worker? Like so:
protected void OnProcessClickHandler(object sender, EventArgs e)
{
var MyString = DictionaryIntIntToCsv(MyDictionary);
ThreadPool.QueueUserWorkItem(ProcessInBackground, new object[] {
MyString
});
}
NOTE: The page is interactive and does several postbacks before firing off the background worker.
There is really a lot of misinformation and bizarre implementation in this question, so much so that it cannot actually be answered without further clarification.
What leads you to believe the dictionary will be collected once you've "stringified" it? What are you actually doing with the dictionary values in the ProcessInBackground() method? Is the processing actually more expensive than serializing and deserializing a dictionary to and from a string for no reason? If so, why is there a background worker being used at all? Why is the string passed in inside an object array instead of simply the string itself? Further on that point, why is the dictionary being serialized at all, is there any good reason can't it be passed in as the state argument directly?
You are likely initializing the property on page load. The reference to the property is tied to the instance of the page which existed on page load. After the server delivered the initial page to you, the class was eligible for garbage collection.
I believe you are seeing a race condition between how long it takes the user to do the postback on the page and how long it takes the server to collect the first instance of the class.
If the property were non-static, the values would not be there on postback. However, since it is a static property, it will exist in memory until the Garbage collector cleans it up.
Here you create a local variable with the dictionary:
if (parms != null && parms.Length > 0)
{
var MyNewDictionary = DictionaryIntIntFromCsv(parms[0] as string);
}
The above does not affect the below in any way. No where else in your code do you ever populate the static field MyDictionary.
The above local variable is completely separate from the below static field you have here, so the above assignment does not affect the below property in any way:
static readonly Dictionary<int, int> MyDictionary = new Dictionary<int, int>();
I have an WinRT application that fires notifications everytime it recieves data from a device. I also have a UI control that is databound to an observable collection which I wish to add the new data to
While I have made it capable of updating the observable collection, it causes the UI to become very laggy, as the amount of data generated it fast. It would therefore be better to batch the update, maybe every few hundred milliseconds.
Below shows a snippet of the code. First I create the periodic timer
TimerElapsedHandler f = new TimerElapsedHandler(batchUpdate);
CreatePeriodicTimer(f, new TimeSpan(0, 0, 3));
Below is my event handler for when new data comes in, along with the temporary list that stores the information
List<FinancialStuff> lst = new List<FinancialStuff>();
async void myData_ValueChanged(GattCharacteristic sender, GattValueChangedEventArgs args)
{
var data = new byte[args.CharacteristicValue.Length];
DataReader.FromBuffer(args.CharacteristicValue).ReadBytes(data);
lst.Add(new FinancialStuff() { Time = "DateTime.UtcNow.ToString("mm:ss.ffffff")", Amount = data[0] });
}
Then my batch update, which is called peroidically
private void batchUpdate(ThreadPoolTimer source)
{
AddItem<FinancialStuff>(financialStuffList, lst);
}
Then finally, for testing I want to clear the observable collection and items.
public async void AddItem<T>(ObservableCollection<T> oc, List<T> items)
{
lock (items)
{
if (Dispatcher.HasThreadAccess)
{
foreach (T item in items)
oc.Add(item);
}
else
{
Dispatcher.RunAsync(CoreDispatcherPriority.Low, () =>
{
oc.Clear();
for (int i = 0; i < items.Count; i++)
{
items.Count());
oc.Add(items[i]);
}
lst.Clear();
});
}
}
}
While this seems to work, after a few updates the UI locks up and it updates very slowly/if not at all. For testing, it's only getting a few hundred items in the list by the time the timer is fired.
Can anybody enlighten me as to why as to why this is happening - I'm presuming my design is very poor.
Thanks
You're not locking your list in the event handler
// "lst" is never locked in your event handler
List<FinancialStuff> lst = new List<FinancialStuff>();
lst.Add(new FinancialStuff() { Time = "DateTime.UtcNow.ToString("mm:ss.ffffff")", Amount = data[0] });
Passing "lst" above to your async method
AddItem<FinancialStuff>(financialStuffList, lst);
You're locking "items" below, which is really "lst" above. However, you're adding to the list while your processing it. I assume the event handler has a higher priority so your processing is slower than your add. This can lead to "i < items.Count" being true forever.
public async void AddItem<T>(ObservableCollection<T> oc, List<T> items)
{
// "lst" reference is locked here, but it wasn't locked in the event handler
lock (items)
{
if (Dispatcher.HasThreadAccess)
{
foreach (T item in items)
oc.Add(item);
}
else
{
Dispatcher.RunAsync(CoreDispatcherPriority.Low, () =>
{
oc.Clear();
// This may never exit the for loop
for (int i = 0; i < items.Count; i++)
{
items.Count());
oc.Add(items[i]);
}
lst.Clear();
});
}
}
}
EDIT:
Do you need to view every piece of data? There is going to be some overhead when using a lock. If you're getting data quicker than the speed of how fast you can render it, you'll eventually be backed up and/or have a very large collection to render, which might also cause some problems. I suggest you do some filtering to only draw the last x number of items (say 100). Also, I'm not sure why you need the if (Dispatcher.HasThreadAccess) condition either.
Try the following:
public async void AddItem<T>(ObservableCollection<T> oc, List<T> items)
{
// "lst" reference is locked here, but it wasn't locked in the event handler
lock (items)
{
// Change this to what you want
const int maxSize = 100;
// Make sure it doesn't index out of bounds
int startIndex = Math.Max(0, items.Count - maxSize);
int length = items.Count - startIndex;
List<T> itemsToRender = items.GetRange(startIndex, length);
// You can clear it here in your background thread. The references to the objects
// are now in the itemsToRender list.
lst.Clear();
// Dispatcher.RunAsync(CoreDispatcherPriority.Low, () =>
// Please verify this is the correct syntax
Dispatcher.Run(() =>
{
// At second look, this might need to be locked too
// EDIT: This probably will just add overhead now that it's not running async.
// You can probably remove this lock
lock(oc)
{
oc.Clear();
for (int i = 0; i < itemsToRender.Count; i++)
{
// I didn't notice it before, but why are you checking the count again?
// items.Count());
oc.Add(itemsToRender[i]);
}
}
});
}
}
EDIT2:
Since your AddItem method is already on a background thread, I don't think you need to run the Dispatcher.RunAsync. Instead, I think it might be desirable for it to block so you don't end up with multiple calls to that section of code. Try using Dispatcher.Run instead. I've updated the code example above to show the changes. You shouldn't need the lock on the oc anymore since the lock on the items is good enough. Also, verify the syntax for Dispatcher.Run is correct.
I'm working on a simple irc chat bot (specifically for twitch.tv streams), and I'm using a List to keep a list of all the users in the channel. When someone leaves or joins, I add or remove them from the list. Then I have a thread that runs every minute that checks if the stream is online, and if it is, it hands out "currency" to all the people in my user list.
I'm sure you can already see where my problem is. If someone leaves or joins while my program is looping through the users in my list, then I get a Collection Modified exception. Currently, as a workaround, I just make a temp list and copy the real list into it, then loop through the temp list instead, but I was just curious if there was a "better" way to do it?
Quick psuedocode:
private List<string> users = new List<string>();
private void IrcInitialize(){
//connect to irc stuff
//blah
//blah
//blah
Thread workThread = new Thread(new ThreadStart(doWork());
workThread.Start();
}
private void ircListener(){
parseIRCMessage(StreamReader.ReadLine());
}
private void parseIRCMessage(msg){
if (msgType == "JOIN"){
users.Add(user);
}
else if (msgType == "PART"){
users.Remove(user);
}
}
private void doWork(){
while (true) {
if (streamOnline() && handOutTime()){
handOutCurrency();
}
Thread.Sleep(60000);
}
}
private void handOutCurrency(){
List<string> temp = users; //This is what I'm currently doing
foreach (String user in temp) {
database.AddCurrency(user, 1);
}
}
Any other suggestions?
I suggest using a ConcurrentBag<string> for the users.
This allows multi-threaded access to the users even while it is being enumerated.
The big plus is that you do not have to worry about locking.
There are two ways to solve this problem:
Using a lock to synchronize access between two threads, or
Doing all access from a single thread.
The first way is simple: add lock(users) {...} block around the code that reads or modifies the users list.
The second way is slightly more involved: define two concurrent queues, toAdd and toRemove in your class. Instead of adding or removing users directly from the users list, add them to the toAdd and toRemove queues. When the sleeping thread wakes up, it should first empty both queues, performing the modifications as necessary. Only then it should hand out the currency.
ConcurrentQueue<string> toAdd = new ConcurrentQueue<string>();
ConcurrentQueue<string> toRemove = new ConcurrentQueue<string>();
private void parseIRCMessage(msg){
if (msgType == "JOIN"){
toAdd.Enqueue(user);
}
else if (msgType == "PART"){
toRemove.Enqueue(user);
}
}
private void doWork(){
while (true) {
string user;
while (toAdd.TryDequeue(out user)) {
users.Add(user);
}
while (toRemove.TryDequeue(out user)) {
users.Remove(user);
}
if (streamOnline() && handOutTime()){
handOutCurrency();
}
Thread.Sleep(60000);
}
}
The suggestions from dasblinkenlight's answer are good. Another option is to do something similar to what you suggested: work with a immutable copy of the list. Except with normal List, you would need to make sure that it's not changed while you're copying it (and you would actually need to copy the list, not just a reference to it, like your code suggested).
A better version of this approach would be to use ImmutableList from the immutable collections library. With that, each modification creates a new collection (but sharing most parts with the previous version to improve efficiency). This way, you could have one thread that modifies the list (actually, creates new lists based on the old one) and you could also read the list from another thread at the same time. This will work, because new changes won't be reflected in an old copy of the list.
With that, your code would look something like this:
private ImmutableList<string> users = ImmutableList<string>.Empty;
private void ParseIRCMessage(string msg)
{
if (msgType == "JOIN")
{
users = users.Add(user);
}
else if (msgType == "PART")
{
users = users.Remove(user);
}
}
private void HandOutCurrency()
{
foreach (String user in users)
{
database.AddCurrency(user, 1);
}
}
You need to lock on the list during all reads, writes, and iterations of the list.
private void parseIRCMessage(msg){
lock(users)
{
if (msgType == "JOIN"){
users.Add(user);
}
else if (msgType == "PART"){
users.Remove(user);
}
}
}
private void doWork(){
while (true) {
if (streamOnline() && handOutTime()){
handOutCurrency();
}
Thread.Sleep(60000);
}
}
private void handOutCurrency(){
lock(users)
{
foreach (String user in users) {
database.AddCurrency(user, 1);
}
}
}
etc...
I have the following piece of code:
private Dictionary<object, object> items = new Dictionary<object, object>;
public IEnumerable<object> Keys
{
get
{
foreach (object key in items.Keys)
{
yield return key;
}
}
}
Is this thread-safe? If not do I have to put a lock around the loop or the yield return?
Here is what I mean:
Thread1 accesses the Keys property while Thread2 adds an item to the underlying dictionary. Is Thread1 affected by the add of Thread2?
What exactly do you mean by thread-safe?
You certainly shouldn't change the dictionary while you're iterating over it, whether in the same thread or not.
If the dictionary is being accessed in multiple threads in general, the caller should take out a lock (the same one covering all accesses) so that they can lock for the duration of iterating over the result.
EDIT: To respond to your edit, no it in no way corresponds to the lock code. There is no lock automatically taken out by an iterator block - and how would it know about syncRoot anyway?
Moreover, just locking the return of the IEnumerable<TKey> doesn't make it thread-safe either - because the lock only affects the period of time when it's returning the sequence, not the period during which it's being iterated over.
Check out this post on what happens behind the scenes with the yield keyword:
Behind the scenes of the C# yield keyword
In short - the compiler takes your yield keyword and generates an entire class in the IL to support the functionality. You can check out the page after the jump and check out the code that gets generated...and that code looks like it tracks thread id to keep things safe.
OK, I did some testing and got an interesting result.
It seems that it is more an issue of the enumerator of the underlying collection than the yield keyword. The enumerator (actually its MoveNext method) throws (if implemented correctly) an InvalidOperationException because the enumeration has changed. According to the MSDN documentation of the MoveNext method this is the expected behavior.
Because enumerating through a collection is usually not thread-safe a yield return is not either.
I believe it is, but I cannot find a reference that confirms it. Each time any thread calls foreach on an iterator, a new thread local* instance of the underlying IEnumerator should get created, so there should not be any "shared" memory state that two threads can conflict over...
Thread Local - In the sense that it's reference variable is scoped to a method stack frame on that thread
I believe yield implementation is thread-safe. Indeed, you can run that simple program at home and you will notice that the state of the listInt() method is correctly saved and restored for each thread without edge effect from other threads.
public class Test
{
public void Display(int index)
{
foreach (int i in listInt())
{
Console.WriteLine("Thread {0} says: {1}", index, i);
Thread.Sleep(1);
}
}
public IEnumerable<int> listInt()
{
for (int i = 0; i < 5; i++)
{
yield return i;
}
}
}
class MainApp
{
static void Main()
{
Test test = new Test();
for (int i = 0; i < 4; i++)
{
int x = i;
Thread t = new Thread(p => { test.Display(x); });
t.Start();
}
// Wait for user
Console.ReadKey();
}
}
class Program
{
static SomeCollection _sc = new SomeCollection();
static void Main(string[] args)
{
// Create one thread that adds entries and
// one thread that reads them
Thread t1 = new Thread(AddEntries);
Thread t2 = new Thread(EnumEntries);
t2.Start(_sc);
t1.Start(_sc);
}
static void AddEntries(object state)
{
SomeCollection sc = (SomeCollection)state;
for (int x = 0; x < 20; x++)
{
Trace.WriteLine("adding");
sc.Add(x);
Trace.WriteLine("added");
Thread.Sleep(x * 3);
}
}
static void EnumEntries(object state)
{
SomeCollection sc = (SomeCollection)state;
for (int x = 0; x < 10; x++)
{
Trace.WriteLine("Loop" + x);
foreach (int item in sc.AllValues)
{
Trace.Write(item + " ");
}
Thread.Sleep(30);
Trace.WriteLine("");
}
}
}
class SomeCollection
{
private List<int> _collection = new List<int>();
private object _sync = new object();
public void Add(int i)
{
lock(_sync)
{
_collection.Add(i);
}
}
public IEnumerable<int> AllValues
{
get
{
lock (_sync)
{
foreach (int i in _collection)
{
yield return i;
}
}
}
}
}