I have a method that I intend to run on its own thread, but can't figure out how to pass the reference when setting up the thread.
private void ManageConnections(ref List<string> instanceAddresses)
{
int connected = Instances.Count();
if(instanceAddresses.Count() > connected)
{
int instancesToAdd = instanceAddresses.Count() - connected;
while(instancesToAdd != 0)
{
Channel channel = new Channel(instanceAddresses[instanceAddresses.Count - instancesToAdd], ChannelCredentials.Insecure);
var client = new ConfigurationDirectoryService.ConfigurationDirectoryServiceClient(channel);
Instances.Add(client);
instancesToAdd--;
}
}
}
Desired behaviour is that when the original list (instanceAddresses) is changed, this method can get to work and can set up a new client and add it to another list.
This is the method that would call the thread start:
public CDS_Service(ref List<string> instanceAddresses)
{
Thread manageAvaliable = new Thread(CheckAvaliability);
manageAvaliable.Start();
if(instanceAddresses.Count() > 0)
{
foreach(string instanceAddr in instanceAddresses)
{
Channel channel = new Channel(instanceAddr, ChannelCredentials.Insecure);
var client = new ConfigurationDirectoryService.ConfigurationDirectoryServiceClient(channel);
Instances.Add(client);
}
foreach(CM commandManager in Instances[0].SyncDirectory(new Empty { }).CommandManagers)
{
List<string> commands = new List<string>();
foreach(string command in commandManager.Commands)
{
commands.Add(command);
}
Directory.Add(new CommandManager(commandManager.Address, commands, commandManager.IsActive));
}
}
//Thread would be setup here
}
And where this is constructed:
Server server = new Server
{
Services = { ConfigurationDirectoryService.BindService(new CDS_Service(ref clientDiscovery.OtherInstances)) },
Ports = { new ServerPort(addr, PORT, ServerCredentials.Insecure) }
};
I'm also not sure if this is also bad practice passing a ref around through different classes like this.
Is this possible/safe to do?
Instead of using ref here, you should use an observable collection, like ObservableCollection<T>. List<string> is already pass-by-reference :)
First, change the type of clientDiscovery.OtherInstances to ObservableCollection<string>, then change the parameter type of the constructor to ObservableCollection<string> as well. Remove all the refs, you don't need those.
Now, rewrite ManageConnections to this signature (You'll need using System.Collections.Specialized):
private void ManageConnections(object sender, NotifyCollectionChangedEventArgs e) {
}
Here, you will check e.NewItems to see which items have been added to the instanceAddresses list, and add each of them to another list:
foreach (var item in e.NewItems) {
Channel channel = new Channel(item, ChannelCredentials.Insecure);
var client = new ConfigurationDirectoryService.ConfigurationDirectoryServiceClient(channel);
Instances.Add(client);
}
You might want to do something if the there are removed items as well. If you want to handle that, use e.OldItems. Those are the removed items.
Now, instead of calling ManageConnections, you do:
instancesToAdd.CollectionChanged += ManageConnections;
Note that this won't handle the initial items in the list (only subsequent changes will be added), so you might want to handle the initial items straight after the line above.
You do not need the ref keyword. List<T> is a class which is a reference type, so it's already passed by reference.
Well, to be precise you're passing the reference around, and THAT is passed by value. You would only need the ref keyword if you assigned the reference to a new / different list and wanted that passed back to the caller.
If you do not intend to alter the list, then you're probably better passing IEnumerable<T> instead, since that is read only .List<T> already implements IEnumerable<T>, so you don't even need to cast.
If you're accessing the list from different threads, then be aware it could change AT ANY TIME (like half way through iterating it). In this case you may want a ConcurrentList<T> which is at least thread safe for add/removes. Alternatively if you're only reading, you may be better to create a readonly "snapshot" of the list at a certain point by calling ToArray() on it or something and passing that around instead.
I can't get to the bottom of this error, because when the debugger is attached, it does not seem to occur.
Collection was modified; enumeration operation may not execute
Below is the code.
This is a WCF server in a Windows service. The method NotifySubscribers() is called by the service whenever there is a data event (at random intervals, but not very often - about 800 times per day).
When a Windows Forms client subscribes, the subscriber ID is added to the subscribers dictionary, and when the client unsubscribes, it is deleted from the dictionary. The error happens when (or after) a client unsubscribes. It appears that the next time the NotifySubscribers() method is called, the foreach() loop fails with the error in the subject line. The method writes the error into the application log as shown in the code below. When a debugger is attached and a client unsubscribes, the code executes fine.
Do you see a problem with this code? Do I need to make the dictionary thread-safe?
[ServiceBehavior(InstanceContextMode=InstanceContextMode.Single)]
public class SubscriptionServer : ISubscriptionServer
{
private static IDictionary<Guid, Subscriber> subscribers;
public SubscriptionServer()
{
subscribers = new Dictionary<Guid, Subscriber>();
}
public void NotifySubscribers(DataRecord sr)
{
foreach(Subscriber s in subscribers.Values)
{
try
{
s.Callback.SignalData(sr);
}
catch (Exception e)
{
DCS.WriteToApplicationLog(e.Message,
System.Diagnostics.EventLogEntryType.Error);
UnsubscribeEvent(s.ClientId);
}
}
}
public Guid SubscribeEvent(string clientDescription)
{
Subscriber subscriber = new Subscriber();
subscriber.Callback = OperationContext.Current.
GetCallbackChannel<IDCSCallback>();
subscribers.Add(subscriber.ClientId, subscriber);
return subscriber.ClientId;
}
public void UnsubscribeEvent(Guid clientId)
{
try
{
subscribers.Remove(clientId);
}
catch(Exception e)
{
System.Diagnostics.Debug.WriteLine("Unsubscribe Error " +
e.Message);
}
}
}
What's likely happening is that SignalData is indirectly changing the subscribers dictionary under the hood during the loop and leading to that message. You can verify this by changing
foreach(Subscriber s in subscribers.Values)
To
foreach(Subscriber s in subscribers.Values.ToList())
If I'm right, the problem will disappear.
Calling subscribers.Values.ToList() copies the values of subscribers.Values to a separate list at the start of the foreach. Nothing else has access to this list (it doesn't even have a variable name!), so nothing can modify it inside the loop.
When a subscriber unsubscribes you are changing contents of the collection of Subscribers during enumeration.
There are several ways to fix this, one being changing the for loop to use an explicit .ToList():
public void NotifySubscribers(DataRecord sr)
{
foreach(Subscriber s in subscribers.Values.ToList())
{
^^^^^^^^^
...
A more efficient way, in my opinion, is to have another list that you declare that you put anything that is "to be removed" into. Then after you finish your main loop (without the .ToList()), you do another loop over the "to be removed" list, removing each entry as it happens. So in your class you add:
private List<Guid> toBeRemoved = new List<Guid>();
Then you change it to:
public void NotifySubscribers(DataRecord sr)
{
toBeRemoved.Clear();
...your unchanged code skipped...
foreach ( Guid clientId in toBeRemoved )
{
try
{
subscribers.Remove(clientId);
}
catch(Exception e)
{
System.Diagnostics.Debug.WriteLine("Unsubscribe Error " +
e.Message);
}
}
}
...your unchanged code skipped...
public void UnsubscribeEvent(Guid clientId)
{
toBeRemoved.Add( clientId );
}
This will not only solve your problem, it will prevent you from having to keep creating a list from your dictionary, which is expensive if there are a lot of subscribers in there. Assuming the list of subscribers to be removed on any given iteration is lower than the total number in the list, this should be faster. But of course feel free to profile it to be sure that's the case if there's any doubt in your specific usage situation.
Why this error?
In general .Net collections do not support being enumerated and modified at the same time. If you try to modify the collection list during enumeration, it raises an exception. So the issue behind this error is, we can not modify the list/dictionary while we are looping through the same.
One of the solutions
If we iterate a dictionary using a list of its keys, in parallel we can modify the dictionary object, as we are iterating through the key-collection and
not the dictionary(and iterating its key collection).
Example
//get key collection from dictionary into a list to loop through
List<int> keys = new List<int>(Dictionary.Keys);
// iterating key collection using a simple for-each loop
foreach (int key in keys)
{
// Now we can perform any modification with values of the dictionary.
Dictionary[key] = Dictionary[key] - 1;
}
Here is a blog post about this solution.
And for a deep dive in StackOverflow: Why this error occurs?
Okay so what helped me was iterating backwards. I was trying to remove an entry from a list but iterating upwards and it screwed up the loop because the entry didn't exist anymore:
for (int x = myList.Count - 1; x > -1; x--)
{
myList.RemoveAt(x);
}
The accepted answer is imprecise and incorrect in the worst case . If changes are made during ToList(), you can still end up with an error. Besides lock, which performance and thread-safety needs to be taken into consideration if you have a public member, a proper solution can be using immutable types.
In general, an immutable type means that you can't change the state of it once created.
So your code should look like:
public class SubscriptionServer : ISubscriptionServer
{
private static ImmutableDictionary<Guid, Subscriber> subscribers = ImmutableDictionary<Guid, Subscriber>.Empty;
public void SubscribeEvent(string id)
{
subscribers = subscribers.Add(Guid.NewGuid(), new Subscriber());
}
public void NotifyEvent()
{
foreach(var sub in subscribers.Values)
{
//.....This is always safe
}
}
//.........
}
This can be especially useful if you have a public member. Other classes can always foreach on the immutable types without worrying about the collection being modified.
I want to point out other case not reflected in any of the answers. I have a Dictionary<Tkey,TValue> shared in a multi threaded app, which uses a ReaderWriterLockSlim to protect the read and write operations. This is a reading method that throws the exception:
public IEnumerable<Data> GetInfo()
{
List<Data> info = null;
_cacheLock.EnterReadLock();
try
{
info = _cache.Values.SelectMany(ce => ce.Data); // Ad .Tolist() to avoid exc.
}
finally
{
_cacheLock.ExitReadLock();
}
return info;
}
In general, it works fine, but from time to time I get the exception. The problem is a subtlety of LINQ: this code returns an IEnumerable<Info>, which is still not enumerated after leaving the section protected by the lock. So, it can be changed by other threads before being enumerated, leading to the exception. The solution is to force the enumeration, for example with .ToList() as shown in the comment. In this way, the enumerable is already enumerated before leaving the protected section.
So, if using LINQ in a multi-threaded application, be aware to always materialize the queries before leaving the protected regions.
InvalidOperationException-
An InvalidOperationException has occurred. It reports a "collection was modified" in a foreach-loop
Use break statement, Once the object is removed.
ex:
ArrayList list = new ArrayList();
foreach (var item in list)
{
if(condition)
{
list.remove(item);
break;
}
}
Actually the problem seems to me that you are removing elements from the list and expecting to continue to read the list as if nothing had happened.
What you really need to do is to start from the end and back to the begining. Even if you remove elements from the list you will be able to continue reading it.
I had the same issue, and it was solved when I used a for loop instead of foreach.
// foreach (var item in itemsToBeLast)
for (int i = 0; i < itemsToBeLast.Count; i++)
{
var matchingItem = itemsToBeLast.FirstOrDefault(item => item.Detach);
if (matchingItem != null)
{
itemsToBeLast.Remove(matchingItem);
continue;
}
allItems.Add(itemsToBeLast[i]);// (attachDetachItem);
}
I've seen many options for this but to me this one was the best.
ListItemCollection collection = new ListItemCollection();
foreach (ListItem item in ListBox1.Items)
{
if (item.Selected)
collection.Add(item);
}
Then simply loop through the collection.
Be aware that a ListItemCollection can contain duplicates. By default there is nothing preventing duplicates being added to the collection. To avoid duplicates you can do this:
ListItemCollection collection = new ListItemCollection();
foreach (ListItem item in ListBox1.Items)
{
if (item.Selected && !collection.Contains(item))
collection.Add(item);
}
This way should cover a situation of concurrency when the function is called again while is still executing (and items need used only once):
while (list.Count > 0)
{
string Item = list[0];
list.RemoveAt(0);
// do here what you need to do with item
}
If the function get called while is still executing items will not reiterate from the first again as they get deleted as soon as they get used.
Should not affect performance much for small lists.
There is one link where it elaborated very well & solution is also given.
Try it if you got proper solution please post here so other can understand.
Given solution is ok then like the post so other can try these solution.
for you reference original link :-
https://bensonxion.wordpress.com/2012/05/07/serializing-an-ienumerable-produces-collection-was-modified-enumeration-operation-may-not-execute/
When we use .Net Serialization classes to serialize an object where its definition contains an Enumerable type, i.e.
collection, you will be easily getting InvalidOperationException saying "Collection was modified;
enumeration operation may not execute" where your coding is under multi-thread scenarios.
The bottom cause is that serialization classes will iterate through collection via enumerator, as such,
problem goes to trying to iterate through a collection while modifying it.
First solution, we can simply use lock as a synchronization solution to ensure that
the operation to the List object can only be executed from one thread at a time.
Obviously, you will get performance penalty that
if you want to serialize a collection of that object, then for each of them, the lock will be applied.
Well, .Net 4.0 which makes dealing with multi-threading scenarios handy.
for this serializing Collection field problem, I found we can just take benefit from ConcurrentQueue(Check MSDN)class,
which is a thread-safe and FIFO collection and makes code lock-free.
Using this class, in its simplicity, the stuff you need to modify for your code are replacing Collection type with it,
use Enqueue to add an element to the end of ConcurrentQueue, remove those lock code.
Or, if the scenario you are working on do require collection stuff like List, you will need a few more code to adapt ConcurrentQueue into your fields.
BTW, ConcurrentQueue doesnât have a Clear method due to underlying algorithm which doesnât permit atomically clearing of the collection.
so you have to do it yourself, the fastest way is to re-create a new empty ConcurrentQueue for a replacement.
Here is a specific scenario that warrants a specialized approach:
The Dictionary is enumerated frequently.
The Dictionary is modified infrequently.
In this scenario creating a copy of the Dictionary (or the Dictionary.Values) before every enumeration can be quite costly. My idea about solving this problem is to reuse the same cached copy in multiple enumerations, and watch an IEnumerator of the original Dictionary for exceptions. The enumerator will be cached along with the copied data, and interrogated before starting a new enumeration. In case of an exception the cached copy will be discarded, and a new one will be created. Here is my implementation of this idea:
using System;
using System.Collections;
using System.Collections.Generic;
using System.Collections.ObjectModel;
using System.Linq;
public class EnumerableSnapshot<T> : IEnumerable<T>, IDisposable
{
private IEnumerable<T> _source;
private IEnumerator<T> _enumerator;
private ReadOnlyCollection<T> _cached;
public EnumerableSnapshot(IEnumerable<T> source)
{
_source = source ?? throw new ArgumentNullException(nameof(source));
}
public IEnumerator<T> GetEnumerator()
{
if (_source == null) throw new ObjectDisposedException(this.GetType().Name);
if (_enumerator == null)
{
_enumerator = _source.GetEnumerator();
_cached = new ReadOnlyCollection<T>(_source.ToArray());
}
else
{
var modified = false;
if (_source is ICollection collection) // C# 7 syntax
{
modified = _cached.Count != collection.Count;
}
if (!modified)
{
try
{
_enumerator.MoveNext();
}
catch (InvalidOperationException)
{
modified = true;
}
}
if (modified)
{
_enumerator.Dispose();
_enumerator = _source.GetEnumerator();
_cached = new ReadOnlyCollection<T>(_source.ToArray());
}
}
return _cached.GetEnumerator();
}
public void Dispose()
{
_enumerator?.Dispose();
_enumerator = null;
_cached = null;
_source = null;
}
IEnumerator IEnumerable.GetEnumerator() => GetEnumerator();
}
public static class EnumerableSnapshotExtensions
{
public static EnumerableSnapshot<T> ToEnumerableSnapshot<T>(
this IEnumerable<T> source) => new EnumerableSnapshot<T>(source);
}
Usage example:
private static IDictionary<Guid, Subscriber> _subscribers;
private static EnumerableSnapshot<Subscriber> _subscribersSnapshot;
//...(in the constructor)
_subscribers = new Dictionary<Guid, Subscriber>();
_subscribersSnapshot = _subscribers.Values.ToEnumerableSnapshot();
// ...(elsewere)
foreach (var subscriber in _subscribersSnapshot)
{
//...
}
Unfortunately this idea cannot be used currently with the class Dictionary in .NET Core 3.0, because this class does not throw a Collection was modified exception when enumerated and the methods Remove and Clear are invoked. All other containers I checked are behaving consistently. I checked systematically these classes:
List<T>, Collection<T>, ObservableCollection<T>, HashSet<T>, SortedSet<T>, Dictionary<T,V> and SortedDictionary<T,V>. Only the two aforementioned methods of the Dictionary class in .NET Core are not invalidating the enumeration.
Update: I fixed the above problem by comparing also the lengths of the cached and the original collection. This fix assumes that the dictionary will be passed directly as an argument to the EnumerableSnapshot's constructor, and its identity will not be hidden by (for example) a projection like: dictionary.Select(e => e).ΤοEnumerableSnapshot().
Important: The above class is not thread safe. It is intended to be used from code running exclusively in a single thread.
You can copy subscribers dictionary object to a same type of temporary dictionary object and then iterate the temporary dictionary object using foreach loop.
So a different way to solve this problem would be instead of removing the elements create a new dictionary and only add the elements you didnt want to remove then replace the original dictionary with the new one. I don't think this is too much of an efficiency problem because it does not increase the number of times you iterate over the structure.
Assuming the following case:
public HashTable map = new HashTable();
public void Cache(String fileName) {
if (!map.ContainsKey(fileName))
{
map.Add(fileName, new Object());
_Cache(fileName);
}
}
}
private void _Cache(String fileName) {
lock (map[fileName])
{
if (File Already Cached)
return;
else {
cache file
}
}
}
When having the following consumers:
Task.Run(()=> {
Cache("A");
});
Task.Run(()=> {
Cache("A");
});
Would it be possible in any ways that the Cache method would throw a Duplicate key exception meaning that both tasks would hit the map.add method and try to add the same key??
Edit:
Would using the following data structure solve this concurrency problem?
public class HashMap<Key, Value>
{
private HashSet<Key> Keys = new HashSet<Key>();
private List<Value> Values = new List<Value>();
public int Count => Keys.Count;
public Boolean Add(Key key, Value value) {
int oldCount = Keys.Count;
Keys.Add(key);
if (oldCount != Keys.Count) {
Values.Add(value);
return true;
}
return false;
}
}
Yes, of course it would be possible. Consider the following fragment:
if (!map.ContainsKey(fileName))
{
map.Add(fileName, new Object());
Thread 1 may execute if (!map.ContainsKey(fileName)) and find that the map does not contain the key, so it will proceed to add it, but before it gets the chance to add it, Thread 2 may also execute if (!map.ContainsKey(fileName)), at which point it will also find that the map does not contain the key, so it will also proceed to add it. Of course, that will fail.
EDIT (after clarifications)
So, the problem seems to be how to keep the main map locked for as little as possible, and how to prevent cached objects from being initialized twice.
This is a complex problem, so I cannot give you a ready-to-run answer that will work, (especially since I do not currently even have a C# development environment handy,) but generally speaking, I think that you should proceed as follows:
Fully guard your map with lock().
Keep your map locked as little as possible; when an object is not found to be in the map, add an empty object to the map and exit the lock immediately. This will ensure that this map will not become a point of contention for all requests coming in to the web server.
After the check-if-present-and-add-if-not fragment, you are holding an object which is guaranteed to be in the map. However, this object may and may not be initialized at this point. That's fine. We will take care of that next.
Repeat the lock-and-check idiom, this time with the cached object: every single incoming request interested in that specific object will need to lock it, check whether it is initialized, and if not, initialize it. Of course, only the first request will suffer the penalty of initialization. Also, any requests that arrive before the object has been fully initialized will have to wait on their lock until the object is initialized. But that's all very fine, that's exactly what you want.
I have the following Linq query:
IEnumerable<Network> net = from file in Directory.GetFiles(folder+ #"\network")
from lines in File.ReadLines(file).Skip(1)
let row = lines.Split(',')
select new Network
{
networkname = getnetwork(row),
...
networkdate = getnetworkdate(row)
};
when I use toList() on it, the list is empty but when I use a foreach loop where I use an empty List and add each item, it is not. Did I made a mistake in this query or could there be different source for this strange behavior?
UPDATE:
I am using it in a extension method like this:
This does not work:
public static void FillFromCsv(this List<Network> network)
{
[QUERY HERE]
network = net.toList();
}
This works:
public static void FillFromCsv(this List<Network> network)
{
[QUERY HERE]
network.Clear();
Foreach (Network n in net)
{
network.add(n);
}
}
Your problem has nothing to do with foreach vs ToList. The problem is that in the first method, you are not changing the list that's passed in. You're overwriting the local reference with a new reference:
public static void FillFromCsv(this List<Network> network)
{
[QUERY HERE]
// this only affects the _local_ "network" reference, not the reference passed in
network = net.toList();
}
You could change the parameter to a ref parameter, but why use an extension method at all? Why have the caller pass in a list if you're just going to blow it away? I would just do
public static IEnumerable<Network> FillFromCsv()
{
[QUERY HERE]
return net.ToList();
}
In the second method, you're clearing the list instance that's passed in and adding the results to it. Which is fine, but it's different that what you're doing in the first method.
You could avoid the foreach by just doing:
public static void FillFromCsv(this List<Network> network)
{
[QUERY HERE]
network.Clear();
network.AddRange(net);
}
But it seems odd to have the caller pass you a list and you clear it out and refill it. A better method would be to just return a list (like the fist suggestion).
I am getting users and their data from external webservice. I cache those items because I don't want to hit web service every time. Now, If user update any of their information, I am saving it through webservice. But I don't want to get the latest data from web service as it takes lot of time. Instead I want to update my cache. Can I do that ? If so, what would be the best way ? Here is my Code
List<User> users = appSecurity.SelectUsers();
var CacheKey = string.Format("GetUserList_{0}", currentUser);
CacheFactory.AddCacheItem(CacheKey, users, 300);
CacheFactory is a class where I handle Adding, Clearing and Removing cache. Below is the code
public static void RemoveCacheItem(string key)
{
Cache.Remove(key);
}
public static void ClearCache()
{
System.Collections.IDictionaryEnumerator enumerator = Cache.GetEnumerator();
while (enumerator.MoveNext())
{
RemoveCacheItem(enumerator.Key.ToString());
}
}
public static void AddCacheItem<T>(string key, T value, double timeOutInSeconds)
{
var Item = GetCacheItem<T>(key);
if (Item != null)
{
RemoveCacheItem(key);
Item = value;
}
Cache.Insert(key, value, null, DateTime.Now.AddSeconds(timeOutInSeconds), System.Web.Caching.Cache.NoSlidingExpiration);
}
The answer is yes, it can be done. It can also be done in many different ways depending on what you want to solve. At the basic level you can create a cache by using a List<T> or Dictionary<T,T> to store your data.
When you get information from the external web-service, you push the data into your List or Dictionary. You can then use that data throughout your application. When you need to update that cache, you update the value in the List/Dictionary.
You can update your dictonary like so
Dictionary<string, int> list = new Dictionary<string, int>();
then you can set the value for the key "test" as follows
list["test"] = list["test"] + 1;
When you are ready to push the updated data to the external source. All you need to do is properly parse that data into the format the source is expecting and send away.
Like I said there are many different ways to do this, but this is a basic sample way to accomplishing it. You can use this example to build off and go from there.