I'm trying to make my application thread safe. I hold my hands up and admit I'm new to threading so not sure what way to proceed.
To give a simplified version, my application contains a list.
Most of the application accesses this list and doesn't change it but
may enumerate through it. All this happens on the UI thread.
Thread
one will periodically look for items to be Added and Removed from the
list.
Thread two will enumerate the list and update the items with
extra information. This has to run at the same time as thread one as
can take anything from seconds to hours.
The first question is does anyone have a recommend stragy for this.
Secondly I was trying to make seperate copies of the list that the main application will use, periodically getting a new copy when something is updated/added or removed, but this doesn't seem to be working.
I have my list and a copy......
public class MDGlobalObjects
{
public List<T> mainList= new List<T>();
public List<T> copyList
{
get
{
return new List<T>(mainList);
}
}
}
If I get copyList, modify it, save mainlist, restart my application, load mainlist and look again at copylist then the changes are present. I presume I've done something wrong as copylist seems to still refer to mainlist.
I'm not sure if it makes a difference but everything is accessed through a static instance of the class.
public static MDGlobalObjects CacheObjects = new MDGlobalObjects();
This is the gist using a ConcurrentDictionary:
public class Element
{
public string Key { get; set; }
public string Property { get; set; }
public Element CreateCopy()
{
return new Element
{
Key = this.Key,
Property = this.Property,
};
}
}
var d = new ConcurrentDictionary<string, Element>();
// thread 1
// prune
foreach ( var kv in d )
{
if ( kv.Value.Property == "ToBeRemoved" )
{
Element dummy = null;
d.TryRemove( kv.Key, out dummy );
}
}
// thread 1
// add
Element toBeAdded = new Element();
// set basic properties here
d.TryAdd( toBeAdded.Key, toBeAdded );
// thread 2
// populate element
Element unPopulated = null;
if ( d.TryGetValue( "ToBePopulated", out unPopulated ) )
{
Element nowPopulated = unPopulated.CreateCopy();
nowPopulated.Property = "Populated";
// either
d.TryUpdate( unPopulated.Key, nowPopulated, unPopulated );
// or
d.AddOrUpdate( unPopulated.Key, nowPopulated, ( key, value ) => nowPopulated );
}
// read threads
// enumerate
foreach ( Element element in d.Values )
{
// do something with each element
}
// read threads
// try to get specific element
Element specific = null;
if ( d.TryGetValue( "SpecificKey", out specific ) )
{
// do something with specific element
}
In thread 2, if you can set properties so that the whole object is consistent after each atomic write, then you can skip making a copy and just populate the properties with the object in place in the collection.
There are a few race conditions in this code, but they should be benign in that readers always have a consistent view of the collection.
actly copylist is just a shallow copy of the mainList. the list is new but the refrences of the objects contained in the list are still the same. to achieve what you are trying to you have to make a deep copy of the list
something like this
public static IEnumerable<T> Clone<T>(this IEnumerable<T> collection) where T : ICloneable
{
return collection.Select(item => (T)item.Clone());
}
and use it like
return mainList.Clone();
looking at your ques again.. i would like to suggest an overall change of approach.
you should use ConcurrentDictionary() as you are using .Net 4.0. in that you wont hav eto use locks as a concurrent collection always maintains a valid state.
so your code will look something like this.
Thread 1s code --- <br>
var object = download_the_object();
dic.TryAdd("SomeUniqueKeyOfTheObject",object);
//try add will return false so implement some sort of retry mechanism
Thread 2s code
foreach(var item in Dictionary)
{
var object item.Value;
var extraInfo = downloadExtraInfoforObject(object);
//update object by using Update
dictionary.TryUpdate(object.uniqueKey,"somenewobjectWithExtraInfoAdded",object);
}
Related
I have a list of integers (Levels). I want to initialize a nested Object of Filter called myFilter as below(Filter is a class with two properties: Value and NextFilter):
var myFilter = new Fliter{
Value = Levels[0],
NextFilter = new Filter{
Value = Levels[1],
NextFilter = new Filter{
Value = Levels[2],
NextFilter = new Filter{
Value = Levels[n],
NextFilter = null
}
}
}
}
Level's count is not static and depends on the input list (I have a multi select list that generates Level)
How can I do that?
This is a classic event for using - the technique of a method that calls itself:
public static Filter CreateFilter(List<int> values) => values.Any() ? new Filter //If the list contains elements, create the filter
{
Value = values.First(), //assign the first item of the values to the value property
NextFilter = CreateFilter(values.Skip(1).ToList()) //Create the rest of the nested object with the rest of the values
} : null; //If there aren't any items left in the list, return null and stop the recursion
You could of course do it in the constructor as well:
public Filter(List<int> values)
{
if (!values.Any()) return;
Value = values.First();
NextFilter = values.Count > 1 ? new Filter(values.Skip(1).ToList()) : null;
}
For more information about recursion, take a look at this: https://www.dotnetperls.com/recursion, for more information on nested classes read through this: https://www.dotnetperls.com/nested-class.
A few more information on recursion:
You can actually achieve everything through recursion - you don't even need loops. That's the reason why in languages like Haskell loops don't exist.
The simplest recursive function is:
public static void EndlessLoop()
{
//Loop body
EndlessLoop();
}
However, even Resharper suggests to convert it to a loop:
Another example, if you want to get the sum of a list you could do:
public static int Sum(List<int> summands) => summands.Count > 0
? summands.First() + Sum(summands.Skip(1).ToList())
: 0;
But those examples aren't useful in C#, as C# isn't a functional programming language, which causes recursion to be slower than loops. Furthermore recursion often causes a StackOverflowException (fitting to this site). If you run the endless loop recursion, it doesn't even take a second till your stack is full.
The reason for this is, that C# adds the address, from which a method got called, to the stack. If a method is called very often (and in 1 second a lot of recursive calls are made) a lot of addresses are added to the stack, so that it overflows.
However I still think, even though those examples aren't useful in c#, that it's quite useful to be able to handle recursion. Recursion is for example the only way to explore a directory structure, for getting for example all files:
public static List<FileInfo> GetAllFiles(DirectoryInfo directory) => directory.GetFiles()
.Concat(directory.GetDirectories().SelectMany(GetAllFiles))
.ToList();
And, as you experienced, it's the only way to fill a nested class from a list properly.
Just make a constructor of Filter, that will get Levels array as a parameter, that will set it's Value as level[0], and init NextFilter = new Filter(level.Skip(1)). Something like that. And it will recursively initialize your object.
I have a simple static inventory class which is a list of custom class Item. I am working on a crafting system and when I craft something I need to remove the required Items from my inventory list.
I tried to create a method that I can call which takes an array of the items to remove as a parameter, but its not working.
I think its because the foreach loop doesn't know which items to remove? I am not getting an error messages, it just doesn't work. How can I accomplish this?
public class PlayerInventory: MonoBehaviour
{
public Texture2D tempIcon;
private static List<Item> _inventory=new List<Item>();
public static List<Item> Inventory
{
get { return _inventory; }
}
public static void RemoveCraftedMaterialsFromInventory(Item[] items)
{
foreach(Item item in items)
{
PlayerInventory._inventory.Remove(item);
}
}
}
Here is the function that shows what items will be removed:
public static Item[] BowAndArrowReqs()
{
Item requiredItem1 = ObjectGenerator.CreateItem(CraftingMatType.BasicWood);
Item requiredItem2 = ObjectGenerator.CreateItem(CraftingMatType.BasicWood);
Item requiredItem3 = ObjectGenerator.CreateItem(CraftingMatType.String);
Item[] arrowRequiredItems = new Item[]{requiredItem1, requiredItem2, requiredItem3};
return arrowRequiredItems;
}
And here is where that is called:
THis is within the RecipeCheck static class:
PlayerInventory.RemoveCraftedMaterialsFromInventory(RecipeCheck.BowAndArrowReqs());
While I like Jame's answer (and it sufficiently covers the contracts), I will talk on how one might implement this equality and make several observations.
For starts, in the list returned there may be multiple objects of the same type - e.g. BasicWood, String. Then there needs to be a discriminator used for each new object.
It would be bad if RemoveCraftedMaterialsFromInventory(new [] { aWoodBlock }) to remove a Wood piece in the same way that two wood pieces were checked ("equals") to each other. This is because being "compatible for crafting" isn't necessarily the same as "being equals".
One simple approach is to assign a unique ID (see Guid.NewGuid) for each specific object. This field would be used (and it could be used exclusively) in the Equals method - however, now we're back at the initial problem, where each new object is different from any other!
So, what's the solution? Make sure to use equivalent (or identical objects) when removing them!
List<Item> items = new List<Item> {
new Wood { Condition = Wood.Rotten },
new Wood { Condition = Wood.Epic },
};
// We find the EXISTING objects that we already have ..
var woodToBurn = items.OfType<Wood>
.Where(w => w.Condition == Wood.Rotten);
// .. so we can remove them
foreach (var wood in woodToBurn) {
items.Remove(wood);
}
Well, okay, that's out of the way, but then we say: "How can we do this with a Recipe such that Equals isn't butchered and yet it will remove any items of the given type?"
Well, we can either do this by using LINQ or a List method that supports predicates (i.e. List.FindIndex) or we can implement a special Equatable to only be used in this case.
An implementation that uses a predicate might look like:
foreach (var recipeItem in recipeItems) {
// List sort of sucks; this implementation also has bad bounds
var index = items.FindIndex((item) => {
return recipeItem.MaterialType == item.MaterialType;
});
if (index >= 0) {
items.RemoveAt(index);
} else {
// Missing material :(
}
}
If class Item doesn't implement IEquatable<Item> and the bool Equals(Item other) method, then by default it will use Object.Equals which checks if they are the same object. (not two objects with the same value --- the same object).
Since you don't say how Item is implemented, I can't suggest how to write it's Equals(), however, you should also override GetHashCode() so that two Items that are Equal return the same hash code.
UPDATE (based on comments):
Essentially, List.Remove works like this:
foreach(var t in theList)
{
if (t.Equals(itemToBeRemove))
PerformSomeMagicToRemove(t);
}
So, you don't have to do anything to the code you've given in your question. Just add the Equals() method to Item.
I need to store a collection of nodes:
class Node
{
int Value;
//other info
}
I have three requirements:
Need to be able to efficiently retrieve the node with the lowest Value in the collection
Need to be able to efficiently insert a node into the collection
Two nodes can have the same Value
I thought the best collection to use for this would be some sort of sorted list. That way requirement #1 is satisfied efficiently by just taking the first element from the sorted list. Requirement #2 is satisfied efficiently by inserting a new node in the right place in the list.
But the SortedList collection in .Net is like SortedDictionary and requires the key being sorted on to be unique, which violates requirement #3.
There appears to be no collection in .Net that satisfies these requirements, mainly because the self-sorting collections that do exist require keys being sorted on to be unique. What is the reason for this? I assume it cannot be an oversight. What am I not grasping here? I can find similar questions about this but they usually involve someone suggesting SortList, followed by realizing this doesn't work, and then the conversation fades out without a standard solution. At least if someone would say "There is no collection in C# for this task, you need to hack something together" that would be an answer.
Is it acceptable to use a regular List<Node> and re-sort the list whenever a new node is added? Seems like that wouldn't be as efficient as inserting the node in the right place to begin with. Perhaps that is what I should do? Manually iterate over the list until I find the place to insert a new node myself?
If all you need is to efficiently insert, and quickly retrieve the item with the lowest value, then you don't need a sorted list. You need a heap. Check out A Generic Binary Heap Class.
Make your list_key unique by adding the object id or another unique identifier: IDs 4 and 5, both having value "1" will become "1_4" and "1_5", which can be added to the sorted List without trouble and will be sorted as expected.
You could use a SortedList<int, List<NodeInfo>>, where you'll put the Value in the key and all the other properties in the value:
public class NodeList : SortedList<int, List<NodeInfo>>
{
public void Add(int key, NodeInfo info)
{
if (this.Keys.Contains(key))
{
this[key].Add(info);
}
else
{
this.Add(key, new List<NodeInfo>() { info } );
}
}
public NodeInfo FirstNode()
{
if (this.Count == 0)
return null;
return this.First().Value.First();
}
}
public class NodeInfo
{
public string Info { get; set; }
// TODO: add other members
}
Here's some sample usage:
var list = new NodeList();
// adding
list.Add(3, new NodeInfo() { Info = "some info 3" });
// inserting
for (int i = 0; i < 100000; i++)
{
list.Add(1, new NodeInfo() { Info = "some info 1" });
list.Add(2, new NodeInfo() { Info = "some info 2" });
list.Add(1, new NodeInfo() { Info = "some info 1.1" });
}
// retrieving the first item
var firstNodeInfo = list.FirstNode();
// retrieving an item
var someNodeInfo = list[2].First();
In my opinion, it is acceptable to use a normal list and re-sort it after every insert. Sorting is pretty efficient in .NET. See this thread : String sorting performance degradation in VS2010 vs. VS2008
You can use OrderedMultiDictionary in Wintellect's Power Collections for .NET. That's exactly what you are looking for.
As far as Thread Safety goes is this ok to do or do I need to be using a different collection ?
List<FileMemberEntity> fileInfo = getList();
Parallel.ForEach(fileInfo, fileMember =>
{
//Modify each fileMember
}
As long as you are only modifying the contents of the item that is passed to the method, there is no locking needed.
(Provided of course that there are no duplicate reference in the list, i.e. two references to the same FileMemberEntity instance.)
If you need to modify the list itself, create a copy that you can iterate, and use a lock when you modify the list:
List<FileMemberEntity> fileInfo = getList();
List<FileMemberEntity> copy = new List<FileMemberEntity>(fileInfo);
object sync = new Object();
Parallel.ForEach(copy, fileMember => {
// do something
lock (sync) {
// here you can add or remove items from the fileInfo list
}
// do something
});
You're safe since you are just reading. Just don't modify the list while you are iterating over its items.
We should use less lock object to make it faster. Only lock object in different local threads of Parrallel.ForEach:
List<FileMemberEntity> copy = new List<FileMemberEntity>(fileInfo);
object sync = new Object();
Parallel.ForEach<FileMemberEntity, List<FileMemberEntity>>(
copy,
() => { return new List<FileMemberEntity>(); },
(itemInCopy, state, localList) =>
{
// here you can add or remove items from the fileInfo list
localList.Add(itemInCopy);
return localList;
},
(finalResult) => { lock (sync) copy.AddRange(finalResult); }
);
// do something
Reference: http://msdn.microsoft.com/en-gb/library/ff963547.aspx
If it does not matter what order the FileMemberEntity objects are acted on, you can use List<T> because you are not modifying the list.
If you must ensure some sort of ordering, you can use OrderablePartitioner<T> as a base class and implement an appropriate partitioning scheme. For example, if the FileMemberEntity has some sort of categorization and you must process each of the categories in some specific order, you would want to go this route.
Hypothetically if you have
Object 1 Category A
Object 2 Category A
Object 3 Category B
there is no guarantee that Object 2 Category A will be processed before Object 3 Category B is processed when iterating a List<T> using Parallel.ForEach.
The MSDN documentation you link to provides an example of how to do that.
I have a queue of users(string of emails) a in c# and I want to send the user his location in this queue.
something like ;
Queue q = new Queue(32);
q.Enqueue(Session["email"].ToString());
queue.IndexOf(email);
Any ideas?
thanks
Maybe a List or an Array would be better for such actions but you could try this:
queue.ToArray().ToList().IndexOf(email);
You can use extension method, something like:
public static int IndexOf<T>(this IEnumerable<T> collection, T searchItem)
{
int index = 0;
foreach (var item in collection)
{
if (EqualityComparer<T>.Default.Equals(item, searchItem))
{
return index;
}
index++;
}
return -1;
}
Queue is not the proper type to use IndexOf, look for List
Unfortunately, you cannot straightly use the plain old .NET Queue object. Queue is made for "blind" first-in-first-out logic, so that you cannot perform anything else but that.
If you really need to implement a queue in which you can find elements and retrieve their position (a very useful thing) try to wrap everything in a class that exposes the following methods:
public class CustomQueue<T> {
private LinkedList<T> fifoList = new LinkedList<T>();
public Enqueue(T newItem) {
//add newItem at the head of fifoList
}
public T Dequeue() {
//return and remove the item that is located at the tail of the queue
}
public int indexOf(T searchFor) {
int ret = 0;
for (T item: fifoList) {
if (item.equals(searchFor)) return ret;
ret++;
}
}
}
For better performance (queue and dequeue O(1) while indexOf O(n)) you should use a double-linked list
if you want to let the user now how many elements are behins his element, simply return the current queue .Count property, after inserting his elements. Whenever you push an elemtn, the count is increased. If an element is popped, the count is decreased.
Use the Queue's ToArray() method to get an array in the order of the queue, then find the object you are searching for. There's a good chance you don't need to use a traditional queue for whatever task you are performing though.
Something like:
Queue q = new Queue();
q.Enqueue("apple");
q.Enqueue("banana");
q.Enqueue("orange");
// get banana index:
return Array.IndexOf(q.ToArray(), "banana");
Spanish Inquisition
Inspired by Monty Python sketch you could swipe through the entire queue of suspects and dequeue and enqueue each item once. And while you are at it you can use a lambda function to kind of keep the dequeuing within the enquing operation
//The queue
var inquisition = new Queue<string>();
//Suspects
inquisition.Enqueue("SUSPECT_A");
inquisition.Enqueue("SUSPECT_B");
inquisition.Enqueue("SUSPECT_C");
//Interrogation lambda function noting particular suspect before returned
Func<string, string> interrogate = (suspect) => {
Console.WriteLine(suspect + (suspect.Equals("SUSPECT_B") ? " <---" : ""));
return suspect;
};
//Processing each suspect in the list
for(var i=0;i<inquisition.Count;i++){
inquisition.Enqueue(interrogate(inquisition.Dequeue()));
}
The result is a list of suspects where dubious ones are marked with an arrow
SUSPECT_A
SUSPECT_B <---
SUSPECT_C
I know this is an older thread, but was relevant to me. In my case the Queue object was perfect for the job it was intended to do, but I had a separate process that reports a status of the objects in queue and I wanted to report the queue position for each queued item. Here was my solution and this illustrates how converting a queue to a list can be helpful. Once converted to a list, you can perform Linq queries on it as well. In this case, I needed to find an export object by its status property:
Dim ExpQueueList As List(Of Export) = ExportQueue.ToList()
For Idx As Integer = 0 To ExportStatuses.Count - 1
Dim Status As ExportStatus = ExportStatuses(Idx)
'Find the export that this status belongs to in the queued export list
Dim ExpObj As Object = (From E As Export In ExpQueueList Where E.Status Is Status Select E).FirstOrDefault()
'If the export was found in the queue list, set the status text to indicate the position in the queue
If ExpObj IsNot Nothing Then
Status.StatusText = "In Queue to run - Queue Position: " & ExpQueueList.IndexOf(ExpObj)
End If
ExpObj = Nothing
Status = Nothing
Next
Since you're enqueing the user, he will always be the last person in the list, which means it will be equivalent to queue.Count.