Fix-sized Concurrent list - c#

I have implemented a simple task to create a fixed sized list that allows concurrent writes and can dump the latest snapshot of items in the list at any time.
Here is my implementation. The offset will increase atomically for each thread and reset if reaches to the size of the list. Different threads should have isolated access to each section of the array.
My question is when I call Dump(), the first few items are not stored in the list. Also, is there a Interlocked function that can do both atomic increase and reset, so I don't have to create a locker object and a lock block? Thanks.
public static void Main(string[] args)
{
ConcurrentCircularFixedList<int> list = new ConcurrentCircularFixedList<int>(20);
Enumerable.Range(1, 30).AsParallel().Select(nu => list.Enqueu(nu)).ToList();
}
public class ConcurrentCircularFixedList<T>
{
private int _size;
private int _offset;
private sealed object _locker = new Object();
privateT[] _list;
public ConcurrentCircularFixedList(int size)
{
_size = size;
_offset = 0;
_list = new T[_size];
}
public int Enqueu(T item)
{
_list[_offset] = item;
lock(_locker)
{
Debug.Write("B " + _offset);
_offset += 1;
if(_offset == _size)
_offset = 0;
Debug.Write("A " + _offset + "\n");
}
return _offset;
}
public T[] Dump()
{
return _list.ToArray();
}
}

Here's a small version of a lock-free list that copies on write. The performance characteristics should be clearly understood before using it. It's expensive when you have many writers or the list is large. Reads are synchronization free since the list is effectively immutable. This could be improved in various ways of course but you get the idea. In effect it sacrifices some memory pressure and slower writes for having zero cost reads.
public class CopyWriteList<T>
{
private volatile List<T> list;
public CopyWriteList()
{
list = new List<T>();
}
public CopyWriteList(int capacity)
{
list = new List<T>(capacity);
}
public T this[int index]
{
get { return list[index]; }
set { Replace(x => x[index] = value); }
}
public void Clear()
{
Replace(x => x.Clear());
}
public void Add(T item)
{
Replace(x => x.Add(item));
}
//Etc....
private void Replace(Action<List<T>> action)
{
List<T> current;
List<T> updated;
do
{
current = list;
updated = new List<T>(current);
action(updated);
} while (Interlocked.CompareExchange(ref list, updated, current) != current);
}
public List<T> GetSnapshot()
{
return list;
}
}
Alternatively here's a fixed version of your code. Note that there is added contention between both readers and writers. Performance could suffer because of it (like the ever expensive context switching).
public class ConcurrentCircularFixedList<T>
{
private readonly int _size;
private int _offset;
private readonly object _locker = new Object();
private readonly T[] _list;
public ConcurrentCircularFixedList(int size)
{
_size = size;
_offset = 0;
_list = new T[_size];
}
public int Enqueue(T item)
{
lock (_locker)
{
_list[_offset] = item;
Debug.Write("B " + _offset);
_offset += 1;
if (_offset == _size)
_offset = 0;
Debug.Write("A " + _offset + "\n");
return _offset;
}
}
public T[] Dump()
{
lock (_locker)
return _list.ToArray();
}
}

Related

Reusable list which maintains memory

I'm looking for a List<T> type class in .NET which behaves similar to List<T> but doesn't de-allocate its memory when Clear() is called - only resets the Size property.
My aim is to use this class in a memory pool, so I want the memory to be maintained, but have the caller use the class as though it were a standard list, but to avoid lots of memory re-allocations.
If this already exists please let me know as it will save time optimizing, testing and debugging this code.
Here is a mockup of what I'm hoping to find in the .NET library:
public class ReusableList<T>
{
#region Static Properties
private static long InitialCapacity = 1000000;
private static int CapacityIncreaseRate = 10000;
#endregion
#region Properties
public long Size
{
get
{
return this._size;
}
private set
{
this._size = 0;
}
}
private long _size = 0;
private long RealSize
{
get
{
return this._realSize;
}
set
{
this._realSize = value;
}
}
private long _realSize = 0;
private T[] Data
{
set
{
this._data = value;
}
get
{
return this._data;
}
}
private T[] _data = null;
#endregion
#region Operators
public T this[long index]
{
get
{
return this.Data[index];
}
set
{
this.Data[index] = value;
}
}
#endregion
#region Public Methods
public ReusableList()
{
this.Rebuild();
}
public void Add(T item)
{
this.Data[this.Size] = item;
this._size++;
if (this.Size >= this.RealSize)
{
this.IncreaseSizeOfList();
}
}
public void Clear()
{
this.Size = 0;
}
#endregion
#region Private Methods
private void Rebuild()
{
this.Data = null;
this.Data = new T[ReusableList<T>.InitialCapacity];
this.Size = 0;
this.RealSize = ReusableList<T>.InitialCapacity;
}
private void IncreaseSizeOfList()
{
if (this.Size < this.RealSize)
return;
var newData = new T[this.RealSize + ReusableList<T>.CapacityIncreaseRate];
Array.Copy(this.Data, newData, this.RealSize);
this.Data = newData;
this.RealSize += ReusableList<T>.CapacityIncreaseRate;
}
#endregion
}
As far as I understand, this is default behavior of List<T>.
When you add items to list, it allocates new memory, if needed. When you remove items (or even clear list at all), it neither "frees" memory nor decreases size of internal array.
The only way to decrease internal array is to decrease Capacity.
You can look into source code yourself. E.g., here's Clear method:
public void Clear()
{
if (_size > 0)
{
Array.Clear(_items, 0, _size);
_size = 0;
}
_version++;
}
As you can see, here's only setting array items to defaults and setting size to 0.
You can store a backup of internal list.
public class ReusableList<T> : List<T>
{
private List<T> backup;
public void Clear(bool purge)
{
if (purge)
backup?.Clear();
else
backup = this.ToList();
base.Clear();
}
public new void Clear()
{
this.Clear(false);
}
}

Deadlock in object pool class

I'm experimenting with threading in C#, and I've created the following class as a result. I've tried to avoid any cases of race conditions, yet a deadlock occurs on use.
The class uses two different locks, one spinlock for straightforward operations, and additionally a Monitor lock to wait in case no object is ready. I originally used EventWaitHandle, but discovered that race conditions were inevitable due to WaitOne/Set precedence.
Note that Monitor.Pulse could not precede Monitor.Wait, so what else could cause a deadlock? In the case where 5 threads use a TestPool class with a capacity of 4, the deadlock always occurs at SpinLock at an irregular moment.
internal class TestPool<T> where T : class
{
private int capacity;
private int unitPos;
private int waitUnitPos;
private int waitCount;
private int lockState;
private object lockObj;
private T[] units;
private Func<T> unitFactory;
public TestPool(int capacity, Func<T> unitFactory)
{
this.lockObj = new object();
this.unitFactory = unitFactory;
Init(capacity);
}
public T Fetch()
{
T unit;
Lock();
unit = (unitPos != capacity) ? units[unitPos++] : Wait();
Unlock();
return unit;
}
public void Store(T unit)
{
Lock();
if (waitCount == 0)
{
units[--unitPos] = unit;
}
else
{
Pulse(unit);
}
Unlock();
}
private T Wait()
{
waitCount++;
lock (lockObj)
{
Unlock();
Monitor.Wait(lockObj);
Lock();
return units[--waitUnitPos];
}
}
private void Pulse(T unit)
{
waitCount--;
units[waitUnitPos++] = unit;
lock (lockObj)
{
Monitor.Pulse(lockObj);
}
}
private void Lock()
{
if (Interlocked.CompareExchange(ref lockState, 1, 0) != 0)
{
SpinLock();
}
}
private void SpinLock()
{
SpinWait spinWait = new SpinWait();
do
{
spinWait.SpinOnce();
}
while (Interlocked.CompareExchange(ref lockState, 1, 0) != 0);
}
private void Unlock()
{
Interlocked.Exchange(ref lockState, 0);
}
private void Init(int capacity)
{
T[] tx = new T[capacity];
for (int i = 0; i < capacity; i++)
{
tx[i] = unitFactory.Invoke();
}
units = tx;
this.capacity = capacity;
}
}
Fixed it. I had to place the following code outside the Monitor lock.
Lock();
return units[--waitUnitPos];

Set maximum value for the progress bar to match length of queue.

I would like to visualize how a queue when accessed by two reader threads and a writer thread grows and shrinks, with help of a progressbar in mainform. I will use a delegate to invoke the progress bar in main form and set its value to Queue.Value ( Toys.lenght). The progressbar does not behave as stated, it does not grow all the way and the lenght variables does not either.
public class Buffer
{
private delegate void Display(int v, ProgressBar f );
private Queue<Toy> Toys = new Queue<Toy>();
private object MyLock = new object();
private int max;
private int lenght;
private ProgressBar progressbar1;
public Buffer(ProgressBar r)
{
this.progressbar1 = r;
this.max = 10;
this.lenght = Toys.Count;
}
public void writeMethod(Toy toy)
{
lock (MyLock)
{
if (Toys.Count == max)
{
Monitor.Wait(MyLock);
}
Toys.Enqueue(toy);
Monitor.PulseAll(MyLock);
progressbar1.Invoke(new Display(Disp), new object[] {Toys.Count, progressbar1});
MessageBox.Show("Que contains these items" + lenght);
}
}
public void readMethod()
{
lock (MyLock)
{
if(Toys.Count == 0)
{
Monitor.Wait(MyLock);
}
Toys.Dequeue();
Monitor.PulseAll(MyLock);
}
}
public void Disp(int I, ProgressBar l)
{
progressbar1.Value = I;
}
}
}
Double check the variable lenght, use count instead. Change Progressbar default setting for maximum size from 100 to 10 to match size of the queue Toys. In Disp method change from = I to += I;
{
public class Buffer
{
private delegate void Display(int v, ProgressBar f );
private Queue<Toy> Toys = new Queue<Toy>();
private object MyLock = new object();
private int max;
private int lenght;
private ProgressBar progressbar1;
public Buffer(ProgressBar r)
{
this.progressbar1 = r;
this.max = 10;
this.lenght = Toys.Count;
}
public void writeMethod(Toy toy)
{
lock (MyLock)
{
if (Toys.Count >= max)
{
Monitor.Wait(MyLock);
}
if(Toys.Count <= max)
{
Toys.Enqueue(toy);
progressbar1.Invoke(new Display(Disp), new object[] {Toys.Count, progressbar1});
}
Monitor.PulseAll(MyLock);
MessageBox.Show("Que contains these items" + Toys.Count);
}
}
public void readMethod()
{
lock (MyLock)
{
if(Toys.Count == 0)
{
Monitor.Wait(MyLock);
}
Toys.Dequeue();
Monitor.PulseAll(MyLock);
}
}
public void Disp(int I, ProgressBar l)
{
progressbar1.Value += I;
}
}
}

Multithreading sharing counter and List collection variables

Scenario, I am having multiple threads trying to sharing a static global variable counter.
After which I would add it into a List of integers and this list would be used in another thread to check out some details.
I realized after even using LOCK on the global variable counter, I still get duplicate numbers
Please pardon my explanation, codes would speak more.
Problem would be different threads may be generated a same counter value( which I don't want).
I want a running number without duplicates
class Test
{
private Object _thisLock = new Object();
List<int> listing = new List<int>(); //shared LIST
public void Main()
{
//array of threads
for (int i = 0; i < 8; i++)
{
Thread th = new Thread(Work);
th.Name = "Thread" + i;
th.Start();
}
Thread.Sleep(5000);
//Start checking for duplicates
Thread checker = new Thread(Checker);
checker.Start();
}
private void Work()
{
Object _thisLock = new Object();
while (true)
{
int a = Singleton.Instance.Counter++;
Console.WriteLine(Thread.CurrentThread.Name);
Console.WriteLine("WOrk : " + a);
lock (_thisLock)
{
listing.Add(a);
}
Thread.Sleep(1000);
}
}
private void Checker()
{
Object _thisLock = new Object();
while (true)
{
lock (_thisLock)
{
List<int> selflist = new List<int>();
selflist.AddRange(listing); ;
foreach (int p in selflist)
{
if (selflist.FindAll(item => item.Equals(p)).Count() > 1)
{
Console.WriteLine("Check!!!!!!!!!!!!!!!!!! : " + p);
}
}
}
Thread.Sleep(5000);
}
}
}
static void Main()
{
Test t = new Test();
t.Main();
}
public sealed class Singleton
{
private static volatile Singleton instance;
private static object syncRoot = new Object();
private readonly Object _thisLock = new Object();
private Singleton() { }
public static Singleton Instance
{
get
{
if (instance == null)
{
lock (syncRoot)
{
if (instance == null)
instance = new Singleton();
}
}
return instance;
}
}
private volatile static int _counter;
public int Counter
{
get
{
lock (_thisLock)
{
return _counter;
}
}
set
{
lock (_thisLock)
{
_counter = value;
}
}
}
}
In your Work method every thread have it's own lock object _thisLock.
Remove this statement in your work method and let it use private lockobject of the class:
Object _thisLock = new Object();
Why not just move the counter into the lock and make the lock shared by moving it to the class level?
private object _thisLock = new object();
...
lock (_thisLock)
{
int a = Singleton.Instance.Counter++;
listing.Add(a);
}
Also, use a thread safe collection type, like ConcurrentBag.

How to remove items from a generic list after N minutes?

I have a list as below:
private List<DateTime> _result = new List<DateTime();
and I add values to it like
_result.Add(DateTime.Now);
The requirement is that each item which is added should be removed from the list within 5 minutes deadline.
I was thinking I could create a Timer which checks my list every e.g. 1 minute and find old items and remove them but I hoped there could be an easier way?
How to implement it?
Thanks
Here's my take on this:
public class DateWrapper
{
private ConcurrentBag<DateWrapper> list;
private DateTime time;
public DateTime Time
{
get { return time; }
}
private Timer timer;
public DateWrapper(ConcurrentBag<DateWrapper> _list, DateTime _time)
{
list = _list;
time = _time;
list.Add(this);
timer = new Timer();
timer.Interval = 300000; // 5 Minutes
timer.Tick += new EventHandler(Tick);
timer.Start();
}
private void Tick(object sender, EventArgs e)
{
list.Remove(this);
}
}
The above work for small list of item. With a too big list, you get too many timer... and performance would get hurt.
So, if you have to handle lot of items, here's a generic way to do it:
public class ExpirableList<T> : IList<T>
{
private volatile List<Tuple<DateTime, T>> collection = new List<Tuple<DateTime,T>>();
private Timer timer;
public int Interval
{
get { return timer.Interval; }
set { timer.Interval = value; }
}
private TimeSpan expiration;
public TimeSpan Expiration
{
get { return expiration; }
set { expiration = value; }
}
/// <summary>
/// Define a list that automaticly remove expired objects.
/// </summary>
/// <param name="_interval"></param>
/// The interval at which the list test for old objects.
/// <param name="_expiration"></param>
/// The TimeSpan an object stay valid inside the list.
public ExpirableList(int _interval, TimeSpan _expiration)
{
timer = new Timer();
timer.Interval = _interval;
timer.Tick += new EventHandler(Tick);
timer.Start();
expiration = _expiration;
}
private void Tick(object sender, EventArgs e)
{
for (int i = collection.Count - 1; i >= 0; i--)
{
if ((DateTime.Now - collection[i].Item1) >= expiration)
{
collection.RemoveAt(i);
}
}
}
#region IList Implementation
public T this[int index]
{
get { return collection[index].Item2; }
set { collection[index] = new Tuple<DateTime, T>(DateTime.Now, value); }
}
public IEnumerator<T> GetEnumerator()
{
return collection.Select(x => x.Item2).GetEnumerator();
}
IEnumerator IEnumerable.GetEnumerator()
{
return collection.Select(x => x.Item2).GetEnumerator();
}
public void Add(T item)
{
collection.Add(new Tuple<DateTime, T>(DateTime.Now, item));
}
public int Count
{
get { return collection.Count; }
}
public bool IsSynchronized
{
get { return false; }
}
public bool IsReadOnly
{
get { return false; }
}
public void CopyTo(T[] array, int index)
{
for (int i = 0; i < collection.Count; i++)
array[i + index] = collection[i].Item2;
}
public bool Remove(T item)
{
bool contained = Contains(item);
for (int i = collection.Count - 1; i >= 0; i--)
{
if ((object)collection[i].Item2 == (object)item)
collection.RemoveAt(i);
}
return contained;
}
public void RemoveAt(int i)
{
collection.RemoveAt(i);
}
public bool Contains(T item)
{
for (int i = 0; i < collection.Count; i++)
{
if ((object)collection[i].Item2 == (object)item)
return true;
}
return false;
}
public void Insert(int index, T item)
{
collection.Insert(index, new Tuple<DateTime, T>(DateTime.Now, item));
}
public int IndexOf(T item)
{
for (int i = 0; i < collection.Count; i++)
{
if ((object)collection[i].Item2 == (object)item)
return i;
}
return -1;
}
public void Clear()
{
collection.Clear();
}
#endregion
}
You can use background thread, which will be iterate through the list and remove unneeded elements.
public void RemoveDates()
{
var checkDatesTask= new Task(
() =>
{
while (!_cancelationTokenSource.IsCancellationRequested)
{
//TODO: check and delete elements here
_cancelationTokenSource.Token.WaitHandle.WaitOne(
TimeSpan.FromSeconds(
5));
}
},
_cancelationTokenSource.Token,
TaskCreationOptions.LongRunning);
checkDatesTask.Start();
}
p.s. I suggest you read more about async. operations.

Categories