Reusable list which maintains memory - c#

I'm looking for a List<T> type class in .NET which behaves similar to List<T> but doesn't de-allocate its memory when Clear() is called - only resets the Size property.
My aim is to use this class in a memory pool, so I want the memory to be maintained, but have the caller use the class as though it were a standard list, but to avoid lots of memory re-allocations.
If this already exists please let me know as it will save time optimizing, testing and debugging this code.
Here is a mockup of what I'm hoping to find in the .NET library:
public class ReusableList<T>
{
#region Static Properties
private static long InitialCapacity = 1000000;
private static int CapacityIncreaseRate = 10000;
#endregion
#region Properties
public long Size
{
get
{
return this._size;
}
private set
{
this._size = 0;
}
}
private long _size = 0;
private long RealSize
{
get
{
return this._realSize;
}
set
{
this._realSize = value;
}
}
private long _realSize = 0;
private T[] Data
{
set
{
this._data = value;
}
get
{
return this._data;
}
}
private T[] _data = null;
#endregion
#region Operators
public T this[long index]
{
get
{
return this.Data[index];
}
set
{
this.Data[index] = value;
}
}
#endregion
#region Public Methods
public ReusableList()
{
this.Rebuild();
}
public void Add(T item)
{
this.Data[this.Size] = item;
this._size++;
if (this.Size >= this.RealSize)
{
this.IncreaseSizeOfList();
}
}
public void Clear()
{
this.Size = 0;
}
#endregion
#region Private Methods
private void Rebuild()
{
this.Data = null;
this.Data = new T[ReusableList<T>.InitialCapacity];
this.Size = 0;
this.RealSize = ReusableList<T>.InitialCapacity;
}
private void IncreaseSizeOfList()
{
if (this.Size < this.RealSize)
return;
var newData = new T[this.RealSize + ReusableList<T>.CapacityIncreaseRate];
Array.Copy(this.Data, newData, this.RealSize);
this.Data = newData;
this.RealSize += ReusableList<T>.CapacityIncreaseRate;
}
#endregion
}

As far as I understand, this is default behavior of List<T>.
When you add items to list, it allocates new memory, if needed. When you remove items (or even clear list at all), it neither "frees" memory nor decreases size of internal array.
The only way to decrease internal array is to decrease Capacity.
You can look into source code yourself. E.g., here's Clear method:
public void Clear()
{
if (_size > 0)
{
Array.Clear(_items, 0, _size);
_size = 0;
}
_version++;
}
As you can see, here's only setting array items to defaults and setting size to 0.

You can store a backup of internal list.
public class ReusableList<T> : List<T>
{
private List<T> backup;
public void Clear(bool purge)
{
if (purge)
backup?.Clear();
else
backup = this.ToList();
base.Clear();
}
public new void Clear()
{
this.Clear(false);
}
}

Related

Two identical multithreading scripts are causing memoryleak

I am working on my own multithreading for my algorithm independed pathfinding for unity. However, when I am executing two the same class I get a memory leak and when only executing one instance I am having no issues. I really want to use at least two threads if it is necessary.
Below is the class I have issues with. Keep in mind, that two independend threads will have to execute parts of this script. AddJob can be called from the main unity thread but will most likely be called from another update thread for the agents.
namespace Plugins.PathFinding.Threading
{
internal class PathFindingThread
{
private Thread m_Worker;
private volatile Queue<CompletedProcessingCallback> m_CallbackQueue;
private volatile Queue<IAlgorithm> m_QueuedTasks;
internal int GetTaskCount
{
get
{
return m_QueuedTasks.Count;
}
}
internal PathFindingThread()
{
m_Worker = new Thread(Run);
m_CallbackQueue = new Queue<CompletedProcessingCallback>();
m_QueuedTasks = new Queue<IAlgorithm>();
}
private void Run()
{
Debug.Log("<b><color=green> [ThreadInfo]:</color></b> PathFinding Thread Started ");
try
{
while(true)
{
if (m_QueuedTasks.Count > 0)
{
IAlgorithm RunningTask = m_QueuedTasks.Dequeue();
RunningTask.FindPath(new IAlgorithmCompleted(AddCallback));
}
else
break;
}
Debug.Log("<b><color=red> [ThreadInfo]:</color></b> PathFinding Worker is idle and has been Stopped");
}
catch(Exception)
{
Debug.Log("<b><color=red> [ThreadInfo]:</color></b> PathFinding thread encountred an error and has been aborted");
}
}
internal void AddJob(IAlgorithm AlgorithmToRun)
{
m_QueuedTasks.Enqueue(AlgorithmToRun);
//Debug.Log("Added Job To Queue");
}
private void AddCallback(CompletedProcessingCallback callback)
{
m_CallbackQueue.Enqueue(callback);
}
private void Update()
{
if (m_CallbackQueue.Count > 0)
{
if (m_CallbackQueue.Peek().m_Callback != null) { }
m_CallbackQueue.Peek().m_Callback.Invoke(m_CallbackQueue.Peek().m_Path);
m_CallbackQueue.Dequeue();
}
if (m_Worker.ThreadState != ThreadState.Running && m_QueuedTasks.Count != 0)
{
m_Worker = new Thread(Run);
m_Worker.Start();
}
}
}
internal delegate void IAlgorithmCompleted(CompletedProcessingCallback callback);
internal struct CompletedProcessingCallback
{
internal volatile FindPathCompleteCallback m_Callback;
internal volatile List<GridNode> m_Path;
}
}
namespace Plugins.PathFinding
{
internal enum TypeOfNode
{
Ground,
Air
}
//used to store location information since array can only take rounded numbers
internal struct Position
{
internal int x;
internal int y;
internal int z;
}
internal class GridNode
{
internal Position M_PostitionInGrid { get; private set; }
internal Vector3 M_PostitionInWorld { get; private set; }
internal TypeOfNode M_type { get; private set; }
internal bool m_IsWalkable = true;
internal GridNode m_ParrentNode;
internal int Hcost;
internal int Gcost;
internal int Fcost { get { return Hcost + Gcost; } }
internal GridNode(Position postion , Vector3 WorldPosition)
{
M_PostitionInGrid = postion;
m_IsWalkable = true;
M_PostitionInWorld = WorldPosition;
}
}
}
internal delegate void FindPathCompleteCallback(List<GridNode> Path);
internal abstract class IAlgorithm
{
protected GridNode m_SavedStart;
protected GridNode m_SavedTarget;
protected List<GridNode> m_LocatedPath;
protected FindPathCompleteCallback m_Callback;
internal FindPathCompleteCallback GetCallback
{
get
{
return m_Callback;
}
}
protected PathFindingGrid m_grid;
internal abstract void FindPath(IAlgorithmCompleted callback);
protected abstract List<GridNode> CreatePath(PathFindingGrid Grid, GridNode Start, GridNode Target);
protected abstract List<GridNode> RetracePath(GridNode start, GridNode target);
}
namespace Plugins.PathFinding.Astar
{
internal class AstarFinder : IAlgorithm
{
//construction of the Algorithm
internal AstarFinder(GridNode start, GridNode target, FindPathCompleteCallback Callback)
{
m_SavedStart = start;
m_SavedTarget = target;
m_Callback = Callback;
m_LocatedPath = new List<GridNode>();
m_grid = PathFindingGrid.GetInstance;
}
//function to start finding a path
internal override void FindPath(IAlgorithmCompleted callback)
{
//running Algorithm and getting the path
m_LocatedPath = CreatePath(PathFindingGrid.GetInstance, m_SavedStart, m_SavedTarget);
callback.Invoke(
new CompletedProcessingCallback()
{
m_Callback = m_Callback,
m_Path = m_LocatedPath
});
}
//Algorithm
protected override List<GridNode> CreatePath(PathFindingGrid Grid, GridNode Start, GridNode Target)
{
if(Grid == null ||
Start == null ||
Target == null)
{
UnityEngine.Debug.Log("Missing Parameter, might be outside of grid");
return new List<GridNode>();
}
List<GridNode> Path = new List<GridNode>();
List<GridNode> OpenSet = new List<GridNode>();
List<GridNode> ClosedSet = new List<GridNode>();
OpenSet.Add(Start);
int Retry = 0;
while (OpenSet.Count > 0)
{
if(Retry > 3000 || Grid == null)
{
UnityEngine.Debug.Log("Path Inpossible Exiting");
break;
}
GridNode CurrentNode = OpenSet[0];
for (int i = 0; i < OpenSet.Count; i++)
{
if(OpenSet[i].Fcost < CurrentNode.Fcost || OpenSet[i].Fcost == CurrentNode.Fcost && OpenSet[i].Hcost < CurrentNode.Hcost)
{
CurrentNode = OpenSet[i];
}
}
OpenSet.Remove(CurrentNode);
ClosedSet.Add(CurrentNode);
if(CurrentNode == Target)
{
Path = RetracePath(CurrentNode,Start);
break;
}
GridNode[] neighbour = Grid.GetNeighbouringNodes(CurrentNode);
for (int i = 0; i < neighbour.Length; i++)
{
if (!neighbour[i].m_IsWalkable || ClosedSet.Contains(neighbour[i]))
continue;
int CostToNeighbour = CurrentNode.Gcost + Grid.GetDistance(CurrentNode, neighbour[i]);
if(CostToNeighbour < neighbour[i].Gcost || !OpenSet.Contains(neighbour[i]))
{
neighbour[i].Gcost = CostToNeighbour;
neighbour[i].Hcost = Grid.GetDistance(neighbour[i], Target);
neighbour[i].m_ParrentNode = CurrentNode;
if (!OpenSet.Contains(neighbour[i]))
OpenSet.Add(neighbour[i]);
}
}
Retry++;
}
return Path;
}
//retracing the path out of a node map
protected override List<GridNode> RetracePath(GridNode start, GridNode target)
{
List<GridNode> Output = new List<GridNode>();
GridNode current = start;
while(current != target)
{
Output.Add(current);
current = current.m_ParrentNode;
}
Output.Reverse();
return Output;
}
}
}
This shows the core of your code made thread safe.
internal class PathFindingThread
{
Task m_Worker;
ConcurrentQueue<CompletedProcessingCallback> m_CallbackQueue;
ConcurrentQueue<IAlgorithm> m_QueuedTasks;
internal int GetTaskCount
{
get
{
return m_QueuedTasks.Count;
}
}
internal PathFindingThread()
{
m_CallbackQueue = new ConcurrentQueue<CompletedProcessingCallback>();
m_QueuedTasks = new ConcurrentQueue<IAlgorithm>();
m_Worker = Task.Factory.StartNew(() =>
{
while (true)
{
IAlgorithm head = null;
if (m_QueuedTasks.TryDequeue(out head))
{
head.FindPath(new IAlgorithmCompleted(AddCallback));
}
else
{
Task.Delay(0);
}
}
});
}
internal void AddJob(IAlgorithm AlgorithmToRun)
{
m_QueuedTasks.Enqueue(AlgorithmToRun);
}
private void AddCallback(CompletedProcessingCallback callback)
{
m_CallbackQueue.Enqueue(callback);
}
private void Update()
{
CompletedProcessingCallback cb = null;
if (m_CallbackQueue.TryDequeue(out cb))
{
cb.m_Callback.Invoke(cb.m_Path);
}
}
}
Volatile is only good for changing the value of the field - not calling methods on a collection that is referenced by the field.
You propably do not need to have Volatile in CompletedProcessingCallback, but it depends where else this is used. Certainly having volatile on a struct field is a bad smell.
Resolve these thread issues first, then see if you still have the problem.

Adding list to listbox one by one

I need to add my list to my listbox. I searched through all the questions on this site but none work I always get things like listbox1.spelers in my listbox.
Here is the code I have now.
private void btnAdd_Click(object sender, EventArgs e)
{
Speler speler1 = new Speler(tbNaam.Text, tbAge.Text);
List<Speler> spelers = new List<Speler>();
spelers.Add(speler1);
listBox1.DataSource = spelers;
}
Also tried with the ToArray but it still didn't work.
SOLVED
You're re-binding the control to a list of exactly one element every time. So the control will only ever have one element.
Keep the list in a higher scope. For example, if this class is persistent in memory (that is, not a web application) then make it a class-level member:
private List<Speler> spelers = new List<Speler>();
private void btnAdd_Click(object sender, EventArgs e)
{
Speler speler1 = new Speler(tbNaam.Text, tbAge.Text);
spelers.Add(speler1);
listBox1.DataSource = spelers;
// maybe call listBox1.DataBind() here? it's been a while since I've had to use forms
}
That way you're always adding another element to the same list, instead of creating a new list every time.
If you are using Windows Forms application, you can use a BindingDource:
Speler speler1 = new Speler(tbNaam.Text, tbAge.Text);
List<Speler> spelers = new List<Speler>();
spelers.Add(speler1);
var bs = new BindingSource();
bs.DataSource = spelers;
listBox1.DataSource = bs;
Good example is here : Console App
using System;
namespace Enumeration
{
using System;
using System.Collections;
// implements IEnumerable
class ListBoxTest : IEnumerable
{
private string[] strings;
private int ctr = 0;
// private nested implementation of ListBoxEnumerator
private class ListBoxEnumerator : IEnumerator
{
// member fields of the nested ListBoxEnumerator class
private ListBoxTest currentListBox;
private int index;
// public within the private implementation
// thus, private within ListBoxTest
public ListBoxEnumerator(ListBoxTest currentListBox)
{
// a particular ListBoxTest instance is
// passed in, hold a reference to it
// in the member variable currentListBox.
this.currentListBox = currentListBox;
index = -1;
}
// Increment the index and make sure the
// value is valid
public bool MoveNext()
{
index++;
if (index >= currentListBox.strings.Length)
return false;
else
return true;
}
public void Reset()
{
index = -1;
}
// Current property defined as the
// last string added to the listbox
public object Current
{
get
{
return(currentListBox[index]);
}
}
} // end nested class
// Enumerable classes can return an enumerator
public IEnumerator GetEnumerator()
{
return (IEnumerator) new ListBoxEnumerator(this);
}
// initialize the listbox with strings
public ListBoxTest(params string[] initialStrings)
{
// allocate space for the strings
strings = new String[8];
// copy the strings passed in to the constructor
foreach (string s in initialStrings)
{
strings[ctr++] = s;
}
}
// add a single string to the end of the listbox
public void Add(string theString)
{
strings[ctr] = theString;
ctr++;
}
// allow array-like access
public string this[int index]
{
get
{
if (index < 0 || index >= strings.Length)
{
// handle bad index
}
return strings[index];
}
set
{
strings[index] = value;
}
}
// publish how many strings you hold
public int GetNumEntries()
{
return ctr;
}
}
public class EnumerationTester
{
public void Run()
{
// create a new listbox and initialize
ListBoxTest currentListBox =
new ListBoxTest("Hello", "World");
// add a few strings
currentListBox.Add("Who");
currentListBox.Add("Is");
currentListBox.Add("John");
currentListBox.Add("Galt");
// test the access
string subst = "Universe";
currentListBox[1] = subst;
// access all the strings
foreach (string s in currentListBox)
{
Console.WriteLine("Value: {0}", s);
}
}
[STAThread]
static void Main()
{
EnumerationTester t = new EnumerationTester();
t.Run();
}
}
}

Fix-sized Concurrent list

I have implemented a simple task to create a fixed sized list that allows concurrent writes and can dump the latest snapshot of items in the list at any time.
Here is my implementation. The offset will increase atomically for each thread and reset if reaches to the size of the list. Different threads should have isolated access to each section of the array.
My question is when I call Dump(), the first few items are not stored in the list. Also, is there a Interlocked function that can do both atomic increase and reset, so I don't have to create a locker object and a lock block? Thanks.
public static void Main(string[] args)
{
ConcurrentCircularFixedList<int> list = new ConcurrentCircularFixedList<int>(20);
Enumerable.Range(1, 30).AsParallel().Select(nu => list.Enqueu(nu)).ToList();
}
public class ConcurrentCircularFixedList<T>
{
private int _size;
private int _offset;
private sealed object _locker = new Object();
privateT[] _list;
public ConcurrentCircularFixedList(int size)
{
_size = size;
_offset = 0;
_list = new T[_size];
}
public int Enqueu(T item)
{
_list[_offset] = item;
lock(_locker)
{
Debug.Write("B " + _offset);
_offset += 1;
if(_offset == _size)
_offset = 0;
Debug.Write("A " + _offset + "\n");
}
return _offset;
}
public T[] Dump()
{
return _list.ToArray();
}
}
Here's a small version of a lock-free list that copies on write. The performance characteristics should be clearly understood before using it. It's expensive when you have many writers or the list is large. Reads are synchronization free since the list is effectively immutable. This could be improved in various ways of course but you get the idea. In effect it sacrifices some memory pressure and slower writes for having zero cost reads.
public class CopyWriteList<T>
{
private volatile List<T> list;
public CopyWriteList()
{
list = new List<T>();
}
public CopyWriteList(int capacity)
{
list = new List<T>(capacity);
}
public T this[int index]
{
get { return list[index]; }
set { Replace(x => x[index] = value); }
}
public void Clear()
{
Replace(x => x.Clear());
}
public void Add(T item)
{
Replace(x => x.Add(item));
}
//Etc....
private void Replace(Action<List<T>> action)
{
List<T> current;
List<T> updated;
do
{
current = list;
updated = new List<T>(current);
action(updated);
} while (Interlocked.CompareExchange(ref list, updated, current) != current);
}
public List<T> GetSnapshot()
{
return list;
}
}
Alternatively here's a fixed version of your code. Note that there is added contention between both readers and writers. Performance could suffer because of it (like the ever expensive context switching).
public class ConcurrentCircularFixedList<T>
{
private readonly int _size;
private int _offset;
private readonly object _locker = new Object();
private readonly T[] _list;
public ConcurrentCircularFixedList(int size)
{
_size = size;
_offset = 0;
_list = new T[_size];
}
public int Enqueue(T item)
{
lock (_locker)
{
_list[_offset] = item;
Debug.Write("B " + _offset);
_offset += 1;
if (_offset == _size)
_offset = 0;
Debug.Write("A " + _offset + "\n");
return _offset;
}
}
public T[] Dump()
{
lock (_locker)
return _list.ToArray();
}
}

How to remove items from a generic list after N minutes?

I have a list as below:
private List<DateTime> _result = new List<DateTime();
and I add values to it like
_result.Add(DateTime.Now);
The requirement is that each item which is added should be removed from the list within 5 minutes deadline.
I was thinking I could create a Timer which checks my list every e.g. 1 minute and find old items and remove them but I hoped there could be an easier way?
How to implement it?
Thanks
Here's my take on this:
public class DateWrapper
{
private ConcurrentBag<DateWrapper> list;
private DateTime time;
public DateTime Time
{
get { return time; }
}
private Timer timer;
public DateWrapper(ConcurrentBag<DateWrapper> _list, DateTime _time)
{
list = _list;
time = _time;
list.Add(this);
timer = new Timer();
timer.Interval = 300000; // 5 Minutes
timer.Tick += new EventHandler(Tick);
timer.Start();
}
private void Tick(object sender, EventArgs e)
{
list.Remove(this);
}
}
The above work for small list of item. With a too big list, you get too many timer... and performance would get hurt.
So, if you have to handle lot of items, here's a generic way to do it:
public class ExpirableList<T> : IList<T>
{
private volatile List<Tuple<DateTime, T>> collection = new List<Tuple<DateTime,T>>();
private Timer timer;
public int Interval
{
get { return timer.Interval; }
set { timer.Interval = value; }
}
private TimeSpan expiration;
public TimeSpan Expiration
{
get { return expiration; }
set { expiration = value; }
}
/// <summary>
/// Define a list that automaticly remove expired objects.
/// </summary>
/// <param name="_interval"></param>
/// The interval at which the list test for old objects.
/// <param name="_expiration"></param>
/// The TimeSpan an object stay valid inside the list.
public ExpirableList(int _interval, TimeSpan _expiration)
{
timer = new Timer();
timer.Interval = _interval;
timer.Tick += new EventHandler(Tick);
timer.Start();
expiration = _expiration;
}
private void Tick(object sender, EventArgs e)
{
for (int i = collection.Count - 1; i >= 0; i--)
{
if ((DateTime.Now - collection[i].Item1) >= expiration)
{
collection.RemoveAt(i);
}
}
}
#region IList Implementation
public T this[int index]
{
get { return collection[index].Item2; }
set { collection[index] = new Tuple<DateTime, T>(DateTime.Now, value); }
}
public IEnumerator<T> GetEnumerator()
{
return collection.Select(x => x.Item2).GetEnumerator();
}
IEnumerator IEnumerable.GetEnumerator()
{
return collection.Select(x => x.Item2).GetEnumerator();
}
public void Add(T item)
{
collection.Add(new Tuple<DateTime, T>(DateTime.Now, item));
}
public int Count
{
get { return collection.Count; }
}
public bool IsSynchronized
{
get { return false; }
}
public bool IsReadOnly
{
get { return false; }
}
public void CopyTo(T[] array, int index)
{
for (int i = 0; i < collection.Count; i++)
array[i + index] = collection[i].Item2;
}
public bool Remove(T item)
{
bool contained = Contains(item);
for (int i = collection.Count - 1; i >= 0; i--)
{
if ((object)collection[i].Item2 == (object)item)
collection.RemoveAt(i);
}
return contained;
}
public void RemoveAt(int i)
{
collection.RemoveAt(i);
}
public bool Contains(T item)
{
for (int i = 0; i < collection.Count; i++)
{
if ((object)collection[i].Item2 == (object)item)
return true;
}
return false;
}
public void Insert(int index, T item)
{
collection.Insert(index, new Tuple<DateTime, T>(DateTime.Now, item));
}
public int IndexOf(T item)
{
for (int i = 0; i < collection.Count; i++)
{
if ((object)collection[i].Item2 == (object)item)
return i;
}
return -1;
}
public void Clear()
{
collection.Clear();
}
#endregion
}
You can use background thread, which will be iterate through the list and remove unneeded elements.
public void RemoveDates()
{
var checkDatesTask= new Task(
() =>
{
while (!_cancelationTokenSource.IsCancellationRequested)
{
//TODO: check and delete elements here
_cancelationTokenSource.Token.WaitHandle.WaitOne(
TimeSpan.FromSeconds(
5));
}
},
_cancelationTokenSource.Token,
TaskCreationOptions.LongRunning);
checkDatesTask.Start();
}
p.s. I suggest you read more about async. operations.

How to use a mutex

I have one thread, that is sending data stored in a buffer of type List< string> via tcp. Another thread is writing into the buffer. As I am not very familiar with c# I'd like to know how I should use lock or Mutex correctly.
This is the code I'd like to use eventually:
while(buffer.isLocked())
{
buffer.wait();
}
buffer.lockBuffer();
buffer.add(tcpPacket);
buffer.unlockBuffer();
buffer.notify();
This is my current code. I hope someone can help me complete it.
public class Buffer
{
private Mutex mutex;
private List<string> buffer;
private bool locked = false;
public Buffer()
{
mutex = new Mutex(false);
buffer = new List<string>();
}
public bool isLocked()
{
return locked;
}
public void lockBuffer()
{
if (!locked)
{
//...
locked = true;
}
}
public void unlockBuffer()
{
if(locked)
{
mutex.ReleaseMutex();
locked = false;
}
}
public void wait()
{
mutex.WaitOne();
}
public void notify()
{
//...
}
}
It would be better if you use System.Collections.Concurrent.BlockingCollection. It doesn't require an external sync.
For those who don't use 4.0
using System;
using System.Collections.Generic;
using System.Threading;
namespace MyCollections
{
public class BlockingQueue<T> : IDisposable
{
Queue<T> _Queue = new Queue<T>();
SemaphoreSlim _ItemsInQueue = null;
SemaphoreSlim _FreeSlots = null;
int _MaxItems = -1;
public BlockingQueue(int maxItems=Int32.MaxValue)
{
_MaxItems = maxItems;
_ItemsInQueue = new SemaphoreSlim(0, maxItems);
_FreeSlots = new SemaphoreSlim(maxItems, maxItems);
}
public void Dispose()
{
if (_ItemsInQueue != null) _ItemsInQueue.Dispose();
if (_FreeSlots != null) _FreeSlots.Dispose();
}
public int Count
{
get { return _ItemsInQueue.CurrentCount; }
}
public void Add(T item)
{
if(_MaxItems != Int32.MaxValue) _FreeSlots.Wait();
lock (this)
{
_Queue.Enqueue(item);
_ItemsInQueue.Release();
}
}
public T Take()
{
T item = default(T);
_ItemsInQueue.Wait();
lock (this)
{
item = _Queue.Dequeue();
if (_MaxItems != Int32.MaxValue) _FreeSlots.Release();
}
return item;
}
}
}
The following code is not thread-safe. If two threads are entering this method at the same time, both might pass the if condition successfully.
public void lockBuffer()
{
if (!locked)
{
//...
locked = true;
}
}
You simply might want to do something like this:
lock (_sycnObject)
{
buffer.lockBuffer();
buffer.add(tcpPacket);
buffer.unlockBuffer();
buffer.notify();
}
I don't think you're doing something sophisticated that requires more than the simple to use lock-statement.
I wouldn't use Mutexes since I suppose you aren't dealing with multiple processes synchronization. Locks are pretty fine and simpler to implement:
class Buffer
{
private readonly object syncObject = new object();
private readonly List<string> buffer = new List<string>();
public void AddPacket(string packet)
{
lock (syncObject)
{
buffer.Add(packet);
}
}
public void Notify()
{
// Do something, if needed lock again here
// lock (syncObject)
// {
// Notify Implementation
// }
}
}
The usage is obviously (as you requested):
var myBuffer = new Buffer();
myBuffer.Add("Hello, World!");
myBuffer.Notify();

Categories