Related
I'm supposed to write a LIFO (last in first out) class for chars, which can be edited by Pop and Add functions and seen by Peek function or foreach. Class is working on array to be more optimalized, but foreach for some reason's not working. I tried to make function GetEnumerator based on return value of _arr.GetEnumerator() function, but it was not working, because when I printed item, in console was shown TestApp.LIFO, so I made this, but now foreach won't print a single item and by debuging _i value on Reset function is 0. Can someone say why is it happening and suggest solution?
using System;
using System.Collections;
using System.Collections.Generic;
namespace TestApp {
internal class LIFO : IEnumerator, IEnumerable {
public LIFO(int size) {
_arr = new char[size];
_index = 0;
}
public LIFO(char[] arr) {
_arr = arr.Clone() as char[];
_index = 0;
}
public char Peek() => _index == 0 ? '\0' : _arr[_index - 1];
public bool Add(char c) {
if (_index == _arr.Length)
return false;
try {
_arr[_index] = c;
} catch (Exception) {
return false;
}
++_index;
return true;
}
public void Pop() {
if (_index == 0)
return;
_arr[--_index] = '\0';
}
private int _i;
public IEnumerator GetEnumerator() => this;
public bool MoveNext() => --_i > -1;
public void Reset() => _i = _index - 1;
public object Current => _arr[_i];
private int _index;
private readonly char[] _arr;
}
}
In Program.cs:
using System;
namespace TestApp {
internal static class Program {
private static void Main() {
LIFO l = new(17);
l.Add('k');
l.Add('h');
l.Add('c');
foreach (var item in l)
Console.WriteLine(l);
}
}
}
The issue is that Reset is not called. It is no longer needed but the interface is not changed due to backwards compatibility. Since new iterators implemented using yield return is actually required to throw an exception if Reset is called, no code is expected to call this method anymore.
As such, your iterator index variable, _i, is never initialized and stays at 0. The first call to MoveNext steps it below 0 and then returns false, ending the foreach loop before it even started.
You should always decouple the iterators from your actual collection as it should be safe to enumerate the same collection twice in a nested manner, storing the index variable as an instance variable in your collection prevents this.
You can, however, simplify the enumerator implementation vastly by using yield return like this:
public class LIFO : IEnumerable
{
...
public IEnumerator GetEnumerator()
{
for (int i = _index - 1; i >= 0; i--)
yield return _arr[i];
}
}
You can then remove the _i variable, the MoveNext and Reset methods, as well as the Current property.
If you first want to make your existing code working, with the above note I made about nesting enumerators, you can change your GetEnumerator and Reset methods as follows:
public IEnumerator GetEnumerator()
{
Reset();
return this;
}
public void Reset() => _i = _index;
Note that you have to reset the _i variable to one step past the last (first) value as you're decrementing it inside MoveNext. If you don't, you'll ignore the last item added to the LIFO stack.
Your index counter variable using get values in _arr array (it is _i) and index varible that doing increase and decrease operations on it (it is _index) are different. Because of that your for loop never iterate your collection. I fix the code with some addition here. I hope it's helpful.
LIFO.cs
using System;
using System.Collections;
using System.Collections.Generic;
namespace TestApp
{
internal class LIFO : IEnumerator, IEnumerable
{
public LIFO(int size)
{
_arr = new char[size];
_index = 0;
}
public LIFO(char[] arr)
{
_arr = arr.Clone() as char[];
_index = 0;
}
public int Count() => _index;
public char Peek() => _index == 0 ? '\0' : _arr[_index - 1];
public bool Add(char c)
{
if (_index == _arr.Length)
return false;
try
{
_arr[_index] = c;
}
catch (Exception)
{
return false;
}
++_index;
_i = _index;
return true;
}
public void Pop()
{
if (_index == 0)
return;
_arr[--_index] = '\0';
_i = _index;
}
public IEnumerator GetEnumerator() => (IEnumerator)this;
public bool MoveNext() => --_index > -1;
public void Reset() => _index = _i;
public object Current
{
get => _arr[_index];
}
private int _index;
private int _i;
private readonly char[] _arr;
}
}
Program.cs
using System;
namespace TestApp
{
internal static class Program
{
private static void Main()
{
LIFO l = new LIFO(17);
l.Add('k');
l.Add('h');
l.Add('c');
Console.WriteLine("Count: " + l.Count());
foreach (var i in l)
Console.WriteLine(i);
l.Reset();
foreach (var i in l)
Console.WriteLine(i);
}
}
}
I want to ensure that my List can contain only 2 elements array of int. I am currently have to declare a struct but I really don't want to declare many type if it is not really necessary. So I am wondering if I can declare a list of fixed size array.
static void Main(string[] args)
{
//I want to declare something like this
List<int[2]> list = new List<int[2]>();
list.Add(new int[2] { 1, 2 });
list.Add(new int[2] { 3, 4 });
//What I having to do(not really want because I have to declare struct)
List<TwoElement> list2 = new List<TwoElement>();
list2.Add(new TwoElement() { Element1 = 1, Element2 = 2 });
list2.Add(new TwoElement() { Element1 = 1, Element2 = 2 });
}
private struct TwoElement
{
public int Element1;
public int Element2;
}
It's not possible, as reference from this post, in short, int[2] is not a Type, therefore no way to limit it without a struct, model or tuple.
I will prefer using tuple in this case, since it's pretty simple with lambda
List<(int, int)> list = new List<(int, int)>();
list.Add((1, 1));
list.Add((2, 3));
Here's an idea of a fixed length array. If the length was more than 2, then I'd probably have a private int[] theArray member. But, at two, it probably makes sense to have a _first and a _second member the way I show. The System.Array class has a lot of members - you get to implement the ones you care about (I put all the Properties in (either as a property or as a const)).
In any case, it gives you an idea of a possible solution:
public class ArrayOf2Int : IEnumerable<int>
{
private readonly int _first;
private readonly int _second;
//I had the urge to make this a Lazy<object> - but it's an object, who cares
//If you implement this with an array of int (int[]) instead of two ints, delegate this to the SyncRoot of the array
public object SyncRoot { get; } = new object();
public const int Length = 2;
public const long LongLength = 2;
public const int Rank = 1;
public const bool IsFixedSize = true;
public const bool IsSynchronized = false;
public ArrayOf2Int(int first, int second)
{
_first = first;
_second = second;
}
public int this[int i]
{
get
{
if (i < 0 || i > 1)
{
throw new ArgumentOutOfRangeException(nameof(i), "Index out of range");
}
return i == 0 ? _first : _second;
}
}
public IEnumerator<int> GetEnumerator()
{
yield return _first;
yield return _second;
}
IEnumerator IEnumerable.GetEnumerator()
{
return GetEnumerator();
}
}
I'm using ConcurrentQueue for a shared data structure which purpose is holding the last N objects passed to it (kind of history).
Assume we have a browser and we want to have the last 100 browsed Urls. I want a queue which automatically drop (dequeue) the oldest (first) entry upon new entry insertion (enqueue) when the capacity gets full (100 addresses in history).
How can I accomplish that using System.Collections ?
I would write a wrapper class that on Enqueue would check the Count and then Dequeue when the count exceeds the limit.
public class FixedSizedQueue<T>
{
readonly ConcurrentQueue<T> q = new ConcurrentQueue<T>();
private object lockObject = new object();
public int Limit { get; set; }
public void Enqueue(T obj)
{
q.Enqueue(obj);
lock (lockObject)
{
T overflow;
while (q.Count > Limit && q.TryDequeue(out overflow)) ;
}
}
}
I'd go for a slight variant... extend ConcurrentQueue so as to be able to use Linq extensions on FixedSizeQueue
public class FixedSizedQueue<T> : ConcurrentQueue<T>
{
private readonly object syncObject = new object();
public int Size { get; private set; }
public FixedSizedQueue(int size)
{
Size = size;
}
public new void Enqueue(T obj)
{
base.Enqueue(obj);
lock (syncObject)
{
while (base.Count > Size)
{
T outObj;
base.TryDequeue(out outObj);
}
}
}
}
For anyone who finds it useful, here is some working code based on Richard Schneider's answer above:
public class FixedSizedQueue<T>
{
readonly ConcurrentQueue<T> queue = new ConcurrentQueue<T>();
public int Size { get; private set; }
public FixedSizedQueue(int size)
{
Size = size;
}
public void Enqueue(T obj)
{
queue.Enqueue(obj);
while (queue.Count > Size)
{
T outObj;
queue.TryDequeue(out outObj);
}
}
}
For what its worth, here's a lightweight circular buffer with some methods marked for safe and unsafe use.
public class CircularBuffer<T> : IEnumerable<T>
{
readonly int size;
readonly object locker;
int count;
int head;
int rear;
T[] values;
public CircularBuffer(int max)
{
this.size = max;
locker = new object();
count = 0;
head = 0;
rear = 0;
values = new T[size];
}
static int Incr(int index, int size)
{
return (index + 1) % size;
}
private void UnsafeEnsureQueueNotEmpty()
{
if (count == 0)
throw new Exception("Empty queue");
}
public int Size { get { return size; } }
public object SyncRoot { get { return locker; } }
#region Count
public int Count { get { return UnsafeCount; } }
public int SafeCount { get { lock (locker) { return UnsafeCount; } } }
public int UnsafeCount { get { return count; } }
#endregion
#region Enqueue
public void Enqueue(T obj)
{
UnsafeEnqueue(obj);
}
public void SafeEnqueue(T obj)
{
lock (locker) { UnsafeEnqueue(obj); }
}
public void UnsafeEnqueue(T obj)
{
values[rear] = obj;
if (Count == Size)
head = Incr(head, Size);
rear = Incr(rear, Size);
count = Math.Min(count + 1, Size);
}
#endregion
#region Dequeue
public T Dequeue()
{
return UnsafeDequeue();
}
public T SafeDequeue()
{
lock (locker) { return UnsafeDequeue(); }
}
public T UnsafeDequeue()
{
UnsafeEnsureQueueNotEmpty();
T res = values[head];
values[head] = default(T);
head = Incr(head, Size);
count--;
return res;
}
#endregion
#region Peek
public T Peek()
{
return UnsafePeek();
}
public T SafePeek()
{
lock (locker) { return UnsafePeek(); }
}
public T UnsafePeek()
{
UnsafeEnsureQueueNotEmpty();
return values[head];
}
#endregion
#region GetEnumerator
public IEnumerator<T> GetEnumerator()
{
return UnsafeGetEnumerator();
}
public IEnumerator<T> SafeGetEnumerator()
{
lock (locker)
{
List<T> res = new List<T>(count);
var enumerator = UnsafeGetEnumerator();
while (enumerator.MoveNext())
res.Add(enumerator.Current);
return res.GetEnumerator();
}
}
public IEnumerator<T> UnsafeGetEnumerator()
{
int index = head;
for (int i = 0; i < count; i++)
{
yield return values[index];
index = Incr(index, size);
}
}
System.Collections.IEnumerator System.Collections.IEnumerable.GetEnumerator()
{
return this.GetEnumerator();
}
#endregion
}
I like to use the Foo()/SafeFoo()/UnsafeFoo() convention:
Foo methods call UnsafeFoo as a default.
UnsafeFoo methods modify state freely without a lock, they should only call other unsafe methods.
SafeFoo methods call UnsafeFoo methods inside a lock.
Its a little verbose, but it makes obvious errors, like calling unsafe methods outside a lock in a method which is supposed to be thread-safe, more apparent.
My version is just a subclass of normal Queue ones.. nothing special but seeing everyone participating and it still goes with the topic title I might as well put it here. It also returns the dequeued ones just in case.
public sealed class SizedQueue<T> : Queue<T>
{
public int FixedCapacity { get; }
public SizedQueue(int fixedCapacity)
{
this.FixedCapacity = fixedCapacity;
}
/// <summary>
/// If the total number of item exceed the capacity, the oldest ones automatically dequeues.
/// </summary>
/// <returns>The dequeued value, if any.</returns>
public new T Enqueue(T item)
{
base.Enqueue(item);
if (base.Count > FixedCapacity)
{
return base.Dequeue();
}
return default;
}
}
Just because no one's said it yet.. you can use a LinkedList<T> and add the thread safety:
public class Buffer<T> : LinkedList<T>
{
private int capacity;
public Buffer(int capacity)
{
this.capacity = capacity;
}
public void Enqueue(T item)
{
// todo: add synchronization mechanism
if (Count == capacity) RemoveLast();
AddFirst(item);
}
public T Dequeue()
{
// todo: add synchronization mechanism
var last = Last.Value;
RemoveLast();
return last;
}
}
One thing to note is the default enumeration order will be LIFO in this example. But that can be overridden if necessary.
Here's my take on the fixed size Queue
It uses regular Queue, to avoid the synchronization overhead when the Count property is used on ConcurrentQueue. It also implements IReadOnlyCollection so that LINQ methods can be used. The rest is very similar to the other answers here.
[Serializable]
[DebuggerDisplay("Count = {" + nameof(Count) + "}, Limit = {" + nameof(Limit) + "}")]
public class FixedSizedQueue<T> : IReadOnlyCollection<T>
{
private readonly Queue<T> _queue = new Queue<T>();
private readonly object _lock = new object();
public int Count { get { lock (_lock) { return _queue.Count; } } }
public int Limit { get; }
public FixedSizedQueue(int limit)
{
if (limit < 1)
throw new ArgumentOutOfRangeException(nameof(limit));
Limit = limit;
}
public FixedSizedQueue(IEnumerable<T> collection)
{
if (collection is null || !collection.Any())
throw new ArgumentException("Can not initialize the Queue with a null or empty collection", nameof(collection));
_queue = new Queue<T>(collection);
Limit = _queue.Count;
}
public void Enqueue(T obj)
{
lock (_lock)
{
_queue.Enqueue(obj);
while (_queue.Count > Limit)
_queue.Dequeue();
}
}
public void Clear()
{
lock (_lock)
_queue.Clear();
}
public IEnumerator<T> GetEnumerator()
{
lock (_lock)
return new List<T>(_queue).GetEnumerator();
}
IEnumerator IEnumerable.GetEnumerator()
{
return GetEnumerator();
}
}
Let's add one more answer. Why this over others?
1) Simplicity. Trying to guarantee size is well and good but leads to unneeded complexity that can exhibit its own problems.
2) Implements IReadOnlyCollection, meaning you can use Linq on it and pass it into a variety of things that expect IEnumerable.
3) No locking. Many of the solutions above use locks, which is incorrect on a lockless collection.
4) Implements the same set of methods, properties, and interfaces ConcurrentQueue does, including IProducerConsumerCollection, which is important if you want to use the collection with BlockingCollection.
This implementation could potentially end up with more entries than expected if TryDequeue fails, but the frequency of that occurring doesn't seem worth specialized code that will inevitably hamper performance and cause its own unexpected problems.
If you absolutely want to guarantee a size, implementing a Prune() or similar method seems like the best idea. You could use a ReaderWriterLockSlim read lock in the other methods (including TryDequeue) and take a write lock only when pruning.
class ConcurrentFixedSizeQueue<T> : IProducerConsumerCollection<T>, IReadOnlyCollection<T>, ICollection {
readonly ConcurrentQueue<T> m_concurrentQueue;
readonly int m_maxSize;
public int Count => m_concurrentQueue.Count;
public bool IsEmpty => m_concurrentQueue.IsEmpty;
public ConcurrentFixedSizeQueue (int maxSize) : this(Array.Empty<T>(), maxSize) { }
public ConcurrentFixedSizeQueue (IEnumerable<T> initialCollection, int maxSize) {
if (initialCollection == null) {
throw new ArgumentNullException(nameof(initialCollection));
}
m_concurrentQueue = new ConcurrentQueue<T>(initialCollection);
m_maxSize = maxSize;
}
public void Enqueue (T item) {
m_concurrentQueue.Enqueue(item);
if (m_concurrentQueue.Count > m_maxSize) {
T result;
m_concurrentQueue.TryDequeue(out result);
}
}
public void TryPeek (out T result) => m_concurrentQueue.TryPeek(out result);
public bool TryDequeue (out T result) => m_concurrentQueue.TryDequeue(out result);
public void CopyTo (T[] array, int index) => m_concurrentQueue.CopyTo(array, index);
public T[] ToArray () => m_concurrentQueue.ToArray();
public IEnumerator<T> GetEnumerator () => m_concurrentQueue.GetEnumerator();
IEnumerator IEnumerable.GetEnumerator () => GetEnumerator();
// Explicit ICollection implementations.
void ICollection.CopyTo (Array array, int index) => ((ICollection)m_concurrentQueue).CopyTo(array, index);
object ICollection.SyncRoot => ((ICollection) m_concurrentQueue).SyncRoot;
bool ICollection.IsSynchronized => ((ICollection) m_concurrentQueue).IsSynchronized;
// Explicit IProducerConsumerCollection<T> implementations.
bool IProducerConsumerCollection<T>.TryAdd (T item) => ((IProducerConsumerCollection<T>) m_concurrentQueue).TryAdd(item);
bool IProducerConsumerCollection<T>.TryTake (out T item) => ((IProducerConsumerCollection<T>) m_concurrentQueue).TryTake(out item);
public override int GetHashCode () => m_concurrentQueue.GetHashCode();
public override bool Equals (object obj) => m_concurrentQueue.Equals(obj);
public override string ToString () => m_concurrentQueue.ToString();
}
Just for fun, here is another implementation that I believe addresses most of the commenters' concerns. In particular, thread-safety is achieved without locking and the implementation is hidden by the wrapping class.
public class FixedSizeQueue<T> : IReadOnlyCollection<T>
{
private ConcurrentQueue<T> _queue = new ConcurrentQueue<T>();
private int _count;
public int Limit { get; private set; }
public FixedSizeQueue(int limit)
{
this.Limit = limit;
}
public void Enqueue(T obj)
{
_queue.Enqueue(obj);
Interlocked.Increment(ref _count);
// Calculate the number of items to be removed by this thread in a thread safe manner
int currentCount;
int finalCount;
do
{
currentCount = _count;
finalCount = Math.Min(currentCount, this.Limit);
} while (currentCount !=
Interlocked.CompareExchange(ref _count, finalCount, currentCount));
T overflow;
while (currentCount > finalCount && _queue.TryDequeue(out overflow))
currentCount--;
}
public int Count
{
get { return _count; }
}
public IEnumerator<T> GetEnumerator()
{
return _queue.GetEnumerator();
}
System.Collections.IEnumerator System.Collections.IEnumerable.GetEnumerator()
{
return _queue.GetEnumerator();
}
}
Well it depends upon the use I have noticed that some of above solution may exceed the size when used in multip-threaded environment. Anyway my use case was to display last 5 events and there are multiple threads writing events into the queue and one other thread reading from it and displaying it in a Winform Control. So this was my solution.
EDIT: Since we already using locking within our implementation we don't really need ConcurrentQueue it may improve the performance.
class FixedSizedConcurrentQueue<T>
{
readonly Queue<T> queue = new Queue<T>();
readonly object syncObject = new object();
public int MaxSize { get; private set; }
public FixedSizedConcurrentQueue(int maxSize)
{
MaxSize = maxSize;
}
public void Enqueue(T obj)
{
lock (syncObject)
{
queue.Enqueue(obj);
while (queue.Count > MaxSize)
{
queue.Dequeue();
}
}
}
public T[] ToArray()
{
T[] result = null;
lock (syncObject)
{
result = queue.ToArray();
}
return result;
}
public void Clear()
{
lock (syncObject)
{
queue.Clear();
}
}
}
EDIT: We don't really need syncObject in above example and we can rather use queue object since we are not re-initializing queue in any function and its marked as readonly anyway.
The accepted answer is going to have avoidable side-effects.
Links below are references that I used when I wrote my example below.
While the documentation from Microsoft is a bit misleading as they do use a lock they however lock the segement classes. The segment classes themselves use Interlocked.
using System;
using System.Collections.Concurrent;
using System.Collections.Generic;
namespace Lib.Core
{
// Sources:
// https://learn.microsoft.com/en-us/dotnet/standard/collections/thread-safe/
// https://learn.microsoft.com/en-us/dotnet/api/system.threading.interlocked?view=netcore-3.1
// https://github.com/dotnet/runtime/blob/master/src/libraries/System.Private.CoreLib/src/System/Collections/Concurrent/ConcurrentQueue.cs
// https://github.com/dotnet/runtime/blob/master/src/libraries/System.Private.CoreLib/src/System/Collections/Concurrent/ConcurrentQueueSegment.cs
/// <summary>
/// Concurrent safe circular buffer that will used a fixed capacity specified and resuse slots as it goes.
/// </summary>
/// <typeparam name="TObject">The object that you want to go into the slots.</typeparam>
public class ConcurrentCircularBuffer<TObject>
{
private readonly ConcurrentQueue<TObject> _queue;
public int Capacity { get; private set; }
public ConcurrentCircularBuffer(int capacity)
{
if(capacity <= 0)
{
throw new ArgumentException($"The capacity specified '{capacity}' is not valid.", nameof(capacity));
}
// Setup the queue to the initial capacity using List's underlying implementation.
_queue = new ConcurrentQueue<TObject>(new List<TObject>(capacity));
Capacity = capacity;
}
public void Enqueue(TObject #object)
{
// Enforce the capacity first so the head can be used instead of the entire segment (slow).
while (_queue.Count + 1 > Capacity)
{
if (!_queue.TryDequeue(out _))
{
// Handle error condition however you want to ie throw, return validation object, etc.
var ex = new Exception("Concurrent Dequeue operation failed.");
ex.Data.Add("EnqueueObject", #object);
throw ex;
}
}
// Place the item into the queue
_queue.Enqueue(#object);
}
public TObject Dequeue()
{
if(_queue.TryDequeue(out var result))
{
return result;
}
return default;
}
}
}
For your coding pleasure I submit to you the 'ConcurrentDeck'
public class ConcurrentDeck<T>
{
private readonly int _size;
private readonly T[] _buffer;
private int _position = 0;
public ConcurrentDeck(int size)
{
_size = size;
_buffer = new T[size];
}
public void Push(T item)
{
lock (this)
{
_buffer[_position] = item;
_position++;
if (_position == _size) _position = 0;
}
}
public T[] ReadDeck()
{
lock (this)
{
return _buffer.Skip(_position).Union(_buffer.Take(_position)).ToArray();
}
}
}
Example Usage:
void Main()
{
var deck = new ConcurrentDeck<Tuple<string,DateTime>>(25);
var handle = new ManualResetEventSlim();
var task1 = Task.Factory.StartNew(()=>{
var timer = new System.Timers.Timer();
timer.Elapsed += (s,a) => {deck.Push(new Tuple<string,DateTime>("task1",DateTime.Now));};
timer.Interval = System.TimeSpan.FromSeconds(1).TotalMilliseconds;
timer.Enabled = true;
handle.Wait();
});
var task2 = Task.Factory.StartNew(()=>{
var timer = new System.Timers.Timer();
timer.Elapsed += (s,a) => {deck.Push(new Tuple<string,DateTime>("task2",DateTime.Now));};
timer.Interval = System.TimeSpan.FromSeconds(.5).TotalMilliseconds;
timer.Enabled = true;
handle.Wait();
});
var task3 = Task.Factory.StartNew(()=>{
var timer = new System.Timers.Timer();
timer.Elapsed += (s,a) => {deck.Push(new Tuple<string,DateTime>("task3",DateTime.Now));};
timer.Interval = System.TimeSpan.FromSeconds(.25).TotalMilliseconds;
timer.Enabled = true;
handle.Wait();
});
System.Threading.Thread.Sleep(TimeSpan.FromSeconds(10));
handle.Set();
var outputtime = DateTime.Now;
deck.ReadDeck().Select(d => new {Message = d.Item1, MilliDiff = (outputtime - d.Item2).TotalMilliseconds}).Dump(true);
}
Here is yet another implementation that uses the underlying ConcurrentQueue as much as possible while providing the same interfaces made available via ConcurrentQueue.
/// <summary>
/// This is a FIFO concurrent queue that will remove the oldest added items when a given limit is reached.
/// </summary>
/// <typeparam name="TValue"></typeparam>
public class FixedSizedConcurrentQueue<TValue> : IProducerConsumerCollection<TValue>, IReadOnlyCollection<TValue>
{
private readonly ConcurrentQueue<TValue> _queue;
private readonly object _syncObject = new object();
public int LimitSize { get; }
public FixedSizedConcurrentQueue(int limit)
{
_queue = new ConcurrentQueue<TValue>();
LimitSize = limit;
}
public FixedSizedConcurrentQueue(int limit, System.Collections.Generic.IEnumerable<TValue> collection)
{
_queue = new ConcurrentQueue<TValue>(collection);
LimitSize = limit;
}
public int Count => _queue.Count;
bool ICollection.IsSynchronized => ((ICollection) _queue).IsSynchronized;
object ICollection.SyncRoot => ((ICollection)_queue).SyncRoot;
public bool IsEmpty => _queue.IsEmpty;
// Not supported until .NET Standard 2.1
//public void Clear() => _queue.Clear();
public void CopyTo(TValue[] array, int index) => _queue.CopyTo(array, index);
void ICollection.CopyTo(Array array, int index) => ((ICollection)_queue).CopyTo(array, index);
public void Enqueue(TValue obj)
{
_queue.Enqueue(obj);
lock( _syncObject )
{
while( _queue.Count > LimitSize ) {
_queue.TryDequeue(out _);
}
}
}
public IEnumerator<TValue> GetEnumerator() => _queue.GetEnumerator();
IEnumerator IEnumerable.GetEnumerator() => ((IEnumerable<TValue>)this).GetEnumerator();
public TValue[] ToArray() => _queue.ToArray();
public bool TryAdd(TValue item)
{
Enqueue(item);
return true;
}
bool IProducerConsumerCollection<TValue>.TryTake(out TValue item) => TryDequeue(out item);
public bool TryDequeue(out TValue result) => _queue.TryDequeue(out result);
public bool TryPeek(out TValue result) => _queue.TryPeek(out result);
}
using System.Collections.Concurrent;
public class FixedSizeQueue<T>
{
ConcurrentQueue<T> _queue = new ConcurrentQueue<T>();
private void Enque(T obj)
{
T temp;
if (_queue.Count > 99)
{
// Remove one of the oldest added items.
_queue.TryDequeue(out temp);
}
_queue.Enqueue(obj);
}
private bool Dequeue(out T obj)
{
return _queue.TryDequeue(out obj);
}
private void Clear()
{
T obj;
// It does not fall into an infinite loop, and clears the contents of the present time.
int cnt = _queue.Count;
for (; cnt > 0; cnt--)
{
_queue.TryDequeue(out obj);
}
}
}
This is my version of the queue:
public class FixedSizedQueue<T> {
private object LOCK = new object();
ConcurrentQueue<T> queue;
public int MaxSize { get; set; }
public FixedSizedQueue(int maxSize, IEnumerable<T> items = null) {
this.MaxSize = maxSize;
if (items == null) {
queue = new ConcurrentQueue<T>();
}
else {
queue = new ConcurrentQueue<T>(items);
EnsureLimitConstraint();
}
}
public void Enqueue(T obj) {
queue.Enqueue(obj);
EnsureLimitConstraint();
}
private void EnsureLimitConstraint() {
if (queue.Count > MaxSize) {
lock (LOCK) {
T overflow;
while (queue.Count > MaxSize) {
queue.TryDequeue(out overflow);
}
}
}
}
/// <summary>
/// returns the current snapshot of the queue
/// </summary>
/// <returns></returns>
public T[] GetSnapshot() {
return queue.ToArray();
}
}
I find it useful to have a constructor that is built upon an IEnumerable and I find it useful to have a GetSnapshot to have a multithread safe list (array in this case) of the items at the moment of the call, that doesn't rise errors if the underlaying collection changes.
The double Count check is to prevent the lock in some circumstances.
I want to do the same as in this question, that is:
enum DaysOfTheWeek {Sunday=0, Monday, Tuesday...};
string[] message_array = new string[number_of_items_at_enum];
...
Console.Write(custom_array[(int)DaysOfTheWeek.Sunday]);
however, I would rather have something integral to so, rather than write this error prone code. Is there a built in module in C# that does just this?
If the values of your enum items are contigious, the array method works pretty well. However, in any case, you could use Dictionary<DayOfTheWeek, string> (which is less performant, by the way).
Since C# 7.3 it has been possible to use System.Enum as a constraint on type parameters. So the nasty hacks in the some of the other answers are no longer required.
Here's a very simple ArrayByEum class that does exactly what the question asked.
Note that it will waste space if the enum values are non-contiguous, and won't cope with enum values that are too large for an int. I did say this example was very simple.
/// <summary>An array indexed by an Enum</summary>
/// <typeparam name="T">Type stored in array</typeparam>
/// <typeparam name="U">Indexer Enum type</typeparam>
public class ArrayByEnum<T,U> : IEnumerable where U : Enum // requires C# 7.3 or later
{
private readonly T[] _array;
private readonly int _lower;
public ArrayByEnum()
{
_lower = Convert.ToInt32(Enum.GetValues(typeof(U)).Cast<U>().Min());
int upper = Convert.ToInt32(Enum.GetValues(typeof(U)).Cast<U>().Max());
_array = new T[1 + upper - _lower];
}
public T this[U key]
{
get { return _array[Convert.ToInt32(key) - _lower]; }
set { _array[Convert.ToInt32(key) - _lower] = value; }
}
public IEnumerator GetEnumerator()
{
return Enum.GetValues(typeof(U)).Cast<U>().Select(i => this[i]).GetEnumerator();
}
}
Usage:
ArrayByEnum<string,MyEnum> myArray = new ArrayByEnum<string,MyEnum>();
myArray[MyEnum.First] = "Hello";
myArray[YourEnum.Other] = "World"; // compiler error
You could make a class or struct that could do the work for you
public class Caster
{
public enum DayOfWeek
{
Sunday = 0,
Monday,
Tuesday,
Wednesday,
Thursday,
Friday,
Saturday
}
public Caster() {}
public Caster(string[] data) { this.Data = data; }
public string this[DayOfWeek dow]{
get { return this.Data[(int)dow]; }
}
public string[] Data { get; set; }
public static implicit operator string[](Caster caster) { return caster.Data; }
public static implicit operator Caster(string[] data) { return new Caster(data); }
}
class Program
{
static void Main(string[] args)
{
Caster message_array = new string[7];
Console.Write(message_array[Caster.DayOfWeek.Sunday]);
}
}
EDIT
For lack of a better place to put this, I am posting a generic version of the Caster class below. Unfortunately, it relies on runtime checks to enforce TKey as an enum.
public enum DayOfWeek
{
Weekend,
Sunday = 0,
Monday,
Tuesday,
Wednesday,
Thursday,
Friday,
Saturday
}
public class TypeNotSupportedException : ApplicationException
{
public TypeNotSupportedException(Type type)
: base(string.Format("The type \"{0}\" is not supported in this context.", type.Name))
{
}
}
public class CannotBeIndexerException : ApplicationException
{
public CannotBeIndexerException(Type enumUnderlyingType, Type indexerType)
: base(
string.Format("The base type of the enum (\"{0}\") cannot be safely cast to \"{1}\".",
enumUnderlyingType.Name, indexerType)
)
{
}
}
public class Caster<TKey, TValue>
{
private readonly Type baseEnumType;
public Caster()
{
baseEnumType = typeof(TKey);
if (!baseEnumType.IsEnum)
throw new TypeNotSupportedException(baseEnumType);
}
public Caster(TValue[] data)
: this()
{
Data = data;
}
public TValue this[TKey key]
{
get
{
var enumUnderlyingType = Enum.GetUnderlyingType(baseEnumType);
var intType = typeof(int);
if (!enumUnderlyingType.IsAssignableFrom(intType))
throw new CannotBeIndexerException(enumUnderlyingType, intType);
var index = (int) Enum.Parse(baseEnumType, key.ToString());
return Data[index];
}
}
public TValue[] Data { get; set; }
public static implicit operator TValue[](Caster<TKey, TValue> caster)
{
return caster.Data;
}
public static implicit operator Caster<TKey, TValue>(TValue[] data)
{
return new Caster<TKey, TValue>(data);
}
}
// declaring and using it.
Caster<DayOfWeek, string> messageArray =
new[]
{
"Sunday",
"Monday",
"Tuesday",
"Wednesday",
"Thursday",
"Friday",
"Saturday"
};
Console.WriteLine(messageArray[DayOfWeek.Sunday]);
Console.WriteLine(messageArray[DayOfWeek.Monday]);
Console.WriteLine(messageArray[DayOfWeek.Tuesday]);
Console.WriteLine(messageArray[DayOfWeek.Wednesday]);
Console.WriteLine(messageArray[DayOfWeek.Thursday]);
Console.WriteLine(messageArray[DayOfWeek.Friday]);
Console.WriteLine(messageArray[DayOfWeek.Saturday]);
Here you go:
string[] message_array = Enum.GetNames(typeof(DaysOfTheWeek));
If you really need the length, then just take the .Length on the result :)
You can get values with:
string[] message_array = Enum.GetValues(typeof(DaysOfTheWeek));
Compact form of enum used as index and assigning whatever type to a Dictionary
and strongly typed. In this case float values are returned but values could be complex Class instances having properties and methods and more:
enum opacityLevel { Min, Default, Max }
private static readonly Dictionary<opacityLevel, float> _oLevels = new Dictionary<opacityLevel, float>
{
{ opacityLevel.Max, 40.0 },
{ opacityLevel.Default, 50.0 },
{ opacityLevel.Min, 100.0 }
};
//Access float value like this
var x = _oLevels[opacitylevel.Default];
If all you need is essentially a map, but don't want to incur performance overhead associated with dictionary lookups, this might work:
public class EnumIndexedArray<TKey, T> : IEnumerable<KeyValuePair<TKey, T>> where TKey : struct
{
public EnumIndexedArray()
{
if (!typeof (TKey).IsEnum) throw new InvalidOperationException("Generic type argument is not an Enum");
var size = Convert.ToInt32(Keys.Max()) + 1;
Values = new T[size];
}
protected T[] Values;
public static IEnumerable<TKey> Keys
{
get { return Enum.GetValues(typeof (TKey)).OfType<TKey>(); }
}
public T this[TKey index]
{
get { return Values[Convert.ToInt32(index)]; }
set { Values[Convert.ToInt32(index)] = value; }
}
private IEnumerable<KeyValuePair<TKey, T>> CreateEnumerable()
{
return Keys.Select(key => new KeyValuePair<TKey, T>(key, Values[Convert.ToInt32(key)]));
}
public IEnumerator<KeyValuePair<TKey, T>> GetEnumerator()
{
return CreateEnumerable().GetEnumerator();
}
IEnumerator IEnumerable.GetEnumerator()
{
return GetEnumerator();
}
}
So in your case you could derive:
class DaysOfWeekToStringsMap:EnumIndexedArray<DayOfWeek,string>{};
Usage:
var map = new DaysOfWeekToStringsMap();
//using the Keys static property
foreach(var day in DaysOfWeekToStringsMap.Keys){
map[day] = day.ToString();
}
foreach(var day in DaysOfWeekToStringsMap.Keys){
Console.WriteLine("map[{0}]={1}",day, map[day]);
}
// using iterator
foreach(var value in map){
Console.WriteLine("map[{0}]={1}",value.Key, value.Value);
}
Obviously this implementation is backed by an array, so non-contiguous enums like this:
enum
{
Ok = 1,
NotOk = 1000000
}
would result in excessive memory usage.
If you require maximum possible performance you might want to make it less generic and loose all generic enum handling code I had to use to get it to compile and work. I didn't benchmark this though, so maybe it's no big deal.
Caching the Keys static property might also help.
I realize this is an old question, but there have been a number of comments about the fact that all solutions so far have run-time checks to ensure the data type is an enum. Here is a complete solution (with some examples) of a solution with compile time checks (as well as some comments and discussions from my fellow developers)
//There is no good way to constrain a generic class parameter to an Enum. The hack below does work at compile time,
// though it is convoluted. For examples of how to use the two classes EnumIndexedArray and ObjEnumIndexedArray,
// see AssetClassArray below. Or, e.g.
// EConstraint.EnumIndexedArray<int, YourEnum> x = new EConstraint.EnumIndexedArray<int, YourEnum>();
// See this post
// http://stackoverflow.com/questions/79126/create-generic-method-constraining-t-to-an-enum/29581813#29581813
// and the answer/comments by Julien Lebosquain
public class EConstraint : HackForCompileTimeConstraintOfTEnumToAnEnum<System.Enum> { }//THIS MUST BE THE ONLY IMPLEMENTATION OF THE ABSTRACT HackForCompileTimeConstraintOfTEnumToAnEnum
public abstract class HackForCompileTimeConstraintOfTEnumToAnEnum<SystemEnum> where SystemEnum : class
{
//For object types T, users should use EnumIndexedObjectArray below.
public class EnumIndexedArray<T, TEnum>
where TEnum : struct, SystemEnum
{
//Needs to be public so that we can easily do things like intIndexedArray.data.sum()
// - just not worth writing up all the equivalent methods, and we can't inherit from T[] and guarantee proper initialization.
//Also, note that we cannot use Length here for initialization, even if Length were defined the same as GetNumEnums up to
// static qualification, because we cannot use a non-static for initialization here.
// Since we want Length to be non-static, in keeping with other definitions of the Length property, we define the separate static
// GetNumEnums, and then define the non-static Length in terms of the actual size of the data array, just for clarity,
// safety and certainty (in case someone does something stupid like resizing data).
public T[] data = new T[GetNumEnums()];
//First, a couple of statics allowing easy use of the enums themselves.
public static TEnum[] GetEnums()
{
return (TEnum[])Enum.GetValues(typeof(TEnum));
}
public TEnum[] getEnums()
{
return GetEnums();
}
//Provide a static method of getting the number of enums. The Length property also returns this, but it is not static and cannot be use in many circumstances.
public static int GetNumEnums()
{
return GetEnums().Length;
}
//This should always return the same as GetNumEnums, but is not static and does it in a way that guarantees consistency with the member array.
public int Length { get { return data.Length; } }
//public int Count { get { return data.Length; } }
public EnumIndexedArray() { }
// [WDS 2015-04-17] Remove. This can be dangerous. Just force people to use EnumIndexedArray(T[] inputArray).
// [DIM 2015-04-18] Actually, if you think about it, EnumIndexedArray(T[] inputArray) is just as dangerous:
// For value types, both are fine. For object types, the latter causes each object in the input array to be referenced twice,
// while the former causes the single object t to be multiply referenced. Two references to each of many is no less dangerous
// than 3 or more references to one. So all of these are dangerous for object types.
// We could remove all these ctors from this base class, and create a separate
// EnumIndexedValueArray<T, TEnum> : EnumIndexedArray<T, TEnum> where T: struct ...
// but then specializing to TEnum = AssetClass would have to be done twice below, once for value types and once
// for object types, with a repetition of all the property definitions. Violating the DRY principle that much
// just to protect against stupid usage, clearly documented as dangerous, is not worth it IMHO.
public EnumIndexedArray(T t)
{
int i = Length;
while (--i >= 0)
{
this[i] = t;
}
}
public EnumIndexedArray(T[] inputArray)
{
if (inputArray.Length > Length)
{
throw new Exception(string.Format("Length of enum-indexed array ({0}) to big. Can't be more than {1}.", inputArray.Length, Length));
}
Array.Copy(inputArray, data, inputArray.Length);
}
public EnumIndexedArray(EnumIndexedArray<T, TEnum> inputArray)
{
Array.Copy(inputArray.data, data, data.Length);
}
//Clean data access
public T this[int ac] { get { return data[ac]; } set { data[ac] = value; } }
public T this[TEnum ac] { get { return data[Convert.ToInt32(ac)]; } set { data[Convert.ToInt32(ac)] = value; } }
}
public class EnumIndexedObjectArray<T, TEnum> : EnumIndexedArray<T, TEnum>
where TEnum : struct, SystemEnum
where T : new()
{
public EnumIndexedObjectArray(bool doInitializeWithNewObjects = true)
{
if (doInitializeWithNewObjects)
{
for (int i = Length; i > 0; this[--i] = new T()) ;
}
}
// The other ctor's are dangerous for object arrays
}
public class EnumIndexedArrayComparator<T, TEnum> : EqualityComparer<EnumIndexedArray<T, TEnum>>
where TEnum : struct, SystemEnum
{
private readonly EqualityComparer<T> elementComparer = EqualityComparer<T>.Default;
public override bool Equals(EnumIndexedArray<T, TEnum> lhs, EnumIndexedArray<T, TEnum> rhs)
{
if (lhs == rhs)
return true;
if (lhs == null || rhs == null)
return false;
//These cases should not be possible because of the way these classes are constructed.
// HOWEVER, the data member is public, so somebody _could_ do something stupid and make
// data=null, or make lhs.data == rhs.data, even though lhs!=rhs (above check)
//On the other hand, these are just optimizations, so it won't be an issue if we reomve them anyway,
// Unless someone does something really dumb like setting .data to null or resizing to an incorrect size,
// in which case things will crash, but any developer who does this deserves to have it crash painfully...
//if (lhs.data == rhs.data)
// return true;
//if (lhs.data == null || rhs.data == null)
// return false;
int i = lhs.Length;
//if (rhs.Length != i)
// return false;
while (--i >= 0)
{
if (!elementComparer.Equals(lhs[i], rhs[i]))
return false;
}
return true;
}
public override int GetHashCode(EnumIndexedArray<T, TEnum> enumIndexedArray)
{
//This doesn't work: for two arrays ar1 and ar2, ar1.GetHashCode() != ar2.GetHashCode() even when ar1[i]==ar2[i] for all i (unless of course they are the exact same array object)
//return engineArray.GetHashCode();
//Code taken from comment by Jon Skeet - of course - in http://stackoverflow.com/questions/7244699/gethashcode-on-byte-array
//31 and 17 are used commonly elsewhere, but maybe because everyone is using Skeet's post.
//On the other hand, this is really not very critical.
unchecked
{
int hash = 17;
int i = enumIndexedArray.Length;
while (--i >= 0)
{
hash = hash * 31 + elementComparer.GetHashCode(enumIndexedArray[i]);
}
return hash;
}
}
}
}
//Because of the above hack, this fails at compile time - as it should. It would, otherwise, only fail at run time.
//public class ThisShouldNotCompile : EConstraint.EnumIndexedArray<int, bool>
//{
//}
//An example
public enum AssetClass { Ir, FxFwd, Cm, Eq, FxOpt, Cr };
public class AssetClassArrayComparator<T> : EConstraint.EnumIndexedArrayComparator<T, AssetClass> { }
public class AssetClassIndexedArray<T> : EConstraint.EnumIndexedArray<T, AssetClass>
{
public AssetClassIndexedArray()
{
}
public AssetClassIndexedArray(T t) : base(t)
{
}
public AssetClassIndexedArray(T[] inputArray) : base(inputArray)
{
}
public AssetClassIndexedArray(EConstraint.EnumIndexedArray<T, AssetClass> inputArray) : base(inputArray)
{
}
public T Cm { get { return this[AssetClass.Cm ]; } set { this[AssetClass.Cm ] = value; } }
public T FxFwd { get { return this[AssetClass.FxFwd]; } set { this[AssetClass.FxFwd] = value; } }
public T Ir { get { return this[AssetClass.Ir ]; } set { this[AssetClass.Ir ] = value; } }
public T Eq { get { return this[AssetClass.Eq ]; } set { this[AssetClass.Eq ] = value; } }
public T FxOpt { get { return this[AssetClass.FxOpt]; } set { this[AssetClass.FxOpt] = value; } }
public T Cr { get { return this[AssetClass.Cr ]; } set { this[AssetClass.Cr ] = value; } }
}
//Inherit from AssetClassArray<T>, not EnumIndexedObjectArray<T, AssetClass>, so we get the benefit of the public access getters and setters above
public class AssetClassIndexedObjectArray<T> : AssetClassIndexedArray<T> where T : new()
{
public AssetClassIndexedObjectArray(bool bInitializeWithNewObjects = true)
{
if (bInitializeWithNewObjects)
{
for (int i = Length; i > 0; this[--i] = new T()) ;
}
}
}
EDIT:
If you are using C# 7.3 or later, PLEASE don't use this ugly solution. See Ian Goldby's answer from 2018.
You can always do some extra mapping to get an array index of an enum value in a consistent and defined way:
int ArrayIndexFromDaysOfTheWeekEnum(DaysOfWeek day)
{
switch (day)
{
case DaysOfWeek.Sunday: return 0;
case DaysOfWeek.Monday: return 1;
...
default: throw ...;
}
}
Be as specific as you can. One day someone will modify your enum and the code will fail because the enum's value was (mis)used as an array index.
For future reference the above problem can be summarized as follows:
I come from Delphi where you can define an array as follows:
type
{$SCOPEDENUMS ON}
TDaysOfTheWeek = (Monday, Tuesday, Wednesday, Thursday, Friday, Saturday, Sunday);
TDaysOfTheWeekStrings = array[TDaysOfTheWeek];
Then you can iterate through the array using Min and Max:
for Dow := Min(TDaysOfTheWeek) to Max(TDaysOfTheWeek)
DaysOfTheWeekStrings[Dow] := '';
Though this is quite a contrived example, when you are dealing with array positions later in the code I can just type DaysOfTheWeekStrings[TDaysOfTheWeek.Monday]. This has the advantage of the fact that I should the TDaysOfTheWeek increase in size then I do not have to remember the new size of the array etc..... However back to the C# world. I have found this example C# Enum Array Example.
It was a very good answer by #ian-goldby, but it didn't address the issue raised by #zar-shardan, which is an issue I hit myself. Below is my take on a solution, with a an extension class for converting an IEnumerable, and a test class below that:
/// <summary>
/// An array indexed by an enumerated type instead of an integer
/// </summary>
public class ArrayIndexedByEnum<TKey, TElement> : IEnumerable<TElement> where TKey : Enum
{
private readonly Array _array;
private readonly Dictionary<TKey, TElement> _dictionary;
/// <summary>
/// Creates the initial array, populated with the defaults for TElement
/// </summary>
public ArrayIndexedByEnum()
{
var min = Convert.ToInt64(Enum.GetValues(typeof(TKey)).Cast<TKey>().Min());
var max = Convert.ToInt64(Enum.GetValues(typeof(TKey)).Cast<TKey>().Max());
var size = max - min + 1;
// Check that we aren't creating a ridiculously big array, if we are,
// then use a dictionary instead
if (min >= Int32.MinValue &&
max <= Int32.MaxValue &&
size < Enum.GetValues(typeof(TKey)).Length * 3L)
{
var lowerBound = Convert.ToInt32(min);
var upperBound = Convert.ToInt32(max);
_array = Array.CreateInstance(typeof(TElement), new int[] {(int)size }, new int[] { lowerBound });
}
else
{
_dictionary = new Dictionary<TKey, TElement>();
foreach (var value in Enum.GetValues(typeof(TKey)).Cast<TKey>())
{
_dictionary[value] = default(TElement);
}
}
}
/// <summary>
/// Gets the element by enumerated type
/// </summary>
public TElement this[TKey key]
{
get => (TElement)(_array?.GetValue(Convert.ToInt32(key)) ?? _dictionary[key]);
set
{
if (_array != null)
{
_array.SetValue(value, Convert.ToInt32(key));
}
else
{
_dictionary[key] = value;
}
}
}
/// <summary>
/// Gets a generic enumerator
/// </summary>
public IEnumerator<TElement> GetEnumerator()
{
return Enum.GetValues(typeof(TKey)).Cast<TKey>().Select(k => this[k]).GetEnumerator();
}
System.Collections.IEnumerator System.Collections.IEnumerable.GetEnumerator()
{
return GetEnumerator();
}
}
Here's the extension class:
/// <summary>
/// Extensions for converting IEnumerable<TElement> to ArrayIndexedByEnum
/// </summary>
public static class ArrayIndexedByEnumExtensions
{
/// <summary>
/// Creates a ArrayIndexedByEnumExtensions from an System.Collections.Generic.IEnumerable
/// according to specified key selector and element selector functions.
/// </summary>
public static ArrayIndexedByEnum<TKey, TElement> ToArrayIndexedByEnum<TSource, TKey, TElement>(this IEnumerable<TSource> source, Func<TSource, TKey> keySelector, Func<TSource, TElement> elementSelector) where TKey : Enum
{
var array = new ArrayIndexedByEnum<TKey, TElement>();
foreach(var item in source)
{
array[keySelector(item)] = elementSelector(item);
}
return array;
}
/// <summary>
/// Creates a ArrayIndexedByEnum from an System.Collections.Generic.IEnumerable
/// according to a specified key selector function.
/// </summary>
public static ArrayIndexedByEnum<TKey, TSource> ToArrayIndexedByEnum<TSource, TKey>(this IEnumerable<TSource> source, Func<TSource, TKey> keySelector) where TKey : Enum
{
return source.ToArrayIndexedByEnum(keySelector, i => i);
}
}
And here are my tests:
[TestClass]
public class ArrayIndexedByEnumUnitTest
{
private enum OddNumbersEnum : UInt16
{
One = 1,
Three = 3,
Five = 5,
Seven = 7,
Nine = 9
}
private enum PowersOf2 : Int64
{
TwoP0 = 1,
TwoP1 = 2,
TwoP2 = 4,
TwoP3 = 8,
TwoP4 = 16,
TwoP5 = 32,
TwoP6 = 64,
TwoP7 = 128,
TwoP8 = 256,
TwoP9 = 512,
TwoP10 = 1_024,
TwoP11 = 2_048,
TwoP12 = 4_096,
TwoP13 = 8_192,
TwoP14 = 16_384,
TwoP15 = 32_768,
TwoP16 = 65_536,
TwoP17 = 131_072,
TwoP18 = 262_144,
TwoP19 = 524_288,
TwoP20 = 1_048_576,
TwoP21 = 2_097_152,
TwoP22 = 4_194_304,
TwoP23 = 8_388_608,
TwoP24 = 16_777_216,
TwoP25 = 33_554_432,
TwoP26 = 67_108_864,
TwoP27 = 134_217_728,
TwoP28 = 268_435_456,
TwoP29 = 536_870_912,
TwoP30 = 1_073_741_824,
TwoP31 = 2_147_483_648,
TwoP32 = 4_294_967_296,
TwoP33 = 8_589_934_592,
TwoP34 = 17_179_869_184,
TwoP35 = 34_359_738_368,
TwoP36 = 68_719_476_736,
TwoP37 = 137_438_953_472,
TwoP38 = 274_877_906_944,
TwoP39 = 549_755_813_888,
TwoP40 = 1_099_511_627_776,
TwoP41 = 2_199_023_255_552,
TwoP42 = 4_398_046_511_104,
TwoP43 = 8_796_093_022_208,
TwoP44 = 17_592_186_044_416,
TwoP45 = 35_184_372_088_832,
TwoP46 = 70_368_744_177_664,
TwoP47 = 140_737_488_355_328,
TwoP48 = 281_474_976_710_656,
TwoP49 = 562_949_953_421_312,
TwoP50 = 1_125_899_906_842_620,
TwoP51 = 2_251_799_813_685_250,
TwoP52 = 4_503_599_627_370_500,
TwoP53 = 9_007_199_254_740_990,
TwoP54 = 18_014_398_509_482_000,
TwoP55 = 36_028_797_018_964_000,
TwoP56 = 72_057_594_037_927_900,
TwoP57 = 144_115_188_075_856_000,
TwoP58 = 288_230_376_151_712_000,
TwoP59 = 576_460_752_303_423_000,
TwoP60 = 1_152_921_504_606_850_000,
}
[TestMethod]
public void TestSimpleArray()
{
var array = new ArrayIndexedByEnum<OddNumbersEnum, string>();
var odds = Enum.GetValues(typeof(OddNumbersEnum)).Cast<OddNumbersEnum>().ToList();
// Store all the values
foreach (var odd in odds)
{
array[odd] = odd.ToString();
}
// Check the retrieved values are the same as what was stored
foreach (var odd in odds)
{
Assert.AreEqual(odd.ToString(), array[odd]);
}
}
[TestMethod]
public void TestPossiblyHugeArray()
{
var array = new ArrayIndexedByEnum<PowersOf2, string>();
var powersOf2s = Enum.GetValues(typeof(PowersOf2)).Cast<PowersOf2>().ToList();
// Store all the values
foreach (var powerOf2 in powersOf2s)
{
array[powerOf2] = powerOf2.ToString();
}
// Check the retrieved values are the same as what was stored
foreach (var powerOf2 in powersOf2s)
{
Assert.AreEqual(powerOf2.ToString(), array[powerOf2]);
}
}
}
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 5 years ago.
Improve this question
I am looking for a .NET implementation of a priority queue or heap data structure
Priority queues are data structures that provide more flexibility than simple sorting, because they allow new elements to enter a system at arbitrary intervals. It is much more cost-effective to insert a new job into a priority queue than to re-sort everything on each such arrival.
The basic priority queue supports three primary operations:
Insert(Q,x). Given an item x with key k, insert it into the priority queue Q.
Find-Minimum(Q). Return a pointer to the item
whose key value is smaller than any other key in the priority queue
Q.
Delete-Minimum(Q). Remove the item from the priority queue Q whose key is minimum
Unless I am looking in the wrong place, there isn't one in the framework. Is anyone aware of a good one, or should I roll my own?
You might like IntervalHeap from the C5 Generic Collection Library. To quote the user guide
Class IntervalHeap<T> implements interface IPriorityQueue<T> using an interval heap stored as an array of pairs. The FindMin and
FindMax operations, and the indexer’s get-accessor, take time O(1). The DeleteMin,
DeleteMax, Add and Update operations, and the indexer’s set-accessor, take time
O(log n). In contrast to an ordinary priority queue, an interval heap offers both minimum
and maximum operations with the same efficiency.
The API is simple enough
> var heap = new C5.IntervalHeap<int>();
> heap.Add(10);
> heap.Add(5);
> heap.FindMin();
5
Install from Nuget https://www.nuget.org/packages/C5 or GitHub https://github.com/sestoft/C5/
Here's my attempt at a .NET heap
public abstract class Heap<T> : IEnumerable<T>
{
private const int InitialCapacity = 0;
private const int GrowFactor = 2;
private const int MinGrow = 1;
private int _capacity = InitialCapacity;
private T[] _heap = new T[InitialCapacity];
private int _tail = 0;
public int Count { get { return _tail; } }
public int Capacity { get { return _capacity; } }
protected Comparer<T> Comparer { get; private set; }
protected abstract bool Dominates(T x, T y);
protected Heap() : this(Comparer<T>.Default)
{
}
protected Heap(Comparer<T> comparer) : this(Enumerable.Empty<T>(), comparer)
{
}
protected Heap(IEnumerable<T> collection)
: this(collection, Comparer<T>.Default)
{
}
protected Heap(IEnumerable<T> collection, Comparer<T> comparer)
{
if (collection == null) throw new ArgumentNullException("collection");
if (comparer == null) throw new ArgumentNullException("comparer");
Comparer = comparer;
foreach (var item in collection)
{
if (Count == Capacity)
Grow();
_heap[_tail++] = item;
}
for (int i = Parent(_tail - 1); i >= 0; i--)
BubbleDown(i);
}
public void Add(T item)
{
if (Count == Capacity)
Grow();
_heap[_tail++] = item;
BubbleUp(_tail - 1);
}
private void BubbleUp(int i)
{
if (i == 0 || Dominates(_heap[Parent(i)], _heap[i]))
return; //correct domination (or root)
Swap(i, Parent(i));
BubbleUp(Parent(i));
}
public T GetMin()
{
if (Count == 0) throw new InvalidOperationException("Heap is empty");
return _heap[0];
}
public T ExtractDominating()
{
if (Count == 0) throw new InvalidOperationException("Heap is empty");
T ret = _heap[0];
_tail--;
Swap(_tail, 0);
BubbleDown(0);
return ret;
}
private void BubbleDown(int i)
{
int dominatingNode = Dominating(i);
if (dominatingNode == i) return;
Swap(i, dominatingNode);
BubbleDown(dominatingNode);
}
private int Dominating(int i)
{
int dominatingNode = i;
dominatingNode = GetDominating(YoungChild(i), dominatingNode);
dominatingNode = GetDominating(OldChild(i), dominatingNode);
return dominatingNode;
}
private int GetDominating(int newNode, int dominatingNode)
{
if (newNode < _tail && !Dominates(_heap[dominatingNode], _heap[newNode]))
return newNode;
else
return dominatingNode;
}
private void Swap(int i, int j)
{
T tmp = _heap[i];
_heap[i] = _heap[j];
_heap[j] = tmp;
}
private static int Parent(int i)
{
return (i + 1)/2 - 1;
}
private static int YoungChild(int i)
{
return (i + 1)*2 - 1;
}
private static int OldChild(int i)
{
return YoungChild(i) + 1;
}
private void Grow()
{
int newCapacity = _capacity*GrowFactor + MinGrow;
var newHeap = new T[newCapacity];
Array.Copy(_heap, newHeap, _capacity);
_heap = newHeap;
_capacity = newCapacity;
}
public IEnumerator<T> GetEnumerator()
{
return _heap.Take(Count).GetEnumerator();
}
IEnumerator IEnumerable.GetEnumerator()
{
return GetEnumerator();
}
}
public class MaxHeap<T> : Heap<T>
{
public MaxHeap()
: this(Comparer<T>.Default)
{
}
public MaxHeap(Comparer<T> comparer)
: base(comparer)
{
}
public MaxHeap(IEnumerable<T> collection, Comparer<T> comparer)
: base(collection, comparer)
{
}
public MaxHeap(IEnumerable<T> collection) : base(collection)
{
}
protected override bool Dominates(T x, T y)
{
return Comparer.Compare(x, y) >= 0;
}
}
public class MinHeap<T> : Heap<T>
{
public MinHeap()
: this(Comparer<T>.Default)
{
}
public MinHeap(Comparer<T> comparer)
: base(comparer)
{
}
public MinHeap(IEnumerable<T> collection) : base(collection)
{
}
public MinHeap(IEnumerable<T> collection, Comparer<T> comparer)
: base(collection, comparer)
{
}
protected override bool Dominates(T x, T y)
{
return Comparer.Compare(x, y) <= 0;
}
}
Some tests:
[TestClass]
public class HeapTests
{
[TestMethod]
public void TestHeapBySorting()
{
var minHeap = new MinHeap<int>(new[] {9, 8, 4, 1, 6, 2, 7, 4, 1, 2});
AssertHeapSort(minHeap, minHeap.OrderBy(i => i).ToArray());
minHeap = new MinHeap<int> { 7, 5, 1, 6, 3, 2, 4, 1, 2, 1, 3, 4, 7 };
AssertHeapSort(minHeap, minHeap.OrderBy(i => i).ToArray());
var maxHeap = new MaxHeap<int>(new[] {1, 5, 3, 2, 7, 56, 3, 1, 23, 5, 2, 1});
AssertHeapSort(maxHeap, maxHeap.OrderBy(d => -d).ToArray());
maxHeap = new MaxHeap<int> {2, 6, 1, 3, 56, 1, 4, 7, 8, 23, 4, 5, 7, 34, 1, 4};
AssertHeapSort(maxHeap, maxHeap.OrderBy(d => -d).ToArray());
}
private static void AssertHeapSort(Heap<int> heap, IEnumerable<int> expected)
{
var sorted = new List<int>();
while (heap.Count > 0)
sorted.Add(heap.ExtractDominating());
Assert.IsTrue(sorted.SequenceEqual(expected));
}
}
I like using the OrderedBag and OrderedSet classes in PowerCollections as priority queues.
here's one i just wrote, maybe it's not as optimized (just uses a sorted dictionary) but simple to understand.
you can insert objects of different kinds, so no generic queues.
using System;
using System.Diagnostics;
using System.Collections;
using System.Collections.Generic;
namespace PrioQueue
{
public class PrioQueue
{
int total_size;
SortedDictionary<int, Queue> storage;
public PrioQueue ()
{
this.storage = new SortedDictionary<int, Queue> ();
this.total_size = 0;
}
public bool IsEmpty ()
{
return (total_size == 0);
}
public object Dequeue ()
{
if (IsEmpty ()) {
throw new Exception ("Please check that priorityQueue is not empty before dequeing");
} else
foreach (Queue q in storage.Values) {
// we use a sorted dictionary
if (q.Count > 0) {
total_size--;
return q.Dequeue ();
}
}
Debug.Assert(false,"not supposed to reach here. problem with changing total_size");
return null; // not supposed to reach here.
}
// same as above, except for peek.
public object Peek ()
{
if (IsEmpty ())
throw new Exception ("Please check that priorityQueue is not empty before peeking");
else
foreach (Queue q in storage.Values) {
if (q.Count > 0)
return q.Peek ();
}
Debug.Assert(false,"not supposed to reach here. problem with changing total_size");
return null; // not supposed to reach here.
}
public object Dequeue (int prio)
{
total_size--;
return storage[prio].Dequeue ();
}
public void Enqueue (object item, int prio)
{
if (!storage.ContainsKey (prio)) {
storage.Add (prio, new Queue ());
}
storage[prio].Enqueue (item);
total_size++;
}
}
}
.NET 6+: As #rustyx commented, .NET 6 adds a System.Collections.Generic.PriorityQueue<TElement,TPriority> class. And FWIW it is open-source and implemented in c#.
Earlier .NET Core versions and .NET Framework: Microsoft has written (and shared online) 2 internal PriorityQueue classes within the .NET Framework. However, as #mathusum-mut commented, there is a bug in one of them (the SO community has, of course, provided fixes for it): Bug in Microsoft's internal PriorityQueue<T>?
I found one by Julian Bucknall on his blog here - http://www.boyet.com/Articles/PriorityQueueCSharp3.html
We modified it slightly so that low-priority items on the queue would eventually 'bubble-up' to the top over time, so they wouldn't suffer starvation.
You may find useful this implementation:
http://www.codeproject.com/Articles/126751/Priority-queue-in-Csharp-with-help-of-heap-data-st.aspx
it is generic and based on heap data structure
class PriorityQueue<T>
{
IComparer<T> comparer;
T[] heap;
public int Count { get; private set; }
public PriorityQueue() : this(null) { }
public PriorityQueue(int capacity) : this(capacity, null) { }
public PriorityQueue(IComparer<T> comparer) : this(16, comparer) { }
public PriorityQueue(int capacity, IComparer<T> comparer)
{
this.comparer = (comparer == null) ? Comparer<T>.Default : comparer;
this.heap = new T[capacity];
}
public void push(T v)
{
if (Count >= heap.Length) Array.Resize(ref heap, Count * 2);
heap[Count] = v;
SiftUp(Count++);
}
public T pop()
{
var v = top();
heap[0] = heap[--Count];
if (Count > 0) SiftDown(0);
return v;
}
public T top()
{
if (Count > 0) return heap[0];
throw new InvalidOperationException("优先队列为空");
}
void SiftUp(int n)
{
var v = heap[n];
for (var n2 = n / 2; n > 0 && comparer.Compare(v, heap[n2]) > 0; n = n2, n2 /= 2) heap[n] = heap[n2];
heap[n] = v;
}
void SiftDown(int n)
{
var v = heap[n];
for (var n2 = n * 2; n2 < Count; n = n2, n2 *= 2)
{
if (n2 + 1 < Count && comparer.Compare(heap[n2 + 1], heap[n2]) > 0) n2++;
if (comparer.Compare(v, heap[n2]) >= 0) break;
heap[n] = heap[n2];
}
heap[n] = v;
}
}
easy.
AlgoKit
I wrote an open source library called AlgoKit, available via NuGet. It contains:
Implicit d-ary heaps (ArrayHeap),
Binomial heaps,
Pairing heaps.
The code has been extensively tested. I definitely recommend you to give it a try.
Example
var comparer = Comparer<int>.Default;
var heap = new PairingHeap<int, string>(comparer);
heap.Add(3, "your");
heap.Add(5, "of");
heap.Add(7, "disturbing.");
heap.Add(2, "find");
heap.Add(1, "I");
heap.Add(6, "faith");
heap.Add(4, "lack");
while (!heap.IsEmpty)
Console.WriteLine(heap.Pop().Value);
Why those three heaps?
The optimal choice of implementation is strongly input-dependent — as Larkin, Sen, and Tarjan show in A back-to-basics empirical study of priority queues, arXiv:1403.0252v1 [cs.DS]. They tested implicit d-ary heaps, pairing heaps, Fibonacci heaps, binomial heaps, explicit d-ary heaps, rank-pairing heaps, quake heaps, violation heaps, rank-relaxed weak heaps, and strict Fibonacci heaps.
AlgoKit features three types of heaps that appeared to be most efficient among those tested.
Hint on choice
For a relatively small number of elements, you would likely be interested in using implicit heaps, especially quaternary heaps (implicit 4-ary). In case of operating on larger heap sizes, amortized structures like binomial heaps and pairing heaps should perform better.
A Simple Max Heap Implementation.
https://github.com/bharathkumarms/AlgorithmsMadeEasy/blob/master/AlgorithmsMadeEasy/MaxHeap.cs
using System;
using System.Collections.Generic;
using System.Linq;
namespace AlgorithmsMadeEasy
{
class MaxHeap
{
private static int capacity = 10;
private int size = 0;
int[] items = new int[capacity];
private int getLeftChildIndex(int parentIndex) { return 2 * parentIndex + 1; }
private int getRightChildIndex(int parentIndex) { return 2 * parentIndex + 2; }
private int getParentIndex(int childIndex) { return (childIndex - 1) / 2; }
private int getLeftChild(int parentIndex) { return this.items[getLeftChildIndex(parentIndex)]; }
private int getRightChild(int parentIndex) { return this.items[getRightChildIndex(parentIndex)]; }
private int getParent(int childIndex) { return this.items[getParentIndex(childIndex)]; }
private bool hasLeftChild(int parentIndex) { return getLeftChildIndex(parentIndex) < size; }
private bool hasRightChild(int parentIndex) { return getRightChildIndex(parentIndex) < size; }
private bool hasParent(int childIndex) { return getLeftChildIndex(childIndex) > 0; }
private void swap(int indexOne, int indexTwo)
{
int temp = this.items[indexOne];
this.items[indexOne] = this.items[indexTwo];
this.items[indexTwo] = temp;
}
private void hasEnoughCapacity()
{
if (this.size == capacity)
{
Array.Resize(ref this.items,capacity*2);
capacity *= 2;
}
}
public void Add(int item)
{
this.hasEnoughCapacity();
this.items[size] = item;
this.size++;
heapifyUp();
}
public int Remove()
{
int item = this.items[0];
this.items[0] = this.items[size-1];
this.items[this.size - 1] = 0;
size--;
heapifyDown();
return item;
}
private void heapifyUp()
{
int index = this.size - 1;
while (hasParent(index) && this.items[index] > getParent(index))
{
swap(index, getParentIndex(index));
index = getParentIndex(index);
}
}
private void heapifyDown()
{
int index = 0;
while (hasLeftChild(index))
{
int bigChildIndex = getLeftChildIndex(index);
if (hasRightChild(index) && getLeftChild(index) < getRightChild(index))
{
bigChildIndex = getRightChildIndex(index);
}
if (this.items[bigChildIndex] < this.items[index])
{
break;
}
else
{
swap(bigChildIndex,index);
index = bigChildIndex;
}
}
}
}
}
/*
Calling Code:
MaxHeap mh = new MaxHeap();
mh.Add(10);
mh.Add(5);
mh.Add(2);
mh.Add(1);
mh.Add(50);
int maxVal = mh.Remove();
int newMaxVal = mh.Remove();
*/
Use a Java to C# translator on the Java implementation (java.util.PriorityQueue) in the Java Collections framework, or more intelligently use the algorithm and core code and plug it into a C# class of your own making that adheres to the C# Collections framework API for Queues, or at least Collections.
Here is the another implementation from NGenerics team:
NGenerics PriorityQueue
I had the same issue recently and ended up creating a NuGet package for this.
This implements a standard heap-based priority queue. It also has all the usual niceties of the BCL collections: ICollection<T> and IReadOnlyCollection<T> implementation, custom IComparer<T> support, ability to specify an initial capacity, and a DebuggerTypeProxy to make the collection easier to work with in the debugger.
There is also an Inline version of the package which just installs a single .cs file into your project (useful if you want to avoid taking externally-visible dependencies).
More information is available on the github page.
The following implementation of a PriorityQueue uses SortedSet from the System library.
using System;
using System.Collections.Generic;
namespace CDiggins
{
interface IPriorityQueue<T, K> where K : IComparable<K>
{
bool Empty { get; }
void Enqueue(T x, K key);
void Dequeue();
T Top { get; }
}
class PriorityQueue<T, K> : IPriorityQueue<T, K> where K : IComparable<K>
{
SortedSet<Tuple<T, K>> set;
class Comparer : IComparer<Tuple<T, K>> {
public int Compare(Tuple<T, K> x, Tuple<T, K> y) {
return x.Item2.CompareTo(y.Item2);
}
}
PriorityQueue() { set = new SortedSet<Tuple<T, K>>(new Comparer()); }
public bool Empty { get { return set.Count == 0; } }
public void Enqueue(T x, K key) { set.Add(Tuple.Create(x, key)); }
public void Dequeue() { set.Remove(set.Max); }
public T Top { get { return set.Max.Item1; } }
}
}