Take this pseudo example code:
static System.Runtime.InteropServices.ComTypes.IEnumString GetUnmanagedObject() => null;
static IEnumerable<string> ProduceStrings()
{
System.Runtime.InteropServices.ComTypes.IEnumString obj = GetUnmanagedObject();
var result = new string[1];
var pFetched = Marshal.AllocHGlobal(sizeof(int));
while(obj.Next(1, result, pFetched) == 0)
{
yield return result[0];
}
Marshal.ReleaseComObject(obj);
}
static void Consumer()
{
foreach (var item in ProduceStrings())
{
if (item.StartsWith("foo"))
return;
}
}
Question is if i decide to not enumerate all values, how can i inform producer to do cleanup?
Even if you are after a solution using yield return, it might be useful to see how this can be accomplished with an explicit IEnumerator<string> implementation.
IEnumerator<T> derives from IDisposable and the Dispose() method will be called when foreach is left (at least since .NET 1.2, see here)
static IEnumerable<string> ProduceStrings()
{
return new ProduceStringsImpl();
}
This is the class implementing IEnumerable<string>
class ProduceStringsImpl : IEnumerable<string>
{
public IEnumerator<string> GetEnumerator()
{
return new EnumProduceStrings();
}
IEnumerator IEnumerable.GetEnumerator()
{
return GetEnumerator();
}
}
And here we have the core of the solution, the IEnumerator<string> implementation:
class EnumProduceStrings : IEnumerator<string>
{
private System.Runtime.InteropServices.ComTypes.IEnumString _obj;
private string[] _result;
private IntPtr _pFetched;
public EnumProduceStrings()
{
_obj = GetUnmanagedObject();
_result = new string[1];
_pFetched = Marshal.AllocHGlobal(sizeof(int));
}
public bool MoveNext()
{
return _obj.Next(1, _result, _pFetched) == 0;
}
public string Current => _result[0];
void IEnumerator.Reset() => throw new NotImplementedException();
object IEnumerator.Current => Current;
public void Dispose()
{
Marshal.ReleaseComObject(_obj);
Marshal.FreeHGlobal(_pFetched);
}
}
I knew i can! Despite guard, Cancel is called only one time in all circumtances.
You can instead encapsulate logic with a type like IterationResult<T> and provide Cleanup method on it but its essentially same idea.
public class IterationCanceller
{
Action m_OnCancel;
public bool Cancelled { get; private set; }
public IterationCanceller(Action onCancel)
{
m_OnCancel = onCancel;
}
public void Cancel()
{
if (!Cancelled)
{
Cancelled = true;
m_OnCancel();
}
}
}
static IEnumerable<(string Result, IterationCanceller Canceller)> ProduceStrings()
{
var pUnmanaged = Marshal.AllocHGlobal(sizeof(int));
IterationCanceller canceller = new IterationCanceller(() =>
{
Marshal.FreeHGlobal(pUnmanaged);
});
for (int i = 0; i < 2; i++) // also try i < 0, 1
{
yield return (i.ToString(), canceller);
}
canceller.Cancel();
}
static void Consumer()
{
foreach (var (item, canceller) in ProduceStrings())
{
if(item.StartsWith("1")) // also try consuming all values
{
canceller.Cancel();
break;
}
}
}
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I have a file with 500.000.000 lines.
The lines are string of max 10 characters.
How can I process this file using multi threading and in batches of 100?
Using MoreLinq's Batch method, this will create a collection of IEnumerable<string> which will contain the line batch size of 100, it will spin a new task for every 100 lines.
This is a basic implementation, it might be wise to use a Semaphore to only run a certain amount of tasks at any given time, and also seeing what overhead File.ReadAllLines will have on performance with 500,000,000 lines.
public class FileProcessor
{
public async Task ProcessFile()
{
List<Task> tasks = new List<Task>();
var lines = File.ReadAllLines("File.txt").Batch(100);
foreach (IEnumerable<string> linesBatch in lines)
{
IEnumerable<string> localLinesBatch = linesBatch;
Task task = Task.Factory.StartNew(() =>
{
// Perform operation on localLinesBatch
});
tasks.Add(task);
}
await Task.WhenAll(tasks);
}
}
public static class LinqExtensions
{
public static IEnumerable<IEnumerable<TSource>> Batch<TSource>(
this IEnumerable<TSource> source, int size)
{
TSource[] bucket = null;
var count = 0;
foreach (var item in source)
{
if (bucket == null)
bucket = new TSource[size];
bucket[count++] = item;
if (count != size)
continue;
yield return bucket;
bucket = null;
count = 0;
}
if (bucket != null && count > 0)
yield return bucket.Take(count);
}
}
Using additional libraries is not required if you use Parallel.ForEach from built-in TPL and write a couple of enumerators (listed below). Your code can look like this:
using (var input = new StreamReader(File.OpenRead(#"c:\path\to\my\file.txt")))
{
Parallel.ForEach(
input.ReadLines().TakeChunks(100),
new ParallelOptions() { MaxDegreeOfParallelism = 8 /* better be number of CPU cores */ },
batchOfLines => {
DoMyProcessing(batchOfLines);
});
}
for this to work, you need a couple of extension methods on IEnumerable<T> and a couple of enumerators, defined as follows:
public static class EnumerableExtensions
{
public static IEnumerable<string> ReadLines(this StreamReader input)
{
return new LineReadingEnumerable(input);
}
public static IEnumerable<IReadOnlyList<T>> TakeChunks<T>(this IEnumerable<T> source, int length)
{
return new ChunkingEnumerable<T>(source, length);
}
public class LineReadingEnumerable : IEnumerable<string>
{
private readonly StreamReader _input;
public LineReadingEnumerable(StreamReader input)
{
_input = input;
}
public IEnumerator<string> GetEnumerator()
{
return new LineReadingEnumerator(_input);
}
IEnumerator IEnumerable.GetEnumerator()
{
return GetEnumerator();
}
}
public class LineReadingEnumerator : IEnumerator<string>
{
private readonly StreamReader _input;
private string _current;
public LineReadingEnumerator(StreamReader input)
{
_input = input;
}
public void Dispose()
{
_input.Dispose();
}
public bool MoveNext()
{
_current = _input.ReadLine();
return (_current != null);
}
public void Reset()
{
throw new NotSupportedException();
}
public string Current
{
get { return _current; }
}
object IEnumerator.Current
{
get { return _current; }
}
}
public class ChunkingEnumerable<T> : IEnumerable<IReadOnlyList<T>>
{
private readonly IEnumerable<T> _inner;
private readonly int _length;
public ChunkingEnumerable(IEnumerable<T> inner, int length)
{
_inner = inner;
_length = length;
}
public IEnumerator<IReadOnlyList<T>> GetEnumerator()
{
return new ChunkingEnumerator<T>(_inner.GetEnumerator(), _length);
}
IEnumerator IEnumerable.GetEnumerator()
{
return this.GetEnumerator();
}
}
public class ChunkingEnumerator<T> : IEnumerator<IReadOnlyList<T>>
{
private readonly IEnumerator<T> _inner;
private readonly int _length;
private IReadOnlyList<T> _current;
private bool _endOfInner;
public ChunkingEnumerator(IEnumerator<T> inner, int length)
{
_inner = inner;
_length = length;
}
public void Dispose()
{
_inner.Dispose();
_current = null;
}
public bool MoveNext()
{
var currentBuffer = new List<T>();
while (currentBuffer.Count < _length && !_endOfInner)
{
if (!_inner.MoveNext())
{
_endOfInner = true;
break;
}
currentBuffer.Add(_inner.Current);
}
if (currentBuffer.Count > 0)
{
_current = currentBuffer;
return true;
}
_current = null;
return false;
}
public void Reset()
{
_inner.Reset();
_current = null;
_endOfInner = false;
}
public IReadOnlyList<T> Current
{
get
{
if (_current != null)
{
return _current;
}
throw new InvalidOperationException();
}
}
object IEnumerator.Current
{
get
{
return this.Current;
}
}
}
}
I'm using ConcurrentQueue for a shared data structure which purpose is holding the last N objects passed to it (kind of history).
Assume we have a browser and we want to have the last 100 browsed Urls. I want a queue which automatically drop (dequeue) the oldest (first) entry upon new entry insertion (enqueue) when the capacity gets full (100 addresses in history).
How can I accomplish that using System.Collections ?
I would write a wrapper class that on Enqueue would check the Count and then Dequeue when the count exceeds the limit.
public class FixedSizedQueue<T>
{
readonly ConcurrentQueue<T> q = new ConcurrentQueue<T>();
private object lockObject = new object();
public int Limit { get; set; }
public void Enqueue(T obj)
{
q.Enqueue(obj);
lock (lockObject)
{
T overflow;
while (q.Count > Limit && q.TryDequeue(out overflow)) ;
}
}
}
I'd go for a slight variant... extend ConcurrentQueue so as to be able to use Linq extensions on FixedSizeQueue
public class FixedSizedQueue<T> : ConcurrentQueue<T>
{
private readonly object syncObject = new object();
public int Size { get; private set; }
public FixedSizedQueue(int size)
{
Size = size;
}
public new void Enqueue(T obj)
{
base.Enqueue(obj);
lock (syncObject)
{
while (base.Count > Size)
{
T outObj;
base.TryDequeue(out outObj);
}
}
}
}
For anyone who finds it useful, here is some working code based on Richard Schneider's answer above:
public class FixedSizedQueue<T>
{
readonly ConcurrentQueue<T> queue = new ConcurrentQueue<T>();
public int Size { get; private set; }
public FixedSizedQueue(int size)
{
Size = size;
}
public void Enqueue(T obj)
{
queue.Enqueue(obj);
while (queue.Count > Size)
{
T outObj;
queue.TryDequeue(out outObj);
}
}
}
For what its worth, here's a lightweight circular buffer with some methods marked for safe and unsafe use.
public class CircularBuffer<T> : IEnumerable<T>
{
readonly int size;
readonly object locker;
int count;
int head;
int rear;
T[] values;
public CircularBuffer(int max)
{
this.size = max;
locker = new object();
count = 0;
head = 0;
rear = 0;
values = new T[size];
}
static int Incr(int index, int size)
{
return (index + 1) % size;
}
private void UnsafeEnsureQueueNotEmpty()
{
if (count == 0)
throw new Exception("Empty queue");
}
public int Size { get { return size; } }
public object SyncRoot { get { return locker; } }
#region Count
public int Count { get { return UnsafeCount; } }
public int SafeCount { get { lock (locker) { return UnsafeCount; } } }
public int UnsafeCount { get { return count; } }
#endregion
#region Enqueue
public void Enqueue(T obj)
{
UnsafeEnqueue(obj);
}
public void SafeEnqueue(T obj)
{
lock (locker) { UnsafeEnqueue(obj); }
}
public void UnsafeEnqueue(T obj)
{
values[rear] = obj;
if (Count == Size)
head = Incr(head, Size);
rear = Incr(rear, Size);
count = Math.Min(count + 1, Size);
}
#endregion
#region Dequeue
public T Dequeue()
{
return UnsafeDequeue();
}
public T SafeDequeue()
{
lock (locker) { return UnsafeDequeue(); }
}
public T UnsafeDequeue()
{
UnsafeEnsureQueueNotEmpty();
T res = values[head];
values[head] = default(T);
head = Incr(head, Size);
count--;
return res;
}
#endregion
#region Peek
public T Peek()
{
return UnsafePeek();
}
public T SafePeek()
{
lock (locker) { return UnsafePeek(); }
}
public T UnsafePeek()
{
UnsafeEnsureQueueNotEmpty();
return values[head];
}
#endregion
#region GetEnumerator
public IEnumerator<T> GetEnumerator()
{
return UnsafeGetEnumerator();
}
public IEnumerator<T> SafeGetEnumerator()
{
lock (locker)
{
List<T> res = new List<T>(count);
var enumerator = UnsafeGetEnumerator();
while (enumerator.MoveNext())
res.Add(enumerator.Current);
return res.GetEnumerator();
}
}
public IEnumerator<T> UnsafeGetEnumerator()
{
int index = head;
for (int i = 0; i < count; i++)
{
yield return values[index];
index = Incr(index, size);
}
}
System.Collections.IEnumerator System.Collections.IEnumerable.GetEnumerator()
{
return this.GetEnumerator();
}
#endregion
}
I like to use the Foo()/SafeFoo()/UnsafeFoo() convention:
Foo methods call UnsafeFoo as a default.
UnsafeFoo methods modify state freely without a lock, they should only call other unsafe methods.
SafeFoo methods call UnsafeFoo methods inside a lock.
Its a little verbose, but it makes obvious errors, like calling unsafe methods outside a lock in a method which is supposed to be thread-safe, more apparent.
My version is just a subclass of normal Queue ones.. nothing special but seeing everyone participating and it still goes with the topic title I might as well put it here. It also returns the dequeued ones just in case.
public sealed class SizedQueue<T> : Queue<T>
{
public int FixedCapacity { get; }
public SizedQueue(int fixedCapacity)
{
this.FixedCapacity = fixedCapacity;
}
/// <summary>
/// If the total number of item exceed the capacity, the oldest ones automatically dequeues.
/// </summary>
/// <returns>The dequeued value, if any.</returns>
public new T Enqueue(T item)
{
base.Enqueue(item);
if (base.Count > FixedCapacity)
{
return base.Dequeue();
}
return default;
}
}
Just because no one's said it yet.. you can use a LinkedList<T> and add the thread safety:
public class Buffer<T> : LinkedList<T>
{
private int capacity;
public Buffer(int capacity)
{
this.capacity = capacity;
}
public void Enqueue(T item)
{
// todo: add synchronization mechanism
if (Count == capacity) RemoveLast();
AddFirst(item);
}
public T Dequeue()
{
// todo: add synchronization mechanism
var last = Last.Value;
RemoveLast();
return last;
}
}
One thing to note is the default enumeration order will be LIFO in this example. But that can be overridden if necessary.
Here's my take on the fixed size Queue
It uses regular Queue, to avoid the synchronization overhead when the Count property is used on ConcurrentQueue. It also implements IReadOnlyCollection so that LINQ methods can be used. The rest is very similar to the other answers here.
[Serializable]
[DebuggerDisplay("Count = {" + nameof(Count) + "}, Limit = {" + nameof(Limit) + "}")]
public class FixedSizedQueue<T> : IReadOnlyCollection<T>
{
private readonly Queue<T> _queue = new Queue<T>();
private readonly object _lock = new object();
public int Count { get { lock (_lock) { return _queue.Count; } } }
public int Limit { get; }
public FixedSizedQueue(int limit)
{
if (limit < 1)
throw new ArgumentOutOfRangeException(nameof(limit));
Limit = limit;
}
public FixedSizedQueue(IEnumerable<T> collection)
{
if (collection is null || !collection.Any())
throw new ArgumentException("Can not initialize the Queue with a null or empty collection", nameof(collection));
_queue = new Queue<T>(collection);
Limit = _queue.Count;
}
public void Enqueue(T obj)
{
lock (_lock)
{
_queue.Enqueue(obj);
while (_queue.Count > Limit)
_queue.Dequeue();
}
}
public void Clear()
{
lock (_lock)
_queue.Clear();
}
public IEnumerator<T> GetEnumerator()
{
lock (_lock)
return new List<T>(_queue).GetEnumerator();
}
IEnumerator IEnumerable.GetEnumerator()
{
return GetEnumerator();
}
}
Let's add one more answer. Why this over others?
1) Simplicity. Trying to guarantee size is well and good but leads to unneeded complexity that can exhibit its own problems.
2) Implements IReadOnlyCollection, meaning you can use Linq on it and pass it into a variety of things that expect IEnumerable.
3) No locking. Many of the solutions above use locks, which is incorrect on a lockless collection.
4) Implements the same set of methods, properties, and interfaces ConcurrentQueue does, including IProducerConsumerCollection, which is important if you want to use the collection with BlockingCollection.
This implementation could potentially end up with more entries than expected if TryDequeue fails, but the frequency of that occurring doesn't seem worth specialized code that will inevitably hamper performance and cause its own unexpected problems.
If you absolutely want to guarantee a size, implementing a Prune() or similar method seems like the best idea. You could use a ReaderWriterLockSlim read lock in the other methods (including TryDequeue) and take a write lock only when pruning.
class ConcurrentFixedSizeQueue<T> : IProducerConsumerCollection<T>, IReadOnlyCollection<T>, ICollection {
readonly ConcurrentQueue<T> m_concurrentQueue;
readonly int m_maxSize;
public int Count => m_concurrentQueue.Count;
public bool IsEmpty => m_concurrentQueue.IsEmpty;
public ConcurrentFixedSizeQueue (int maxSize) : this(Array.Empty<T>(), maxSize) { }
public ConcurrentFixedSizeQueue (IEnumerable<T> initialCollection, int maxSize) {
if (initialCollection == null) {
throw new ArgumentNullException(nameof(initialCollection));
}
m_concurrentQueue = new ConcurrentQueue<T>(initialCollection);
m_maxSize = maxSize;
}
public void Enqueue (T item) {
m_concurrentQueue.Enqueue(item);
if (m_concurrentQueue.Count > m_maxSize) {
T result;
m_concurrentQueue.TryDequeue(out result);
}
}
public void TryPeek (out T result) => m_concurrentQueue.TryPeek(out result);
public bool TryDequeue (out T result) => m_concurrentQueue.TryDequeue(out result);
public void CopyTo (T[] array, int index) => m_concurrentQueue.CopyTo(array, index);
public T[] ToArray () => m_concurrentQueue.ToArray();
public IEnumerator<T> GetEnumerator () => m_concurrentQueue.GetEnumerator();
IEnumerator IEnumerable.GetEnumerator () => GetEnumerator();
// Explicit ICollection implementations.
void ICollection.CopyTo (Array array, int index) => ((ICollection)m_concurrentQueue).CopyTo(array, index);
object ICollection.SyncRoot => ((ICollection) m_concurrentQueue).SyncRoot;
bool ICollection.IsSynchronized => ((ICollection) m_concurrentQueue).IsSynchronized;
// Explicit IProducerConsumerCollection<T> implementations.
bool IProducerConsumerCollection<T>.TryAdd (T item) => ((IProducerConsumerCollection<T>) m_concurrentQueue).TryAdd(item);
bool IProducerConsumerCollection<T>.TryTake (out T item) => ((IProducerConsumerCollection<T>) m_concurrentQueue).TryTake(out item);
public override int GetHashCode () => m_concurrentQueue.GetHashCode();
public override bool Equals (object obj) => m_concurrentQueue.Equals(obj);
public override string ToString () => m_concurrentQueue.ToString();
}
Just for fun, here is another implementation that I believe addresses most of the commenters' concerns. In particular, thread-safety is achieved without locking and the implementation is hidden by the wrapping class.
public class FixedSizeQueue<T> : IReadOnlyCollection<T>
{
private ConcurrentQueue<T> _queue = new ConcurrentQueue<T>();
private int _count;
public int Limit { get; private set; }
public FixedSizeQueue(int limit)
{
this.Limit = limit;
}
public void Enqueue(T obj)
{
_queue.Enqueue(obj);
Interlocked.Increment(ref _count);
// Calculate the number of items to be removed by this thread in a thread safe manner
int currentCount;
int finalCount;
do
{
currentCount = _count;
finalCount = Math.Min(currentCount, this.Limit);
} while (currentCount !=
Interlocked.CompareExchange(ref _count, finalCount, currentCount));
T overflow;
while (currentCount > finalCount && _queue.TryDequeue(out overflow))
currentCount--;
}
public int Count
{
get { return _count; }
}
public IEnumerator<T> GetEnumerator()
{
return _queue.GetEnumerator();
}
System.Collections.IEnumerator System.Collections.IEnumerable.GetEnumerator()
{
return _queue.GetEnumerator();
}
}
Well it depends upon the use I have noticed that some of above solution may exceed the size when used in multip-threaded environment. Anyway my use case was to display last 5 events and there are multiple threads writing events into the queue and one other thread reading from it and displaying it in a Winform Control. So this was my solution.
EDIT: Since we already using locking within our implementation we don't really need ConcurrentQueue it may improve the performance.
class FixedSizedConcurrentQueue<T>
{
readonly Queue<T> queue = new Queue<T>();
readonly object syncObject = new object();
public int MaxSize { get; private set; }
public FixedSizedConcurrentQueue(int maxSize)
{
MaxSize = maxSize;
}
public void Enqueue(T obj)
{
lock (syncObject)
{
queue.Enqueue(obj);
while (queue.Count > MaxSize)
{
queue.Dequeue();
}
}
}
public T[] ToArray()
{
T[] result = null;
lock (syncObject)
{
result = queue.ToArray();
}
return result;
}
public void Clear()
{
lock (syncObject)
{
queue.Clear();
}
}
}
EDIT: We don't really need syncObject in above example and we can rather use queue object since we are not re-initializing queue in any function and its marked as readonly anyway.
The accepted answer is going to have avoidable side-effects.
Links below are references that I used when I wrote my example below.
While the documentation from Microsoft is a bit misleading as they do use a lock they however lock the segement classes. The segment classes themselves use Interlocked.
using System;
using System.Collections.Concurrent;
using System.Collections.Generic;
namespace Lib.Core
{
// Sources:
// https://learn.microsoft.com/en-us/dotnet/standard/collections/thread-safe/
// https://learn.microsoft.com/en-us/dotnet/api/system.threading.interlocked?view=netcore-3.1
// https://github.com/dotnet/runtime/blob/master/src/libraries/System.Private.CoreLib/src/System/Collections/Concurrent/ConcurrentQueue.cs
// https://github.com/dotnet/runtime/blob/master/src/libraries/System.Private.CoreLib/src/System/Collections/Concurrent/ConcurrentQueueSegment.cs
/// <summary>
/// Concurrent safe circular buffer that will used a fixed capacity specified and resuse slots as it goes.
/// </summary>
/// <typeparam name="TObject">The object that you want to go into the slots.</typeparam>
public class ConcurrentCircularBuffer<TObject>
{
private readonly ConcurrentQueue<TObject> _queue;
public int Capacity { get; private set; }
public ConcurrentCircularBuffer(int capacity)
{
if(capacity <= 0)
{
throw new ArgumentException($"The capacity specified '{capacity}' is not valid.", nameof(capacity));
}
// Setup the queue to the initial capacity using List's underlying implementation.
_queue = new ConcurrentQueue<TObject>(new List<TObject>(capacity));
Capacity = capacity;
}
public void Enqueue(TObject #object)
{
// Enforce the capacity first so the head can be used instead of the entire segment (slow).
while (_queue.Count + 1 > Capacity)
{
if (!_queue.TryDequeue(out _))
{
// Handle error condition however you want to ie throw, return validation object, etc.
var ex = new Exception("Concurrent Dequeue operation failed.");
ex.Data.Add("EnqueueObject", #object);
throw ex;
}
}
// Place the item into the queue
_queue.Enqueue(#object);
}
public TObject Dequeue()
{
if(_queue.TryDequeue(out var result))
{
return result;
}
return default;
}
}
}
For your coding pleasure I submit to you the 'ConcurrentDeck'
public class ConcurrentDeck<T>
{
private readonly int _size;
private readonly T[] _buffer;
private int _position = 0;
public ConcurrentDeck(int size)
{
_size = size;
_buffer = new T[size];
}
public void Push(T item)
{
lock (this)
{
_buffer[_position] = item;
_position++;
if (_position == _size) _position = 0;
}
}
public T[] ReadDeck()
{
lock (this)
{
return _buffer.Skip(_position).Union(_buffer.Take(_position)).ToArray();
}
}
}
Example Usage:
void Main()
{
var deck = new ConcurrentDeck<Tuple<string,DateTime>>(25);
var handle = new ManualResetEventSlim();
var task1 = Task.Factory.StartNew(()=>{
var timer = new System.Timers.Timer();
timer.Elapsed += (s,a) => {deck.Push(new Tuple<string,DateTime>("task1",DateTime.Now));};
timer.Interval = System.TimeSpan.FromSeconds(1).TotalMilliseconds;
timer.Enabled = true;
handle.Wait();
});
var task2 = Task.Factory.StartNew(()=>{
var timer = new System.Timers.Timer();
timer.Elapsed += (s,a) => {deck.Push(new Tuple<string,DateTime>("task2",DateTime.Now));};
timer.Interval = System.TimeSpan.FromSeconds(.5).TotalMilliseconds;
timer.Enabled = true;
handle.Wait();
});
var task3 = Task.Factory.StartNew(()=>{
var timer = new System.Timers.Timer();
timer.Elapsed += (s,a) => {deck.Push(new Tuple<string,DateTime>("task3",DateTime.Now));};
timer.Interval = System.TimeSpan.FromSeconds(.25).TotalMilliseconds;
timer.Enabled = true;
handle.Wait();
});
System.Threading.Thread.Sleep(TimeSpan.FromSeconds(10));
handle.Set();
var outputtime = DateTime.Now;
deck.ReadDeck().Select(d => new {Message = d.Item1, MilliDiff = (outputtime - d.Item2).TotalMilliseconds}).Dump(true);
}
Here is yet another implementation that uses the underlying ConcurrentQueue as much as possible while providing the same interfaces made available via ConcurrentQueue.
/// <summary>
/// This is a FIFO concurrent queue that will remove the oldest added items when a given limit is reached.
/// </summary>
/// <typeparam name="TValue"></typeparam>
public class FixedSizedConcurrentQueue<TValue> : IProducerConsumerCollection<TValue>, IReadOnlyCollection<TValue>
{
private readonly ConcurrentQueue<TValue> _queue;
private readonly object _syncObject = new object();
public int LimitSize { get; }
public FixedSizedConcurrentQueue(int limit)
{
_queue = new ConcurrentQueue<TValue>();
LimitSize = limit;
}
public FixedSizedConcurrentQueue(int limit, System.Collections.Generic.IEnumerable<TValue> collection)
{
_queue = new ConcurrentQueue<TValue>(collection);
LimitSize = limit;
}
public int Count => _queue.Count;
bool ICollection.IsSynchronized => ((ICollection) _queue).IsSynchronized;
object ICollection.SyncRoot => ((ICollection)_queue).SyncRoot;
public bool IsEmpty => _queue.IsEmpty;
// Not supported until .NET Standard 2.1
//public void Clear() => _queue.Clear();
public void CopyTo(TValue[] array, int index) => _queue.CopyTo(array, index);
void ICollection.CopyTo(Array array, int index) => ((ICollection)_queue).CopyTo(array, index);
public void Enqueue(TValue obj)
{
_queue.Enqueue(obj);
lock( _syncObject )
{
while( _queue.Count > LimitSize ) {
_queue.TryDequeue(out _);
}
}
}
public IEnumerator<TValue> GetEnumerator() => _queue.GetEnumerator();
IEnumerator IEnumerable.GetEnumerator() => ((IEnumerable<TValue>)this).GetEnumerator();
public TValue[] ToArray() => _queue.ToArray();
public bool TryAdd(TValue item)
{
Enqueue(item);
return true;
}
bool IProducerConsumerCollection<TValue>.TryTake(out TValue item) => TryDequeue(out item);
public bool TryDequeue(out TValue result) => _queue.TryDequeue(out result);
public bool TryPeek(out TValue result) => _queue.TryPeek(out result);
}
using System.Collections.Concurrent;
public class FixedSizeQueue<T>
{
ConcurrentQueue<T> _queue = new ConcurrentQueue<T>();
private void Enque(T obj)
{
T temp;
if (_queue.Count > 99)
{
// Remove one of the oldest added items.
_queue.TryDequeue(out temp);
}
_queue.Enqueue(obj);
}
private bool Dequeue(out T obj)
{
return _queue.TryDequeue(out obj);
}
private void Clear()
{
T obj;
// It does not fall into an infinite loop, and clears the contents of the present time.
int cnt = _queue.Count;
for (; cnt > 0; cnt--)
{
_queue.TryDequeue(out obj);
}
}
}
This is my version of the queue:
public class FixedSizedQueue<T> {
private object LOCK = new object();
ConcurrentQueue<T> queue;
public int MaxSize { get; set; }
public FixedSizedQueue(int maxSize, IEnumerable<T> items = null) {
this.MaxSize = maxSize;
if (items == null) {
queue = new ConcurrentQueue<T>();
}
else {
queue = new ConcurrentQueue<T>(items);
EnsureLimitConstraint();
}
}
public void Enqueue(T obj) {
queue.Enqueue(obj);
EnsureLimitConstraint();
}
private void EnsureLimitConstraint() {
if (queue.Count > MaxSize) {
lock (LOCK) {
T overflow;
while (queue.Count > MaxSize) {
queue.TryDequeue(out overflow);
}
}
}
}
/// <summary>
/// returns the current snapshot of the queue
/// </summary>
/// <returns></returns>
public T[] GetSnapshot() {
return queue.ToArray();
}
}
I find it useful to have a constructor that is built upon an IEnumerable and I find it useful to have a GetSnapshot to have a multithread safe list (array in this case) of the items at the moment of the call, that doesn't rise errors if the underlaying collection changes.
The double Count check is to prevent the lock in some circumstances.
I am making a prototype application and for that I designed a class that behaves like an infinite looping list. That is, if my internal list contains 100 values, when I ask for the 101st value, I get the first, the 102nd yields the second, and so on, repeating.
So I would like to write the following code:
var slice = loopingListInstance.Skip(123).Take(5);
And for that I need to implement IEnumerable suitable, as I understand.
Here is my current code:
public class InfiniteLoopingList : IEnumerable<double>
{
double[] _values = File.ReadLines(#"c:\file.txt")
.Select(s => double.Parse(s, CultureInfo.InvariantCulture))
.ToArray();
int _size;
public InfiniteLoopingList()
{
_size = _values.Length;
}
public double this[int i]
{
get { return _values[i % _size]; }
set { _values[i % _size] = value; }
}
public IEnumerator<double> GetEnumerator()
{
return this.GetEnumerator();
}
IEnumerator IEnumerable.GetEnumerator()
{
// ???? now what ?? :(
}
}
Since you implemented the indexer property, you could do it via the simplest way as follows:
public IEnumerator<double> GetEnumerator()
{
int i = 0;
while (true)
yield return this[i++];
}
IEnumerator IEnumerable.GetEnumerator()
{
return GetEnumerator();
}
EDIT
Please notice, that this is not really infinite loop. This approach will only work until i = int.MaxValue. Thanks to #oleksii.
You don't need a class for this...
An extension method will do the trick:
public static class InfEx
{
public static IEnumerable<T> LoopForever<T>(this IEnumerable<T> src)
{
var data = new List<T>();
foreach(var item in src)
{
data.Add(item);
yield return item;
}
for(;;)
{
foreach(var item in data)
{
yield return item;
}
}
}
}
Now you can take a sequence and make it a looping, infinite sequence:
IEnumerable<Foo> mySeq = ...;
IEnumerable<Foo> infMySeq = mySeq.LoopForver();
IEnumerable<Foo> aSelectionOfInfMySeq = infMySeq.Skip(101).Take(5);
You can implement the IEnumerator interface:
class InifniteEnumerator<T> : IEnumerator<T> {
private int index = -1;
private IList<T> innerList;
private int repeatPos;
public InifniteEnumerator(IList<T> innerList, int repeatPos) {
this.innerList = innerList;
this.repeatPos = repeatPos;
}
public T Current {
get {
if (index == -1) {
throw new InvalidOperationException();
}
return this.innerList[index];
}
}
object IEnumerator.Current {
get {
return this.Current;
}
}
public void Dispose() {
}
public bool MoveNext() {
this.index++;
if (this.index == repeatPos) {
this.index = 0;
}
return true;
}
public void Reset() {
this.index = -1;
}
}
and then return an instance of it in the GetEnumerator methods:
IEnumerator IEnumerable.GetEnumerator() {
return this.GetEnumerator();
}
public IEnumerator<T> IEnumerable<T>.GetEnumerator() {
return new InifniteEnumerator(this, 100);
}
I'm trying to write a custom LinkedList class in C# using monoDevelop on Linux, just for the sake of testing and learning. The following code never compiles, and I have no idea why!! It doesn't even tell me what's wrong. All what it says is: Error: The compiler appears to have crashed. Check the build output pad for details. When I go to check the output pad, it's not helpful either:
Unhandled Exception: System.ArgumentException: The specified field must be declared on a generic type definition.
Parameter name: field
What can I do?
using System;
using System.Text;
using System.Collections.Generic;
namespace LinkedList
{
public class myLinkedList<T> : IEnumerable<T>
{
//List Node class
//===============
private class ListNode<T>
{
public T data;
public ListNode<T> next;
public ListNode(T d)
{
this.data = d;
this.next = null;
}
public ListNode(T d, ListNode<T> n)
{
this.data = d;
this.next = n;
}
}
//priavte fields
//===============
private ListNode<T> front;
private int size;
//Constructor
//===========
public myLinkedList ()
{
front = null;
size = 0;
}
//public methods
//===============
public bool isEmpty()
{
return (size == 0);
}
public bool addFront(T element)
{
front = new ListNode<T>(element, front);
size++;
return true;
}
public bool addBack(T element)
{
ListNode<T> current = front;
while (current.next != null)
{
current = current.next;
}
current.next = new ListNode<T>(element);
size++;
return true;
}
public override string ToString()
{
ListNode<T> current = front;
if(current == null)
{
return "**** Empty ****";
}
else
{
StringBuilder sb = new StringBuilder();
while (current.next != null)
{
sb.Append(current.data + ", ");
current = current.next;
}
sb.Append(current.data);
return sb.ToString();
}
}
// These make myLinkedList<T> implement IEnumerable<T> allowing
// a LinkedList to be used in a foreach statement.
public IEnumerator<T> GetEnumerator()
{
return new myLinkedListIterator<T>(front);
}
private class myLinkedListIterator<T> : IEnumerator<T>
{
private ListNode<T> current;
public virtual T Current
{
get
{
return current.data;
}
}
private ListNode<T> front;
public myLinkedListIterator(ListNode<T> f)
{
front = f;
current = front;
}
public bool MoveNext()
{
if(current.next != null)
{
current = current.next;
return true;
}
else
{
return false;
}
}
public void Reset()
{
current = front;
}
public void Dispose()
{
throw new Exception("Unsupported Operation");
}
}
}
}
You need to add the non-generic APIs; so add to the iterator:
object System.Collections.IEnumerator.Current { get { return Current; } }
and to the enumerable:
System.Collections.IEnumerator System.Collections.IEnumerable.GetEnumerator()
{
return GetEnumerator();
}
HOWEVER! If you are implementing this by hand, you are missing a trick. An "iterator block" would be much easier.
The following is a complete implementation; you don't need to write an enumerator class at all (you can remove myLinkedListIterator<T> completely):
public IEnumerator<T> GetEnumerator()
{
var node = front;
while(node != null)
{
yield return node.data;
node = node.next;
}
}
System.Collections.IEnumerator System.Collections.IEnumerable.GetEnumerator()
{
return GetEnumerator();
}
When i tried the code that you have pasted i get 2 errors when trying to build.
myLinkedList' does not implement interface member
'System.Collections.IEnumerable.GetEnumerator()'.
'.myLinkedList.GetEnumerator()' cannot implement
'System.Collections.IEnumerable.GetEnumerator()' because it does not
have the matching return type of 'System.Collections.IEnumerator'.
Solution is to implement the following in the first class.
IEnumerator IEnumerable.GetEnumerator()
{
return GetEnumerator();
}
And the second error is :
myLinkedList.myLinkedListIterator' does not implement interface
member 'System.Collections.IEnumerator.Current'.
'JonasApplication.myLinkedList.myLinkedListIterator.Current'
cannot implement 'System.Collections.IEnumerator.Current' because it
does not have the matching return type of 'object'.
Solution to the second could be something as following to implement in the second class.
object IEnumerator.Current
{
get { return Current; }
}