Lock over multiple methods - c#

i have a singleton class (MySingletonClass) that has three main methods
BeginTransaction(clientId)
AddItem(item, clientId)
CommitTransaction(clientId)
The ClientClass is:
public class ClientClass
{
private string id;
private MySingletonClass s = MySingletonClass.Instance;
public ClientClass()
{
id = new Guid().ToString();
}
public void BeginTransaction()
{
//start a lock here
s.BeginTransaction(id);
}
public void CommitTransaction()
{
s.CommitTransaction(id);
//end lock here
}
public void AddItem(string item)
{
//no access until lock is released
s.AddItem(item, id);
}
}
There are many task each of which with its own ClientClass class
I need a way to serialize access to singleton class by transaction:
if a transaction has not been committed then no other thread should start a new transaction or calla any other method on the singleton instance
For example every task can have code like this
Task.Factory.StartNew(() =>
{
Client c = new ClientClass();
c.BeginTransaction();
c.AddItem("www");
c.AddItem("qqq");
c.CommitTransaction();
});
Task.Factory.StartNew(() =>
{
Client c1 = new ClientClass();
c1.BeginTransaction();
c1.AddItem("aaa");
c1.CommitTransaction();
});
Any idea how can i archieve this with some sort of locking that starts on begin transaction and is released on commit?
All the examples i've seen of monitor , mutex and lock start and are released in the same method.
Is there a lock over multiple methods??

Your MySingletonClass could look something like this:
public class MySingletonClass {
private readonly Object _lockObject = new object();
private List<string> _list = new List<string>();
public void BeginTransaction()
{
Monitor.Enter(_lockObject);
_list.Clear();
}
public void AddItem(string item)
{
if( Monitor.IsEntered(_lockObject) == false ) throw new ThreadStateException("Not owner of transaction");
_list.Add(item);
}
public void CommitTransaction()
{
if( Monitor.IsEntered(_lockObject) == false ) throw new ThreadStateException("Not owner of transaction");
Monitor.Exit(_lockObject);
}
public void RollbackTransaction()
{
if( Monitor.IsEntered(_lockObject) == false ) throw new ThreadStateException("Not owner of transaction");
Monitor.Exit(_lockObject);
}
}
But if a ClientClass fails (exception being thrown) between BeginTransaction and CommitTransaction, it very important to call RollbackTransaction else all threads will deadlock since the monitor is never freed.
So it is not a recommendable solution.

Related

Item enqueue in concurrent queue is not adding

I am working on an application in which I am getting orders from an third party app. The application is written on windows form so I am using service stack to add routes in my application.
I have three classes. One contains endpoint
public class Service : ServiceStack.Service
{
Repository _repository;
public OrderService()
{
_repository = Repository.GetInstance();
}
[Authenticate]
public void Post(Order order)
{
if (order != null)
{
_repository.AddItem(order);
}
}
}
The second class is processing the orders and this class is a singleton class.
public sealed class Repository
{
private static object _myLock = new object();
private static Repository _mySingleton = null;
private ConcurrentQueue<Order> _queue;
public static bool orderCheck = true;
private Repository() {
_queue = new ConcurrentQueue<Order>();
}
public void AddItem(Order order)
{
_queue.Enqueue(order);
}
public static Repository GetInstance()
{
if (_mySingleton == null)
{
lock (_myLock)
{
if (_mySingleton == null)
{
_mySingleton = new Repository();
}
}
}
return _mySingleton;
}
public void CreateOrder()
{
while (orderCheck)
{
Order order = null;
_queue.TryDequeue(out order);
if (order != null)
{
try
{
// performing business logic with order
}
catch (Exception ex)
{
throw new Exception(ex.Message);
}
}
else
{
Thread.Sleep(10000);
}
}
}
}
The third class creates a new thread when the application is started:
new Thread(delegate ()
{
var repo = Repository.GetInstance();
repo.CreateOrder();
}).Start();
The problem is that the endpoint added the order information in the queue, but when I try to dequeue in the Repository class then it's not available on the tryDequeue method.
I put the getHashCode of ConcurrentQueue and I found the hashcode showing differently in while loop and in AddItem method.

Issue in C# singleton with multi threading: a variable not intialized

This is a simplified version of production code and running in multi thread with singleton. Compared to traditional singleton the additional thing is that I initialized client in the lock section.
When I trying to get the client by: Client client = Singleton.Instance.GetClient();, there is chance that client can be null (but the chance is very small).
public class Client
{
public int Value { get; set; } = 10;
}
public class Singleton
{
private static Singleton instance = null;
private static readonly object padlock = new object();
private Client client = null;
public static Singleton Instance
{
get
{
if (instance == null)
{
lock (padlock)
{
if (instance == null)
{
instance = new Singleton();
// Here is the interesting part!
instance.InitClient();
}
}
}
return instance;
}
}
private void InitClient()
{
this.client = new Client();
}
public Client GetClient()
{
return this.client;
}
}
This is how I testing it:
static void Main(string[] args)
{
Console.WriteLine("Input thread count: ");
int threadCount = Int32.Parse(Console.ReadLine().Trim());
List<Task> tasks = new List<Task>(threadCount);
for (int i = 0; i < threadCount; ++i)
{
tasks.Add(Task.Factory.StartNew(() => DoStuff()));
}
Task.WaitAll(tasks.ToArray());
Console.WriteLine("All threads complete");
}
private static void DoStuff()
{
Client client = Singleton.Instance.GetClient();
if (client.Value != 10)
{
Console.WriteLine($"Thread: {Thread.CurrentThread.ManagedThreadId}.");
}
}
And client can be null in occasionlly:
But when I moved the InitClient() into the private constructor of Singleton, I never meet the situation that client is null:
private Singleton()
{
this.InitClient();
}
I don't have any clue what is difference and what is wrong, thanks for the helping!
As soon as you call instance = new Singleton() inside the lock, "instance" is no longer null, meaning separate (threaded) calls to Singleton.Instance returns immediately, and a call to GetClient on that instance would be a race condition with the InitClient from the first call.
Initializing inside the constructor ensures "Instance" itself is initialized as soon as it's created. So subsequent calls from separate threads wouldn't race against anything.

c# AsyncSocket Server need locking?

i have written a TCP Socket-Server. In general i want to have the following behaviour:
A Listening Socket can accept N Connections (There a multiple Listeners on different Ports (1337,733) )
A Connection can authenticated itself as a "Client", multiple connections can grouped to one "Client"
Multiple connection can accepted / revice data at the same time (concurrency)
Here my Server-Class:
class Server
{
internal List ConnectionListeners;
internal bool IsListening = false;
internal Server()
{
this.ClientManager = new ClientManager();
this.ConnectionManager = new ConnectionManager();
this.ConnectionListeners = new List();
}
internal void AddConnectionListener(int Port)
{
ConnectionListener c = new ConnectionListener(Port);
c.AcceptedConnection += new ConnectionListener.AcceptedConnectionEventHandler(ConnectionProcessor.AcceptConnection);
ConnectionListeners.Add(c);
}
internal void RemoveConnectionListener(ConnectionListener ConnectionListener)
{
ConnectionListeners.Remove(ConnectionListener);
}
public delegate void OnStartListeningEventHandler();
public event OnStartListeningEventHandler OnStartListening;
internal void StartListening()
{
IsListening = true;
foreach (ConnectionListener cl in this.ConnectionListeners)
{
cl.StartListening();
}
OnStartListening?.Invoke();
}
public delegate void OnStopListeningEventHandler();
public event OnStopListeningEventHandler OnStopListening;
internal void StopListening()
{
ConnectionManager.DisconnectConnections();
foreach (ConnectionListener cl in this.ConnectionListeners)
{
cl.StopListening();
}
IsListening = false;
OnStopListening?.Invoke();
}
}
Method of ConnectionProcessor where i hande a new Accept Connection (ConnectionProcessor.AcceptConnection):
internal void AcceptConnection(Socket Socket)
{
Connection Connection = new Connection(Socket);
Connection.Sent += new Connection.SentEventHandler(onSend);
Connection.Received += new Connection.ReceivedEventHandler(onRecive);
Connection.Disconnected += new Connection.DisconnectedEventHandler(OnDisconnect);
Connection.Recive();
Logger.Instance.AddLog(new LogMessage(LogMessage.LogLevel.Normal, "Connection ("+Connection.ConnectionId+") Accepted"));
ConnectionManager.AddConnection(Connection);
}
ConnectionManager:
class ConnectionManager
{
internal ConnectionManager()
{
this.Connections = new List();
}
internal void AddConnection(Connection Connection)
{
Connections.Add(Connection);
OnAddConnection(Connection);
}
internal void RemoveConnection(Connection Connection)
{
Connections.Remove(Connection);
OnRemoveConnection(Connection);
}
internal void DisconnectConnections()
{
foreach (Connection c in Connections)
{
c.Disconnect();
}
}
}
Everything seems to work, but I am unsure about concurrency.
As you can see in the ConnectionManger, i store each Connection in a List (Connections.Add(Connection)). Its enough do to this? I have to care about that a normal "List" is not Thread safe?
Is my "design" in gerneal the right way to solve my requirements?
Since all you do is adding\removing\enumerating connections in your list - you can use thread-safe collection, without any locks. Unfortunately, there is no ConcurrentList or ConcurrentHashSet, but you can use ConcurrentDictionary with dummy keys, either directly or by wrapping in separate class:
class BasicConcurrentSet<T> : IEnumerable<T> {
private readonly ConcurrentDictionary<T, byte> _items = new ConcurrentDictionary<T, byte>();
public void Add(T item) {
_items.TryAdd(item, 0);
}
public void Remove(T item) {
byte tmp;
_items.TryRemove(item, out tmp);
}
IEnumerator IEnumerable.GetEnumerator() {
return GetEnumerator();
}
public IEnumerator<T> GetEnumerator() {
foreach (var kv in _items) {
yield return kv.Key;
}
}
}
Adding, removing and enumerating items from such concurrent collection is thread safe.
class ConnectionManager {
private readonly BasicConcurrentSet<Connection> _connections = new BasicConcurrentSet<Connection>();
internal ConnectionManager() {
}
internal void AddConnection(Connection connection) {
_connections.Add(connection);
OnAddConnection(Connection);
}
internal void RemoveConnection(Connection connection) {
_connections.Remove(connection);
OnRemoveConnection(connection);
}
internal void DisconnectConnections() {
foreach (var connection in _connections) {
connection.Disconnect();
}
}
}
Even if you incapsulate Connections (should be private) inside ConnectionManager class and situation when RemoveConnection or DisconnectConnections can be earlier then AddConnection is impossible - no, this is not thread safe. So my suggestion is make all 3 functions thread safe like this :
private Object lck;
internal ConnectionManager()
{
lck = new Object();
this.Connections = new List();
}
internal void AddConnection(Connection Connection)
{
lock (lck)
{
Connections.Add(Connection);
OnAddConnection(Connection);
}
}

Multithreading BlockingCollection Alternatives to GetConsumingEnumerable() Producer-Consumer

I have a situation where I have multiple producers and multiple consumers. The producers enters a job into a queue. I chose the BlockingCollection and it works great since I need the consumers to wait for a job to be found. However, if I use the GetConsumingEnumerable() feature the order of the items in the collection change... this is not what I need.
It even says in MSDN http://msdn.microsoft.com/en-us/library/dd287186.aspx
that it does not preserve the order of the items.
Does anyone know an alternative for this situation?
I see that the Take method is available but does it also provide a 'wait' condition for the consumer threads?
It says http://msdn.microsoft.com/en-us/library/dd287085.aspx
'A call to Take may block until an item is available to be removed.' Is it better to use TryTake? I really need the thread to wait and keep checking for a job.
Take blocks the thread till something comes available.
TryTake as the name implies tries to do so but returns a bool if it fails or succeeds.
Allowing for more flex using it:
while(goingOn){
if( q.TryTake(out var){
Process(var)
}
else{
DoSomething_Usefull_OrNotUseFull_OrEvenSleep();
}
}
instead of
while(goingOn){
if( var x = q.Take(){
//w'll wait till this ever will happen and then we:
Process(var)
}
}
My votes are for TryTake :-)
EXAMPLE:
public class ProducerConsumer<T> {
public struct Message {
public T Data;
}
private readonly ThreadRunner _producer;
private readonly ThreadRunner _consumer;
public ProducerConsumer(Func<T> produce, Action<T> consume) {
var q = new BlockingCollection<Message>();
_producer = new Producer(produce,q);
_consumer = new Consumer(consume,q);
}
public void Start() {
_producer.Run();
_consumer.Run();
}
public void Stop() {
_producer.Stop();
_consumer.Stop();
}
private class Producer : ThreadRunner {
public Producer(Func<T> produce, BlockingCollection<Message> q) : base(q) {
_produce = produce;
}
private readonly Func<T> _produce;
public override void Worker() {
try {
while (KeepRunning) {
var item = _produce();
MessageQ.TryAdd(new Message{Data = item});
}
}
catch (ThreadInterruptedException) {
WasInterrupted = true;
}
}
}
public abstract class ThreadRunner {
protected readonly BlockingCollection<Message> MessageQ;
protected ThreadRunner(BlockingCollection<Message> q) {
MessageQ = q;
}
protected Thread Runner;
protected bool KeepRunning = true;
public bool WasInterrupted;
public abstract void Worker();
public void Run() {
Runner = new Thread(Worker);
Runner.Start();
}
public void Stop() {
KeepRunning = false;
Runner.Interrupt();
Runner.Join();
}
}
class Consumer : ThreadRunner {
private readonly Action<T> _consume;
public Consumer(Action<T> consume,BlockingCollection<Message> q) : base(q) {
_consume = consume;
}
public override void Worker() {
try {
while (KeepRunning) {
Message message;
if (MessageQ.TryTake(out message, TimeSpan.FromMilliseconds(100))) {
_consume(message.Data);
}
else {
//There's nothing in the Q so I have some spare time...
//Excellent moment to update my statisics or update some history to logfiles
//for now we sleep:
Thread.Sleep(TimeSpan.FromMilliseconds(100));
}
}
}
catch (ThreadInterruptedException) {
WasInterrupted = true;
}
}
}
}
}
USAGE:
[Fact]
public void ConsumerShouldConsume() {
var produced = 0;
var consumed = 0;
Func<int> produce = () => {
Thread.Sleep(TimeSpan.FromMilliseconds(100));
produced++;
return new Random(2).Next(1000);
};
Action<int> consume = c => { consumed++; };
var t = new ProducerConsumer<int>(produce, consume);
t.Start();
Thread.Sleep(TimeSpan.FromSeconds(5));
t.Stop();
Assert.InRange(produced,40,60);
Assert.InRange(consumed, 40, 60);
}

What is the correct way to dispose elements held inside a ThreadLocal<IDisposable>?

When you use a ThreadLocal<T> and T implements IDisposable, how are you supposed to dispose of the members being held inside of the ThreadLocal?
According to ILSpy, the Dispose() and Dispose(bool) methods of ThreadLocal are
public void Dispose()
{
this.Dispose(true);
GC.SuppressFinalize(this);
}
protected virtual void Dispose(bool disposing)
{
int currentInstanceIndex = this.m_currentInstanceIndex;
if (currentInstanceIndex > -1 && Interlocked.CompareExchange(ref this.m_currentInstanceIndex, -1, currentInstanceIndex) == currentInstanceIndex)
{
ThreadLocal<T>.s_availableIndices.Push(currentInstanceIndex);
}
this.m_holder = null;
}
It does not appear that ThreadLocal attempts to call Dispose on its child members. I can't tell how to reference each thread it internally has allocated so I can take care of it.
I ran a test with the following code, the class is never disposed
static class Sandbox
{
static void Main()
{
ThreadLocal<TestClass> test = new ThreadLocal<TestClass>();
test.Value = new TestClass();
test.Dispose();
Console.Read();
}
}
class TestClass : IDisposable
{
public void Dispose()
{
Dispose(true);
GC.SuppressFinalize(this);
}
protected void Dispose(bool Disposing)
{
Console.Write("I was disposed!");
}
}
I had a look at the code in ThreadLocal<T> to see what the current Dispose is doing and it appears to be a lot of voodoo. Obviously disposing of thread-related stuff.
But it doesn't dispose of the values if T itself is disposable.
Now, I have a solution - a ThreadLocalDisposables<T> class, but before I give the full definition it's worth thinking about what should happen if you wrote this code:
var tl = new ThreadLocalDisposables<IExpensiveDisposableResource>();
tl.Value = myEdr1;
tl.Value = myEdr2;
tl.Dispose();
Should both myEdr1 & myEdr2 both be disposed? Or just myEdr2? Or should myEdr1 be disposed when myEdr2 was assigned?
It's not clear to me what the semantics should be.
It is clear to me, however, that if I wrote this code:
var tl = new ThreadLocalDisposables<IExpensiveDisposableResource>(
() => new ExpensiveDisposableResource());
tl.Value.DoSomething();
tl.Dispose();
Then I would expect that the resource created by the factory for each thread should be disposed of.
So I'm not going to allow the direct assignment of the disposable value for ThreadLocalDisposables and only allow the factory constructor.
Here's ThreadLocalDisposables:
public class ThreadLocalDisposables<T> : IDisposable
where T : IDisposable
{
private ThreadLocal<T> _threadLocal = null;
private ConcurrentBag<T> _values = new ConcurrentBag<T>();
public ThreadLocalDisposables(Func<T> valueFactory)
{
_threadLocal = new ThreadLocal<T>(() =>
{
var value = valueFactory();
_values.Add(value);
return value;
});
}
public void Dispose()
{
_threadLocal.Dispose();
Array.ForEach(_values.ToArray(), t => t.Dispose());
}
public override string ToString()
{
return _threadLocal.ToString();
}
public bool IsValueCreated
{
get { return _threadLocal.IsValueCreated; }
}
public T Value
{
get { return _threadLocal.Value; }
}
}
Does this help?
In .NET 4.5, the Values property was added to ThreadLocal<> to deal with the problem of manually managing the lifetime of ThreadLocal objects. It returns a list of all current instances bound to that ThreadLocal variable.
An example using a Parallel.For loop accessing a ThreadLocal database connection pool was presented in this MSDN article. The relevant code snippet is below.
var threadDbConn = new ThreadLocal<MyDbConnection>(() => MyDbConnection.Open(), true);
try
{
Parallel.For(0, 10000, i =>
{
var inputData = threadDbConn.Value.GetData(i);
...
});
}
finally
{
foreach(var dbConn in threadDbConn.Values)
{
dbConn.Close();
}
}
Normally when you don't explicitly dispose of a class that holds an unmanaged resource, the garbage collector will eventually run and dispose of it. For this to happen, the class has to have a finalizer that disposes of its resource. Your sample class doesn't have a finalizer.
Now, to dispose of a class that's held inside a ThreadLocal<T> where T is IDisposable you also have to do it yourself. ThreadLocal<T> is just a wrapper, it won't attempt to guess what's the correct behavior for its wrapped reference when it is itself disposed. The class could, e.g., survive its thread local storage.
This is related to ThreadLocal<> and memory leak
My guess is because there is no IDisposable constraint on T, it is assumed that the user of ThreadLocal<T> will dispose of the local object, when appropriate.
How is the ThreadLocal.Dispose method itself getting called? I would expect that it would most likely be within something like a "using" block. I would suggest that one wrap the "using" block for the ThreadLocal with a "using" block for the resource that's going to be stored there.
MSDN reference states that the ThreadLocal values should be disposed by the thread using them once its done. However in some instances such as event threading using a thread pool A thread may use the value and go off to do something else and then come back to the value N number of times.
Specific example is where I want an Entity Framework DBContext to persist across the lifespan of a series of service bus worker threads.
I've written up the following class which I use in these instances:
Either DisposeThreadCompletedValues can be called manually every so often by another thread or the internal monitor thread can be activated
Hopefully this helps?
using System.Threading;
public class DisposableThreadLocal<T> : IDisposable
where T : IDisposable
{
public DisposableThreadLocal(Func<T> _ValueFactory)
{
Initialize(_ValueFactory, false, 1);
}
public DisposableThreadLocal(Func<T> _ValueFactory, bool CreateLocalWatcherThread, int _CheckEverySeconds)
{
Initialize(_ValueFactory, CreateLocalWatcherThread, _CheckEverySeconds);
}
private void Initialize(Func<T> _ValueFactory, bool CreateLocalWatcherThread, int _CheckEverySeconds)
{
m_ValueFactory = _ValueFactory;
m_CheckEverySeconds = _CheckEverySeconds * 1000;
if (CreateLocalWatcherThread)
{
System.Threading.ThreadStart WatcherThreadStart;
WatcherThreadStart = new ThreadStart(InternalMonitor);
WatcherThread = new Thread(WatcherThreadStart);
WatcherThread.Start();
}
}
private object SyncRoot = new object();
private Func<T> m_ValueFactory;
public Func<T> ValueFactory
{
get
{
return m_ValueFactory;
}
}
private Dictionary<Thread, T> m_InternalDict = new Dictionary<Thread, T>();
private Dictionary<Thread, T> InternalDict
{
get
{
return m_InternalDict;
}
}
public T Value
{
get
{
T Result;
lock(SyncRoot)
{
if (!InternalDict.TryGetValue(Thread.CurrentThread,out Result))
{
Result = ValueFactory.Invoke();
InternalDict.Add(Thread.CurrentThread, Result);
}
}
return Result;
}
set
{
lock (SyncRoot)
{
if (InternalDict.ContainsKey(Thread.CurrentThread))
{
InternalDict[Thread.CurrentThread] = value;
}
else
{
InternalDict.Add(Thread.CurrentThread, value);
}
}
}
}
public bool IsValueCreated
{
get
{
lock (SyncRoot)
{
return InternalDict.ContainsKey(Thread.CurrentThread);
}
}
}
public void DisposeThreadCompletedValues()
{
lock (SyncRoot)
{
List<Thread> CompletedThreads;
CompletedThreads = new List<Thread>();
foreach (Thread ThreadInstance in InternalDict.Keys)
{
if (!ThreadInstance.IsAlive)
{
CompletedThreads.Add(ThreadInstance);
}
}
foreach (Thread ThreadInstance in CompletedThreads)
{
InternalDict[ThreadInstance].Dispose();
InternalDict.Remove(ThreadInstance);
}
}
}
private int m_CheckEverySeconds;
private int CheckEverySeconds
{
get
{
return m_CheckEverySeconds;
}
}
private Thread WatcherThread;
private void InternalMonitor()
{
while (!IsDisposed)
{
System.Threading.Thread.Sleep(CheckEverySeconds);
DisposeThreadCompletedValues();
}
}
private bool IsDisposed = false;
public void Dispose()
{
if (!IsDisposed)
{
IsDisposed = true;
DoDispose();
}
}
private void DoDispose()
{
if (WatcherThread != null)
{
WatcherThread.Abort();
}
//InternalDict.Values.ToList().ForEach(Value => Value.Dispose());
foreach (T Value in InternalDict.Values)
{
Value.Dispose();
}
InternalDict.Clear();
m_InternalDict = null;
m_ValueFactory = null;
GC.SuppressFinalize(this);
}
}

Categories