I have a helper function which deserializes an XML file, and generates c# objects from it.
After that, the objects are added into server memory. The only way of adding something in the server's memory is only through this function.
public class DeserializeXmlHelper
{
public void DeserializeXml(Guid xml_Id, decimal version)
{
// heavy process here which takes about 3 seconds
}
}
This function is being called by different clients using an API method (made in Asp.net MVC API).
When calling the API, can i prevent the execution of the function if somebody else already called the same function with the same parameters?
Something like this, but i don't know if it is a good way of doing it.
public class DeserializeXmlHelper
{
private static readonly ConcurrentDictionary<string, object> _processes = new ConcurrentDictionary<string, object>();
public void DeserializeXml(Guid xml_Id, decimal version)
{
string processKey = string.Format("{0}_v{1}", xml_Id, version.ToString("#0.0"));
object processLocker = null;
if (_processes.TryGetValue(processKey, out processLocker) == false)
{
processLocker = new object();
_processes.TryAdd(processKey, processLocker);
}
lock (processLocker)
{
// heavy process here which takes about 3 seconds
_processes.TryRemove(processKey);
}
}
}
EDITED - New version
Tim Roger's answer is working successfully.
However, if i want to return only when the initial call has finished, can i do something like this? (I am using ConcurrentDictionary because i don't know how to add locks, but the idea should be the same)
public class DeserializeXmlHelper
{
private static readonly ConcurrentDictionary<string, string> _processes = new ConcurrentDictionary<string, string>();
public void DeserializeXml(Guid xml_Id, decimal version)
{
string _processKey = string.Format("{0}_v{1}", xml_Id, version.ToString("#0.0"));
string _processValue = null;
if (_processes.TryGetValue(_processKey, out _processValue) == true)
{
// function already called with the same parameters
do
{
System.Threading.Thread.Sleep(100);
}
while(_processes.TryGetValue(_processKey, out _processValue) == true)
return;
}
try
{
_processes.TryAdd(_processKey, _processValue);
var begin = "begin process";
System.Threading.Thread.Sleep(10000);
var end = "ending process";
}
finally
{
_processes.TryRemove(_processKey, out _processValue);
}
}
}
Your original solution was not far off but you seemed to always be running the process regardless of the current state.
Even though you have used a concurrent dictionary, you also need a lock to lock the entire dictionary while you are updating its state, since the thread could be interrupted between TryGetValue and Add.
private static readonly ConcurrentDictionary<string, object> _processes = new ConcurrentDictionary<string, object>();
private static readonly object _dictionaryLock = new object();
public void DeserializeXml(Guid xml_Id, decimal version)
{
string processKey = string.Format("{0}_v{1}", xml_Id, version.ToString("#0.0"));
object processLocker = null;
bool needsExecuting = false;
lock (_dictionaryLock)
{
if (_processes.TryGetValue(processKey, out processLocker) == false)
{
needsExecuting = true;
processLocker = new object();
_processes.Add(processKey, processLocker);
}
}
lock (processLocker)
{
if (needsExecuting)
{
// heavy process here which takes about 3 seconds
_processes.Remove(processKey);
}
}
}
Related
I doing a small project to map a network (routers only) using SNMP. In order to speed things up, I´m trying to have a pool of threads responsible for doing the jobs I need, apart from the first job which is done by the main thread.
At this time I have two jobs, one takes a parameter the other doesn´t:
UpdateDeviceInfo(NetworkDevice nd)
UpdateLinks() *not defined yet
What I´m trying to achieve is to have those working threads waiting for a job to
appear on a Queue<Action> and wait while it is empty. The main thread will add the first job and then wait for all workers, which might add more jobs, to finish before starting adding the second job and wake up the sleeping threads.
My problem/questions are:
How to define the Queue<Actions> so that I can insert the methods and the parameters if any. If not possible I could make all functions accept the same parameter.
How to launch the working threads indefinitely. I not sure where should I create the for(;;).
This is my code so far:
public enum DatabaseState
{
Empty = 0,
Learning = 1,
Updating = 2,
Stable = 3,
Exiting = 4
};
public class NetworkDB
{
public Dictionary<string, NetworkDevice> database;
private Queue<Action<NetworkDevice>> jobs;
private string _community;
private string _ipaddress;
private Object _statelock = new Object();
private DatabaseState _state = DatabaseState.Empty;
private readonly int workers = 4;
private Object _threadswaitinglock = new Object();
private int _threadswaiting = 0;
public Dictionary<string, NetworkDevice> Database { get => database; set => database = value; }
public NetworkDB(string community, string ipaddress)
{
_community = community;
_ipaddress = ipaddress;
database = new Dictionary<string, NetworkDevice>();
jobs = new Queue<Action<NetworkDevice>>();
}
public void Start()
{
NetworkDevice nd = SNMP.GetDeviceInfo(new IpAddress(_ipaddress), _community);
if (nd.Status > NetworkDeviceStatus.Unknown)
{
database.Add(nd.Id, nd);
_state = DatabaseState.Learning;
nd.Update(this); // The first job is done by the main thread
for (int i = 0; i < workers; i++)
{
Thread t = new Thread(JobRemove);
t.Start();
}
lock (_statelock)
{
if (_state == DatabaseState.Learning)
{
Monitor.Wait(_statelock);
}
}
lock (_statelock)
{
if (_state == DatabaseState.Updating)
{
Monitor.Wait(_statelock);
}
}
foreach (KeyValuePair<string, NetworkDevice> n in database)
{
using (System.IO.StreamWriter file = new System.IO.StreamWriter(n.Value.Name + ".txt")
{
file.WriteLine(n);
}
}
}
}
public void JobInsert(Action<NetworkDevice> func, NetworkDevice nd)
{
lock (jobs)
{
jobs.Enqueue(item);
if (jobs.Count == 1)
{
// wake up any blocked dequeue
Monitor.Pulse(jobs);
}
}
}
public void JobRemove()
{
Action<NetworkDevice> item;
lock (jobs)
{
while (jobs.Count == 0)
{
lock (_threadswaitinglock)
{
_threadswaiting += 1;
if (_threadswaiting == workers)
Monitor.Pulse(_statelock);
}
Monitor.Wait(jobs);
}
lock (_threadswaitinglock)
{
_threadswaiting -= 1;
}
item = jobs.Dequeue();
item.Invoke();
}
}
public bool NetworkDeviceExists(NetworkDevice nd)
{
try
{
Monitor.Enter(database);
if (database.ContainsKey(nd.Id))
{
return true;
}
else
{
database.Add(nd.Id, nd);
Action<NetworkDevice> action = new Action<NetworkDevice>(UpdateDeviceInfo);
jobs.Enqueue(action);
return false;
}
}
finally
{
Monitor.Exit(database);
}
}
//Job1 - Learning -> Update device info
public void UpdateDeviceInfo(NetworkDevice nd)
{
nd.Update(this);
try
{
Monitor.Enter(database);
nd.Status = NetworkDeviceStatus.Self;
}
finally
{
Monitor.Exit(database);
}
}
//Job2 - Updating -> After Learning, create links between neighbours
private void UpdateLinks()
{
}
}
Your best bet seems like using a BlockingCollection instead of the Queue class. They behave effectively the same in terms of FIFO, but a BlockingCollection will let each of your threads block until an item can be taken by calling GetConsumingEnumerable or Take. Here is a complete example.
http://mikehadlow.blogspot.com/2012/11/using-blockingcollection-to-communicate.html?m=1
As for including the parameters, it seems like you could use closure to enclose the NetworkDevice itself and then just enqueue Action instead of Action<>
I am working on a project that uses Threads. In some cases, I have these problems:
Here is some piece of my code :
List<EmailAddress> lstEmailAddress = new List<EmailAddress>();
private void TimerCheckInternetConnection_Tick(object sender, EventArgs e)
{
lock (TicketLock)
{
if (UtilityManager.CheckForInternetConnection())
{
if (ApplicationRunStatus == Enum_ApplicationRunStatus.UnknownDisconnect || ApplicationRunStatus == Enum_ApplicationRunStatus.IsReady)
{
// Connect
ThreadPool.QueueUserWorkItem((o) =>
{
for (int i = 0; i < lstEmailAddress.Count; i++)
{
lstEmailAddress[i].IsActive = lstEmailAddress[i].Login();
}
this.BeginInvoke(new Action(() =>
{
// some code
}));
});
}
}
}
}
and this is EmailAddress class :
class EmailAddress
{
private Imap4Client imap = new Imap4Client();
private object objectLock = new object();
public bool IsActive;
public string Address;
public string Password;
public string RecieveServerAddress;
public int RecieveServerPort;
public bool Login()
{
lock (objectLock)
{
try
{
imap.ConnectSsl(RecieveServerAddress, RecieveServerPort);
}
catch (Exception)
{
}
try
{
imap.Login(Address, Password);
return true;
}
catch (Exception)
{
return false;
}
}
}
}
And my problem is this:
When I want to use Login procedure that belongs to EmailAddress Class, it has some conflict. As you can see, I used Lock but any thing changed.
For more details:
If I have 3 items in lstEmailAddress , the Login procedure has to be called 3 times by this code. but every time, the login procedure will work on same username and password. So all my emails cannot login correctly.
If I remove threadpool, it will be ok.
Your code is very confusing:
If you add the lock in your code, it will run synchroniously, only one thread at the time, which will lead to performance loss.
If you queue work via QueueUserWorkItem - it will run in other thread, and not inside TicketLock
You should incapsulate locks inside your class, and should not lock entire logic in your program.
You start work for a loop variable i, which is being closured for it's last value, which lead for a problem you state in last sentence.
lock object in Email class isn't static so it's being created for each instance, and doesn't actually lock anithing.
As you are using Invoke method, your code is being started from UI, and you need to pass the synchronization context. I suggest you to use TPL code for this, and do not directly work with ThreadPool
So I suggest you this solution:
List<EmailAddress> lstEmailAddress = new List<EmailAddress>();
private void TimerCheckInternetConnection_Tick(object sender, EventArgs e)
{
// remove this lock as we have another in Email class
//lock (TicketLock)
if (UtilityManager.CheckForInternetConnection())
{
if (ApplicationRunStatus == Enum_ApplicationRunStatus.UnknownDisconnect
|| ApplicationRunStatus == Enum_ApplicationRunStatus.IsReady)
{
for (int i = 0; i < lstEmailAddress.Count; i++)
{
// use local variable to store index
int localIndex = i;
// Connect
ThreadPool.QueueUserWorkItem((o) =>
{
// if you add a lock here, this will run synchroniosly,
// and you aren't really need the ThreadPool
//lock (TicketLock)
lstEmailAddress[localIndex].IsActive = lstEmailAddress[localIndex].Login();
this.BeginInvoke(new Action(() =>
{
// some code
}));
});
}
}
}
}
class EmailAddress
{
// if you have to login only for one user simultaneosly
// use static variables here, other wise simply remove the lock as it is useless
private static Imap4Client imap;
private static object objectLock;
// static constructor for only one initialization for a static fields
static EmailAddress()
{
objectLock = new object();
imap = new Imap4Client();
}
public bool IsActive;
public string Address;
public string Password;
public string RecieveServerAddress;
public int RecieveServerPort;
public bool Login()
{
// aquire a static lock
lock (objectLock)
{
try
{
imap.ConnectSsl(RecieveServerAddress, RecieveServerPort);
}
catch (Exception)
{
// STORE THE EXCEPTION!!!
// return as you haven't connected
return false;
}
try
{
imap.Login(Address, Password);
return true;
}
catch (Exception)
{
// STORE THE EXCEPTION!!!
return false;
}
}
}
}
Change your Code as and try . you code is queing item from lstEmailAddress where it will always go and hit last item from the list. change your code to inquie each item in threadpool. that should fix. it.
for (int i = 0; i < lstEmailAddress.Count; i++)
{
ThreadPool.QueueUserWorkItem((o) =>
{
lstEmailAddress[i].IsActive = lstEmailAddress[i].Login();
}
}
I have a static class which is used to access a static concurrentdictionary:
public static class LinkProvider
{
private static ConcurrentDictionary<String, APNLink.Link> deviceLinks;
static LinkProvider()
{
int numProcs = Environment.ProcessorCount;
int concurrencyLevel = numProcs * 2;
deviceLinks = new ConcurrentDictionary<string, APNLink.Link>(concurrencyLevel, 64);
}
public APNLink.Link getDeviceLink(string deviceId, string userId)
{
var result = deviceLinks.First(x => x.Key == deviceId).Value;
if (result == null)
{
var link = new APNLink.Link(username, accountCode, new APNLink.DeviceType());
deviceLinks.TryAdd(deviceId, link);
return link;
}
else
{
return result;
}
}
public bool RemoveLink(string deviceId)
{
//not implmented
return false;
}
}
how can I make use of this class in my controller in an asp.net
Ie I want to go:
LinkProvider provider;
APNLink.Link tmpLink = provider.getDeviceLink(id, User.Identity.Name);
//use my link
Back ground. The dictionary is used to save a link object between states / requests in a asp.net web api program. So when the service needs to use the link, it asks the linkprovider to find a link for it and if there isn't one it must create one. So I need the dictionary object to the same instance in all my http requests.
So I need the dictionary object to the same instance in all my http
requests
Then use a static class, and make every method static too, so you could call it using the following syntax:
APNLink.Link tmpLink = LinkProvider.getDeviceLink(id, User.Identity.Name);
That being said, you should be aware that in-memory static variables in an ASP.Net application are not always safe to use, because your application isn't stateless and in case the application pool is recycled, your dictionary will be re-instantiated.
I am interfacing with a back-end system, where I must never ever have more than one open connection to a given object (identified by it's numeric ID), but different consumers may be opening and closing them independently of one another.
Roughly, I have a factory class fragment like this:
private Dictionary<ulong, IFoo> _openItems = new Dictionary<ulong, IFoo>();
private object _locker = new object();
public IFoo Open(ulong id)
{
lock (_locker)
{
if (!_openItems.ContainsKey(id))
{
_openItems[id] = _nativeResource.Open(id);
}
_openItems[id].RefCount++;
return _openItems[id];
}
}
public void Close(ulong id)
{
lock (_locker)
{
if (_openItems.ContainsKey(id))
{
_openItems[id].RefCount--;
if (_openItems[id].RefCount == 0)
{
_nativeResource.Close(id);
_openItems.Remove(id);
}
}
}
}
Now, here is the problem. In my case, _nativeResource.Open is very slow. The locking in here is rather naive and can be very slow when there are a lot of different concurrent .Open calls, even though they are (most likely) referring to different ids and don't overlap, especially if they are not in the _openItems cache.
How do I structure the locking so that I am only preventing concurrent access to a specific ID and not to all callers?
What you may want to look into is a striped locking strategy. The idea is that you share N locks for M items (possible ID's in your case), and choose a lock such that for any ID the lock chosen is always the same one. The classic way of choosing locks for this technique is modulo division- simply divide M by N, take the remainder, and use the lock with that index:
// Assuming the allLocks class member is defined as follows:
private static AutoResetEvent[] allLocks = new AutoResetEvent[10];
// And initialized thus (in a static constructor):
for (int i = 0; i < 10; i++) {
allLocks[i] = new AutoResetEvent(true);
}
// Your method becomes
var lockIndex = id % allLocks.Length;
var lockToUse = allLocks[lockIndex];
// Wait for the lock to become free
lockToUse.WaitOne();
try {
// At this point we have taken the lock
// Do the work
} finally {
lockToUse.Set();
}
If you are on .net 4, you could try the ConcurrentDictionary with something along these lines:
private ConcurrentDictionary<ulong, IFoo> openItems = new ConcurrentDictionary<ulong, IFoo>();
private object locker = new object();
public IFoo Open(ulong id)
{
var foo = this.openItems.GetOrAdd(id, x => nativeResource.Open(x));
lock (this.locker)
{
foo.RefCount++;
}
return foo;
}
public void Close(ulong id)
{
IFoo foo = null;
if (this.openItems.TryGetValue(id, out foo))
{
lock (this.locker)
{
foo.RefCount--;
if (foo.RefCount == 0)
{
if (this.openItems.TryRemove(id, out foo))
{
this.nativeResource.Close(id);
}
}
}
}
}
If anyone can see any glaring issues with that, please let me know!
I have a Dictionary of items that a thread is updating. I want to have a method get the updated list of items using another thread.
Like so:
internal List<string> GetListOfEntities()
{
List<string> listOfEntities = new List<string>();
foreach (string entityName in ModelFacade._totalListOfStkObjects.Keys)
{
listOfEntities.Add(entityName);
}
return listOfEntities;
}
ModelFacade._totalListOfStkObjects is the collection being updated by the thread. I keep getting the exception: "Collection was modified; enumeration operation may not execute."; I have tried copying _totalListOfStkObjects to a local collection and iterating over that in GetListOfEntities().. but I get the same error..?
Any help ?
WulfgarPro
There isn't going to be a guaranteed thread-safe way to access the dictionary. Your best bet is to either change your code so that you're not sharing the collection or to to lock the dictionary when accessing:
object dictLock = new object();
internal List<string> GetListOfEntities()
{
lock (dictLock)
{
return ModelFacade._totalListOfStkObjects.Keys.ToList();
}
}
Make sure you also lock the dictionary when modifying it in another thread.
Change your Dictionary to ConcurrentDictionary if you are using .NET 4. Here is an easy example to simulate your question and resolve it.
class DataItem
{
public int Data { get; set; }
public bool IsDirty { get; set; }
}
var data = new ConcurrentDictionary<string, DataItem>();
Thread addingItems = new Thread(() =>
{
for (int i = 0; i < 10000; i++)
{
data.TryAdd("data " + i, new DataItem { Data = i, IsDirty = true });
Thread.Sleep(100);
}
});
Thread fetchingItems = new Thread(() =>
{
int count = 0;
while (count < 100)
{
foreach (var item in data)
{
if (item.Value.IsDirty)
{
Console.WriteLine(item.Key + " " + item.Value);
item.Value.IsDirty = false;
count++;
}
}
}
});
addingItems.Start();
fetchingItems.Start();
You can wrap the dictionary up in a thread-safe singleton class. This should provide all of the functionality of ConcurrentDictionary to Dictionary. Referencing the Dictionary should only require one additional layer of indirection.
Reference:
Singleton.Instance.myDictionary.Add(1, "Hello World");
Declaration:
public sealed class Singleton
{
private static volatile Singleton instance;
private static object syncRoot = new Object();
public Dictionary<int, string> myDictionary = new Dictionary<int, string>();
private Singleton() {}
public static Singleton Instance
{
get
{
if (instance == null)
{
lock (syncRoot)
{
if (instance == null)
instance = new Singleton();
}
}
return instance;
}
}
}
Look here for more information on the Singleton Pattern in C#. Note that there is only one difference between the pattern on this link and my example code.