I'm a bit confused on the proper usage of MemoryCache.
Should/can it be used to load static information to save on repeated calls?
Should/can it be used to persist data on a view across several action methods?
I have an instance where I don't want to use the data store to populate and persist the data across my view. I started using the MemoryCache which works fine, however I'm starting to question if that was the correct approach.
My concerns was what happens if I have several users on the same page using the same MemoryCache?
First of all, MemoryCache is part of the System.Runtime.Caching namespace. It can be used by MVC applications, but it is not limited to MVC applications.
NOTE: There is also a System.Web.Caching namespace (much older) that can only be used with the ASP.NET framework (including MVC).
Should/can it be used to load static information to save on repeated calls?
Yes.
Should/can it be used to persist data on a view across several action methods?
Yes. If your views use the same data it can. Or if you have data that is on your _Layout.cshtml page that needs caching, it could be done in a global filter.
what happens if I have several users on the same page using the same MemoryCache?
Caching is shared between all users by default. It is specifically meant for keeping data in memory so it doesn't have to be fetched from the database on every request (for example, a list of state names on a checkout page for populating a dropdown for all users).
It is also a good idea to cache data that changes frequently for a second or two to prevent a flood of concurrent requests from being a denial-of-service attack on your database.
Caching depends on a unique key. It is possible to store individual user information in the cache by making the user's name or ID part of the key.
var key = "MyFavoriteItems-" + this.User.Identity.Name;
Warning: This method works only if you have a single web server. It won't scale to multiple web servers. Session state (which is meant for individual user memory storage) is a more scalable approach. However, session state is not always worth the tradeoffs.
Typical Caching Pattern
Note that although MemoryCache is thread-safe, using it in conjunction with a database call can make the operation cross threads. Without locking, it is possible that you might get several queries to the database to reload the cache when it expires.
So, you should use a double-checked locking pattern to ensure only one thread makes it through to reload the cache from the database.
Let's say you have a list that is wasteful to get on every request because every user will need the list when they get to a specific page.
public IEnumerable<StateOrProvinceDto> GetStateOrProvinceList(string country)
{
// Query the database to get the data...
}
To cache the result of this query, you can add another method with a double-checked locking pattern and use it to call your original method.
NOTE: A common approach is to use the decorator pattern to make the caching seamless to your API.
private ObjectCache _cache = MemoryCache.Default;
private object _lock = new object();
// NOTE: The country parameter would typically be a database key type,
// (normally int or Guid) but you could still use it to build a unique key using `.ToString()`.
public IEnumerable<StateOrProvinceDto> GetCachedStateOrProvinceList(string country)
{
// Key can be any string, but it must be both
// unique across the cache and deterministic
// for this function.
var key = "GetCachedStateList" + country;
// Try to get the object from the cache
var stateOrProvinceList = _cache[key] as IEnumerable<StateOrProvinceDto>;
// Check whether the value exists
if (stateOrProvinceList == null)
{
lock (_lock)
{
// Try to get the object from the cache again
stateOrProvinceList = _cache[key] as IEnumerable<StateOrProvinceDto>;
// Double-check that another thread did
// not call the DB already and load the cache
if (stateOrProvinceList == null)
{
// Get the list from the DB
stateOrProvinceList = GetStateOrProvinceList()
// Add the list to the cache
_cache.Set(key, stateOrProvinceList, DateTimeOffset.Now.AddMinutes(5));
}
}
}
// Return the cached list
return stateOrProvinceList;
}
So, you call the GetCachedStateOrProvinceList and it will automatically get the list from the cache and if it is not cached will automatically load the list from the database into the cache. Only 1 thread will be allowed to call the database, the rest will wait until the cache is populated and then return the list from the cache once available.
Note also that the list of states or provinces for each country will be cached individually.
Related
I am currently having problems getting my Update function working, the function first loads an entity using session.Load() and then uses session.SaveorUpdate().
My problem is that if I do not load the session first nhibernate will not know the the relationships and therefore try and insert data which is already there and when I do load the entity first, the updated entity is overwritten by the data already in the database.
public void Update(T Entity, bool load)
{
using(ISession session = this.helper.GetSession())
{
using(ITransaction transaction = session.BeginTransaction())
{
if(load)
{
session.Load(Entity, Entity.ID);
}
session.SaveOrUpdate(Entity);
transaction.Commit();
session.Flush();
}
}
}
In a nutshell:
load object and then bind it with new values (changes will be persisted on session.Flush() without any explicit Update() call) or
create new C# instance with bounded values, including ID, and call session.Update(myInstance)
The more complex answer could be found in one of the doc chapters:
9.4.2. Updating detached objects
Many applications need to retrieve an object in one transaction, send it to the UI layer for manipulation, then save the changes in a new transaction. (Applications that use this kind of approach in a high-concurrency environment usually use versioned data to ensure transaction isolation.) This approach requires a slightly different programming model to the one described in the last section. NHibernate supports this model by providing the method ISession.Update().
// in the first session
Cat cat = firstSession.Load<Cat>(catId);
Cat potentialMate = new Cat();
firstSession.Save(potentialMate);
// in a higher tier of the application
cat.Mate = potentialMate;
// later, in a new session
secondSession.Update(cat); // update cat
secondSession.Update(mate); // update mate
The usage and semantics of SaveOrUpdate() seems to be confusing for new users. Firstly, so long as you are not trying to use instances from one session in another new session, you should not need to use Update() or SaveOrUpdate(). Some whole applications will never use either of these methods.
Usually Update() or SaveOrUpdate() are used in the following scenario:
the application loads an object in the first session
the object is passed up to the UI tier
some modifications are made to the object
the object is passed back down to the business logic tier
the application persists these modifications by calling Update() in a second session
So, we can get an instance of some entity in one session... and close that session. Such object could be even totally brand new C# instance - with all its properties being bounded by some upper layer (e.g. MVC binder, or Web API formatter)
Later, we can use that instance and call session.Update(myInstance). NHibernate will take the ID of that entity and issue the proper update statement.
Another way could be to call Merge:
The last case can be avoided by using Merge(Object o). This method copies the state of the given object onto the persistent object with the same identifier. If there is no persistent instance currently associated with the session, it will be loaded. The method returns the persistent instance. If the given instance is unsaved or does not exist in the database, NHibernate will save it and return it as a newly persistent instance. Otherwise, the given instance does not become associated with the session. In most applications with detached objects, you need both methods, SaveOrUpdate() and Merge().
read more in the doc
I develop a server with persistent client connections (non request based). As I keep track of each connected client state in memory it would be strange if I load entities each time when I need to access such client data.
So I have my detached entities and when I need to perform any changes I don't apply them directly but instead pass these changes and detached entity as a request to GameDb class. It performs changes on this entity and than loads the same entity from the db to perform the same changes again on session-owned entity so NH can track these changes.
I could use Merge but it's much slower because NH should load all entity data (including lazy collections which could be unmodified) to check each property for changes. In my case the performance is critical.
An example:
public void GameDb.UpdateTradeOperation(UserOperation operation, int incomeQuantity, decimal price)
{
if (operation == null) throw new ArgumentNullException("operation");
if (operation.Id == 0) throw new ArgumentException("operation is not persisted");
_operationLogic.UpdateTradeOperation(operation, incomeQuantity, price);
try
{
_factory.Execute(
s =>
{
var op = s.Get<UserOperation>(operation.Id);
_operationLogic.UpdateTradeOperation(op, incomeQuantity, price);
if (op.User.BalanceFrozen != operation.User.BalanceFrozen)
throw new Exception("Inconsistent balance");
}); // commits transaction if no exceptions thrown
}
catch (Exception e)
{
throw new UserStateCorruptedException(operation.User, null, e);
}
}
This approach brings some overcomplexity as I need to apply each change twice and check if the result states are equal. It would be easier if I could use NH Session to monitor entity changes. But it's not recommended to keep NH session opened for a long time and I could have thousands of such long lived opened sessions.
Also it forces me to split my entities and common logic. The problem is that GameDb class doesn't know from which context it's called and can't request any additional data for its operation (e.g. current prices or client socket inactivety timer or many other things) or it may need to conditionaly (by its decision) send some data to the client. Of course I can pass a bunch of delegates to GameDb method but it doesn't seem to me as a good solution.
Can I use Session.Lock to attach my unchanged detached entities so I don't need to perform the changes twice? What LockMode should I use?
Can I use a better approach here? If I keep one opened session per client but commit or rollback transactions quickly will it still open a lot of connections? Will the session keep entities state after the transaction is completed?
What kind of concurrency issues I can experience with long lived per-client-sessions:
If I operate each user entities only from its own thread fiber (or lock)?
If I request another user profile for readonly from "wrong" session (from that session's thread)?
I think what you need to do is to use a second level cache, and store Id of entities per connected client instead of storing entities in the memory.
When client connects you can fetch the entities using Id you storing, which will not even hit the database in subsequent requests as it will fetch entities from the second level cache and you do not need to worry about change tracking.
http://ayende.com/blog/3976/nhibernate-2nd-level-cache
I tried to use Session.Lock (LockMode.None) to reattach detached entities to the new session and it works. It adds an object to the session as clean and unchanged. I can modify it and it will be stored to the database with the next transaction commit.
This is better than merge because Nhibernate does not need to look at all the properties to find out what is changed.
if I change at least one property it updates the whole object (I mean all the properties but without collections and entity links if they are not touched). I set DynamicUpdate = true in the entities mapping and now it updates only changed properties.
If I change any property of the dettached entity outside of its current session, the next call to Session.Lock throws an exception (especially if I change the collection content the exception states "reassociated object has dirty collection"). I do those changes outside of the session because I don't need to save them (some stuff with references).
Very strange, but it works perfectly when I call Lock twice!
try
{
s.Lock(DbEntity, LockMode.None); // throws
}
catch
{
s.Lock(DbEntity, LockMode.None); // everything ok
}
Also for collections: before I came to the solution above I casted them to IPersistentCollection and used ClearDirty().
What about concurrency? My code unsures that each thread fiber updates only its user and nobody except this fiber has write access to the entity.
So the pattern is:
I open a session, get an entity and store it somewhere in the memory.
When I need to read its property - I can do it at any time and very fast.
When I want to modify it I open a new session and perform Lock() on it. After applying changes I commit the transaction and close the session.
I have a somewhat complex permission system that uses six database tables in total and in order to speed it up, I would like to cache these tables in memory instead of having to hit the database every page load.
However, I'll need to update this cache when a new user is added or a permission is changed. I'm not sure how to go about having this in memory cache, and how to update it safely without causing problems if its accessed at the same time as updating
Does anyone have an example of how to do something like this or can point me in the right direction for research?
Without knowing more about the structure of the application, there are lots of possible options. One such option might be to abstract the data access behind a repository interface and handle in-memory caching within that repository. Something as simple as a private IEnumerable<T> on the repository object.
So, for example, say you have a User object which contains information about the user (name, permissions, etc.). You'd have a UserRepository with some basic fetch/save methods on it. Inside that repository, you could maintain a private static HashSet<User> which holds User objects which have already been retrieved from the database.
When you fetch a User from the repository, it first checks the HashSet for an object to return, and if it doesn't find out it gets it from the database, adds it to the HashSet, then returns it. When you save a User it updates both the HashSet and the database.
Again, without knowing the specifics of the codebase and overall design, it's hard to give a more specific answer. This should be a generic enough solution to work in any application, though.
I would cache items as you use it, which means on your data layer when you are getting you data back you check on your cache if it is available there otherwise you go to the database and cache the result after.
public AccessModel GetAccess(string accessCode)
{
if(cache.Get<string>(accessCode) != null)
return cache.Get<string>(accessCode);
return GetFromDatabase(accessCode);
}
Then I would think next on my cache invalidate strategy. You can follow two ways:
One would be set expire data to be 1 hour and then you just hit the database once in a hour.
Or invalidate the cache whenever you update the data. That is for sure the best but is a bit more complex.
Hope it helps.
Note: you can either use ASP.NET Cache or another solution like memcached depending on your infrastructure
Is it hitting the database every page load that's the problem or is it joining six tables that's the problem?
If it's just that the join is slow, why not create a database table that summarizes the data in a way that is much easier and faster to query?
This way, you just have to update your summary table each time you add a user or update a permission. If you group all of this into a single transaction, you shouldn't have issues with out-of-sync data.
You can take advantage of ASP.NET Caching and SqlCacheDependency Class. There is article on MSDN.
You can use the Cache object built in ASP.Net. Here is an article that explains how.
I can suggest cache such data in Application state object. For thread-safe usage, consider using lock operator. Your code would look something like this:
public void ClearTableCache(string tableName)
{
lock (System.Web.HttpContext.Current)
{
System.Web.HttpContext.Current.Application[tableName] = null;
}
}
public SomeDataType GetTableData(string tableName)
{
lock (System.Web.HttpContext.Current)
{
if (System.Web.HttpContext.Current.Application[tableName] == null)
{
//get data from DB then put it into application state
System.Web.HttpContext.Current.Application[tableName] = dataFromDb;
return dataFromDb;
}
return (SomeDataType)System.Web.HttpContext.Current.Application[tableName];
}
}
I have a list of product that I have stored in asp.net cache but I have a problem in refreshing the cache. As per our requirement I want to refresh cache every 15 minutes but I want to know that if in the mean time when the cache is being refreshed if some user ask for the list of product then should he get error or the old list or he have to wait until the cache is refreshed.
the sample code is below
public class Product
{
public int Id{get;set;}
public string Name{get;set;}
}
we have a function which gives us list of Product in BLL
public List<Product> Products()
{
//////some code
}
Cache.Insert("Products", Products(), null, DateTime.Now.AddMinutes(15), TimeSpan.Zero);
I want to add one more situation here, Let say I use static object instead of cache object then what will happen and which approach is best if we are on a stand alone server and not on cluster
Sorry - this might be naive/obvious but just have a facade type class which does
if(Cache["Products"] == null)
{
Cache.Insert("Products", Products(), null, DateTime.Now.AddMinutes(15), TimeSpan.Zero);
}
return Cache["Products"];
There is also a CacheItemRemoveCallback delegate which you could use to repopulate an expired cache. As an alternative
ALSO
use the cache object rather than static objects. More efficient apparently (Asp.net - Caching vs Static Variable for storing a Dictionary) and you get all your cache management methods (sliding expiration and so on)
EDIT
If there is a concern about update times then consider two cache objects plus a controller e.g.
Active Cache
Backup Cache - this is the one that will be updated
Cache controller (another cache object?) this will indicate which object is active
So the process to update will be
Update backup cache
Completes. Check is valid
Backup becomes active and visa versa. The control now flags the Backup cache as being active
There needs to be a method which will fire when the products cache object is populated. I would probably use the CacheItemRemoveCallback delegate to initiate the cache repopulation. Or do an async call in the facade type class - you wouldn't want it blocking the current thread
I'm sure there are many other variants of this
EDIT 2
Actually thinking about this I would make the controller class something like this
public class CacheController
{
public StateEnum Cache1State {get;set;}
public StateEnum Cache1State {get;set;}
public bool IsUpdating {get;set;}
}
The state would be active, backup, updating and perhaps inactive and error. You would set the IsUpdating flag when the update is occurring and then back to false once complete to stop multiple threads trying to update at once - i.e. a race condition. The class is just a general principle and could/should be amended as required
I have a situation where I need to retrieve data from a query which executes for almost half a minute and bring it to a web page. (There is no way to reduce this time because the maximum amount of optimization has been performed on it)
I use a four layer architecture along with Entity Framework ( EF, Data Access Layer, Biz Logic Layer, UI) for my application.
I'm trying to use the singleton method when an instance to the DAL is created (The DAL in turn retrieves data from the database) so that I will be able to re-use this instance and hence additional instances wont be created within the same session.
How do I go about setting the session state and checking the availability of the instance in the State Server?
public static Singleton getInstance() {
if (**instance == null**)
instance = new Singleton();
return instance;
}
What should reside within the if block? What condition should I check for in the if block? I'm really unsure as to what I must do.
PS: This session must have a timeout of 5 mins. I hear this can be specified in the Web.config file. is it true?
To be honest you should rather use Entity Framework context and create it every time you need access to database, i.e. in each method. It is optimized to be used that way. Connection pooling will make sure there is not penalty in recreating EF context each time. This is the best practice.
But your DAL might be more than just simple DB access. If you want to have it as a singleton separate for each session you must create the instance on the first request, store it into the Session and check if it's there before using. With thread safety the code could look like that:
class DALClass
{
private static object instanceLock = new object();
public static DALClass Instance
{
get
{
if (Session["DALInstance"] == null)
{
lock (instanceLock)
{
if (Session["DALInstance"] == null)
{
Session["DALInstance"] = new DALClass();
}
}
}
return (DALClass)Session["DALInstance"];
}
}
}
It sounds to me like you have a well defined architecture which would suit dependency injection. Using DI you could just get your IOC container to return you a singleton object or a transient one. However, be very careful using singletons in a web environment as they often cause more trouble than they are worth.
If the query you are running contains user specific data then I would probably place the results of that query into session within the code which composes the UI part of your application if you are using a pattern like MVC that would be in the controller or MVP in the presenter.
If these patterns aren’t in use then you could consider placing the information into session inside the business layer but only if you wrap up the session and inject in that dependency into your business object, e.g. something like “IUserSession”. The business project should not contain a reference to “system.Web” or anything like that.