I'm trying to use the cache in an ASP.NET MVC web application to store some list data that rarely updates. I insert this data into the cache in an UpdateCache() method like this:
HttpContext.Current.Cache.Insert("MyApp-Products", products, null, DateTime.Now.AddYears(99), Cache.NoSlidingExpiration);
Then in my model I retrieve it:
public static List<string> GetProducts()
{
var cachedProducts = HttpContext.Current.Cache["MyApp-Products"];
if (cachedProducts == null)
{
UpdateCache();
cachedProducts = HttpContext.Current.Cache["MyApp-Products"];
}
return ((List<string>)cachedProducts );
}
The first time I visit the page, UpdateCache() is called as expected. If I refresh, the data comes from the cache and I do not need to call UpdateCache(). However, after maybe 15 mins, I come back to the app and the cached values are gone. My understanding was that this cache was per application, not session, so I would have expected it to still be there for myself or another user.
Is there something wrong with the way I'm storing the cache? Or is there something I'm not understanding about how the Cache in ASP.NET works in a web app?
My understanding was that this cache was per application, not
session, so I would have expected it to still be there for myself or
another user.
While the cache is per application, ASP.NET doesn't provide you any guarantees that if you stored something into the cache you will find it back there. The cache could be evicted under different circumstances such as for example your server starts running low on memory and so on. You can subscribe to an event and get notified when an item is evicted from the cache. You can also define a priority when caching an item. The higher the priority, the lower the chance of this item getting evicted.
Also since the cache is stored in the memory of the web server (by default) you should not forget the fact that your application domain could be recycled at any point by IIS. For example after a certain amount of inactivity or if it starts running low on memory or even if a certain CPU threshold usageis reached, ... and everything that is stored in the memory including the cache will simply disappear into the void and the next request in your application will start a new AppDomain.
But in any cases make sure that you check if the item is present in the cache before using it. Never rely on the fact that if you stored something into it, you will find it.
All this blabla to come to the really essential point which is something very worrying with your code. It looks like you are storing a List<string> instance into the cache. But since the cache is per application, this single instance of List<string> could be shared between multiple users of the application and of course this could happen concurrently. And as you know List<T> is not a thread safe structure. So with this code you will, at best, get an exception and at worst you will get corrupt data. So be very careful what you are caching and how you are synchronizing the access to the structure especially if you are caching a non thread-safe class.
IF this is on a full IIS, and happens around every 15minuntes. Remember to check the Idle timeout value.
That being said, if this list "never" changes, why not store it in a static array instead.
I can see how storing data in general in database is better than in static object.
For one, in case data does change, it is easier to update DB than the application.
Try explicitly setting absolute expiration when caching your object:
HttpRuntime.Cache.Insert("MyApp-Products", cachedProducts, null, DateTime.Now.AddDays(20), TimeSpan.Zero );
Note HttpRuntime is used instead of HttpContext for performance reasons even though the difference is minor.
Related
I have a web-service which is called by some web-service clients. This web-service returns the current inventory list of an inventory. This list can be big, 10K+ of product IDs, and it takes quite some time (~4 minutes) to refresh the list by reading data in the database. I don't want to refresh the list every time this web-service is called, as it may consume too much resource on my database server, and the performance is always not very good.
What I intend to do is giving the inventory list some time-to-live value, which means when a client asks for the inventory list, if the data is not out-of-date I just return the data right away; if the data is obsolete I will read it from the database, update this list data and its time-to-live value, and then return the refreshed data back to the web-service client. As there may be several clients call this web-service, It looks like I need a multi-thread synchronization(multiple-read single-write, ReaderWriterLockSlim class?) to protect this inventory list, but I haven't found a good design to make this web-service have good performance: only one client refreshes the data, the other clients don't have to redo the work if the data is still within the time-to-live frame and the web-service should return the result as soon as possible after the write thread completes the update.
I also think about another solution (also use ReaderWriterLockSlim class): creating a separate thread to refresh the inventory list periodically (write-thread refreshes the data every 15 minutes), and let all the web-service clients use read-thread to read the data. This may work, but I don't really like it as this solution still waste resource of the web-server. For example, if there is no client's request, the system still has to refresh the inventory list data every 15 minutes.
Please suggest some solution. Thanks.
I would suggest using a MemoryCache.
https://stackoverflow.com/a/22935978/34092 can be used to detect when the item expires. https://msdn.microsoft.com/en-us/library/7kxdx246.aspx is also worth a read.
At this point, the first line of the code you write (in CacheRemovedCallback) should write the value back to the MemoryCache - this will allow readers to keep reading it.
Then it should get the latest data, and then write that new data to the MemoryCache (again passing in a CacheItemPolicy to allow the callback to be called when the latest version is removed). And so on and so on...
Do you ever run only one instance of your service? Then in-memory caching is enough for you. Or use a ConcurrentDictionary if you don't want to implement the lock yourself.
If you run multiple instances of that services, it might be advisable to use a out of process cache like Redis.
Also, you could eventually maintain the cached list so that it is always in sync with what you have in the database!?
There are many different cache vendors dotnet and asp.net and asp.net core have different solutions. For distributed caching there are also many different options. Just pick whatever fits best for the framework you use.
You can also use libraries like CacheManager to help you implement what you need more easily.
I want to build an MVC application that replaces the session object with the caching mechanism of .net framework 4.0.
The application has to store some values in some kind of cache to keep the database traffic and the loading times low (e.g. settings, dropdown values, etc.). The users authenticate themselfs with a username and a password. The default FormsAuthentication is used here. My idea was to create a unique MemoryCache for each user and a general MemoryCache for the application.
My getter for the User MemoryCache looks like this
private MemoryCache _memCache
{
get
{
if(HttpContext.Application[User.Identity.Name] == null)
{
HttpContext.Application[User.Identity.Name] = new MemoryCache(User.Identity.Name);
}
return HttpContext.Application[User.Identity.Name] as MemoryCache;
}
}
If a user logs out the cache will be disposed.
This is the global MemoryCache for the application
private MemoryCache _appMemCache
{
get
{
if(HttpContext.Application["ApplicationMemoryCache"] == null)
{
HttpContext.Application["ApplicationMemoryCache"] = new MemoryCache("ApplicationMemoryCache");
}
return HttpContext.Application["ApplicationMemoryCache"] as MemoryCache;
}
}
The question is if the Application object is usable to store the MemoryCache. And if the CacheItemPolicy works without problems with this.
The reason for me not to use the session is the possible timeout and the blocking of parallel request.
This is a bad idea for a number of reasons. First and foremost, MemoryCache is both volatile and process-bound. Some attempt will be made to keep Session data around, even if it's being stored in proc, but cache is considered completely disposable. If memory runs short, it will be discarded almost immediately. Also, this will effectively kill any ability to scale your app out to multiple workers (processes) as each would have it's on version of the cache, meaning whether the user's "session" was restored or not would depend on the luck of the draw of which process they landed on with their next request.
Your reasons for not just using session don't make any sense. Yes, sessions have timeouts, but these are somewhat fixed, assuming you're not using in proc. Even with in proc, the session will timeout at a fixed interval, as long as something like an App Pool recycle or similar does not occur. On the contrary, MemoryCache could go away at any time. There are no guarantees about how long the data would persist or if it would even persist at all. Just because you set it to stick around for a year, doesn't mean it won't be discarded in five minutes depending on what else is going on in the system.
As far as "blocking parallel requests" go, I'm not sure what you're talking about with that. The session is thread-safe and theoretically should allow parallel requests. There's always going to be concurrency issues with any asynchronous process like writing to a session store, but those same issues would exist with MemoryCache as well.
All that said, the things you're talking about saving actually don't belong in a session, anyways, though. While you're reasons for using something like MemoryCache are completely invalid, you should actually being using a cache to store this type of stuff. In other words, just use the cache and don't try to somehow replace the session with cache.
Also, as I mentioned previously, there's serious issues with MemoryCache in multiprocess scenarios. If you're building an app that will ever serve more than a handful of users, you should avoid MemoryCache entirely and use something like Redis instead. Redis is used by Microsoft Azure, so it has good support from Microsoft and integration into ASP.NET, but you could chooose any NoSQL solution.
Net web api developer and i want to know if im working correctly.
Im saving a changeable objects into the cache.
Other developers on my team said only static data should be stored in the cache.
So i wanted to know if only static data need to be store in cache or there's another right way to do it.
Thanks.
I use caching for changeable objects because they take a reasonable amount of time to generate, although the frequency of their changing varies.
There are a couple of things which I do to try and make sure the data is always valid.
On the cached item I put a policy which will keep the item in cache for say 15 minutes, and make the expiration time sliding. This keeps the used items in cache but drops less used items.
I also have cache eviction end points on the API, and the process which updates the data in the database calls the endpoint once the process has been complete. The items which have been updated are then removed from the cache and hence rebuilt the next time they are requested.
In the end I think it all boils down to how long it takes to get the object you are trying to return, and whether the delay to generate it is acceptable.
I've been tasked with implementing server-side cache on a web application to improve performance of the front-end. The objective is to cache a large dataset resulting from an MDX query on an SSAS cube. The cache is to expire at midnight, which is when the cube is processed each day. I've decided to use the Cache.NET API, and so I have this:
Cache.Insert("MDXResult", myMDXDataSetThatExpiresAtMidnight, null,
DateTime.Now.AddMinutes(getMinutesUntilMidnight()), TimeSpan.Zero);
What troubles me is something I read on the MSDN page on ASP.NET Caching: Techniques and Best Practices:
The simplest way to store data in the Cache is simply to assign it,
using a key, just like a HashTable or Dictionary object:
Cache["key"] = "value";
This will store the item in the cache without any dependencies, so it
will not expire unless the cache engine removes it in order to make
room for additional cached data.
The last bit -- the fact that the cache engine removes cached data in order to make room for additional cached data -- does it apply to only the case above where the item is stored in cache without any dependencies? How can I be sure that the cache of myMDXDataSetThatExpiresAtMidnight will not be cleared by the cache engine before its expiry time?
Alternatively, is there any way to control the amount of space allocated for server-side cache, such as a setting in web.config or similar, to ensure that cache isn't inadvertently cleared?
All entries, including those with dependencies, can be removed at any time. The dependencies just help the cache engine calculate when items have expired.
You can not enforce your item to stay in the cache in any way. The cache may remove it for known, unknown and other causes. Usually due to expiration (time or dependency based) or memory pressure. You can, however, use the Cache.Add overload which accepts an CacheItemRemovedCallback onRemoveCallback which may be a function that calculates a new item (or knows of the old one) and add it again. I'm not sure about the actual timings, but I guess that there is a brief time where the item is not in the cache, while your callback is executed and has not yet added the new item.
You can configure the caching using the CacheSection in web.config.
I'm having some trouble getting my cache to work the way I want.
The problem:
The process of retrieving the requested data is very time consuming. If using standard ASP.NET caching some users will take the "hit" of retrieving the data. This is not acceptable.
The solution?:
It is not super important that the data is 100% current. I would like to serve old invalidated data while updating the cached data in another thread making the new data available for future requests. I reckon that the data needs to be persisted in some way in order to be able to serve the first user after application restart without that user taking the "hit".
I've made a solution which does somewhat of the above, but I'm wondering if there is a "best practice" way or of there is a caching framework out there already supporting this behaviour?
There are tools that do this, for example Microsofts ISA Server (may be a bit expensive / overkill).
You can cache it in memory using Enterprise Libary Caching. Let your users read from Cache, and have other pages that update the Cache, these other pages should be called as regularly as you need to keep the data upto date.
You could listen when the Cached Item is Removed and Process then,
public void RemovedCallback(String k, Object v, CacheItemRemovedReason r)
{
// Put Item Back IN Cache, ( so others can use it until u have finished grabbing the new data)
// Spawn Thread to Go Get Up To Date Data
// Over right Old data with new return...
}
in global asax
protected void Application_Start(object sender, EventArgs e)
{
// Spawn worker thread to pre-load critical data
}
Ohh...I have no idea if this is best practice, i just thought it would be slick~
Good Luck~
I created my own solution with a Dictionary/Hashtable in memory as a duplicate of the actual cache. When a method call came in requesting the object from cache and it wasn't there but was present in memory, the memory stored object was returned and fired a new thread to update the object in both memory and the cache using a delegate method.
You can do this pretty easily with the Cache and Timer classes built into .NET. The Timer runs on a separate thread.
And I actually wrote a very small wrapper library called WebCacheHelper which exposes this functionality in an overloaded constructor. The library also serves as a strongly typed wrapper around the Cache object.
Here's an example of how you could do this...
public readonly static WebCacheHelper.Cache<int> RegisteredUsersCount =
new WebCacheHelper.Cache<int>(new TimeSpan(0, 5, 0), () => GetRegisteredUsersCount());
This has a lazy loading aspect to it where GetRegisteredUsersCount() will be executed on the calling thread the instant that RegisteredUsersCount is first accessed. However, after that it's executed every 5 minutes on a background thread. This means that the only user who will be penalized with a slow wait time will be the very first user.
Then getting the value is as simple as referencing RegisteredUsersCount.Value.
Yeah you could just cache the most frequently accessed data when your app starts but that still means the first user to trigger that would "take the hit" as you say (assuming inproc cache of course).
What I do in this situation is using a CacheTable in db to cache the latest data, and running a background job (with a windows service. in a shared environment you can use threads also) that refreshes the data on the table.
There is a very little posibility to show user a blank screen. I eliminate this by also caching via asp.net cache for 1 minute.
Don't know if it's a bad design, but it's working great without a problem on a highly used web site.