when is using Cache is to much? - c#

i am really struggling finding the best solution, what is really confusing me is as long as .net framework purges low priority cached items why should i worry about memory ( i know it is a dump question) ?
can i really constrain an ASP.net Caching to use certain amount of RAM, not from IIS cause my site is on shared hosting, also what that affect on caching ?
would static classes make better use for some situations ?

Sadly, the memoryLimit attribute on the processModel configuration element is only allowed to be set at the machine level (in machine.config) - and the way config files work, there's no way to say "this is the maximum memory any one app can have, but they can ask for less if they want".
IF you're finding that things are constantly spooling out of your cache, should they really be in there? You're either not reading them often enough to bother caching them in the first place, or your trying to put too much stuff into cache at once.
Cache spooling can be seen with the built in Performance Monitors. Under ASP.NET Applications and ASP.NET Apps v2.0.50727 you can find things like:
Cache API Trims
Cache API Turnover Rate
Cache Total Trims
Cache Total Turnover Rate
More details can be found on TechNet.

Related

Clear MemoryCache across worker processes

I have an ASP.NET MVC application that runs on IIS 7. It is setup as a web garden and the number of worker processes matches the number of my processors. I tend to experience some heavy load at times and this setup as worked best.
I have implemented some caching using System.Web.Cache. I will occasionally need to invalidate some of items in my cache however I cannot clear the cache across all processes.
Does the .NET 4 System.Runtime.Caching features make this any easier? Here is a similar question but I hoping there is better advice with .NET 4.
Flush HttpRuntime.Cache objects across all worker processes on IIS webserver
System.Web.Cache and System.Runtime.Caching provide almost the same features, it is just a simple memory cache where items in the cache can have an expiration time, dependencies etc...
If you want to run your site on multiple physical machines or in your case you run it as web garden, caching data in any in process cache doesn't make a lot of sense because it would cache it for each process again. This will let the memory consumption grow pretty quickly I guess...
In those scenarios a distributed cache system is the best choice, because all processes can leverage the already cached data...
I worked with 2 pretty popular distributed in memory cache systems, one is memcached which was also mentioned in the your link.
The other one is the app fabric cache, here is a good example of how to use it
Memchached is a "simple" cache, it doesn't care about security and all this stuff in the first place. But it is very easy to implement and there are good .Net clients which are really simple to use, almost exactly as the .Net build in crap.
If you want to encrypt the data transfers of your cache and have all this high secured, you might want to go with app fabric...

App Domain Refresh Solution: AppFabric, how is the performance

We currently are facing problems due to high amount of cached objects. We cache Data from an ERP system (for an Online Shop) and IIS will refresh the webpage as it reaches the maximum amount of memory and we loose all cashed objects. As this makes the idea of caching a little bit problematic we are searching for a solution to cache the objects with a different solution.
I have found AppFabric from Microsoft as it is already included into our Windows Server licenses to be a pretty neat solution.
How ever I still fear that we will have enormous performance problems when using AppFabric Velocity instead of the MemoryCache Class (our current solution for Caching).
So my question is now, is this a solution for our problem or am I over-thinking here and is the performance of AppFabric fast enough?
Grid Dynamics did a great report on using AppFabric here. While I don't know the numbers for your specific cache operations, the report showed great numbers performance wise for AppFabric. In one test, they wanted to see how the size of the cache impacted the cache operations performance. When just reading the data, it had little to no impact on the cache operations performance. When updating, there was impact on cache operations performance, but not a ridiculous amount. When testing object size and performance, obviously, larger objects lowered the performance (throughput performance here). Overall, the report has solid tests and statistics that show that the performance of AppFabric Cache is excellent.
No, Grid Dynamics does not compare the results to other products, but they do show you what the performance of AppFabric Cache is like in different tests. They have a particularly useful Appendix section that can provide details to help people in different usage scenarios.
As always, using a solution that is not on the same machine as the IIS instance will add a little bit of time to the fetching of session data from the cache, but we are talking a small amount of time.
If I am understanding your situation than there are object caching solutions available that let you cache objects in memory and expire them according to your application logic or when the cache starts filling up.
Appfabric is not a very mature product in this regard especially when talking about an "inproc" cache. You'd need a client cache, which really is a subset of the distributed cache (meaning all the cached objects) that resides "in proc" and is kept synchronized with the distributed cache.
One solution that I'd recommend is to use NCache as a distributed cache and use its clinet caching feature for your ERP objects.

50GB HttpRuntime.Cache Persistence Possible?

We have an ASP.NET 4.0 application that draws from a database a complex data structure that takes over 12 hours to push into an in memory data structure (that is later stored in HttpRuntime.Cache). The size of the data structure is quickly increasing and we can't continue waiting 12+ hours to get it into memory if the application restarts. This is a major issue if you want to change the web.config or any code in the web application that causes a restart - it means a long wait before the application can be used, and hinders development or updating the deployment.
The data structure MUST be in memory to work at a speed that makes the website usable. In memory databases such as memcache or Redis are slow in comparison to HttpRuntime.Cache, and would not work in our situation (in memory db's have to serialize put/get, plus they can't reference each other, they use keys which are lookups - degrading performance, plus with a large amount of keys the performance goes down quickly). Performance is a must here.
What we would like to do is quickly dump the HttpRuntime.Cache to disk before the application ends (on a restart), and be able to load it back immediately when the application starts again (hopefully within minutes instead of 12+ hours or days).
The in-memory structure is around 50GB.
Is there a solution to this?
In memory databases such as memcache or Redis are slow in comparison to HttpRuntime.Cache
Yes, but they are very fast compared to a 12+ hour spin-up. Personally, I think you're taking the wrong approach here in forcing load of a 50 GB structure. Just a suggestion, but we use HttpRuntime.Cache as part of a multi-tier caching strategy:
local cache is checked etc first
otherwise redis is used as the next tier of cache (which is faster than the underlying data, persistent, and supports a number of app servers) (then local cache is updated)
otherwise, the underlying database is hit (and then both redis and local cache are updated)
The point being, at load we don't require anything in memory - it is filled as it is needed, and from then on it is fast. We also use pub/sub (again courtesy of redis) to ensure cache invalidation is prompt. The net result: it is fast enough when cold, and very fast when warm.
Basically, I would look at anything that avoids needing the 50GB data before you can do anything.
If this data isn't really cache, but is your data, I would look at serialization on a proper object model. I would suggest protobuf-net (I'm biased as the author) as a strong candidate here - very fast and very small output.

How can I exceed the 60% Memory Limit of IIS7 in ASP.NET Caching application

Pardon if this is more serverfault vs. stackoverflow. It seems to be on the border.
We have an application that caches a large amount of product data for an e-commerce application using ASP.NET caching. This is a dictionary object with 65K elements, and our calculations put the object's size at ~10GB.
Problem:
The amount of memory the object consumes seems to be far in excess of our 10GB calculation.
BIGGEST CONCERN: We can't seem to use over 60% of the 32GB in the server.
What we've tried so far:
In machine.config/system.web (sf doesn't allow the tags, pardon the formatting):
processModel autoConfig="true" memoryLimit="80"
In web.config/system.web/caching/cache (sf doesn't allow the tags, pardon the formatting):
privateBytesLimit = "20000000000" (and 0, the default of course)
percentagePhysicalMemoryUsedLimit = "90"
Environment:
Windows 2008R2 x64
32GB RAM
IIS7
Nothing seems to allow us to exceed the 60% value.
See screenshot of taskman.
http://www.freeimagehosting.net/image.php?7a42144e03.jpg
A little late but I'm having nearly the same issue. The problem with the memoryLimit setting on processModel is that it just seems to have no effect despite being documented as such.
percentagePhysicalMemoryUsedLimit similarly looks like it should do something but has no effect.
privateBytesLimit="20000000000" does work though. I went and debugged the process and found the CacheMemorySizePressure object and it successfully picked up the value and set it to _memoryLimit. I would double check that.
Another option is setting a Private Memory Usage recycle threshold on the IIS app pool. That should also get picked up and override the default 60% limit.
A third option is using the new MemoryCache class and setting the PhysicalMemoryLimit on it.
Have you considered using a different caching strategy? The in built caching is not all that feature rich and you will struggle to get it to do much more (unless some IIS guru has some clever work about).
We spent a lot of time working on this and gave up. We actually use slimmer objects to store in the cache and get the fuller objects as needed.
When we have needed to contemplate this we investigated Memcached and Velocity but retreated from deploying them just yet. They are more feature rich though.
Also how are you storing the items in the cache through code? Are you putting them in there at application start or after the first request for each? The reason I ask is whether your cache keys are effective and that actually you are populating the cache over and over and not retrieving anything (this may be the case of just one object type). We managed to do this once by appending the time to a date specific cache key for example.

What are all the disdvantages of using Cache in aspnet?

As I said above, I want to know what are all the disadvantage of using cache? Is that good to use cache in a website?
I don't see any disadvantages to using the cache.
The only disadvantages if you can call them that is incorrect usage.
There are several potential problems when using the cache though:
You will experience increased memory usage if you store objects in memory instead of a database
You may end up storing objects in cache that you don't want there (old objects or dynamic data for instance)
You may cache too much - causing you applications performance to degrade since your cache eats all the servers resources
You may cache too little ending up with a system with increased complexity and no performance gain
You may cache wrong data
And so on. Caching is hard, but used correctly it is a Good Thing.
Cache is a good thing. It will help your site run faster and avoid downloading the same content over and over again. Of course you should avoid caching for dynamically generated pages.
Another problem is with the caching of images or similar resources. If you do use caching for them, it will be tricky to update them when the need arises. You should always choose the caching times properly, making a compromise between faster loading and update efficiency.

Categories