MemoryCache absoluteExpiration and memory limit - c#

I am coding a MVC 5 internet application, and am using the MemoryCache object for caching objects. I see that using the MemoryCache.Set method, an absoluteExpiration can be specified.
If I use the following way to add and retrieve an object from the MemoryCache, what is the absoluteExpiration set to:
cache['cacheItem'] = testObject;
TestObject testObject = cache['cacheItem'] as TestObject;
Also, when using the MemoryCache in an MVC internet application, should I set the amount of memory that can be used for the MemoryCache, or is the default implementation safe enough for an Azure website?
Thanks in advance.

Your code is equivalent to calling Add, like below:
cache.Add("cacheItem", testObject, null);
The added entry would have the default expiration time, which is infinite (i.e., it doesn't expire). See the MSDN on CacheItemPolicy.AbsoluteExpiration for details.
To answer the question about memory usage: (from CacheMemoryLimitMegabytes Property):
The default is zero, which indicates that MemoryCache instances manage their own memory based on the amount of memory that is installed on the computer.
I would say that it's safe to let the MemoryCache defaults decide how much memory to use, unless you're doing something really fancy.

Related

Is using global cache a bad idea for passig complex object on navigation? If it's, any alternative?

I have been using parameter passing with page Navigate method in UWP app without a problem (with object serialization and deserialization). As my objects (passed as parameters) grew in size, I started hitting a problem when the app is suspended and the SuspensionManager attempts to serialize the navigation parameter and save it to a local storage. I get an exception indicating size limit of 8K I think (and assume I have no control over this size).
So I am considering passing the parameter via memory cache rather than navigation parameter(say, save my complex data object to a dictionary in memory with nameof(PageNavigatedToType) as key and retrieve the cached data on NavigatedTo in the destination page. My concern is a possible memory usage increase and not sure for instance if setting a particular dictionary value to null (when no more needed) makes that much of a difference to global memory (I mean app scope).
Any thoughts and suggestions appreciated.
You can't use LocalSettings to store a value higher than 8k and each composite setting can't be higher than 64K bytes in size.
ApplicationData.LocalSettings | localSettings property
If you need to much data to restore when SuspensionManager handle the app restoration maybe a better idea is just to save a value as key and then restore the full object from another place or procedure. As you have suggested.
I am considering passing the parameter via memory cache rather than
navigation parameter(say, save my complex data object to a dictionary
in memory with nameof(PageNavigatedToType) as key and retrieve the
cached data on NavigatedTo in the destination page.
Your concern
My concern is a possible memory usage increase and not sure for
instance if setting a particular dictionary value to null (when no
more needed) makes that much of a difference to global memory (I mean
app scope).
Is a legit concern, but you can handle the object scope to be short, once the GC "detects" memory needings It'll free the objects according your object scope into the code.
If you free a dictionary value -setting it to null- could be enough to be recycled "if and when " there doesn't exists more references to such object.
short object scope could be achieve it in many ways because this is a concept.
Just declaring local method variables instead of class variables/properties is one of them.
When you're using IDisposable objects. The using keyword ensures the correct use of IDisposable objects and allow them to be garbagecollect
using (Font font1 = new Font("Arial", 10.0f))
{
byte charset = font1.GdiCharSet;
}
C# allows you to control a field scope into a method thru brackets, this is a syntax helper to control in which scope your field would be used and because of that you can opt to free those resources just after close the brackets. For sure you could use that if your field isn't an IDisposable field.
public void MyMethod()
{
...
...
{
var o = new MyObject();
var otherReferenceToSameObject = o;
var s = "my string";
...
...
...
...
otherReferenceToSameObject = o = null;
s = null;
}
...
...
}
Any way remember that GC is deterministic by its own algorithms, We don't have enough control over that, but keep some control over our resources could help the GC to do a better work.

C# Cache expire from config

i have few application that use the same code
i want some of the application will save the cache for diffrence expiration time.
there is away that ill call insert cache and the time will be from the config time
call this
HttpContext.Current.Cache.Insert
and value will be auto from config
i know i can use appsetting but there is better way?
<add key="timeTolive" value="5"/>
If I am understanding your question correctly then in Application_Start within Global.asax you could use this:
Context.Cache.Insert("tome", ConfigurationManager.AppSettings["tome"], null, Cache.NoAbsoluteExpiration, Cache.NoSlidingExpiration, CacheItemPriority.NotRemovable, null);
It, at least partially, depends on the cache you are using. For example, the MemoryCache object will allow you to set the defaults on how long an item has to live. This is not automagic, but you can set it up rather easily.
If this is not an option for you, you can explicitly set the cache timeout yourself when you put an object in cache. Encapsualte this in its own class for reuse and you are rolling.
And, you can mix the two concepts, if you want a default time, but occasionally an object that spends more or less time in cache.

In-process caching library in c# other then MS ones?

Does someone knows a library for in-process caching, other than MS ASP.NET cache and ENTLib, with at least two features:
- expiring time;
- object dependency.
I implemented a thread safe pseudo LRU for in memory caching. It's simpler and faster than using Memory cache - performance is very close to ConcurrentDictionary (10x faster than memory cache, and zero memory allocs for hits).
Usage of the LRU looks like this (just like dictionary but you need to give capacity - it's a bounded cache):
int capacity = 666;
var timeToLive = DateTime.FromMinutes(5);
var lru = new ConcurrentTLru<int, SomeItem>(capacity, timeToLive);
var value = lru.GetOrAdd(1, (k) => new SomeItem(k));
GitHub: https://github.com/bitfaster/BitFaster.Caching
Install-Package BitFaster.Caching
My advice would be to cache object graphs (e.g. classes with properties that reference other classes) in an LRU, so that you can evict things at well defined nodes in the dependency tree by simply updating the object. You will naturally end up with something that is easier to understand and doesn't dependency cycles.
There are a few cache providers over at Codeplex, SharedCache seems promising: http://sharedcache.codeplex.com/.

C#: How to implement a smart cache

I have some places where implementing some sort of cache might be useful. For example in cases of doing resource lookups based on custom strings, finding names of properties using reflection, or to have only one PropertyChangedEventArgs per property name.
A simple example of the last one:
public static class Cache
{
private static Dictionary<string, PropertyChangedEventArgs> cache;
static Cache()
{
cache = new Dictionary<string, PropertyChangedEventArgs>();
}
public static PropertyChangedEventArgs GetPropertyChangedEventArgs(
string propertyName)
{
if (cache.ContainsKey(propertyName))
return cache[propertyName];
return cache[propertyName] = new PropertyChangedEventArgs(propertyName);
}
}
But, will this work well? For example if we had a whole load of different propertyNames, that would mean we would end up with a huge cache sitting there never being garbage collected or anything. I'm imagining if what is cached are larger values and if the application is a long-running one, this might end up as kind of a problem... or what do you think? How should a good cache be implemented? Is this one good enough for most purposes? Any examples of some nice cache implementations that are not too hard to understand or way too complex to implement?
This is a large problem, you need to determine the domain of the problem and apply the correct techniques. For instance, how would you describe the expiration of the objects? Do they become stale over a fixed interval of time? Do they become stale from an external event? How frequently does this happen? Additionally, how many objects do you have? Finally, how much does it cost to generate the object?
The simplest strategy would be to do straight memoization, as you have above. This assumes that objects never expire, and that there are not so many as to run your memory dry and that you think the cost to create these objects warrants the use of a cache to begin with.
The next layer might be to limit the number of objects, and use an implicit expiration policy, such as LRU (least recently used). To do this you'd typically use a doubly linked list in addition to your dictionary, and every time an objects is accessed it is moved to the front of the list. Then, if you need to add a new object, but it is over your limit of total objects, you'd remove from the back of the list.
Next, you might need to enforce explicit expiration, either based on time, or some external stimulus. This would require you to have some sort of expiration event that could be called.
As you can see there is alot of design in caching, so you need to understand your domain and engineer appropriately. You did not provide enough detail for me to discuss specifics, I felt.
P.S. Please consider using Generics when defining your class so that many types of objects can be stored, thus allowing your caching code to be reused.
You could wrap each of your cached items in a WeakReference. This would allow the GC to reclaim items if-and-when required, however it doesn't give you any granular control of when items will disappear from the cache, or allow you to implement explicit expiration policies etc.
(Ha! I just noticed that the example given on the MSDN page is a simple caching class.)
Looks like .NET 4.0 now supports System.Runtime.Caching for caching many types of things. You should look into that first, instead of re-inventing the wheel. More details:
http://msdn.microsoft.com/en-us/library/system.runtime.caching%28VS.100%29.aspx
This is a nice debate to have, but depending your application, here's some tips:
You should define the max size of the cache, what to do with old items if your cache is full, have a scavenging strategy, determine a time to live of the object in the cache, does your cache can/must be persisted somewhere else that memory, in case of application abnormal termination, ...
This is a common problem that has many solutions depending on your application need.
It is so common that Microsoft released a whole library to address it.
You should check out Microsoft Velocity before rolling up your own cache.
http://msdn.microsoft.com/en-us/data/cc655792.aspx
Hope this help.
You could use a WeakReference but if your object is not that large than don't because the WeakReference would be taking more memory than the object itself which is not a good technique. Also, if the object is a short-time usage where it will never make it to generation 1 from generation 0 on the GC, there is not much need for the WeakReference but IDisposable interface on the object would have with the release on SuppressFinalize.
If you want to control the lifetime you need a timer to update the datetime/ timespan again the desiredExpirationTime on the object in your cache.
The important thing is if the object is large then opt for the WeakReference else use the strong reference. Also, you can set the capacity on the Dictionary and create a queue for requesting additional objects in your temp bin serializing the object and loading it when there is room in the Dictionary, then clear it from the temp directory.

C# Collection whose items expire

I am writing a Console Application in C# in which I want to cache certain items for a predefined time (let's say 1 hour). I want items that have been added into this cache to be automatically removed after they expire. Is there a built-in data structure that I can use? Remember this is a Console App not a web app.
Do you actually need them removed from the cache at that time? Or just that future requests to the cache for that item should return null after a given time?
To do the former, you would need some sort of background thread that was periodically purging the cache. This would only be needed if you were worried about memory consumption or something. If you just want the data to expire, that would be easy to do.
It is trivial to create such a class.
class CachedObject<TValue>
{
DateTime Date{get;set;}
TimeSpan Duration{get;set;}
TValue Cached{get;set;}
}
class Cache : Dictionary<TKey,TValue>
{
public new TValue this(TKey key)
{
get{
if (ContainsKey(key))
{
var val = base.this[key];
//compare dates
//if expired, remove from cache, return null
//else return the cached item.
}
}
set{//create new CachedObject, set date and timespan, set value, add to dictionary}
}
Its already in the BCL. Its just not where you expect to find it: You can use System.Web.Caching from other kinds of applications too, not only in ASP.NET.
This search on google links to several resources about this.
I don't know of any objects in the BCL which do this, but I have written similar things before.
You can do this fairly easily by just including a System.Threading.Timer inside of your caching class (no web/winforms dependencies), and storing an expiration (or last used) time on your objects. Just have the timer check every few minutes, and remove the objects you want to expire.
However, be watchful of events on your objects. I had a system like this, and was not being very careful to unsubscribe from events on my objects in the cache, which was preventing a subtle, but nasty memeory leak over time. This can be very tricky to debug.
Include an ExpirationDate property in the object that you will be caching (probably a wrapper around your real object) and set it to expire in an hour in its constructor. Instead of removing items from the collection, access the collection through a method that filters out the expired items. Or create a custom collection that does this automatically. If you need to actually remove items from the cache, your custom collection could instead purge expired items on every call to one of its members.

Categories