Net web api developer and i want to know if im working correctly.
Im saving a changeable objects into the cache.
Other developers on my team said only static data should be stored in the cache.
So i wanted to know if only static data need to be store in cache or there's another right way to do it.
Thanks.
I use caching for changeable objects because they take a reasonable amount of time to generate, although the frequency of their changing varies.
There are a couple of things which I do to try and make sure the data is always valid.
On the cached item I put a policy which will keep the item in cache for say 15 minutes, and make the expiration time sliding. This keeps the used items in cache but drops less used items.
I also have cache eviction end points on the API, and the process which updates the data in the database calls the endpoint once the process has been complete. The items which have been updated are then removed from the cache and hence rebuilt the next time they are requested.
In the end I think it all boils down to how long it takes to get the object you are trying to return, and whether the delay to generate it is acceptable.
Related
This is easier to explain if I first explain my order of events:
User requests a webpage that requires x database objects
The server checks the cache to see if it is cached, and if it is not, it re-requests the database
The page is sent to the user
My issue is, when the user requests a webpage and the cache is expired, it will take a very long time for the cache to update. The reason is because the data that is being cached includes data fetched from other locations, so web requests are being made to update the cache. Because it is making web requests, it can take quite a while for the cache to update, causing the user's webpage to sit there and load for upwards of ten seconds or more.
My question is, how would I go about reducing or completely removing these edge cases where, when the cache is updating, the user's webpage takes forever to load?
The first solution I came up with was to see if I could persist the MemoryCache past its expiration time, or at the very least check its expiration time, so that I can fetch the old object and return that to the user, and then initiate a cache update in another thread for the next user. However, I found that MemoryCache completely removes the items entirely upon expiration (which makes sense), and that there is no way to avoid doing this. I looked into using CacheItemPriority.NeverRemove, but there is no way to check the expiration time (for some weird reason).
So the second solution I came up with was to create my own cache, but I don't know how I would go about doing that. The object I am storing is a list objects, so I would prefer to avoid a wrapper object around them (but, if that's what I have to do, I'll be willing to do that). I would like this cache to be abstract, of course, so it can handle any type of item, and using a wrapper object for lists would not allow me to do that.
So what I'm looking for in a custom cache is:
Ability to check expiration date
Items are not removed upon expiration so that I can manually update them
Yet through the past couple of hours searching online, I have found nothing that describes a cache that's even remotely close to being able to do something like this (at least, one that's provided with .NET Core or available in the NuGet packages). I have also not found a guide or any examples that would help me understand how to create a custom cache like this.
How would I go abouts making this custom cache? Or is a cache even what I'm looking for here?
I have a web-service which is called by some web-service clients. This web-service returns the current inventory list of an inventory. This list can be big, 10K+ of product IDs, and it takes quite some time (~4 minutes) to refresh the list by reading data in the database. I don't want to refresh the list every time this web-service is called, as it may consume too much resource on my database server, and the performance is always not very good.
What I intend to do is giving the inventory list some time-to-live value, which means when a client asks for the inventory list, if the data is not out-of-date I just return the data right away; if the data is obsolete I will read it from the database, update this list data and its time-to-live value, and then return the refreshed data back to the web-service client. As there may be several clients call this web-service, It looks like I need a multi-thread synchronization(multiple-read single-write, ReaderWriterLockSlim class?) to protect this inventory list, but I haven't found a good design to make this web-service have good performance: only one client refreshes the data, the other clients don't have to redo the work if the data is still within the time-to-live frame and the web-service should return the result as soon as possible after the write thread completes the update.
I also think about another solution (also use ReaderWriterLockSlim class): creating a separate thread to refresh the inventory list periodically (write-thread refreshes the data every 15 minutes), and let all the web-service clients use read-thread to read the data. This may work, but I don't really like it as this solution still waste resource of the web-server. For example, if there is no client's request, the system still has to refresh the inventory list data every 15 minutes.
Please suggest some solution. Thanks.
I would suggest using a MemoryCache.
https://stackoverflow.com/a/22935978/34092 can be used to detect when the item expires. https://msdn.microsoft.com/en-us/library/7kxdx246.aspx is also worth a read.
At this point, the first line of the code you write (in CacheRemovedCallback) should write the value back to the MemoryCache - this will allow readers to keep reading it.
Then it should get the latest data, and then write that new data to the MemoryCache (again passing in a CacheItemPolicy to allow the callback to be called when the latest version is removed). And so on and so on...
Do you ever run only one instance of your service? Then in-memory caching is enough for you. Or use a ConcurrentDictionary if you don't want to implement the lock yourself.
If you run multiple instances of that services, it might be advisable to use a out of process cache like Redis.
Also, you could eventually maintain the cached list so that it is always in sync with what you have in the database!?
There are many different cache vendors dotnet and asp.net and asp.net core have different solutions. For distributed caching there are also many different options. Just pick whatever fits best for the framework you use.
You can also use libraries like CacheManager to help you implement what you need more easily.
I've been tasked with implementing server-side cache on a web application to improve performance of the front-end. The objective is to cache a large dataset resulting from an MDX query on an SSAS cube. The cache is to expire at midnight, which is when the cube is processed each day. I've decided to use the Cache.NET API, and so I have this:
Cache.Insert("MDXResult", myMDXDataSetThatExpiresAtMidnight, null,
DateTime.Now.AddMinutes(getMinutesUntilMidnight()), TimeSpan.Zero);
What troubles me is something I read on the MSDN page on ASP.NET Caching: Techniques and Best Practices:
The simplest way to store data in the Cache is simply to assign it,
using a key, just like a HashTable or Dictionary object:
Cache["key"] = "value";
This will store the item in the cache without any dependencies, so it
will not expire unless the cache engine removes it in order to make
room for additional cached data.
The last bit -- the fact that the cache engine removes cached data in order to make room for additional cached data -- does it apply to only the case above where the item is stored in cache without any dependencies? How can I be sure that the cache of myMDXDataSetThatExpiresAtMidnight will not be cleared by the cache engine before its expiry time?
Alternatively, is there any way to control the amount of space allocated for server-side cache, such as a setting in web.config or similar, to ensure that cache isn't inadvertently cleared?
All entries, including those with dependencies, can be removed at any time. The dependencies just help the cache engine calculate when items have expired.
You can not enforce your item to stay in the cache in any way. The cache may remove it for known, unknown and other causes. Usually due to expiration (time or dependency based) or memory pressure. You can, however, use the Cache.Add overload which accepts an CacheItemRemovedCallback onRemoveCallback which may be a function that calculates a new item (or knows of the old one) and add it again. I'm not sure about the actual timings, but I guess that there is a brief time where the item is not in the cache, while your callback is executed and has not yet added the new item.
You can configure the caching using the CacheSection in web.config.
I'm trying to use the cache in an ASP.NET MVC web application to store some list data that rarely updates. I insert this data into the cache in an UpdateCache() method like this:
HttpContext.Current.Cache.Insert("MyApp-Products", products, null, DateTime.Now.AddYears(99), Cache.NoSlidingExpiration);
Then in my model I retrieve it:
public static List<string> GetProducts()
{
var cachedProducts = HttpContext.Current.Cache["MyApp-Products"];
if (cachedProducts == null)
{
UpdateCache();
cachedProducts = HttpContext.Current.Cache["MyApp-Products"];
}
return ((List<string>)cachedProducts );
}
The first time I visit the page, UpdateCache() is called as expected. If I refresh, the data comes from the cache and I do not need to call UpdateCache(). However, after maybe 15 mins, I come back to the app and the cached values are gone. My understanding was that this cache was per application, not session, so I would have expected it to still be there for myself or another user.
Is there something wrong with the way I'm storing the cache? Or is there something I'm not understanding about how the Cache in ASP.NET works in a web app?
My understanding was that this cache was per application, not
session, so I would have expected it to still be there for myself or
another user.
While the cache is per application, ASP.NET doesn't provide you any guarantees that if you stored something into the cache you will find it back there. The cache could be evicted under different circumstances such as for example your server starts running low on memory and so on. You can subscribe to an event and get notified when an item is evicted from the cache. You can also define a priority when caching an item. The higher the priority, the lower the chance of this item getting evicted.
Also since the cache is stored in the memory of the web server (by default) you should not forget the fact that your application domain could be recycled at any point by IIS. For example after a certain amount of inactivity or if it starts running low on memory or even if a certain CPU threshold usageis reached, ... and everything that is stored in the memory including the cache will simply disappear into the void and the next request in your application will start a new AppDomain.
But in any cases make sure that you check if the item is present in the cache before using it. Never rely on the fact that if you stored something into it, you will find it.
All this blabla to come to the really essential point which is something very worrying with your code. It looks like you are storing a List<string> instance into the cache. But since the cache is per application, this single instance of List<string> could be shared between multiple users of the application and of course this could happen concurrently. And as you know List<T> is not a thread safe structure. So with this code you will, at best, get an exception and at worst you will get corrupt data. So be very careful what you are caching and how you are synchronizing the access to the structure especially if you are caching a non thread-safe class.
IF this is on a full IIS, and happens around every 15minuntes. Remember to check the Idle timeout value.
That being said, if this list "never" changes, why not store it in a static array instead.
I can see how storing data in general in database is better than in static object.
For one, in case data does change, it is easier to update DB than the application.
Try explicitly setting absolute expiration when caching your object:
HttpRuntime.Cache.Insert("MyApp-Products", cachedProducts, null, DateTime.Now.AddDays(20), TimeSpan.Zero );
Note HttpRuntime is used instead of HttpContext for performance reasons even though the difference is minor.
Hell guys, I'm really confused about wich solution use for my project.
Well, I have a big List retrivied from my database(more than 1000 results with a large query clauses, searching in more than 3 tables with more than 3.000.000 items) and I don't want make this query twice without changes because more than 300 users can make this big query at the same time, so I decided to use session to stay with every user query results, but I really don't know if it's a good pratice.
My team mate told me that's better make the big query at every user post because it's not a good pratice put Big Lists inside Sessions because a lot of users using Sessions with large Lists will waste more from our server than make this query a lot of times.
So, Is a good pratice put big Lists on ASP.NET MVC sessions?
[EDIT]
every user can have different results, they're not the same for all users.
[EDIT 2]
I need to show all the results of the query at the same time, so I can't paginate it.
firstly- Bryan Crosby's remark is a very good one, plus- is the user going to need to view 1000 items at a time?
have you considered paging your data?
if, however, you decide that you must have that huge result set, then how about this-
if I understand you correctly, this query is identical for all 300 users.
if that's the case, the proper place for this result set is not Session but application's Cache.
this is because Session is per-user, and Cache is per-application (meaning- shared between all users).
so if you store your items in cache, once the first user has retrieved those items from storage, they'll be available to all subsequent users.
a few things to keep in mind, though:
1. since cache is common to all users, you must synchronize your access to it.
2. you need to set an expiry period (cache item has this option natively), so that those 1000s of items won't live in the memory of your application forever.
3. if those items can change, you need to invalidate the cache when they do, so the user doesn't view stale data.
good luck
Generally, no, it is not a good practice to store large sets of data in the session...
The problem with storing "large" sets of information in the session state is that every time you do a form post-back, that session state is re-transmitted to the client, slowing down their user experience. (It's stored in a hidden field on the form, and can baloon in size due to poor compression of data to web-safe encrypted text - generally, you should avoid putting large amounts of data in the session state if you can)
In cases where the user has to view "large" sets of information, it's possible to create session-like stores or chaches to keep the info in server memory, and then just expose a session key in the session state; tie the server cashed item for that session to the session key and your set to re-transmit it if needed.
Something like (pseudocode)
Dictinary<Guid, DataSet> serverCache = new Dictionary<Guid, DataSet>;
This.ApplicationState.Add(serverCache, "DataCache");
// Add users session key and local cached data here
serverCache.Add(This.GetUserSessionGuid(), This.LoadData());
Also +1 to the post about paging this data - now that you have it in the server cache - you can handle paging easily.
All that said, keeping this data in a cache for some fixed time, might eat up your server memory pretty quick (usually "cheaply" solved with more memory... but still)
A good DataBase and Front end pairing should be optimized to handle the traffic load for the data. As suggested - do some metrics to find out if it's even an issue. I would suggest designing the database queries to allow for paging of data, so each view on the form by each user is further limited....
10 queries, one page at a time, for 1000 users, returning 100 rows, at a time (100-thousand rows, at a time with 1 query per user per second) is much more tollerable to a performance DB than 1 query, all at once, returning all 10000 rows, for 1000 users (1-Million rows, at a time)
I wouldn't put anything that big into session if there's any way you can avoid it. If I did have to store it in session I wouldn't store the List object. Convert the List controls to an Array, and store the array if you must store it in session.
Session["mylist"] = list.ToArray();
Rality check: You have toy data.
1000 results are nothing, tables with 3 million items are nothing. They even wre nothing significant 10 years ago - today my mobilep hone handles that without a sweat.
Simple like that.
THAT SAID: it also goes the other way. 1000 items are a joke memory wise 8unless they are images) so they MAY be storage in session. Unless you run a ton of users, it may be worth to just store it in memory - there is a tradeof, but for example for most nitranet type applications this is doable.
My main problem with that is that session state is once possibly for multiple browser windows (tabls) and the number of times I ahve been pissed by a stupid prgorammer storing something in the session that killed me using 2-3 tabs on the site at the same time is higher than 0. I would be carefull with that. Like someone using two tabs for different searches to compare the list.