Caching DAL return objects? - c#

I was wondering if i should be caching the objects returned from my DAL in some way? I may have multiple UI controls calling for the same data in a single load of the page.
What would you guys recommend? Am i begin a little to cautious? Its not a terrible amount of data. But if i should be caching in some way, what would be the recommended approach?

You could cache AND if you really have multiple controls on the same page using the same data you can call the data once in the parent page and pass a reference to it to each control with a Setter in each control (rather than have each control pull the same data from the DAL themselves), eg:
myControl.AllUsers = _allUsers;
....
myOtherControl.AllUsers = _allUsers;
I also agree with #DanielHilgarth. Caching adds complexity (when to refresh the cache, for example). If the page loads quickly anyway, I wouldn't bother.
If the page is slow, database calls in loops are often the culprit in my experience.

It depends if it's safe and nescessary to do so. If the data you are working with does not require realtime data (ie. your blog) then by all means cache if you feel it's nescessary to do so (meaning your site is running slow).
A problem with caching that a lot of times people forget to account for is being able to clear the cache on demand if something that requires an immediate response (for example you ban a user, or update payment gateway information).
There are two main types of caching, sliding cache and fixed-time cache.
Sliding cache (cache that gets extended each time a valid retrieval is performed) is great for resources that have relatively easy to compute values, but may suffer from database / network overhead. Cache for 1 hour (or w/ever) on a sliding cache, and then manually invalidate (remove) the cache whenever an INSERT / UPDATE / DELETE occurs for the DAO. This way the user will see realtime results, but will infact be cached whenever possible.
Fixed time cache is great for resources that are difficult to perform (ie a very complex stored procedure) and do not require realtime accuracy. Cache for 1 hour (or w/ever) for the first time it's requested, and do not clear cache until that first hour is up. INSERT / UPDATE / DELETE is ignored by your cache mechanism (unless it's absolutely nescesssary).

To do so you can look at this library:
http://www.reactiveui.net/
it provides a neat an clean way to cache your objects.
I'd say it is a very neat way.

Related

How to cache an object without user interruption?

This is easier to explain if I first explain my order of events:
User requests a webpage that requires x database objects
The server checks the cache to see if it is cached, and if it is not, it re-requests the database
The page is sent to the user
My issue is, when the user requests a webpage and the cache is expired, it will take a very long time for the cache to update. The reason is because the data that is being cached includes data fetched from other locations, so web requests are being made to update the cache. Because it is making web requests, it can take quite a while for the cache to update, causing the user's webpage to sit there and load for upwards of ten seconds or more.
My question is, how would I go about reducing or completely removing these edge cases where, when the cache is updating, the user's webpage takes forever to load?
The first solution I came up with was to see if I could persist the MemoryCache past its expiration time, or at the very least check its expiration time, so that I can fetch the old object and return that to the user, and then initiate a cache update in another thread for the next user. However, I found that MemoryCache completely removes the items entirely upon expiration (which makes sense), and that there is no way to avoid doing this. I looked into using CacheItemPriority.NeverRemove, but there is no way to check the expiration time (for some weird reason).
So the second solution I came up with was to create my own cache, but I don't know how I would go about doing that. The object I am storing is a list objects, so I would prefer to avoid a wrapper object around them (but, if that's what I have to do, I'll be willing to do that). I would like this cache to be abstract, of course, so it can handle any type of item, and using a wrapper object for lists would not allow me to do that.
So what I'm looking for in a custom cache is:
Ability to check expiration date
Items are not removed upon expiration so that I can manually update them
Yet through the past couple of hours searching online, I have found nothing that describes a cache that's even remotely close to being able to do something like this (at least, one that's provided with .NET Core or available in the NuGet packages). I have also not found a guide or any examples that would help me understand how to create a custom cache like this.
How would I go abouts making this custom cache? Or is a cache even what I'm looking for here?

Web api cache architecture

Net web api developer and i want to know if im working correctly.
Im saving a changeable objects into the cache.
Other developers on my team said only static data should be stored in the cache.
So i wanted to know if only static data need to be store in cache or there's another right way to do it.
Thanks.
I use caching for changeable objects because they take a reasonable amount of time to generate, although the frequency of their changing varies.
There are a couple of things which I do to try and make sure the data is always valid.
On the cached item I put a policy which will keep the item in cache for say 15 minutes, and make the expiration time sliding. This keeps the used items in cache but drops less used items.
I also have cache eviction end points on the API, and the process which updates the data in the database calls the endpoint once the process has been complete. The items which have been updated are then removed from the cache and hence rebuilt the next time they are requested.
In the end I think it all boils down to how long it takes to get the object you are trying to return, and whether the delay to generate it is acceptable.

(ASP.NET Cache API) Is it possible that an item may be removed from cache before its set expiry?

I've been tasked with implementing server-side cache on a web application to improve performance of the front-end. The objective is to cache a large dataset resulting from an MDX query on an SSAS cube. The cache is to expire at midnight, which is when the cube is processed each day. I've decided to use the Cache.NET API, and so I have this:
Cache.Insert("MDXResult", myMDXDataSetThatExpiresAtMidnight, null,
DateTime.Now.AddMinutes(getMinutesUntilMidnight()), TimeSpan.Zero);
What troubles me is something I read on the MSDN page on ASP.NET Caching: Techniques and Best Practices:
The simplest way to store data in the Cache is simply to assign it,
using a key, just like a HashTable or Dictionary object:
Cache["key"] = "value";
This will store the item in the cache without any dependencies, so it
will not expire unless the cache engine removes it in order to make
room for additional cached data.
The last bit -- the fact that the cache engine removes cached data in order to make room for additional cached data -- does it apply to only the case above where the item is stored in cache without any dependencies? How can I be sure that the cache of myMDXDataSetThatExpiresAtMidnight will not be cleared by the cache engine before its expiry time?
Alternatively, is there any way to control the amount of space allocated for server-side cache, such as a setting in web.config or similar, to ensure that cache isn't inadvertently cleared?
All entries, including those with dependencies, can be removed at any time. The dependencies just help the cache engine calculate when items have expired.
You can not enforce your item to stay in the cache in any way. The cache may remove it for known, unknown and other causes. Usually due to expiration (time or dependency based) or memory pressure. You can, however, use the Cache.Add overload which accepts an CacheItemRemovedCallback onRemoveCallback which may be a function that calculates a new item (or knows of the old one) and add it again. I'm not sure about the actual timings, but I guess that there is a brief time where the item is not in the cache, while your callback is executed and has not yet added the new item.
You can configure the caching using the CacheSection in web.config.

does storing Data in ViewState slows down the page load?

i have stored my dataset in the View State(because i need to filter the data on different client clicks and show the data) but i feel like the page loading is taking a lot of time, even a checbox checked event(with AutoPostback) which does not have any code to execute is taking almost 2-3 seconds.
is this just because of the view state data, if so are there any alternatives for which i can achieve my tasks? and i need the data to be shown quicky on client events so i have been using the view state. any work around would help.
As #Tushar mentioned above, ViewState is not the place you want to be storing large amounts of data. It's really only meant to preserve the state of controls between round trips, and it can really lead to poor app performance.
Instead you should look into the following server managed options:
Application State - Used for storing data that is shared between all users. Uses server memory.
Session State - Used for storing data specific to a user's session. Also uses server memory. Data can be persisted through app restarts, as well throughout a web-garden or server-farm. More info from MSDN here: http://msdn.microsoft.com/en-us/library/z1hkazw7.aspx
The biggest cons of those methods are memory management, as both options consume server memory, and keep data until there is either a restart of some sorts, or until the session is dropped. Thus, these methods don't always scale well.
Also, here is an MSDN article discussing the various .net methods of state management, with pros and cons for each method :
A third option is to implement a caching strategy by either using the .NET caching libraries, building your own and/or using 3rd party caching servers/libraries. The benefit to using cache is that you have the data automatically expire after any given specified amount of time. However, complexities are introduced when working in a web-garden or server-farm environment.
The biggest thing to remember, is that any of the strategies mentioned above will require some planning and consideration in regards to managing/sharing the data.
If you're storing a large amount of data in ViewState, you'll notice performance issues. Although ViewState is really meant for a "this page only" and Session is meant for "this session", you'll reach a limit with ViewState size where the Session is ultimately much better for performance.
It's worth noting that you might be having some other type of issue, not just an issue with the ViewState (i.e. your database query may be taking a long time and could possibly be cached).
The ViewState makes the page slightly larger due to the extra data embedded in the page's HTML to hold the serialized ViewState. Whether that extra size will cause load problems depends on the connection speed, and on the size of the view state relative to the rest of the page.
The ViewState is sent back to the server with each HTTP request (so including your AutoPostback). Again, whether that causes a noticeable performance issue depends on the view state size and the connection speed.
On a broadband(ish) connection with the amount of ViewState data one would find in a typical page, you would not see 2-3 seconds additional processing time.
Diagnosing
Use the developer tools in your browser (in IE, press F12). You can monitor web requests including the exact header and body sent and received. You can also see the timing for each HTTP request. If the ViewState is not huge (not more than 1-2K perhaps) and your connection speed is not excessively slow, that is not your culprit.
Alternatives
You can hold state entirely server-side, or put any state items that are large entirely on the server. You can use Ajax requests to process page events that depend on that state.
Instead of loading data from a data-source multiple times, only do it one time. The other answers talk about accessing the data. I have run into instances where I load the data every time I do a post-back.
string myString;
public string MyString
{
get
{
// If there is already data in "myString", do not load it again!
if (this.ViewState["myData"] == null)
{
// Load data one time
this.ViewState["myData"] = "Hello";
}
return this.ViewState["myData"] as string;
}
}
How much ViewState slows down your page depends upon have much view state you have. I've inherited pages that generated over a megabyte of viewstate and seen the web server spend 10 seconds just processing the view state. If you don't want to rewrite your application and you need the large amount of view state, you need to investigate alternate strategies for saving / restoring view state. Saving ViewState to a database or even a plain file is much faster -- don't have to stream viewstate to/from client on each request.
Best strategy is to avoid viewstate in the first place though.
Just thought I should add, some controls are simply ViewState pigs, some grids are just terrible for viewstate consumption.
You can view the source of your page and get the ViewState value and use the online ViewState decoder at below url to check how much large are the values stored in your ViewState field for your pages:
http://ignatu.co.uk/ViewStateDecoder.aspx
If you find your viewstate is having large stored values then you should find alternatives for storing your Dataset.
Anyways, you should avoid putting the Dataset into your ViewState.

Is it a good pratice to put big Lists<T> on ASP.NET MVC sessions?

Hell guys, I'm really confused about wich solution use for my project.
Well, I have a big List retrivied from my database(more than 1000 results with a large query clauses, searching in more than 3 tables with more than 3.000.000 items) and I don't want make this query twice without changes because more than 300 users can make this big query at the same time, so I decided to use session to stay with every user query results, but I really don't know if it's a good pratice.
My team mate told me that's better make the big query at every user post because it's not a good pratice put Big Lists inside Sessions because a lot of users using Sessions with large Lists will waste more from our server than make this query a lot of times.
So, Is a good pratice put big Lists on ASP.NET MVC sessions?
[EDIT]
every user can have different results, they're not the same for all users.
[EDIT 2]
I need to show all the results of the query at the same time, so I can't paginate it.
firstly- Bryan Crosby's remark is a very good one, plus- is the user going to need to view 1000 items at a time?
have you considered paging your data?
if, however, you decide that you must have that huge result set, then how about this-
if I understand you correctly, this query is identical for all 300 users.
if that's the case, the proper place for this result set is not Session but application's Cache.
this is because Session is per-user, and Cache is per-application (meaning- shared between all users).
so if you store your items in cache, once the first user has retrieved those items from storage, they'll be available to all subsequent users.
a few things to keep in mind, though:
1. since cache is common to all users, you must synchronize your access to it.
2. you need to set an expiry period (cache item has this option natively), so that those 1000s of items won't live in the memory of your application forever.
3. if those items can change, you need to invalidate the cache when they do, so the user doesn't view stale data.
good luck
Generally, no, it is not a good practice to store large sets of data in the session...
The problem with storing "large" sets of information in the session state is that every time you do a form post-back, that session state is re-transmitted to the client, slowing down their user experience. (It's stored in a hidden field on the form, and can baloon in size due to poor compression of data to web-safe encrypted text - generally, you should avoid putting large amounts of data in the session state if you can)
In cases where the user has to view "large" sets of information, it's possible to create session-like stores or chaches to keep the info in server memory, and then just expose a session key in the session state; tie the server cashed item for that session to the session key and your set to re-transmit it if needed.
Something like (pseudocode)
Dictinary<Guid, DataSet> serverCache = new Dictionary<Guid, DataSet>;
This.ApplicationState.Add(serverCache, "DataCache");
// Add users session key and local cached data here
serverCache.Add(This.GetUserSessionGuid(), This.LoadData());
Also +1 to the post about paging this data - now that you have it in the server cache - you can handle paging easily.
All that said, keeping this data in a cache for some fixed time, might eat up your server memory pretty quick (usually "cheaply" solved with more memory... but still)
A good DataBase and Front end pairing should be optimized to handle the traffic load for the data. As suggested - do some metrics to find out if it's even an issue. I would suggest designing the database queries to allow for paging of data, so each view on the form by each user is further limited....
10 queries, one page at a time, for 1000 users, returning 100 rows, at a time (100-thousand rows, at a time with 1 query per user per second) is much more tollerable to a performance DB than 1 query, all at once, returning all 10000 rows, for 1000 users (1-Million rows, at a time)
I wouldn't put anything that big into session if there's any way you can avoid it. If I did have to store it in session I wouldn't store the List object. Convert the List controls to an Array, and store the array if you must store it in session.
Session["mylist"] = list.ToArray();
Rality check: You have toy data.
1000 results are nothing, tables with 3 million items are nothing. They even wre nothing significant 10 years ago - today my mobilep hone handles that without a sweat.
Simple like that.
THAT SAID: it also goes the other way. 1000 items are a joke memory wise 8unless they are images) so they MAY be storage in session. Unless you run a ton of users, it may be worth to just store it in memory - there is a tradeof, but for example for most nitranet type applications this is doable.
My main problem with that is that session state is once possibly for multiple browser windows (tabls) and the number of times I ahve been pissed by a stupid prgorammer storing something in the session that killed me using 2-3 tabs on the site at the same time is higher than 0. I would be carefull with that. Like someone using two tabs for different searches to compare the list.

Categories