Should dynamic business objects for a site be stored in the users session or use ASP.Net caching (objects such as orders, profile information etc)?
I have worked with sites that used sessions to store business objects, but I was wondering...What are the advantages or disadvantages of caching?
If the objects are shareable between user sessions, then use the cache. If the objects are unique to each session -- perhaps because they are governed by permissions -- then store it in the session. The in-process session itself is stored in the cache so the deciding factor really should be the scope of the data.
Caching is just that -- caching. You can never rely on entries being there, so no assumptions must be made in that respect: be prepared to go straight to the DB (or wherever else) to refetch data.
Session, on the other hand, is more suited towards storing objects, though personally I try to avoid session store in favour of a DB. I usually do that by abstracting away the store behind an opaque ISessionStoreService interface:
interface ISessionStore
{
T GetEntry<T>(string key);
void SaveEntry<T>(string key, T entry);
}
and then "dependency-injecting" appropriate implementation, be it InmemorySessionStore, DbSessionStore or whatever.
The ASP.NET system cache is global to the application where as the session is unique to the current user. If you chose to use the global cache to store objects you would need to create an object identification strategy so you could get the correct objects for your per user basis.
If you are looking to enhance performance you would be better off replacing the ASP.NET session state with a distributed memory cache such as Microsoft's velocity. Microsoft has posted articles on how to replace session usage to target Velocity. You could also use Memcache or other similar products in a related fashion.
Session objects are suitable for user-only-data, on the other hand, Cache objects are more suitable for data shared for the application.
The key to save in one or the other is to determine if what are you trying to store will be user-only data or it needs to be shared in all the application.
Session => A webpage with a step by step interface (an online test for example).
Cache => The info displayed in some kind of weather widget (like the one that google has in its igoogle.com page).
Hope this helps.
Although you can store your business object in Cache, but Cache is designed for performance improvement not state management. Imagine you that you have a process of getting 1000 record from database (and it take about 3 seconds) and you will need it for a few minutes. You can store your objects in Cache and set expire date, priority and dependency to it (like SqlDependency or FileDependency), so for next requests you can use Cached data instead of retriving it from database. You can store your object in Session but you can not set dependency for Session by default. Also Cache has a unique behavior that when system needs memory it will release objects from cache depending on its priority. Cache objects are global to Application and shared between all users but Session is not shared and it's usable for each user (Session).
Related
My Problem:
My app has number of pages/resources that should be accessible by guest users but only after providing correct pair [Resource Code - Unique Token] (each page has one Page Code and multiple unique "Tokens" issued for each user). Tokens are generated beforehand and stored in DB. After accessing the page, user will be able to interact with multiple other resources belonging to particular page.
How I organized this so far:
Page asks user to provide token and checks it with records in DB. If this is a correct token for resource requested, it writes cookie and then, every time user interacts with the resource or its content, controller will every time read cookie and check [PageCode-Token] pair with database before continuing the action.
Question:
Is there any other, more elegant and efficient approach? Should I use Session instead? I feel a bit bad about querying DB every time.
This depends on how many users access your service, if the volume is too large it would be recommended to create a cache where all tokens are stored, thus avoiding a database overload. However if the service is not widely used this is not necessary as a database can handle a lot of requests.
You could create a cache in two ways, using ready-made software or create a small cache within the project itself.
If you choose to use software, I would recommend Redis, it is a cache database that stores values with or without a timeout, ie after a while the tokens are deleted.
Keep in mind that this does not prevent you from making requests to the database, but you will always make requests to the cache first (Redis) and if the value does not exist, it is necessary to search within the database.
But if you choose to create your own, you will need to do most things manually and always knowing how much resources can be allocated. It may be more advantageous to use software than reinvent the stone.
I'm working with ASP.NET and I want to load once a big object (specific by user) in my controller and then use it in my view.
I though about a static property but I find some problems with it.
See : Is it a bad idea to use a large static variable?
I'm not familiar with this language and I have no idea how to properly share the same object between the different methods for each user. Could you tell me how to do that ? Do you know if singleton could be a solution ?
A singleton won't help you here if this big object is going to be different for every user.
Without knowing all the details, I'd say perhaps your best option is to store it in the Session object,
HttpContext.Current.Session["bigObject"] = bigObject;
Depending on the size & traffic, this can have performance problems, so I'd suggest you read up on the pros and cons of using Session
If you want to get to use something for the next simultaneous request, then use TempData - which is a bucket where you can hold data that is only needed for the following request. That is, anything you put into TempData is discarded after the next request completes.
If you want to persist the information specific to user, then go for Session. With session, you will have timeout, so that after certain amount of time the data stored in session will be lost. You can configure this value to much more bigger value. Apart from that when you go for Webfarm of servers, then maintaining session will be a problem, probably you might need to go for managing session in SQL Server or in some other store.
Alternatively you can Use Runtime Cache object in ASP.Net MVC to keep all the common data. Cached data can be accessed fast and we have other benefits like expiring cache etc. You can share this Cache object across users, or you can maintain different cache objects for different users, that is purely dependent on your logic. In case of webfarms, yo u have distributed cache servers like redis, Azure Cache service etc., which you can use them.
The project I am working on is facing a design dilemma on how to get objects and collections of objects from a database. Sometimes it is useful to buffer *all* objects from the database with its properties into memory, sometimes it is useful to just set an object id and query its properties on-demand (1 db call per object to get all properties). And in many cases, collections need to support both buffering objects into memory and being initialized with minimum information for on-demand access. After all, not everything can be buffered into memory and not everything can be read on-demand. It is a ubiquitous memory vs IO problem.
Did anyone have to face the same problem? How did affect your design? What are the tough lessons learned? Any other thoughts and recommendations?
EDIT: my project is a classic example of a business layer dll, consumed by a web application, web services and desktop application. When a list of products is requested for a desktop application and displayed only by product name, it is ok to have this sequence of steps to display all products (lets say there is a million of products in the database):
1. One db call to get all product names
2. One db call to get all product information if the user clicks on the product to see details (on-demand access)
However, if this same API is going to be consumed by a web service to display all products with details, the network traffic will become chatty. The better sequence in this case would be:
1. What the heck, buffer all products and product fields from just one db call (in this case buffering 1 million products also looks scary)
It depends how often the data changes. It is common to cache static and near static data (usually with a cache expiry window).
Databases are already designed to cache data, so provided network I/O is not a bottleneck, let the database do what it is good at.
Have you looked at some of the caching technologies available?
.NET Framework 4 ObjectCache Class
Cache Class: Using the ASP.NET Cache outside of ASP.NET
Velocity: Build Better Data-Driven Apps With Distributed Caching
Object cache for C#
This is not a popular position, but avoid all caching unless absolutely necessary or if you know for sure immediately that you're going to need to "Internet-scale." Tried to scale out a layered-cache atop the database? Are you going to write-through the cache and only read or wait for an LRU object to write changes? what happens when another app or web services tier sit atop the DB and get inconsistent reads?
Most modern databases already have cache and likely can implement them better than you, just determine if you want to hit the DB wire every time you need something. In a large majority of the cases, the DB'll perform just fine and you'll keep your consistency. BASE and CAP theory is nice and fun to talk about and imagine, but you sometimes just can't beat the cost-to-market of just hitting the good old database. Stress test and determine your hotspots, implement your cache conservatively if needed.
We have a web application that is storing all the site data in HttpRuntime.Cache.
We now need to deploy the application across 2 load balanced web servers.
This being the case, each web server will have its own cache, which is not ideal because if a user requests data from webserver1 it will be cached, but there next request might go to webserver2, and the data that their previous request cached won't be available.
Is it possible to use a shared-cache provider to share the HttpRuntime.Cache between the two web servers or to replicate the cache between them, so that the same cache will be available on both web servers? If so, what can I do to solve this problem?
No, you can't share the built-in ASP.NET cache, but you could use something like memcached or AppFabric instead.
Nope, it's not possible. You have to use so called ditributed cache like Microsoft AppFabric Caching or very popupar open source product memcached.
It sounds from your question that you've got user data it the cache? In that case I'd be with Aliostad and say dont go there!
HttpRuntime cache should be used for static but regularly used items that come from the database, the main purpose should be preventing database hits that would otherwise occur on every request regardless of user... so things like options in a combobox or certian configuration settings
If you do genuinely need caching for user data, then as above Memcached, Appfabric or nvelocity
There are layers of caching suitable for different needs, having only 2 webservers suggests that you dont yet require the Distributed caching frameworks above.
What is the server load, and what is the limiting factor, CPU, RAM, Network Bandwith? On your DB or your webservers? Each of these indicates a different caching strategy.
Don't go there. Normally, cache being a static object, only lives in the AppDomain. Manually updating these is a world of pain and strongly advise against.
You can use a number of caching solutions that sit in front of your server for that kind of purpose.
I'm definitely not a fan of WebForms, I prefer in the .NET world ASP.NET MVC.
Regardless, I'm working on a small part of a very large legacy WebForms application.
I'm integrating Korzh.com's EasyQuery.NET. to allow end users to create their own SQL queries based on pre-defined models made user friendly with aliases.
This is relevant because Korzh's demo uses Global.asax for its model and
query class along with Session.
Because the legacy WebForms application is very large, Global.asax in not used
for page specific items.
My solution was to use private static instead. static works well in desktop
applications but seems at the very least likely to cause some grief in WebForms applications.
I've discovered that !IsPostBack is not too reliable and it seems to me that
in WebForms the best practice may be to use Session. The problem with
Session is that it seems to be passed to the client with the HTML and can grow
very large in kilobytes.
QUESTIONS:
Since static variables reside on the IIS server when used with WebForms, does every user of a WebForms application share the same static variable address space? (I think the answer is yes).
What are the best practices/guidelines for using/not using static variables
with ASP.NET WebForms applications?
Thank you.
Regards,
Gerry (Lowry)
P.S.: I could not find answers
via Google or searching SO.
In ASP.NET, static instances will live for the lifetime of the application, that being the web application itself, until it is recycled, or shutdown, e.g.:
public class Global : HttpApplication {
public static string MyString
}
Because of this, the static property is accessible for all requests made to the application. Not the place to be storing page-specific items. There are quite a few storage mechanisms available:
HttpRuntime.Cache and HttpContext.Cache, both point to the same cache instance, and items exist for the lifetime of the application (so has the same issues as static instances).
HttpContext.Items, a request specific collection of items. Each request made to the application will have its own collection of items.
HttpSessionState session, persisted for the length of the user visit, or whenever it times out. This can be configured 4 ways:
3.a. InProc, session objects are stored in memory by the worker process itself. Fast accessing cache, doesn't require serialisation, but if the application recycles, the session data is lost.
3.b. SqlServer, session objects are serialised and stored in a Sql Server database. Requires all session-stored items to be serialisable. Session objects persist even when an application recycles.
3.c. StateServer, session objects are stored in a seperate process, and persists data through application recycles.
3.d. Custom session provider, that's up to you....
ViewState, this is where data is persisted to the client-side and is posted back to the server to rebuild control states between page views.
I would avoid using static instances and the HttpRuntime cache for anything user related. Use these mechanisms for shared, common information, such as configuration, caching, etc. Session is likely the place you want to store things on a per-user basis. If you are looking for a per-page solution, its a lot simpler, because you simply make the variables part of the page structure itself, as properties or fields. You just have to manage the initialisation of these fields.
Hope that helps.