I want to build an MVC application that replaces the session object with the caching mechanism of .net framework 4.0.
The application has to store some values in some kind of cache to keep the database traffic and the loading times low (e.g. settings, dropdown values, etc.). The users authenticate themselfs with a username and a password. The default FormsAuthentication is used here. My idea was to create a unique MemoryCache for each user and a general MemoryCache for the application.
My getter for the User MemoryCache looks like this
private MemoryCache _memCache
{
get
{
if(HttpContext.Application[User.Identity.Name] == null)
{
HttpContext.Application[User.Identity.Name] = new MemoryCache(User.Identity.Name);
}
return HttpContext.Application[User.Identity.Name] as MemoryCache;
}
}
If a user logs out the cache will be disposed.
This is the global MemoryCache for the application
private MemoryCache _appMemCache
{
get
{
if(HttpContext.Application["ApplicationMemoryCache"] == null)
{
HttpContext.Application["ApplicationMemoryCache"] = new MemoryCache("ApplicationMemoryCache");
}
return HttpContext.Application["ApplicationMemoryCache"] as MemoryCache;
}
}
The question is if the Application object is usable to store the MemoryCache. And if the CacheItemPolicy works without problems with this.
The reason for me not to use the session is the possible timeout and the blocking of parallel request.
This is a bad idea for a number of reasons. First and foremost, MemoryCache is both volatile and process-bound. Some attempt will be made to keep Session data around, even if it's being stored in proc, but cache is considered completely disposable. If memory runs short, it will be discarded almost immediately. Also, this will effectively kill any ability to scale your app out to multiple workers (processes) as each would have it's on version of the cache, meaning whether the user's "session" was restored or not would depend on the luck of the draw of which process they landed on with their next request.
Your reasons for not just using session don't make any sense. Yes, sessions have timeouts, but these are somewhat fixed, assuming you're not using in proc. Even with in proc, the session will timeout at a fixed interval, as long as something like an App Pool recycle or similar does not occur. On the contrary, MemoryCache could go away at any time. There are no guarantees about how long the data would persist or if it would even persist at all. Just because you set it to stick around for a year, doesn't mean it won't be discarded in five minutes depending on what else is going on in the system.
As far as "blocking parallel requests" go, I'm not sure what you're talking about with that. The session is thread-safe and theoretically should allow parallel requests. There's always going to be concurrency issues with any asynchronous process like writing to a session store, but those same issues would exist with MemoryCache as well.
All that said, the things you're talking about saving actually don't belong in a session, anyways, though. While you're reasons for using something like MemoryCache are completely invalid, you should actually being using a cache to store this type of stuff. In other words, just use the cache and don't try to somehow replace the session with cache.
Also, as I mentioned previously, there's serious issues with MemoryCache in multiprocess scenarios. If you're building an app that will ever serve more than a handful of users, you should avoid MemoryCache entirely and use something like Redis instead. Redis is used by Microsoft Azure, so it has good support from Microsoft and integration into ASP.NET, but you could chooose any NoSQL solution.
Related
In my ASP.NET-MVC-application I store information in static classes with static vars. But ASP.NET is recycling all data and threads after and my "App_Start"-procedure will call after the cleanup.
I solved the problem with the backup-tasks with HangFire.
But to generate the static-class, I need a long time. The first request after the recycle has to wait while the static-classes are set up.
Why the delay? I am using the EntityFramework and for correct handling I need all records from the database with their relations.
So I hold all records with static-classes and use the database as 2nd strategy.
I have no ideas what I can do to improve performance.
My first idea was to serialize the complete data - but how is the performance for deserialize a ArrayList with 2K or more records?
Is there a way to prevent the recyclefor my static ArrayList?
I'd recommend you use the application cache mechanism for ASP.NET instead. However, by default, the cache is still in-memory and maintained within the process, so app pool recycles would still wipe it out. The solution is to change the storage location of the application cache so it's in a different process. See this answer for some recommendations for how to store your application cache.
In short, I wouldn't recommend trying to avoid app pool recycling since it can really save you a lot of trouble.
So, im working in a huge .NET MVC 3 system. As many users could be logged in at same time. I was just writting a way of "hey there's still someone logged with this key" with HttpContext. But, is this the best practice ? is it better to Query DB ?
what i wrote:
MvcApplication.SessionsLock();
if (!force && MvcApplication.Sessions.Values.Any(p => p.ID.Equals(acesso.id_usuario.ToString(CultureInfo.InvariantCulture)) && p.Valid))
throw new BusinessException("There's another user logged with this key. Continue ?");
MvcApplication.SessionsUnlock();
our I can query my DB.. maybe cookies ? any ideas would be appreciated
Storage
The database provides a central, durable location for this information. You might use a custom data structure, or ASP.Net SQL session might meet your requirements (more below on this).
There is not a deterministic way of always knowing exactly when a user's session ended. For example, you can listen to the Session End event, but it will only fire for in-process sessions and is not guaranteed to fire at all (e.g. the OS could crash).
Regardless, if you are building a "huge system" as you state, you shouldn't design against using in-proc session as it won't scale upwards. Start thinking about SQL-based session state which is more scalable (and may give you enough information to determine roughly how many users are active).
Session Pro/Con
I want to know if session is a good practice. That piece of code
works. But i have been reading a lot of articles deprecating usage of
sessions on ASP.NET MVC Application.
As far as Session being a good or bad thing--as always--it depends on how it is used. Properly designed MVC apps can present fairly complex views without needing to preserve state. Part of this is due to strong support for AJAX (no need to reload the page) and elegant model binding (which can take a complex Request.Form and turn it into a complete model).
Conversely, there is nothing inherently wrong with putting small snippets of repeatedly-used information into session state, using it to avoid sending sensitive data to the client, using it to make a smoother user flow, etc.
Do beware of session fixation attacks in high-security scenarios. Session may not be appropriate and/or may need to be manually secured further.
One thing to be aware of is that ASP.Net places a lock on session. This can lead to very real performance issues when multiple requests are made at once. Normally, this isn't an issue, but consider a page with a dozen AJAX widgets which all requested data from a controller or endpoint that used session. These will contend with each other (firsthand experience).
A non-locking in-process ASP.NET session state store
https://stackoverflow.com/a/2327051/453277
MVC provides an easy way to mark a controller as needing only readonly access to Session, which eliminates the issue. However, any read/write activity to Session will still be serialized, so plan accordingly.
Business Considerations
From a business perspective it's not always important to know that the session has expired so much as work has ceased (do you care that they stopped using the site, or that their session timed out?) This can be reliably addressed by checking last modified timestamps on entities and warning the users. Warn, don't lock. In my opinion, you shouldn rarely/never lock records based on login/logout in a web application (too easy to get stuck in a locked status).
I'm trying to use the cache in an ASP.NET MVC web application to store some list data that rarely updates. I insert this data into the cache in an UpdateCache() method like this:
HttpContext.Current.Cache.Insert("MyApp-Products", products, null, DateTime.Now.AddYears(99), Cache.NoSlidingExpiration);
Then in my model I retrieve it:
public static List<string> GetProducts()
{
var cachedProducts = HttpContext.Current.Cache["MyApp-Products"];
if (cachedProducts == null)
{
UpdateCache();
cachedProducts = HttpContext.Current.Cache["MyApp-Products"];
}
return ((List<string>)cachedProducts );
}
The first time I visit the page, UpdateCache() is called as expected. If I refresh, the data comes from the cache and I do not need to call UpdateCache(). However, after maybe 15 mins, I come back to the app and the cached values are gone. My understanding was that this cache was per application, not session, so I would have expected it to still be there for myself or another user.
Is there something wrong with the way I'm storing the cache? Or is there something I'm not understanding about how the Cache in ASP.NET works in a web app?
My understanding was that this cache was per application, not
session, so I would have expected it to still be there for myself or
another user.
While the cache is per application, ASP.NET doesn't provide you any guarantees that if you stored something into the cache you will find it back there. The cache could be evicted under different circumstances such as for example your server starts running low on memory and so on. You can subscribe to an event and get notified when an item is evicted from the cache. You can also define a priority when caching an item. The higher the priority, the lower the chance of this item getting evicted.
Also since the cache is stored in the memory of the web server (by default) you should not forget the fact that your application domain could be recycled at any point by IIS. For example after a certain amount of inactivity or if it starts running low on memory or even if a certain CPU threshold usageis reached, ... and everything that is stored in the memory including the cache will simply disappear into the void and the next request in your application will start a new AppDomain.
But in any cases make sure that you check if the item is present in the cache before using it. Never rely on the fact that if you stored something into it, you will find it.
All this blabla to come to the really essential point which is something very worrying with your code. It looks like you are storing a List<string> instance into the cache. But since the cache is per application, this single instance of List<string> could be shared between multiple users of the application and of course this could happen concurrently. And as you know List<T> is not a thread safe structure. So with this code you will, at best, get an exception and at worst you will get corrupt data. So be very careful what you are caching and how you are synchronizing the access to the structure especially if you are caching a non thread-safe class.
IF this is on a full IIS, and happens around every 15minuntes. Remember to check the Idle timeout value.
That being said, if this list "never" changes, why not store it in a static array instead.
I can see how storing data in general in database is better than in static object.
For one, in case data does change, it is easier to update DB than the application.
Try explicitly setting absolute expiration when caching your object:
HttpRuntime.Cache.Insert("MyApp-Products", cachedProducts, null, DateTime.Now.AddDays(20), TimeSpan.Zero );
Note HttpRuntime is used instead of HttpContext for performance reasons even though the difference is minor.
At the new place I am working, I've been tasking with developing a web-application framework. I am new (6 months ish) to the ASP.NET framework and things seem pretty straight forward, but I have a few questions that I'd like to ask you ASP professionals. I'll note that I am no stranger to C#.
Long life objects/Caching
What is the preferred method to deal with objects that you don't want to re-initialize every time a page is it? I noticed that there was a cache manager that can be used, but are there any caveats to using this? For example, I might want to cache various things and I was thinking about writing a wrapper around the cache that prefixed cache names so that I could implement different caches using the same underlying .NET cache manager.
1) Are there any design considerations I need to think about the objects that I am want to cache?
2) If I want to implement a manager of some time that is around during the lifetime of the web application (thread-safe, obviously), is it enough to initialize it during app_start and kill it in app_end? Or is this practiced frowned upon and any managers are created uniquely in the constructor/init method of the page being served.
3) If I have a long-term object initialized at app start, is this likely to get affected when the app pool is recycled? If it is destroy at app end is it a case of it simply getting destroyed and then recreated again? I am fine with this restriction, I just want to get a little clearer :)
Long Life Threads
I've done a bit of research on this and this question is probably redundant. It seems it is not safe to start a worker thread in the ASP.NET environment and instead, use a windows service to do long-running tasks. The latter isn't exactly a problem, the target environments will have the facility to install services, but I just wanted to double check that this was absolutely necessary. I understand threads can throw exceptions and die, but I do not understand the reasoning behind prohibiting them. If .NET provided a a thread framework that encompassed System.Thread, but also provided notifications for when the Application Server was going to recycle the App-Pool, we could actually do something about it rather than just keel over and die at the point we were stopped.
Are there any solutions to threading in ASP.NET or is it basically "service"?
I am sure I'll have more queries, but this is it for now.
EDIT: Thankyou for all the responses!
So here's the main thing that you're going to want to keep in mind. The IIS may get reset or may reset itself (based on criteria) while you're working. You can never know when that will happen unless it stops rendering your page while you're waiting on the response (in which case you'll get a browser notice that the page stopped responding, eventually.
Threads
This is why you shouldn't use threads in ASP.NET apps. However, that's not to say you can't. Once again, you'll need to configure the IIS engine properly (I've had it hang when spawning a lot of threads, but that may have been machine dependent). If you can trust that nobody will cause ASP.NET to recompile your code/restart your application (by saving the web.config, for instance) then you will have less issues than you might otherwise.
Instead of running a Windows service, you could use an ASMX or WCF service which also run on IIS/.NET. That's up to you. But with multiple service pools it allows you to keep everything "in the same environment" as far as installations and builds are concerned. They obviously don't share the same processpool/memoryspace.
"You're Wrong!"
I'm sure someone will read this far and go "but you can't thread in ASP.NET!!!" so here's the link that shows you how to do it from that venerable MSDN http://msdn.microsoft.com/en-us/magazine/cc164128.aspx
Now onto Long life objects/Caching
Caching
So it depends on what you mean by caching. Is this per user, per system, per application, per database, or per page? Each is possible, but takes some contrivance and complexity, depending on needs.
The simplest way to do it per page is with static variables. This is also highly dangerous if you're using it for user-code-stuff because there's no indication to the end user that the variable is going to change, if more than one users uses the page. Instead, if you need something to live with the user while they work with the page in particular, you could either stuff it into session (serverside caching, stays with the user, they can use it across multiple pages) or you could stick it into ViewState.
The cachemanager you reference above would be good for application style caching, where everyone using the webapp can use the same datastore. That might be good for intensive queries where you want to get the values back as quickly as possible so long as they're not stale. That's up to you to decide. Also, things like application settings could be stored there, if you use a database layer for storage.
Long term cache objects
You could initialize it in the app_start with no problem, and the same goes for destroying it at the end if you felt the need, but yes, you do need to watch out for what I described at first about the system throwing all your code out and restarting.
Keel over and die
But you don't get notified when you're (the app pool here) going to be restarted (as far as I know) so you can pretty much keel over and die on anything. Always assume the app is going to go down on you before your request, and that every request is the first one.
Really tho, that just leads back into web-design in the first place. You don't know that this is the first visitor or the fifty millionth (unless you're storing that information in memory of course) so just like the app is stateless, you also need to plan your architecture to be stateless as much as possible. That's where web-apps are great.
If you need state on a regular basis, consider sticking with desktop apps. If you can live with stateless-ness, welcome to ASP.NET and web development.
1) The main thing about caching is understanding the lifetime of the cache, and the effects of caching (particularly large) objects in cache. Consider caching a 1MB object in memory that is generated each time your default.aspx page is hit; and after a year of production you're getting 10,000 hits an hour, and object lifetime is 2 hours. You can easily chew up TONS of memory, which can affect performance, and also may cause things to be prematurely expired from the cache, which in turn can cause other issues. As long as you understand the effects of all of this, you're fine.
2) Starting it up in Application_Start and shutting it down in Application_End is fine. You can also implement a custom HttpApplication with an http module.
3) Yes, when your app pool is recycled it calls Application_End and everything is shutdown and destroyed.
4) (Threads) The issue with threads comes up in relation to scaling. If you hit that default.aspx page, and it fires up a thread, and that page gets hit 10,000 in 2 minutes, you could potentially have a ton of threads running in your application pool. Again, as long as you understand the ramifications of firing up a thread, you can do it. ThreadPool is another story, the asp.net runtime uses the ThreadPool to process requests, so if you tie up all the threadpool threads, your application can potentially hang because there isn't a thread available to process the request.
1) Are there any design considerations I need to think about the objects that I am want to cache?
2) If I want to implement a manager of some time that is around during the lifetime of the web application (thread-safe, obviously), is it enough to initialize it during app_start and kill it in app_end? Or is this practiced frowned upon and any managers are created uniquely in the constructor/init method of the page being served.
There's a difference between data caching and output caching. I think you're looking for data caching which means caching some object for use in the application. This can be done via HttpContext.Current.Cache. You can also cache page output and differentiate that on conditions so the page logic doesn't have to run at all. This functionality is also built into ASP.NET. Something to keep in mind when doing data caching is that you need to be careful about the scope of the things you cache. For example, when using Entity Framework, you might be tempted to cache some object that's been retrieved from the DB. However, if your DB Context is scoped per request (a new one for every user visiting your site, probably the correct way) then your cached object will rely on this DB Context for lazy loading but the DB Context will be disposed of after the first request ends.
3) If I have a long-term object initialized at app start, is this likely to get affected when the app pool is recycled? If it is destroy at app end is it a case of it simply getting destroyed and then recreated again? I am fine with this restriction, I just want to get a little clearer :)
Perhaps the biggest issue with threading in ASP.NET is that it runs in the same process as all your requests. Even if this weren't an issue in and of itself, IIS can be configured (and if you don't own the servers almost certainly will be configured) to shut down the app if it's inactive (which you mentioned) which can cause issues for these threads. I have seen solutions to that including making sure IIS never recycles the app pool to spawning a thread that hits the site to keep it alive even on hosted servers
I'm having some trouble getting my cache to work the way I want.
The problem:
The process of retrieving the requested data is very time consuming. If using standard ASP.NET caching some users will take the "hit" of retrieving the data. This is not acceptable.
The solution?:
It is not super important that the data is 100% current. I would like to serve old invalidated data while updating the cached data in another thread making the new data available for future requests. I reckon that the data needs to be persisted in some way in order to be able to serve the first user after application restart without that user taking the "hit".
I've made a solution which does somewhat of the above, but I'm wondering if there is a "best practice" way or of there is a caching framework out there already supporting this behaviour?
There are tools that do this, for example Microsofts ISA Server (may be a bit expensive / overkill).
You can cache it in memory using Enterprise Libary Caching. Let your users read from Cache, and have other pages that update the Cache, these other pages should be called as regularly as you need to keep the data upto date.
You could listen when the Cached Item is Removed and Process then,
public void RemovedCallback(String k, Object v, CacheItemRemovedReason r)
{
// Put Item Back IN Cache, ( so others can use it until u have finished grabbing the new data)
// Spawn Thread to Go Get Up To Date Data
// Over right Old data with new return...
}
in global asax
protected void Application_Start(object sender, EventArgs e)
{
// Spawn worker thread to pre-load critical data
}
Ohh...I have no idea if this is best practice, i just thought it would be slick~
Good Luck~
I created my own solution with a Dictionary/Hashtable in memory as a duplicate of the actual cache. When a method call came in requesting the object from cache and it wasn't there but was present in memory, the memory stored object was returned and fired a new thread to update the object in both memory and the cache using a delegate method.
You can do this pretty easily with the Cache and Timer classes built into .NET. The Timer runs on a separate thread.
And I actually wrote a very small wrapper library called WebCacheHelper which exposes this functionality in an overloaded constructor. The library also serves as a strongly typed wrapper around the Cache object.
Here's an example of how you could do this...
public readonly static WebCacheHelper.Cache<int> RegisteredUsersCount =
new WebCacheHelper.Cache<int>(new TimeSpan(0, 5, 0), () => GetRegisteredUsersCount());
This has a lazy loading aspect to it where GetRegisteredUsersCount() will be executed on the calling thread the instant that RegisteredUsersCount is first accessed. However, after that it's executed every 5 minutes on a background thread. This means that the only user who will be penalized with a slow wait time will be the very first user.
Then getting the value is as simple as referencing RegisteredUsersCount.Value.
Yeah you could just cache the most frequently accessed data when your app starts but that still means the first user to trigger that would "take the hit" as you say (assuming inproc cache of course).
What I do in this situation is using a CacheTable in db to cache the latest data, and running a background job (with a windows service. in a shared environment you can use threads also) that refreshes the data on the table.
There is a very little posibility to show user a blank screen. I eliminate this by also caching via asp.net cache for 1 minute.
Don't know if it's a bad design, but it's working great without a problem on a highly used web site.