performance issues to consider from adding a cookie to 100K users? - c#

I'm looking to track clicking on a site, mostly using jQuery to fire click events on anchors and buttons that add/update a cookie that I'll log into internal databases when a user hits an error page
This would mean adding some jQuery logic on the master page (that is on every page) that will get and update the cookie, along with hooking up jquery click events to most objects on the site.
the site sees roughly around 100K unique visitors a day and is heavy on database calls already
This is the first time I've incorporated anything like this into a site of this size. I'd like to know if any experts have concerns over doing something like this, what kind of performance overhead I can expect to cause with this solution, any ideas for implementation that would make this as non-noticeable as possible to the user and server loads, etc.
I appreciate any guidance anyone can give.

To log something in cookies is not very good idea as the cookie have limitation of text length (4KB if i haven't forgotten). So if you would like to log some user activity you may try to use ajax to send request to the server and try to apply some logging system with high performance at the server side.

The page load performance hit of adding a cookie to the request is going to depend on the current request size and the number of requests made to the domain with the new cookie for each page load. The cookie will get added to each request (including images, css, js, etc.) so depending on your scenario, it could be a large impact or a small one.
One way to combat the page load size issue is to have the static resources on a different cookieless domain that the domain that serves the pages. Requests to the cookieless domain will not include the new cookie and hence will not be affected by it. (Stack Overflow does exactly this).
As others have mentioned, however, using a cookie may not be the best way to track this, since it does have such a large impact on load time. Instead, I would suggest that you track this server-side by including the relevant information into the user's session. In addition to not increasing the client load time in any large way, you also have the advantage of being able to deal with a branching traffic pattern which is often seen when users open up multiple tabs off of a single page.
This does have a slight disadvantage in that you are taking on a slightly higher server-side load, but I would be very careful when it comes to increasing the client-side load time, since that is a critical metric for user happiness and engagement. A server-side load increase can be engineered around, especially for a 100k user site. A less-happy user due to a slower page load is much harder to fix.

you could do some load testing. Visual Studio has a nice implementation
No one else can tell you how well your hardware is going to handle this sort of thing, so load testing in a staging environment is generally your best option. If your staging environment is similar enough, you should be able to load test it with a significantly smaller number of virtual users and see what sort of effects it will have on your servers.

Related

If data caching used like a session would it have better performance

I am working on a maintenance of one asp.net application where I found pervious developers have implemented data caching as like a session, means they stored data in a cache for per session like this
Public Function GetDataCache(ByVal dataCacheKey As String) As Object
dataCacheKey = dataCacheKey & Convert.ToString(LoginSessionDO.UserID)
Return Cache(dataCacheKey)
End Function
In this application there are many screens where user can add multiple rows (data) in a grid temporary which actually store in cache for that particular current user only and finally press save button to save data in database.
My question is if caching is used like a session! will it give any performance improvement?
actually I can change it in my dev. environment to check performance but we cannot create load like prod in our environment and also without any surety I cannot change and deploy code in production.
Please suggest me
Is caching is good the way its implemented?.
It’s using like a session would it have better performance than session?
The cache will need to be cleared out, otherwise all items will remain until the app domain recycles. Session has a much shorter expiry and can be explicitly abandoned on log out, for example.
This might be a scaling issue if your site grows. However, the shorter expiry time of the session might cause you issues with saving if it is no longer there when expected. A staging table in the db might be a better approach.
An edit from several years after the initial answer.
In reality, it would be much preferable to store the added rows on the client side and then submit them all in one go. Either of the server side options above run into issues if the app domain recycles in the middle of a session and both will cause you scaling issues on the server with enough users/data.

does storing Data in ViewState slows down the page load?

i have stored my dataset in the View State(because i need to filter the data on different client clicks and show the data) but i feel like the page loading is taking a lot of time, even a checbox checked event(with AutoPostback) which does not have any code to execute is taking almost 2-3 seconds.
is this just because of the view state data, if so are there any alternatives for which i can achieve my tasks? and i need the data to be shown quicky on client events so i have been using the view state. any work around would help.
As #Tushar mentioned above, ViewState is not the place you want to be storing large amounts of data. It's really only meant to preserve the state of controls between round trips, and it can really lead to poor app performance.
Instead you should look into the following server managed options:
Application State - Used for storing data that is shared between all users. Uses server memory.
Session State - Used for storing data specific to a user's session. Also uses server memory. Data can be persisted through app restarts, as well throughout a web-garden or server-farm. More info from MSDN here: http://msdn.microsoft.com/en-us/library/z1hkazw7.aspx
The biggest cons of those methods are memory management, as both options consume server memory, and keep data until there is either a restart of some sorts, or until the session is dropped. Thus, these methods don't always scale well.
Also, here is an MSDN article discussing the various .net methods of state management, with pros and cons for each method :
A third option is to implement a caching strategy by either using the .NET caching libraries, building your own and/or using 3rd party caching servers/libraries. The benefit to using cache is that you have the data automatically expire after any given specified amount of time. However, complexities are introduced when working in a web-garden or server-farm environment.
The biggest thing to remember, is that any of the strategies mentioned above will require some planning and consideration in regards to managing/sharing the data.
If you're storing a large amount of data in ViewState, you'll notice performance issues. Although ViewState is really meant for a "this page only" and Session is meant for "this session", you'll reach a limit with ViewState size where the Session is ultimately much better for performance.
It's worth noting that you might be having some other type of issue, not just an issue with the ViewState (i.e. your database query may be taking a long time and could possibly be cached).
The ViewState makes the page slightly larger due to the extra data embedded in the page's HTML to hold the serialized ViewState. Whether that extra size will cause load problems depends on the connection speed, and on the size of the view state relative to the rest of the page.
The ViewState is sent back to the server with each HTTP request (so including your AutoPostback). Again, whether that causes a noticeable performance issue depends on the view state size and the connection speed.
On a broadband(ish) connection with the amount of ViewState data one would find in a typical page, you would not see 2-3 seconds additional processing time.
Diagnosing
Use the developer tools in your browser (in IE, press F12). You can monitor web requests including the exact header and body sent and received. You can also see the timing for each HTTP request. If the ViewState is not huge (not more than 1-2K perhaps) and your connection speed is not excessively slow, that is not your culprit.
Alternatives
You can hold state entirely server-side, or put any state items that are large entirely on the server. You can use Ajax requests to process page events that depend on that state.
Instead of loading data from a data-source multiple times, only do it one time. The other answers talk about accessing the data. I have run into instances where I load the data every time I do a post-back.
string myString;
public string MyString
{
get
{
// If there is already data in "myString", do not load it again!
if (this.ViewState["myData"] == null)
{
// Load data one time
this.ViewState["myData"] = "Hello";
}
return this.ViewState["myData"] as string;
}
}
How much ViewState slows down your page depends upon have much view state you have. I've inherited pages that generated over a megabyte of viewstate and seen the web server spend 10 seconds just processing the view state. If you don't want to rewrite your application and you need the large amount of view state, you need to investigate alternate strategies for saving / restoring view state. Saving ViewState to a database or even a plain file is much faster -- don't have to stream viewstate to/from client on each request.
Best strategy is to avoid viewstate in the first place though.
Just thought I should add, some controls are simply ViewState pigs, some grids are just terrible for viewstate consumption.
You can view the source of your page and get the ViewState value and use the online ViewState decoder at below url to check how much large are the values stored in your ViewState field for your pages:
http://ignatu.co.uk/ViewStateDecoder.aspx
If you find your viewstate is having large stored values then you should find alternatives for storing your Dataset.
Anyways, you should avoid putting the Dataset into your ViewState.

"User already logged" HttpContext or Request DB

So, im working in a huge .NET MVC 3 system. As many users could be logged in at same time. I was just writting a way of "hey there's still someone logged with this key" with HttpContext. But, is this the best practice ? is it better to Query DB ?
what i wrote:
MvcApplication.SessionsLock();
if (!force && MvcApplication.Sessions.Values.Any(p => p.ID.Equals(acesso.id_usuario.ToString(CultureInfo.InvariantCulture)) && p.Valid))
throw new BusinessException("There's another user logged with this key. Continue ?");
MvcApplication.SessionsUnlock();
our I can query my DB.. maybe cookies ? any ideas would be appreciated
Storage
The database provides a central, durable location for this information. You might use a custom data structure, or ASP.Net SQL session might meet your requirements (more below on this).
There is not a deterministic way of always knowing exactly when a user's session ended. For example, you can listen to the Session End event, but it will only fire for in-process sessions and is not guaranteed to fire at all (e.g. the OS could crash).
Regardless, if you are building a "huge system" as you state, you shouldn't design against using in-proc session as it won't scale upwards. Start thinking about SQL-based session state which is more scalable (and may give you enough information to determine roughly how many users are active).
Session Pro/Con
I want to know if session is a good practice. That piece of code
works. But i have been reading a lot of articles deprecating usage of
sessions on ASP.NET MVC Application.
As far as Session being a good or bad thing--as always--it depends on how it is used. Properly designed MVC apps can present fairly complex views without needing to preserve state. Part of this is due to strong support for AJAX (no need to reload the page) and elegant model binding (which can take a complex Request.Form and turn it into a complete model).
Conversely, there is nothing inherently wrong with putting small snippets of repeatedly-used information into session state, using it to avoid sending sensitive data to the client, using it to make a smoother user flow, etc.
Do beware of session fixation attacks in high-security scenarios. Session may not be appropriate and/or may need to be manually secured further.
One thing to be aware of is that ASP.Net places a lock on session. This can lead to very real performance issues when multiple requests are made at once. Normally, this isn't an issue, but consider a page with a dozen AJAX widgets which all requested data from a controller or endpoint that used session. These will contend with each other (firsthand experience).
A non-locking in-process ASP.NET session state store
https://stackoverflow.com/a/2327051/453277
MVC provides an easy way to mark a controller as needing only readonly access to Session, which eliminates the issue. However, any read/write activity to Session will still be serialized, so plan accordingly.
Business Considerations
From a business perspective it's not always important to know that the session has expired so much as work has ceased (do you care that they stopped using the site, or that their session timed out?) This can be reliably addressed by checking last modified timestamps on entities and warning the users. Warn, don't lock. In my opinion, you shouldn rarely/never lock records based on login/logout in a web application (too easy to get stuck in a locked status).

Increasing speed in web application of asp.net

In my project, there is a page which has about 10-15 controls and, 3 tree views. It takes a lot of time to load as there is lot of logical checking required. How can I increase its speed (any method, code or database)? Please suggest some measures.
For eg: Every time the page loads, it checks whether the user is allowed to access that page or not. What controls he can access? The tree views are also binded based on the user access only.
On the loading of each controls, it again and again goes to fetch data. How can it be minimized?
Start with front-end optimization.
Disable ViewState of the controls, whereever it is not required.
Set Expire headers for your static content. (Browser caching)
Enable GZIP compression.
Install YSLOW, PageSpeed....for more recommendations.
YSLOW
PageSpeed
Then, come backend optimization:
Cache frequently accessed data
Do some code refactoring.....more
Try a profiler (the JetBrains one is good) on your application, it'll show you where your pinch points are. If you think it's database related, run explain plan/showplan on your queries to show how long they're taking and where they're spending their time. Really, your bottleneck could be anywhere, it'll take some legwork to track it down (it could even be network related).
Once a user is authenticated, cache his logged-in status and use that in your code; see if this makes a difference. Also, cache the data per user. You could use Session state to do this.
Also, set the trace attribute to true in your page directive tag - just to be sure this is what is causing the performance hit.
OutputCache can increase some shared partialview/usercontrol.

Resources for learning how to handle heavy traffic asp.net mvc site?

how much traffic is heavy traffic? what are the best resources for learning about heavy traffic web site development?.. like what are the approaches?
There are a lot of principles that apply to any web site, irrelevant of the underlying stack:
use HTTP caching facilities. For one there is the user agent cache. Second, the entire web backbone is full of proxies that can cache your requests, so use this to full advantage. A request that does even land on your server will add 0 to your load, you can't optimize better than that :)
corollary to the point above, use CDNs (Content Delivery Network, like CloudFront) for your static content. CSS, JPG, JS, static HTML and many more pages can be served from a CDN, thus saving the web server from a HTTP request.
second corollary to the first point: add expiration caching hints to your dynamic content. Even a short cache lifetime like 10 seconds will save a lot of hits that will be instead served from all the proxies sitting between the client and the server.
Minimize the number of HTTP requests. Seems basic, but is probably the best overlooked optimization available. In fact, Yahoo best practices puts this as the topmost optimization, see Best Practices for Speeding Up Your Web Site. Here is their bets practices list:
Minimize HTTP Requests
Use a Content Delivery Network
Add an Expires or a Cache-Control Header
Gzip Components
... (the list is quite long actually, just read the link above)
Now after you eliminated as much as possible from the superfluous hits, you still left with optimizing whatever requests actually hit your server. Once your ASP code starts to run, everything will pale in comparison with the database requests:
reduce number of DB calls per page. The best optimization possible is, obviously, not to make the request to the DB at all to start with. Some say 4 reads and 1 write per page are the most a high load server should handle, other say one DB call per page, still other say 10 calls per page is OK. The point is that fewer is always better than more, and writes are significantly more costly than reads. Review your UI design, perhaps that hit count in the corner of the page that nobody sees doesn't need to be that accurate...
Make sure every single DB request you send to the SQL server is optimized. Look at each and every query plan, make sure you have proper covering indexes in place, make sure you don't do any table scan, review your clustered index design strategy, review all your IO load, storage design, etc etc. Really, there is no short cut you can take her, you have to analyze and optimize the heck out of your database, it will be your chocking point.
eliminate contention. Don't have readers wait for writers. For your stack, SNAPSHOT ISOLATION is a must.
cache results. And usually this is were the cookie crumbles. Designing a good cache is actually quite hard to pull off. I would recommend you watch the Facebook SOCC keynote: Building Facebook: Performance at Massive Scale. Somewhere at slide 47 they show how a typical internal Facebook API looks like:
.
cache_get (
$ids,
'cache_function',
$cache_params,
'db_function',
$db_params);
Everything is requested from a cache, and if not found, requested from their MySQL back end. You probably won't start with 60000 servers thought :)
On the SQL Server stack the best caching strategy is one based on Query Notifications. You can almost mix it with LINQ...
I will define heavy traffic as traffic which triggers resource intensive work. Meaning, if one web request triggers multiple sql calls, or they all calculate pi with a lot of decimals, then it is heavy.
If you are returning static html, then your bandwidth is more of an issue than what a good server today can handle (more or less).
The principles are the same no matter if you use MVC or not when it comes to optimize for speed.
Having a decoupled architecture
makes it easier to scale by adding
more servers etc
Use a repository
pattern for data retrieval (makes
adding a cache easier)
Cache data
which is expensive to query
Data to
be written could be written thru a
cache, so that the client don't have
to wait for the actual database
commit
There's probably more ground rules as well. Maybe you can you say something about the architecture of your application, and how much load you need to plan for?
MSDN has some resources on this. This particular article is out of date, but is a start.
I would suggest also not limiting yourself to reading about the MVC stack: many principles are cross-platform.

Categories