In my project, there is a page which has about 10-15 controls and, 3 tree views. It takes a lot of time to load as there is lot of logical checking required. How can I increase its speed (any method, code or database)? Please suggest some measures.
For eg: Every time the page loads, it checks whether the user is allowed to access that page or not. What controls he can access? The tree views are also binded based on the user access only.
On the loading of each controls, it again and again goes to fetch data. How can it be minimized?
Start with front-end optimization.
Disable ViewState of the controls, whereever it is not required.
Set Expire headers for your static content. (Browser caching)
Enable GZIP compression.
Install YSLOW, PageSpeed....for more recommendations.
YSLOW
PageSpeed
Then, come backend optimization:
Cache frequently accessed data
Do some code refactoring.....more
Try a profiler (the JetBrains one is good) on your application, it'll show you where your pinch points are. If you think it's database related, run explain plan/showplan on your queries to show how long they're taking and where they're spending their time. Really, your bottleneck could be anywhere, it'll take some legwork to track it down (it could even be network related).
Once a user is authenticated, cache his logged-in status and use that in your code; see if this makes a difference. Also, cache the data per user. You could use Session state to do this.
Also, set the trace attribute to true in your page directive tag - just to be sure this is what is causing the performance hit.
OutputCache can increase some shared partialview/usercontrol.
Related
I have several user controls loading up in a template in an Umbraco website, and basically, there are problems with a certain template loading very slow initially (after that it loads fast). But the initial page load can take up to 5 minutes to load and the CPU on the localhost goes up very high during this. This is only on 1 specific template. I have tried a stack trace and still not able to output anything useful. Is there something in code where I can set breakpoints somehow in the code itself to see where it is spending most of it's time on the server before the page gets rendered to the client side?
I need to understand, that when the page is not cached, why it takes up to 5 minutes to render the page. How can I find this out in code? Preferably with breakpoints or some ASP.NET plugin that will help me understand why this is happening?
I have determined that it is not related to IIS 7.5, and it is NOT a System Hang! It is something in the code that is causing this.
Glimpse should provide...a glimpse into what is causing the slow initial load without requiring you to write profiling code just to troubleshoot the issue. It is certainly worth checking out before doing the latter at least.
Also, StackExchange's MiniProfiler looks promising and may be worth checking out.
The fallback option of instrumenting your code to time its execution will be more work and require code changes of course; but if that proves to be your best option to understand the slow load times, so be it.
i have stored my dataset in the View State(because i need to filter the data on different client clicks and show the data) but i feel like the page loading is taking a lot of time, even a checbox checked event(with AutoPostback) which does not have any code to execute is taking almost 2-3 seconds.
is this just because of the view state data, if so are there any alternatives for which i can achieve my tasks? and i need the data to be shown quicky on client events so i have been using the view state. any work around would help.
As #Tushar mentioned above, ViewState is not the place you want to be storing large amounts of data. It's really only meant to preserve the state of controls between round trips, and it can really lead to poor app performance.
Instead you should look into the following server managed options:
Application State - Used for storing data that is shared between all users. Uses server memory.
Session State - Used for storing data specific to a user's session. Also uses server memory. Data can be persisted through app restarts, as well throughout a web-garden or server-farm. More info from MSDN here: http://msdn.microsoft.com/en-us/library/z1hkazw7.aspx
The biggest cons of those methods are memory management, as both options consume server memory, and keep data until there is either a restart of some sorts, or until the session is dropped. Thus, these methods don't always scale well.
Also, here is an MSDN article discussing the various .net methods of state management, with pros and cons for each method :
A third option is to implement a caching strategy by either using the .NET caching libraries, building your own and/or using 3rd party caching servers/libraries. The benefit to using cache is that you have the data automatically expire after any given specified amount of time. However, complexities are introduced when working in a web-garden or server-farm environment.
The biggest thing to remember, is that any of the strategies mentioned above will require some planning and consideration in regards to managing/sharing the data.
If you're storing a large amount of data in ViewState, you'll notice performance issues. Although ViewState is really meant for a "this page only" and Session is meant for "this session", you'll reach a limit with ViewState size where the Session is ultimately much better for performance.
It's worth noting that you might be having some other type of issue, not just an issue with the ViewState (i.e. your database query may be taking a long time and could possibly be cached).
The ViewState makes the page slightly larger due to the extra data embedded in the page's HTML to hold the serialized ViewState. Whether that extra size will cause load problems depends on the connection speed, and on the size of the view state relative to the rest of the page.
The ViewState is sent back to the server with each HTTP request (so including your AutoPostback). Again, whether that causes a noticeable performance issue depends on the view state size and the connection speed.
On a broadband(ish) connection with the amount of ViewState data one would find in a typical page, you would not see 2-3 seconds additional processing time.
Diagnosing
Use the developer tools in your browser (in IE, press F12). You can monitor web requests including the exact header and body sent and received. You can also see the timing for each HTTP request. If the ViewState is not huge (not more than 1-2K perhaps) and your connection speed is not excessively slow, that is not your culprit.
Alternatives
You can hold state entirely server-side, or put any state items that are large entirely on the server. You can use Ajax requests to process page events that depend on that state.
Instead of loading data from a data-source multiple times, only do it one time. The other answers talk about accessing the data. I have run into instances where I load the data every time I do a post-back.
string myString;
public string MyString
{
get
{
// If there is already data in "myString", do not load it again!
if (this.ViewState["myData"] == null)
{
// Load data one time
this.ViewState["myData"] = "Hello";
}
return this.ViewState["myData"] as string;
}
}
How much ViewState slows down your page depends upon have much view state you have. I've inherited pages that generated over a megabyte of viewstate and seen the web server spend 10 seconds just processing the view state. If you don't want to rewrite your application and you need the large amount of view state, you need to investigate alternate strategies for saving / restoring view state. Saving ViewState to a database or even a plain file is much faster -- don't have to stream viewstate to/from client on each request.
Best strategy is to avoid viewstate in the first place though.
Just thought I should add, some controls are simply ViewState pigs, some grids are just terrible for viewstate consumption.
You can view the source of your page and get the ViewState value and use the online ViewState decoder at below url to check how much large are the values stored in your ViewState field for your pages:
http://ignatu.co.uk/ViewStateDecoder.aspx
If you find your viewstate is having large stored values then you should find alternatives for storing your Dataset.
Anyways, you should avoid putting the Dataset into your ViewState.
I'm looking to track clicking on a site, mostly using jQuery to fire click events on anchors and buttons that add/update a cookie that I'll log into internal databases when a user hits an error page
This would mean adding some jQuery logic on the master page (that is on every page) that will get and update the cookie, along with hooking up jquery click events to most objects on the site.
the site sees roughly around 100K unique visitors a day and is heavy on database calls already
This is the first time I've incorporated anything like this into a site of this size. I'd like to know if any experts have concerns over doing something like this, what kind of performance overhead I can expect to cause with this solution, any ideas for implementation that would make this as non-noticeable as possible to the user and server loads, etc.
I appreciate any guidance anyone can give.
To log something in cookies is not very good idea as the cookie have limitation of text length (4KB if i haven't forgotten). So if you would like to log some user activity you may try to use ajax to send request to the server and try to apply some logging system with high performance at the server side.
The page load performance hit of adding a cookie to the request is going to depend on the current request size and the number of requests made to the domain with the new cookie for each page load. The cookie will get added to each request (including images, css, js, etc.) so depending on your scenario, it could be a large impact or a small one.
One way to combat the page load size issue is to have the static resources on a different cookieless domain that the domain that serves the pages. Requests to the cookieless domain will not include the new cookie and hence will not be affected by it. (Stack Overflow does exactly this).
As others have mentioned, however, using a cookie may not be the best way to track this, since it does have such a large impact on load time. Instead, I would suggest that you track this server-side by including the relevant information into the user's session. In addition to not increasing the client load time in any large way, you also have the advantage of being able to deal with a branching traffic pattern which is often seen when users open up multiple tabs off of a single page.
This does have a slight disadvantage in that you are taking on a slightly higher server-side load, but I would be very careful when it comes to increasing the client-side load time, since that is a critical metric for user happiness and engagement. A server-side load increase can be engineered around, especially for a 100k user site. A less-happy user due to a slower page load is much harder to fix.
you could do some load testing. Visual Studio has a nice implementation
No one else can tell you how well your hardware is going to handle this sort of thing, so load testing in a staging environment is generally your best option. If your staging environment is similar enough, you should be able to load test it with a significantly smaller number of virtual users and see what sort of effects it will have on your servers.
I was wondering if i should be caching the objects returned from my DAL in some way? I may have multiple UI controls calling for the same data in a single load of the page.
What would you guys recommend? Am i begin a little to cautious? Its not a terrible amount of data. But if i should be caching in some way, what would be the recommended approach?
You could cache AND if you really have multiple controls on the same page using the same data you can call the data once in the parent page and pass a reference to it to each control with a Setter in each control (rather than have each control pull the same data from the DAL themselves), eg:
myControl.AllUsers = _allUsers;
....
myOtherControl.AllUsers = _allUsers;
I also agree with #DanielHilgarth. Caching adds complexity (when to refresh the cache, for example). If the page loads quickly anyway, I wouldn't bother.
If the page is slow, database calls in loops are often the culprit in my experience.
It depends if it's safe and nescessary to do so. If the data you are working with does not require realtime data (ie. your blog) then by all means cache if you feel it's nescessary to do so (meaning your site is running slow).
A problem with caching that a lot of times people forget to account for is being able to clear the cache on demand if something that requires an immediate response (for example you ban a user, or update payment gateway information).
There are two main types of caching, sliding cache and fixed-time cache.
Sliding cache (cache that gets extended each time a valid retrieval is performed) is great for resources that have relatively easy to compute values, but may suffer from database / network overhead. Cache for 1 hour (or w/ever) on a sliding cache, and then manually invalidate (remove) the cache whenever an INSERT / UPDATE / DELETE occurs for the DAO. This way the user will see realtime results, but will infact be cached whenever possible.
Fixed time cache is great for resources that are difficult to perform (ie a very complex stored procedure) and do not require realtime accuracy. Cache for 1 hour (or w/ever) for the first time it's requested, and do not clear cache until that first hour is up. INSERT / UPDATE / DELETE is ignored by your cache mechanism (unless it's absolutely nescesssary).
To do so you can look at this library:
http://www.reactiveui.net/
it provides a neat an clean way to cache your objects.
I'd say it is a very neat way.
I just posted the question how-to-determine-why-the-browser-keeps-trying-to-load-a-page and discovered that my problem is with Gravatar.
I also noticed that StackOverflow is suffering from the same outage.
Does anyone know of a graceful way to determine if Gravatar, or any third party website for that matter, is up or not, before trying to retrieve avatar icons from them?
This would eliminate the long page load and the never ending busy cursor ... I shouldn't say never ending ... it just takes a long time to go away and is very confusing to the user as they sit there and wait ... for nothing.
You can have a different process that is periodically checking the status of the site. Set a rule about what is down for you, for instance you could say: "ping time > 1500 ms = down". Have this process to leave a note in a database table or config file. Then you check this value on each page rendering at almost no cost.
Depending on how critical is this external site, you can do the check more or less often.
This process could be an out of the web stack program, or a page only accessible through localhost that gets executed via Scheduled Tasks or an ASP.NET facility like mentioned in the comments.
For Gravatar you can cache all theses images instead of taking them from their server everytime. Of course, if user change their icon, it might not refresh as fast as it would be if it were direct access to the main server but at least you do not have to request gravar server everytime.