I am developping a (relatively small) website in ASP.Net 2.0. I am also using nAnt to perform some easy tweaking on my project before delivering executables. In its current state, the website is "precompiled" using
aspnet_compiler.exe -nologo -v ${Appname} -u ${target}
I have noticed that after the IIS pool is restarted (after a idle shutdown or a recycle), the application takes up to 20 seconds before it is back online (and Application_start is reached).
I don't have the same issue when I am debugging directly within Visual Studio (it takes 2 seconds to start) so I am wondering if the aspnet_compiler is really such a good idea.
I couldn't find much on MSDN. How do you compile your websites for production?
Make sure that:
You are using a Web Application project rather than a Web Site project, this will result in a precompiled binary for your code behind
You have turned off debug code generation in the web.config file - I guess if this is different to when you used aspnet_compiler the code may be recompiled
If you've tried those, you could maybe try running ngen over your assembly thus saving the JIT time?
For ultimate reponsiveness, don't allow your app to be shutdown.
The first method is to make sure that it's incredibly popular so that there's always someone using it.
Alternatively, fetching a tiny keep-alive page from somewhere else as a scheduled activity can be used to keep your site 'hot'.
If your website is compiled as updatable, you'll see a bunch of .ASPX files in your virtual directory. These must be compiled on startup. That's so you can come in and alter the web UI itself. This is default for both websites and web applications.
Make sure this is set in web.config <compilation debug=false>. In my case, I also have a batch file which issue Get requests for all the main pages before giving to users (page loading simulation).
The key is to make sure the IIS Application Pool never shuts down. This is where the code is actually hosted. Set the "Idle Timeout" (Under Advanced Settings) to something really high, like 1440 minutes (24 hours) to make sure it's not shut down as long as somebody hits your site once a day.
You are still going to have the JIT time whenever you deploy new code, or if this idle timeout period is exceeded witout any traffic.
Configuring IIS 7.x Idle Timeout
#Simon:
The project is a Web Application. Websites are then slower to startup (I had no idea it had an incidence, beside the different code organization)?
I checked, and while I edit the web.config after aspnet_compiler is called, I don't touch the debug value (I will however check the website is not faster to startup if I don't touch the web.config, just to make sure)
(And I will definitely have a look at ngen, I was not aware of that tool.)
Related
I'm working on a legacy application built on ASP.NET Webforms and .NET Framework 4.6.2.
The application's initial page is very slow to start on my local machine and I'm trying to troubleshoot it. I put a breakpoint on Global.asax Application_Start method, and it's taking quite some time even before the breakpoint is hit. I'd like to know how can I track exactly what happens before the Application_Start event (what calls are made, external libraries used, etc.), from the moment I make the first request to the server, so I can get a starting point to improve the performance. Thanks in advance.
Well, while in VS create (or open) some aspx page with JUST hello world on it. Does that page take a long time to load?
And to be fair, create a brand new blank asp.net webforms project, and find out if that project also is slow to load.
In other words, this just might mean your computer is slow.
When you hit f5, then:
The project code is compiled. Then a bunch of information is "copied", setup, then configured for IIS to launch. Then a WHOLE web server is launched (IIS express), then the web page is loaded, and then IIS processes that page, and then it is rendered.
In other words, a BOATLOAD of startup stuff occurs. Once that WHOLE big web server is spooled up, then and only then does your application start and all that jazz start.
If you can navigate to another page (say from the menu bar (assuming your application has one), and then BACK to that supposed page that loads slow? Does that page load slow the 2nd time you hit that page?
If that is the case, then your computer might be slow, starved for memory, or even starved for processing to start up the WHOLE system of stuff required to run that whole web server.
if once the page is displayed, and everything runs normal, then this is not a slow loading web page, but in fact a slow load WHOLE web site, and in actual use and production of the system, then it not really slow anymore is it?
So, say if your computer only has 8 gig, and say a low or slow laptop processor, then startup of all the "enormous" moving parts required to run a whole web server system might be the issue, and NOT that the page load is really slow.
And since the production system will not be re-loading from 100% scratch each time a whole web server system, then you may well not have a slow system, but only a slow "first time" startup.
So, do other web site projects based on .net framework, and asp.net pages start out slow?
And is this a asp.net web site, or is this a asp.net web site (application) project?
So, quite a bit of "hand stands" will occur on first run and load of the web site. So, this could be your computer, and not really that the web site is slow.
Having stated the above?
I have some found web site projects RATHER slow to load, and this is especially in the case of a asp.net web site application. (as opposed to web site's, which tend to load a WHOLE lot faster - about 1 second, and the other was taking about 18-23 seconds on my old laptop (it was starved for processing).
On a older laptop, I was seeing about 18-23 seconds delay after hitting f5. On my newer laptop, I have about 2, sometimes 3 seconds delay from the time I hit f5 in code or markup to see the results and web site running.
So, often IIS has to be loaded and started, maybe SQL server started, and even more bits and parts. All of these things can take a LONG time on your dev box, but on a production server, all of those things are running all the time.
So, it not clear if you have a very slow web site, slow loading "specific" web page, or you just talking about the overall slow loading time for "any" first page to display, and then after that, everything is fine? In other words, it is f5 to start things going is the slow point in time.
And you didn't even note how long of a delay and time here?
Are you talking about 5 seconds, or 50 seconds?
Right click on your task bar, task manager.
Here is me hitting f5 on my laptop. Note the 6 cores go 100% max speed and power during that startup.
I do know that on my laptop, when I hit f5, then all 6 cores (12 threads) jumps about about 2ghz to 4 ghz, and all 6 cores are pinned for those 2-3 seconds while that whole web server is created, configured, and THEN started, and then the page in question has to load.
So, here you can see that hitting f5, the web page "starts" almost instant, and then about 4 seconds more.
(the processor usage is somewhat high - part of that is due to running the screen capture for this screen shot).
So, as "soon" as I hit f5, you can see the processor jump above 20%. I count about < 1 second for the browser to appear. Then about 4 more seconds. And the above is a slower to load page. And as soon as the page displays, then you see the cpu drop down.
As noted, try creating a new asp.net web site "application". A web site, they tend to load < 1 second. In fact above is the "by far" slowest startup I have. Most are about 1-2 seconds from hitting f5.
A asp.net web site application? They take longer - the whole site has to be compiled each time, and the .dll's have to be created.
Sometimes, it simple things like some virtual folder mapped to a location that does not exist (so on startup, a non existing folder or path name can take some time to "time out".
so, how fast does a new test project (create a new asp.net web site application (create a application, not a web site). I find that web sites load fast - even on computers with limited hardware.
However, asp.net web "applications" tend to load a WHOLE lot slower. (and thus more ram, and more cpu, and more cpu cores makes a HUGE difference in startup times).
Performance is a real art form in our industry. But, a few more details such as do all "application" sites start slow on your dev box, or JUST this one, and by how much compared to other sites.
Often, a delay can be network connection to a database, and say little or nothing to do with startup, but some network delay in say connecting to sql server which may well be some kind of network connection.
So, all first pages slow, or just one page?
does a new asp.net web site application (application is important here - not just web site) also startup slow? (if yes, then this is not specific to your site, is it???).
And using task manager - do you see high cpu usage during that startup (f5), or is it low? (this then suggests network and connection delays - not cpu and memory and performance issues.
So, a lot more trouble shooting here is required, but then again, we have little idea as to what testing you done so far, such as testing another web site application on your computer - and does it have things like authentication and sql server logon's involved.
I mean, we can have some issues with a car that will not start. So, we don't just yet swap out the engine, re-do the fuel injection system, and then ONLY to find out we forgot to put in a new battery, and that was the issue all along.
So, do a few tests - get some base-lines of say a test site - use one of the templates that has the "default" bootstrap menu, master pages etc. How fast does that start? Since if that site ALSO starts slow, then attempts at troubleshooting the existing site will be in vain, since all sites load slow - not just the "one" site your working on.
I have an ASP.NET app running in IIS. The first time a call to the application is made, it can sometimes take extremely long (e.g. 80 seconds), whereas the second time it's very quick
I know this has to do with the app first starting and possibly needing to gather resources etc. However, the problem is that I can run the same identical app on another machine and the load time for the first call is significantly less.
So I'm wondering what factors on the machine would affect this load time?
Thanks for any assistance
I agree with Steve's comment. FYI this slow response on initial request will happen every time the app pool has been idle for a while too. You can combat this by disabling the app pool from shutting down when idle. I think the default is 20 mins, this is a setting in IIS.
Then you will only suffer the problem every time the app pool recycles. You can stop this happening but I don't advise it. Interesting article on this here http://weblogs.asp.net/owscott/archive/2013/04/06/why-is-the-iis-default-app-pool-recycle-set-to-1740-minutes.aspx Recycling the app pool every now and again protects you from memory leaks. However you can pro actively spin up the app pool by setting up a scheduled task that runs a batch file to make a request to the website on detection of an app pool recycle.
This ensures that your site is always spinning and good to go for every request.
ASP.NET only compiles when the page is requested for the first time. This means that on first load the page is being compiled and then displayed. This can be solved by following the precompile instructions from Microsoft.
http://msdn.microsoft.com/en-us/library/ms227972%28v=vs.90%29.aspx
EDIT: I realized that I didn't answer the question that you were really asking.
There are a few things that could affect the first load:
1) The browser you are using may not be as efficient at displaying the type of content on the page (assuming different browsers).
2)If the machines aren't running on the same internet connection(and even if they are speeds vary between wifi/ethernet) this could be affecting the overall speed.
3)The specs on the machines themselves can be making a difference, browsers still take up resources to run, and as such a faster computer could display quicker (although it wouldn't make a giant difference).
4)You said the app was running on IIS, but you didn't specify whether it was a local (test) server or a deployed server. If it's local the specs of the machine again come into play, and in a giant way. Booting an IIS server, deploying the app and then displaying pages (what happens when you click run in VS or similar) can take very different amounts of time depending on the machine.
I am having a rather time-consuming issue with the Sitecore CMS.
Is it possible to shorten the time it takes for the application pool to recycle every time a change is made in the bin folder or web.config?
It takes the server 2 to 5 minutes to respond right after a change.
Any ideas?
There is one article from alex shyba from sitecore here on Reducing Sitecore Startup time
Summary of article is
In machine.config: disable process to check assembly signed
<runtime>
<generatePublisherEvidence enabled="false"/>
</runtime>
Disable performance counters in Sitecore web.config
<setting name="Counters.Enabled" value="false" />
There are plenty of articles on the improving sitecore performance I have listed few links below:
Sitecore SDN - Optimizing Performance in Sitecore
Sitecore Slides on performance improvements - download pdf
Analyzing and identifying the slow causing culprit - Sitecore Startup basics
He needs to restart the application pool in order to load the new dll's in.
You could try and minimize the things you do at startup(but i'm guessing it's mostly sitecore stuff which you can't influence) so the best suggestion i can give is to have 2 webservers with a content switch.
You would run your application on 2 servers and the content switch decides which server handles which request(be carefull with session and statics because each server will know nothing of the other).
If at some point you need to release a new version you just instruct your content switch to direct all trafic to webserver A. You then deploy to webserver B, open the website via a direct url that doesn't pass the content switch and make sure it's working properly + warmed up.
Then you tell the content switch to point all trafic to B and you have all the time in the world to update webserver A and switch the content switch back to normal mode.
I had this issue as well.
For whatever reason my local copy would take around 5 minutes to fully spin up after any change or starting the website fresh. It was absolutely dreadful for testing anything. It was only an issue locally. My development server environment and production didn't seem to be effected.
I found out a profiling tool that allowed me to find an outlier control that was taking an obscene amount of time to render. Try this...
Fire up your website after it compiles.
Log into this admin page: http://local.example.com/sitecore/admin/stats.aspx
You'll see a screen like this: Look for anything with an obscenely high Avg or Max time. Your problem will likely be in the logic there.
As a bonus, if you can keep hitting pages after your long recompile without much trouble, you should notice the Avg time roughly halving itself every page hit for your problem control if your problem is anything like mine was.
In my case, I found the largest issue came from one control that was doing an expensive search for one item.
It looked like this -> Sitecore.Context.Database.SelectItems("" + Sitecore.Context.Item.Paths.Path + "/ancestor-or-self::*[##templateid='" + templateId + "']");
This was mostly a local issue because the SQL server is remote to my machine, while the other servers were in the same building. Hence dev and production were relatively unaffected.
Good luck!
You can follow Alex Shyba's post here
But 2-5 minutes sounds extreme. This isn't normal. Are your hardware up to date?
Also I would look in to your prefetch cache. In your development environment you probably don't want i to fetch to much on start up, as this isn't needed in development.
I would also look into the initialize pipeline and global.asax to see if you are doing any custom start up jobs.
You can reduce the pre-fetch cache size on core, master, and web databases, but this is not recommended, do this only on development machines, not on production servers.
I want to add automated encryption of some sections of web.config with following requirements:
It should be done automatically every time application is started (I don't want to do it manually using aspnet_regiis after each config edit)
It should automatically happen only on production environment (I need to see my values on my machine)
It's not a problem to find the code for it, my question is where to put it and how to make sure it's not executed on my localhost?
I tried to put it to Global.asax but I get Request is not available in this context exception there.
Thanks.
I would recommend setting this up as part of your deployment process.
Calling the encryption every time the application starts isn't a good idea because the ASP.Net App Pool will recycle itself from time to time, and that could cause problems.
Depending on your deployment process, you can do it through a batch file, powershell script, but it will need to be run on your production machine in question.
Over the last few weeks I've been subject to a sudden and significant performance deterioration when browsing locally hosted ASP.NET 3.5 MVC web applications (C#). Load times for a given page are on average 20 seconds (regardless of content); start up is usually over a minute. These applications run fast on production and even test systems (Test system is comparable to my development environment).
I am running IIS 6.0, VS2008, Vista Ultimate, SQL2005, .NET 3.5, MVC 1.0, and we use VisualSVN 1.7.
My SQL DB is local and IPv6 does not seem to be the cause. I browse in Firefox and IE8 outside of Debug mode using loopback, machine name, and 'localhost' and get the exact same results every time (hence DNS doesn't seem to be the issue either).
Below are screen shots of my dotTrace output.
http://www.glowfoto.com/static_image/28-100108L/3123/jpg/06/2010/img4/glowfoto
This issue has made it near impossible to debug/test any web app. Any suggestions very much appreciated!
SOLUTION: Complete re-installation of Windows, IIS, Visual Studio, etc. It wasn't the preferred solution, but it worked.
Surely the big red flag on that profiler output is the fact that AddDirectory is called 408 times and AddExistingFile is called 66,914 times?
Can you just confirm that there's not just a shed load of directories and files underneath your MVC app's root folder? Because it looks like the framework is busying itself trying to work out what files it needs to build (or add watches to) on startup.
[I am not au fait with MVC and so maybe this is not what is happening but 67k calls to a function with a name like "AddExistingFile" does smell wrong].
I've learnt that it's usually a "smell" when things fail near a power of two ...
Given
Over the last few weeks I've been subject to a sudden and significant performance deterioration
and
AddExistingFile is called 66,914 times
I'm wondering if the poor performance hit at about the time as the number of files exceeded 65,535 ...
Other possibilities to consider ...
Are all 66,914 files in the same directory? If so, that's a lot of directory blocks to access ... try a hard drive defrag. In fact, it's even more directory blocks if they're distributed across a bunch of directories.
Are you storing all the files in the same list? Are you preseting the capacity of that list, or allowing it to "grow" naturally and slowly?
Are you scanning for files depth first or breadth first? Caching by the OS will favor the performance of depth first.
Update 14/7
Clarification of Are you storing all the files in the same list?
Naive code like this first example doesn't perform ideally well because it needs to reallocate storage space as the list grows.
var myList = new List<int>();
for (int i=0; i<10000; i++)
{
myList.Add(i);
}
It's more efficient, if you know it, to initialize the list with a specific capacity to avoid the reallocation overhead:
var myList = new List<int>(10000); // Capacity is 10000
for (int i=0; i<10000; i++)
{
myList.Add(i);
}
Update 15/7
Comment by OP:
These web apps are not programmatically probing files on my hard disk, at least not by my hand. If there is any recursive file scanning, its by VS 2008.
It's not Visual Studio that's doing the file scanning - it is your web application. This can clearly be seen in the first profiler trace you posted - the call to System.Web.Hosting.HostingEnvironment.Initialize() is taking 49 seconds, largely because of 66,914 calls to AddExistingFile(). In particular, the read of the property CreationTimeUTC is taking almost all the time.
This scanning won't be random - it's either the result of your configuration of the application, or the files are in your web applications file tree. Find those files and you'll know the reason for your performance problems.
Try creating a new, default MVC2 application in a new web folder. Build and browse it. If your load times are okay with the new app, then there's something up with your application. If not, it's outside of the context of the app and you should start looking at IIS config, extensions, hardware, network, etc.
In your app, back up your web config and start with a new, default web.config. That should disable any extensions or handlers you've installed. If that fixes your load times, start adding stuff from the old web.config into the new one in small blocks until the problem reappears, and in that way isolate the offending item.
I call this "binary search" debugging. It's tedious, but actually works pretty quickly and will most likely identify the problem when we get stuck in one of those "BUT IT SHOULD WORK!!!" modes.
Update Just a thought: to rule out IIS config, try running the site under Cassini/built-in dev server.
The solution was to format and do a clean install of Vista, SQL Server 2005, Visual Studio 2008, IIS6 and the whole lot. I am now able to debug, without consequence, the very same webapp(s) I was experiencing the problems with initially. This leads me to believe the problem lay within one of the installations above and must have been aggravated by a software update or by the addition of software.
You could download Fidler to measure how long each call takes and get some measurements.
Link
This video might help...