How to disable CAS cache with app.config? - c#

I ran into issue when my .net application takes too long to "cold run" (means when starting it first time after PC boot).
I saw with procmon a lot of reading and writing files with .cch extension, that i understand as CAS cache.
I'm suspecting this making the performance problem, and i want to disable the use of CAS cache, to let the .net framework process the security over and over every startup of my application without reading and writing CAS cache.
How to do that? preferably with app.config
Thanks.

Related

What strategy for encrypting web.config sections should I use?

I want to add automated encryption of some sections of web.config with following requirements:
It should be done automatically every time application is started (I don't want to do it manually using aspnet_regiis after each config edit)
It should automatically happen only on production environment (I need to see my values on my machine)
It's not a problem to find the code for it, my question is where to put it and how to make sure it's not executed on my localhost?
I tried to put it to Global.asax but I get Request is not available in this context exception there.
Thanks.
I would recommend setting this up as part of your deployment process.
Calling the encryption every time the application starts isn't a good idea because the ASP.Net App Pool will recycle itself from time to time, and that could cause problems.
Depending on your deployment process, you can do it through a batch file, powershell script, but it will need to be run on your production machine in question.

What are the most time consuming checks performed by .NET when executing a managed application?

I've developed a .NET based Windows service that uses part managed (C#) and unmanaged code (C/C++ libraries).
In some domain environments (e.g. Win 2k3 32bit server inside domain abc.com) sometimes the service takes more than 30 seconds to start (especially on OS restart), thus failing to start the service. I suspect that it has something to do with enterprise level security but I do not know for sure.
http://msdn.microsoft.com/en-us/library/aa720255%28VS.71%29.aspx
I've tried the following without success:
- delay loading references by moving the using directives as far as possible from the servicebase implementation (especially the xml namespace - know to cause delays in loading)
- delay loading and configuring log4net
- precompiling the code by using ngen
- delaying the start of the worker thread
- add/remove manifest + decencies set inside
- sign/unsign the binaries
- use the configuration settings (there are a lot of settings and the scope level for all is set to application ) as later as possible
- add all dependencies to GAC
I didn't tried yet to add security demands for the class that has the Main method implemented.
I didn't tries to implement my own configuration loader because after inspecting the autogenerated code, I've noticed that the setting class is a singletone and it gets its instance on call.
By completely removing the log4net dependency it worked, but this is not an option.
When the network card is disabled the service starts immediately.
You'd normally use SysInternals' Process Monitor to diagnose this problem. The fact that this is a service complicates matters. Check this blog post for a similar troubleshooting session.
It quacks like a CRL (Certification Revocation List) problem btw. To disable it: Control Panel, Internet Options, Advanced tab, Security, untick "Check for publisher's certificate revocation".
We discovered that using a log4net UDP appender with a name resolution (even to 12.0.0.1) was causing a massive slow down in start up.

Doing an inplace update on software

I would like to be able to do an "inplace" update with my program. Basically, I want to be able to login remotely where the software is deployed, install it while other users are still using it (in a thin client way), and it update their program.
Is this possible without too much of a hassle? I've looked into clickonce technology, but I don't think that's really what I'm looking for.
What about the way firefox does it's updates? Just waits for you to restart the program, and notifies you when it's been updated.
UPDATE: I'm not remoting into the users' PC. This program is ran on a server, and I remote in and update it, the users run it directly off the server through remote access.
ClickOnce won't work because it requires a webserver.
I had some example code that I can't find right now but you can do something similar to Firefox with the System.Deployment.Application namespace.
If you use the ApplicationDeployment class, you should be able to do what you want.
From MSDN, this class...
Supports updates of the current deployment programmatically, and handles on-demand downloading of files.
Consider the MS APIs with BITS, just using bitsadmin.exe in a script or the Windows Update Services.
Some questions:
Are the users running the software locally, but the files are located on a networked share on your server?
Are they remoting into the same server you want to remote into, and execute it there?
If 2. are they executing the files where they are placed on the server, or are they copying them down to a "private folder"?
If you cannot change the location of the files, and everyone is remoting in, and everyone is executing the files in-place, then you have a problem. As long as even 1 user is running the program, the files will be locked. You can only update the files once everyone is out.
If, on the other hand, the users are able to run their own private copy of the files, then I would set up a system where you have a central folder with the latest version of the files, and when a user starts his program, it checks if the central folder has newer versions than the user is about to execute. If it does, copy the new version down first.
Or, if that will take too long, and the user will get impatient (what, huh, users getting impatient?), then having the program check the versions after startup, and remind the user to exit would work instead. In this case, the program would set a flag that upon next startup would do the copying, only now the user is aware of it happening.
The copying part would easily be handled by either having a separate executable that does the actual copying, and executing that instead, or the program could copy itself temporarily to another location and run that copy with parameters that says "update the original files".
While you can design your code to modify itself (maybe not in C#?), this is generally a bad idea. This means that you must restart something to get the update. (In Linux you are able to replace files that are in use, however an update does not happen until the new data is loaded into memory i.e. application restart)
The strategy used by Firefox (never actually looked into it) is storing the updated executable in a different file which is checked for when program starts to load. This allows the program to overwrite the program with the update before the resource is locked by the OS. You can also design you program more modular so that portions of it can be "restarted" without requiring a restart of the entire program.
How you actually do this is probably provided by the links given by others.
Edit:: In light of a response given to Lasse V. Karlsen
You can have your main program looking for the latest version of the program to load (This program wouldn't be able to get updates without everyone out). You then can remove older versions once people are no longer using it. Depending on how frequent people restart their program you may end up with a number of older programs versions.
ClickOnce and Silverlight (Out of browser) both support your scenario, if we talk about upgrades. Remote login to your users machine? Nope. And no, Firefox doesn't do that either as far as I can tell..
Please double-check both methods and add them to your question, explaining why they might not do what you need. Otherwise it's hard to move on and suggest better alternatives.
Edit: This "I just updated, please restart" thing you seem to like is one method call for Silverlight applications running outside of the browser. At this point I'm fairly certain that this might be the way to go for you.
ClickOnce doesn't require a webserver, it will let you publish updates while users are running the software. You can code your app to check for new update every few minutes and prompt the user to restart the app if a new version is found which will then take them through the upgrade process.
Another option is a Silverlight OOB application, but this would be more work if your app is already built as WinForms/WPF client app.
Various deployment/update scenarios (for .NET applications) are discussed with there pros and cons in Microsoft's Smart Client Architecture and Design Guide. Though a little bit old I find that most still holds today, as it is describing rather the basic architectural principles than technical details. There is a PDF version, but you find it online as well:
Deploying and Updating Smart Client Applications
Is this possible without too much of a hassle?
Considering the concurrency issues with thin clients and the complexity of Windows installations, yes hot updates will be a hassel without doing it the way the system demands.

.NET Applications performance problem on Windows 2003

We have a 2 x Quad Core Xeon server with 8GB of RAM and Windows Server 2003 Enterprise installed on it. We installed our application server which is based on .NET Framework 3.5 on it. The server uses SQL Server 2005 as its database server.
When we installed the application server, it used to have ultra fast performance and everything was fine. Once we joined it into our domain, its performance decreased dramatically. For example a task that took 1 sec to complete, now takes about 30 sec. This is very strange since only .NET based applications' performance got this performance hit but the other applications still run at their normal speed.
Does anyone have any idea about why is this happening? Any help or suggestion is much appreciated.
Unfortunately, more is probably needed to answer your question. There are a host of possible reasons why this is occurring, and most of them involve your code.
Based on the symptom that you joined the domain and then things started causing trouble, I'd say you've got a lot of networking that you're doing that previously was able to be done locally on your machine and the latency is now actually causing trouble.
But that's a wild guess based on not nearly enough information.
I'd suggest you profile your code. Find out where the majority of your time is spent during execution and then post the code or a sanitized version of it here so we can help you optimize it.
I did find the answer to my question so i thought it might be good to share it here. The CLR want generate publisher evidence for assemblies with authenticode signature when it tries to load the assemblies. In our case CLR was trying to connect to clr.microsoft.com but our server's internet access was blocked so it caused huge delay whenever the application server tries to load a new assembly.
The following post describes how you can disable this feature:
Bypassing the Authenticode Signature Check on Startup
I'm going to make a guess here and think that you're talking about a web application. If this is correct, you might want to take a look at the application pools you have setup on the webserver. Your application might be getting confused about which pool to set itself in when it starts running.
Another thing to check might be your data connections and make sure that you're closing everything that's been opened.
The last thing, like Randolpho said, you're just really going to have to follow your code execution with some kind of profiler and see where things are getting tied up.
Good luck!

Precompilation and startup times on ASP.Net

I am developping a (relatively small) website in ASP.Net 2.0. I am also using nAnt to perform some easy tweaking on my project before delivering executables. In its current state, the website is "precompiled" using
aspnet_compiler.exe -nologo -v ${Appname} -u ${target}
I have noticed that after the IIS pool is restarted (after a idle shutdown or a recycle), the application takes up to 20 seconds before it is back online (and Application_start is reached).
I don't have the same issue when I am debugging directly within Visual Studio (it takes 2 seconds to start) so I am wondering if the aspnet_compiler is really such a good idea.
I couldn't find much on MSDN. How do you compile your websites for production?
Make sure that:
You are using a Web Application project rather than a Web Site project, this will result in a precompiled binary for your code behind
You have turned off debug code generation in the web.config file - I guess if this is different to when you used aspnet_compiler the code may be recompiled
If you've tried those, you could maybe try running ngen over your assembly thus saving the JIT time?
For ultimate reponsiveness, don't allow your app to be shutdown.
The first method is to make sure that it's incredibly popular so that there's always someone using it.
Alternatively, fetching a tiny keep-alive page from somewhere else as a scheduled activity can be used to keep your site 'hot'.
If your website is compiled as updatable, you'll see a bunch of .ASPX files in your virtual directory. These must be compiled on startup. That's so you can come in and alter the web UI itself. This is default for both websites and web applications.
Make sure this is set in web.config <compilation debug=false>. In my case, I also have a batch file which issue Get requests for all the main pages before giving to users (page loading simulation).
The key is to make sure the IIS Application Pool never shuts down. This is where the code is actually hosted. Set the "Idle Timeout" (Under Advanced Settings) to something really high, like 1440 minutes (24 hours) to make sure it's not shut down as long as somebody hits your site once a day.
You are still going to have the JIT time whenever you deploy new code, or if this idle timeout period is exceeded witout any traffic.
Configuring IIS 7.x Idle Timeout
#Simon:
The project is a Web Application. Websites are then slower to startup (I had no idea it had an incidence, beside the different code organization)?
I checked, and while I edit the web.config after aspnet_compiler is called, I don't touch the debug value (I will however check the website is not faster to startup if I don't touch the web.config, just to make sure)
(And I will definitely have a look at ngen, I was not aware of that tool.)

Categories