I want to add automated encryption of some sections of web.config with following requirements:
It should be done automatically every time application is started (I don't want to do it manually using aspnet_regiis after each config edit)
It should automatically happen only on production environment (I need to see my values on my machine)
It's not a problem to find the code for it, my question is where to put it and how to make sure it's not executed on my localhost?
I tried to put it to Global.asax but I get Request is not available in this context exception there.
Thanks.
I would recommend setting this up as part of your deployment process.
Calling the encryption every time the application starts isn't a good idea because the ASP.Net App Pool will recycle itself from time to time, and that could cause problems.
Depending on your deployment process, you can do it through a batch file, powershell script, but it will need to be run on your production machine in question.
Related
Problems have been reported to me regarding the performance of a live site. I can't seem to replicate any of these issues on any dev or staging environments, and the profilers I have ran against dev has revealed nothing unusual.
This has led me to turn to a diagnostics trace for a simple timing trace so I can at least try and isolate the cause and try and narrow it down.
I'm quite happy to add
System.Diagnostics.Trace.WriteLine("....");
wherever necessary and add a listener (via web.config entry) to write out to a log file, but could this massively impact the performance of the live environment itself?
Is there anything else I need to consider when, potentially, leaving this to run over the weekend? i.e. is it best that I specify how large the log file is to get before closing and opening a new one?
It depends how much data you are going to log so turn on the logger and check if your application behaves normally. Also if logging to a log file slows down your application consider a faster TraceListener such as EventLogTraceListener (you may create a dedicated event log for this purpose with maximum size and log rolling). In case logging to a file is not a problem get EssentialDiagnostics RollingFileTraceListener. It has many options including setting maximum file size and the number of rolled files.
Use a logging framework like log4NET and make logging like:
LogManager.GetCurrentClassLogger().Debug("...");
When you disable logging afterwards in the configuration, these functions are not executed by the framework.
If you need to do string formatting for your messages: Use "DebugFormat()" which will not do the formatting if it is not needed by the level of logging desired.
I have a console application that writes on a txt files information retrieved from a database. Until now I manually executes the executable generated by the console application.
Now I need to automatize the invocation of the .exe from my web application, so that each time a specific condition happens in my code behind I can run the .exe with a logic "fire and forget".
My goals are:
1) Users must not be affected in any way by the console application execution (the SQL queries and txt file generation might take around 3 to 5 minutes), therefore the logic of "fire and forget" delegated to a separate process.
2) Since the executable will be still run manually in some cases, I would prefer having the all logic in one place, in order to avoid the risk of having a different behaviour.
Can I safely use System.Diagnostics.Process to achieve this?
System.Diagnostics.Process cmd = new System.Diagnostics.Process();
cmd.Start("Logger.exe");
Does the process automatically ends or do I have to set a timeout and explicitly close it? Is it "safe" in a web application environment with different users accessing the web application let them call the executable without the risk of concurring accesses?
Thanks.
EDIT:
Changed to use the built in class for more clarity, thanks for the hint.
As far as the mechanics, I assume CommandLineProcess wraps Process? If so, I don't see anything necessarily wrong with it, at first glance. I just have some issue with running this as an executable from a web application, as you are more likely to reduce security to get it working than rearchitect (if you follow the normal path I see in development).
If you encapsulate the actual business code in a class library, you can run the code in the web application. The main rule is the folder it saves to should be under webroot (physically or logically) so you don't have to reduce security. But, if the logic is encapsulated, you can run the "file creeator" in the web process without spinning up a Process.
Your other option is wrap the process in a service (I like a non-HTTP WCF service, but you can go windows service, if you want). I would only go this direction if it makes sense to follow a SOA path with a service endpoint. As this is likely to be isolated to a single application, in process makes more sense (unless you are saving to a directory outside of webroot).
Hope this makes sense.
Yes, it will die on it's own - provided that the .exe file will terminate on it's own. It will run with the same credentials of the web server.
Keep in mind this is considered unsafe, since you are executing code based on whatever your webapp is doing. However, the problem is with .exe files being executed this way in general and not with the actual users accessing the app.
Similar question here How do I run a command line process from a web application?
I ran into issue when my .net application takes too long to "cold run" (means when starting it first time after PC boot).
I saw with procmon a lot of reading and writing files with .cch extension, that i understand as CAS cache.
I'm suspecting this making the performance problem, and i want to disable the use of CAS cache, to let the .net framework process the security over and over every startup of my application without reading and writing CAS cache.
How to do that? preferably with app.config
Thanks.
We have an IIS hosted web method which is randomly dying on us about 10% of the time. In trying to debug this we've added Log.Debug() messages in front of every real code line and it appears to be dying on random lines.
Has anyone seen this or have an idea on how to debug this?
[Additional Details]
We've spent a lot of time looking at it and have discovered the following...
We have a seperate self-hosted WCF Service that access the same database and lives on the same machine. When it is under heavy load the web method croaks every time. If it's not under load then things usually work fine (but not 100%).
High CPU doesn't seem to be part of the problem. We ran a small app that created a high cpu load and the web service did not die.
The web service dies when we either new up an XmlSerializer (without doing the sgen precomp) OR have NHibernate create a SessionFactory. The only two things these things have in common is that they 1) seem like things people commonly do.. 2) seem like they would be fairly intensive.
We've added a Global.asax to try to capture Application_End and Application_Error but neither event gets fired. This to me implies that we're not dealing with a normal application pool resetting?
Sounds like it might be a threading issue. You are using informative debug messages -- you should try to reproduce the issue while running the debugger and breaking on all exceptions. Make sure you check all the windows logs for information on why the app pool crashed.
Per comment: It's hard to say, but many things can cause a thread to appear to "just die." Memory issues: are you doing any interop? Improper marshaling: are you touching data on another thread? But, I will play the probabilities and ask if you're sure your handling any exception that might be happening and logging it. Are you sure you are not gobbling up an exception and not reporting it? Somewhere down low? Is this a permissions issue? Are you running partial trust or on a low privilege user account?
Figured it out.. two problems really..
We added Global.asax but it didn't get copied over which explains why we weren't seeing any messages. We fixed this and found out that...
Our WCF log was being written out to the bin directory of the IIS Web Service. In retrospect this is kind of silly since the WS is an old school web service. The WCF stuff is in the same directory only for some reason that is unknown to us since the initial person who set things up is gone..
Lesson learned.. Somewhere there is a message that explains everything.. you just have to find it.
I am developping a (relatively small) website in ASP.Net 2.0. I am also using nAnt to perform some easy tweaking on my project before delivering executables. In its current state, the website is "precompiled" using
aspnet_compiler.exe -nologo -v ${Appname} -u ${target}
I have noticed that after the IIS pool is restarted (after a idle shutdown or a recycle), the application takes up to 20 seconds before it is back online (and Application_start is reached).
I don't have the same issue when I am debugging directly within Visual Studio (it takes 2 seconds to start) so I am wondering if the aspnet_compiler is really such a good idea.
I couldn't find much on MSDN. How do you compile your websites for production?
Make sure that:
You are using a Web Application project rather than a Web Site project, this will result in a precompiled binary for your code behind
You have turned off debug code generation in the web.config file - I guess if this is different to when you used aspnet_compiler the code may be recompiled
If you've tried those, you could maybe try running ngen over your assembly thus saving the JIT time?
For ultimate reponsiveness, don't allow your app to be shutdown.
The first method is to make sure that it's incredibly popular so that there's always someone using it.
Alternatively, fetching a tiny keep-alive page from somewhere else as a scheduled activity can be used to keep your site 'hot'.
If your website is compiled as updatable, you'll see a bunch of .ASPX files in your virtual directory. These must be compiled on startup. That's so you can come in and alter the web UI itself. This is default for both websites and web applications.
Make sure this is set in web.config <compilation debug=false>. In my case, I also have a batch file which issue Get requests for all the main pages before giving to users (page loading simulation).
The key is to make sure the IIS Application Pool never shuts down. This is where the code is actually hosted. Set the "Idle Timeout" (Under Advanced Settings) to something really high, like 1440 minutes (24 hours) to make sure it's not shut down as long as somebody hits your site once a day.
You are still going to have the JIT time whenever you deploy new code, or if this idle timeout period is exceeded witout any traffic.
Configuring IIS 7.x Idle Timeout
#Simon:
The project is a Web Application. Websites are then slower to startup (I had no idea it had an incidence, beside the different code organization)?
I checked, and while I edit the web.config after aspnet_compiler is called, I don't touch the debug value (I will however check the website is not faster to startup if I don't touch the web.config, just to make sure)
(And I will definitely have a look at ngen, I was not aware of that tool.)