Preload all assemblies (JIT) - c#

We are taking a hit the first time some heavy UI screens are loaded. Our project is divided into one main executable and several DLL files. The DLL files can also contain UI screens which are slow the first time they are loaded.
Is there a way (in code) we can preload all the referenced assemblies so as to avoid the JIT compilation hit?
I know there is a tool called NGen. Is it possible to operate NGen in a development environment so we can see its effects instantly? Ideally though, we would like to preload the referenced assemblies from code.
Using C# .NET 3.5 along with DevExpress for our UI components.

I personally found that putting the pre-jitter in 'Program' helped in a certain application, but that is a situation-specific question.
More importantly, the code in wal's answer will crash when it encounters an abstract method, so two lines of code have been added to skip abstract methods.
static Program()
{
foreach (var type in Assembly.GetExecutingAssembly().GetTypes())
{
foreach (var method in type.GetMethods(BindingFlags.DeclaredOnly |
BindingFlags.NonPublic |
BindingFlags.Public | BindingFlags.Instance |
BindingFlags.Static))
{
if ((method.Attributes & MethodAttributes.Abstract) == MethodAttributes.Abstract|| method.ContainsGenericParameters)
{
continue;
}
System.Runtime.CompilerServices.RuntimeHelpers.PrepareMethod(method.MethodHandle);
}
}
Console.WriteLine("jitted!");
}

Did you try/look at this?
That is, do the following for each assembly you want to pre JIT.
static void PreJIT()
{
foreach (var type in Assembly.GetExecutingAssembly().GetTypes())
{
foreach (var method in type.GetMethods(BindingFlags.DeclaredOnly |
BindingFlags.NonPublic |
BindingFlags.Public | BindingFlags.Instance |
BindingFlags.Static))
{
System.Runtime.CompilerServices.RuntimeHelpers.PrepareMethod(method.MethodHandle);
}
}
}

Take a look at NGen. Another option is to use ILMerge to merge all your assemblies into one. Whenever I use ILMerge, I add a post build command to so it happens automatically. The other alternative (untested) is, if you don't mind a longer start-up time, you could manually call Assembly.Load() at the beginning for each assembly.

You can just create instances of the classes located in externall assemblies. Just call the constructor in a limited scope (I mean declare the variable inside a function. It should not be global var because it will delay GC to dispose that instance). This will load the assembly, compile it and cache it. You can even do this in a background thread so the main thread will keep responsiveness.

You can use NGen on any machine - there is no notion of "development enviroment" in CLR... When you use it make sure that NGen'ed images are actually used (see Native Image Generator (Ngen.exe) for instructions and look for FusLogVw note in the document).
You can also pre-JIT through invoking all the code you expect to run (as Davita suggested), but you'll need to invoke each and every method of all classes which in not exactly practical.
You should profile your application to see where time is actually spent - it could be reading of the assemblies from the disk, not JITing itself... You can roughly see it by starting the application, looking at the forms, closing the application and repeating steps. If the second run is much faster then the application spends most of the time reading from the disk, not JITing.

Related

How to update dll-file without main program being restart?

How to update dll-file without program being restart?
I want to create my "Updater" class. It's main idea is to
check local(connected to executive file) and server dll-files,
and when newer version is available, copying server file to local.
Code:
Updater updater = new Updater(LOCAL_PATH, SERVER_PATH);
if (updater.IsAvailableNewerVersion)
updater.Update();
Updater's ctor take two paths - to server dll and to local dll and calculate their Versions.
Then I call a property IsAvailableNewerVersion - if it's true (available version is newer then a local), I call Update method.
Main Idea of Update method is to copying server dll-file to local with overwrite, and say to user to restart program.
Code:
public class Updater
{
private readonly string _localPath;
private readonly string _serverPath;
private readonly Version _currentVersion;
private readonly Version _availableVersion;
public Updater(string localPath, string serverPath)
{
_localPath = localPath;
_serverPath = serverPath;
_currentVersion = AssemblyName.GetAssemblyName(_localPath).Version;
_availableVersion = AssemblyName.GetAssemblyName(_serverPath).Version;
}
public bool IsAvailableNewerVersion => _availableVersion.Major > _currentVersion.Major ||
_availableVersion.MajorRevision > _currentVersion.MajorRevision ||
_availableVersion.Minor > _currentVersion.Minor ||
_availableVersion.MinorRevision > _currentVersion.MinorRevision ||
_availableVersion.Build > _currentVersion.Build ||
_availableVersion.Revision > _currentVersion.Revision;
public void Update()
{
try
{
File.Copy(_serverPath, _localPath, true);
}
catch (Exception e)
{
MessageBox.Show("Unable to copying file - " + e);
return;
}
MessageBox.Show("File was successfully updated. Please restart program.");
}
}
Is there a way to check dll-files before they being used?
How to update dll-file without program being restart?
P.S.
I think to use server dll-files, but my program becomes dependent on the server that is not good.
You can't really unload / change the assemblies in a running app-domain; so basically you have two options here:
use a launcher exe that does this work and invokes an inner exe to do the real work, shutting it down when complete (launching a new one either before or after the shutdown)
use multiple app-domains in a single exe, isolating where each loads from
The second approach is more useful for server applications, since you can handle the network IO in the outer exe and funnel requests through to the inner app-domain, just changing a single reference to swap between the two systems. However, app-domains aren't in .NET Core, so you should be aware that it limits your flexibility.
1. Is there a way to check dll-files before they being used?
If you are using a single EXE, it depends on how you use the DLLs. If they are hard coupled to your app, I do not know any way on how to do so.
But if the DLLs are just implemeting interfaces of your core DLLs, you could load them using:
Assembly assembly = Assembly.LoadFrom("path\orbital.dll"));
Type type = assembly.GetType("FullClassName");
object instanceOfMyType = Activator.CreateInstance(type);
This requires you to follow some design patterns, what would be good for your app. But if you already have several DLLs strongly coupled (that means, you added them as reference in your Visual Studio project and are creating objects as var object = new YourOtherDLLNamespace.YorObjectInTheOtherDLL()) then you will probably take a lot of time to change to this approach.
The good about this approach is that you could check the DLL version before loading it, copy a new version from a server (or any other method you find appropriate) and then load it.
2. How to update dll-file without program being restart?
Ok, lets suppose you made just like my suggestion on item 1. Once you loaded the DLL, you wont be able to Unload it. You could destroy all the objects using the garbage collector, but that won't change the way windows is loading it.
So the basic answer for a single EXE is no.
Other solutions
You could wrap your solution in other EXE that keeps constantly checking if the DLL is updated, send a custom event (so that your app EXE would now what is going on an take appropriate action before shutting it down), notify the user and, if the user wants to do so, then you could close the app. Then your other app wrapper would update it and launch the EXE again.

Block file system and internet access in a C# application [duplicate]

Over the months, I've developed a personal tool that I'm using to compile C# 3.5 Xaml projects online. Basically, I'm compiling with the CodeDom compiler. I'm thinking about making it public, but the problem is that it is -very-very- easy to do anything on the server with this tool.
The reason I want to protect my server is because there's a 'Run' button to test and debug the app (in screenshot mode).
Is this possible to run an app in a sandbox - in other words, limiting memory access, hard drive access and BIOS access - without having to run it in a VM? Or should I just analyze every code, or 'disable' the Run mode?
Spin up an AppDomain, load assemblies in it, look for an interface you control, Activate up the implementing type, call your method. Just don't let any instances cross that AppDomain barrier (including exceptions!) that you don't 100% control.
Controlling the security policies for your external-code AppDomain is a bit much for a single answer, but you can check this link on MSDN or just search for "code access security msdn" to get details about how to secure this domain.
Edit: There are exceptions you cannot stop, so it is important to watch for them and record in some manner the assemblies that caused the exception so you will not load them again.
Also, it is always better to inject into this second AppDomain a type that you will then use to do all loading and execution. That way you are ensured that no type (that won't bring down your entire application) will cross any AppDomain boundary. I've found it is useful to define a type that extends MarshalByRefObject that you call methods on that executes insecure code in the second AppDomain. It should never return an unsealed type that isn't marked Serializable across the boundary, either as a method parameter or as a return type. As long as you can accomplish this you are 90% of the way there.

Dynamically Compile C# Code Without Piling up Assemblies in Memory

The problem (and some unnecessary information): I am creating a chat bot in C# (not chatterbot), and want users to be able to run custom code on the bot. Basically, you send a string message over the network, and the bot runs the code contained in it.
I have looked into and actually implemented/used CSharpCodeProvider, however, this has the problem of every time custom code is compiled, it adds another Assembly to the AppDomain (which is impossible to remove). When you take into account that tens or hundreds of separate custom code invokes may occur in a single lifetime, this becomes a problem.
My idea is that there might be a interpreted language or some such thing that is able to be invoked from C#.
You can remove an assembly if you remove the entire appdomain. So you could create a fresh appdomain, load the assembly there (or compile it from there) and dispose of it after use.
You could recycle the appdomain every 100 statements or so in order to amortize the (small) time it takes to cycle one.

How to Run Excute method another current domain?

i have a advanced question which is related to remoting dlls in MEMORY. i have wcf service that take Dlls(Starter.DLL entryType) Also there is a Process dll which is referenced to wpf application . i have some steps:
1) wcf takes Dlls (add to stream memory..)
2) Process dll read from wcf
3) Process dll adds dll to currentdomain
foreach (byte[] binary in deCompressBinaries)
{
AppDomain.CurrentDomain.Load(binary);
}
4) wpf accesses Process' CreateApplication method
this._btnStartApp.Click += (s, args) =>
{
Process appMngr = new Process ();
appMngr.CreateApplication();
};
CreateApplication method :
object obj = appLoader.CreateInstance(appLoader.EntryType);
MethodInfo minfo = obj.GetType().GetMethod("Execute", BindingFlags.Instance | BindingFlags.Public | BindingFlags.CreateInstance);
but i can not see Execute method in getMethos result, if QuickWatching,How to run Execute method in another currentdomain ? this Dlls are adding Process Dll's currewntdomain not wpf currentdomain.
AppDomains are tricky to work with then dynamicly loading DLLs.
First; if you cannot see the method then verify you can get the type of the object. If you can see the type the object should be laoded into the current domain. Then you can itterate over the methods and inspect those to see if something is missmatched with the GetMethod call, wich could make it fail to show up. The CreateInstance flag might be the issue, I beleive that one is used for constructors.
Second; If you want things to run in a seperate appdomain (very important when you want to unload the dll later) then you have to build 3 things.
create a set of classes that can handle all the loading and execution of the of the dynamic DLL.
create an interface that allows your main code to talk to this set of classes; this iterface cannot return any types that came from the dynamic DLL; if it does you either laod that DLL or you crash. I recall marshal by ref may be needed, but uncertain, it has been a while.
Attach event handlers to your main Appdomain that notify you when you load a DLL into that appdomain. This will be how you can debug if you are inadvertidly loading the DLL into your current AppDomain.
http://msdn.microsoft.com/library/system.appdomain.assemblyload.aspx
Now you can create a new Appdomain, call your loading code in there and get results trough the interface. If you can complete the call and you had no load events trigered you got it working. you can dispaose of the AppDomain when you are done and the loaded DLL goes along with it.
Good luck, this was tricky to figure out and get working correctly.

Very High Memory Usage in .NET 4.0

I have a C# Windows Service that I recently moved from .NET 3.5 to .NET 4.0. No other code changes were made.
When running on 3.5, memory utilzation for a given work load was roughly 1.5 GB of memory and throughput was 20 X per second. (The X doesn't matter in the context of this question.)
The exact same service running on 4.0 uses between 3GB and 5GB+ of memory, and gets less than 4 X per second. In fact, the service will typically end up stalling out as memory usage continue to climb until my system is siting at 99% utilization and page file swapping goes nuts.
I'm not sure if this has to do with garbage collection, or what, but I'm having trouble figuring it out. My window service uses the "Server" GC via the config file switch seen below:
<runtime>
<gcServer enabled="true"/>
</runtime>
Changing this option to false didn't seem to make a difference. Futhermore, from the reading I've done on the new GC in 4.0, the big changes only effect the workstation GC mode, not server GC mode. So perhaps GC has nothing to do with the issue.
Ideas?
Well this was an interesting one.
The root cause turns out to be a change in the behavior of SQL Server Reporting Services' LocalReport class (v2010) when running this on top of .NET 4.0.
Basically, Microsoft altered the behavior of RDLC processing so that each time a report was processed it was done so in a seperate application domain. This was actually done specifically to address a memory leak caused by the inability to unload assemblies from app domains. When the LocalReport class processed an RDLC file, it actually creates an assembly on the fly and loads it into the app domain.
In my case, due to the large volume of report I was processing, this was resulting in very large numbers of System.Runtime.Remoting.ServerIdentity objects being created. This was my tip off to the cause, as I was confused as to why processing an RLDC required remoting.
Of course, to call a method on a class in another app domain, remoting is exactly what you use. In .NET 3.5, this wasn't necessary as, by default, the RDLC-assembly was loaded into the same app domain. In .NET 4.0, however, a new app domain is created by default.
The fix was fairly easy. First I needed to go enable legacy security policy using the following config:
<runtime>
<NetFx40_LegacySecurityPolicy enabled="true"/>
</runtime>
Next, I needed to force the RDLCs to be processed in the same app domain as my service by calling the following:
myLocalReport.ExecuteReportInCurrentAppDomain(AppDomain.CurrentDomain.Evidence);
This resolved the issue.
I ran into this exact issue. And it is true that app domains are created and not cleaned up. However I wouldn't recommend reverting to legacy. They can be cleaned up by ReleaseSandboxAppDomain().
LocalReport report = new LocalReport();
...
report.ReleaseSandboxAppDomain();
Some other things I also do to clean up:
Unsubscribe to any SubreportProcessing events,
Clear Data Sources,
Dispose the report.
Our windows service processes several reports a second and there are no leaks.
I'm pretty late to this, but I have a real solution and can explain why!
It turns out that LocalReport here is using .NET Remoting to dynamically create a sub appdomain and run the report in order to avoid a leak internally somewhere. We then notice that, eventually, the report will release all the memory after 10 to 20 minutes. For people with a lot of PDFs being generated, this isn't going to work. However, the key here is that they are using .NET Remoting. One of the key parts to Remoting is something called "Leasing". Leasing means that it will keep that Marshal Object around for a while since Remoting is usually expensive to setup and its probably going to be used more than once. LocalReport RDLC is abusing this.
By default, the leasing time is... 10 minutes! Also, if something makes various calls into it, it adds another 2 minutes to the wait time! Thus, it can randomly be between 10 and 20 minutes depending how the calls line up. Luckily, you can change how long this timeout happens. Unluckily, you can only set this once per app domain... Thus, if you need remoting other than PDF generation, you will probably need to make another service running it so you can change the defaults. To do this, all you need to do is run these 4 lines of code at startup:
LifetimeServices.LeaseTime = TimeSpan.FromSeconds(5);
LifetimeServices.LeaseManagerPollTime = TimeSpan.FromSeconds(5);
LifetimeServices.RenewOnCallTime = TimeSpan.FromSeconds(1);
LifetimeServices.SponsorshipTimeout = TimeSpan.FromSeconds(5);
You'll see the memory use start to rise and then within a few seconds you should see the memory start coming back down. Took me days with a memory profiler to really track this down and realize what was happening.
You can't wrap ReportViewer in a using statement (Dispose crashes), but you should be able to if you use LocalReport directly. After that disposes, you can call GC.Collect() if you want to be doubly sure you are doing everything you can to free up that memory.
Hope this helps!
Edit
Apparently, you should call GC.Collect(0) after generating a PDF report or else it appears the memory use could still get high for some reason.
You might want to
profile the heap
use WinDbg + SOS.dll to establish what resource is being leaked and from where the reference is held
Perhaps some API has changed semantics or there might even be a bug in the 4.0 version of the framework
Just for completeness, if anyone is looking for the equivalent ASP.Net web.config setting, it is:
<system.web>
<trust legacyCasModel="true" level="Full"/>
</system.web>
ExecuteReportInCurrentAppDomain works the same.
Thanks to this Social MSDN reference.
It seems as though Microsoft tried putting the report into its own separate memory space to work around all of the memory leaks rather than fix them. In doing so, they introduced some hard crashes, and ended up having more memory leaks anyway. They seem to cache the report definition, but never use it and never clean it up, and every new report creates a new report definition, taking up more and more memory.
I played around with doing the same thing: use a separate app domain and marshal the report over to it. I think that is a terrible solution and makes a mess very quickly.
What I did instead is similar: split the reporting part of your program out into its own separate reports program. This turns out to be a good way to organize your code anyway.
The tricky part is passing information to the separate program. Use the Process class to start a new instance of the reports program and pass any parameters it needs on the command line. The first parameter should be an enum or similar value indicating the report that should be printed. My code for this in the main program looks something like:
const string sReportsProgram = "SomethingReports.exe";
public static void RunReport1(DateTime pDate, int pSomeID, int pSomeOtherID) {
RunWithArgs(ReportType.Report1, pDate, pSomeID, pSomeOtherID);
}
public static void RunReport2(int pSomeID) {
RunWithArgs(ReportType.Report2, pSomeID);
}
// TODO: currently no support for quoted args
static void RunWithArgs(params object[] pArgs) {
// .Join here is my own extension method which calls string.Join
RunWithArgs(pArgs.Select(arg => arg.ToString()).Join(" "));
}
static void RunWithArgs(string pArgs) {
Console.WriteLine("Running Report Program: {0} {1}", sReportsProgram, pArgs);
var process = new Process();
process.StartInfo.FileName = sReportsProgram;
process.StartInfo.Arguments = pArgs;
process.Start();
}
And the reports program looks something like:
[STAThread]
static void Main(string[] pArgs) {
Application.EnableVisualStyles();
Application.SetCompatibleTextRenderingDefault(false);
var reportType = (ReportType)Enum.Parse(typeof(ReportType), pArgs[0]);
using (var reportForm = GetReportForm(reportType, pArgs))
Application.Run(reportForm);
}
static Form GetReportForm(ReportType pReportType, string[] pArgs) {
switch (pReportType) {
case ReportType.Report1: return GetReport1Form(pArgs);
case ReportType.Report2: return GetReport2Form(pArgs);
default: throw new ArgumentOutOfRangeException("pReportType", pReportType, null);
}
}
Your GetReportForm methods should pull the report definition, make use of relevant arguments to obtain the dataset, pass the data and any other arguments to the report, and then place the report in a report viewer on a form and return a reference to the form. Note that it is possible to extract much of this process so that you can basically say 'give me a form for this report from this assembly using this data and these arguments'.
Also note that both programs must be able to see your data types that are relevant to this project, so hopefully you have extracted your data classes into their own library, which both of these programs can share a reference to. It would not work to have all of the data classes in the main program, because you would have a circular dependency between the main program and the report program.
Don't over do it with the arguments, either. Do any database querying you need in the reports program; don't pass a huge list of objects (which probably wouldn't work anyway). You should just be passing simple things like database ID fields, date ranges, etc. If you have particularly complex parameters, you might need to push that part of the UI to the reports program too and not pass them as arguments on the command line.
You can also put a reference to the reports program in your main program, and the resulting .exe and any related .dlls will be copied to the same output folder. You can then run it without specifying a path and just use the executable filename by itself (ie: "SomethingReports.exe"). You can also remove the reporting dlls from the main program.
One issue with this is that you will get a manifest error if you've never actually published the reports program. Just dummy publish it once, to generate a manifest and then it will work.
Once you have this working, it's very nice to see your regular program's memory stay constant when printing a report. The reports program appears, taking up more memory than your main program, and then disappears, cleaning it up completely with your main program taking up no more memory than it already had.
Another issue might be that each report instance will now take up more memory than before, since they are now entire separate programs. If the user prints a lot of reports and never closes them, it will use up a lot of memory very fast. But I think this is still much better since that memory can easily be reclaimed simply by closing the reports.
This also makes your reports independent of your main program. They can stay open even after closing the main program, and you can generate them from the command line manually, or from other sources as well.

Categories