Startup of Winforms program 10x slower under x64 relative to x86 - c#

I've created a popular Winforms program in C# which has a lot of GUI widgets, and found that when the platform target is x64, startup is around 5-10x slower than x86. Under an x64 target, it takes around 5 seconds to start up, and this negatively impacts the user experience. I'd like to make it quicker.
I also tried with another program of mine, and also found the startup time under x64 was double or triple that of x86.
So I began to wonder what was causing it. My programs use quite a lot of widgets, so to test the theory I decided to create a test project with 800 buttons! I set all of them to Visible=False so that redrawing/refresh speed doesn't muddy the waters.
To my amazement, the x64 started up a LOT slower than the x86 equivalent. I then proceeded to time the InitializeComponent(); section and sure enough, the x64 version ran in about 3.5 seconds. The x86 on the other hand took only around 0.275 seconds. That's almost 13x quicker!
.NET Framework 2.0, 3.0, and 3.5 are equally as bad. Targeting .NET 4 & 4.5 under x64 is much better at around 0.8 seconds, but that's still about 3x as slow (x86 is about as good at 0.28 seconds), and I want to use .NET 3.5 for the increased userbase.
So my question is: What's causing the x64 version to start up so slow, and how can I make it quicker so that it's comparable to the x86 version?
If anyone wants to test immediately, I have created a zip of the VS 2010 project which can be downloaded here: http://www.skytopia.com/stuff/64_vs_32bit_Startup_Speed.zip

It's the JIT cost of your 10000 line InitializeComponent function.
If you measure the time between the call to InitializeComponent and the execution of its first line, that's the majority of the cost. (Just insert a line at the top of InitializeComponent for the measurement.)
If you use the VS Performance Analyzer, it'll show most time spent in ThePreStub, which is related to JIT.
The 64 bit JITter takes longer to compile code than the 32 bit JITter, but in exchange it produces better code.
Microsoft is working on new version of the JITter, called RyuJIT. It's derived from the 32 bit JITter and has similar characteristics (fast compilation outputting worse code). It will become the standard JIT in future versions of .NET.
.NET 4.5 reduces cost from 2.0 seconds to 1.3 on my machine. This is probably due to JIT improvements in the 4.0 runtime.
An equivalent loop is much faster than your InitializeComponent function.
That won't help you if you want to create all components in the designer, but if you have a mix of repetitive controls and components that you want to edit in the designer, you can use a loop. Just put it in Form1.cs not in Form1.designer.cs so that it won't get overwritten by the designer.
Using NGen on your assembly should eliminate the JIT cost. But it comes with the downside of having to deal with the GAC.

Related

Understanding 32-bit vs 64-bit in .Net

I've always skirted around the issue of building 64-bit desktop apps as I somehow got it in my head that it wasn't a trivial thing to do, plus I guess I didn't really understand the solution "configuration manager" dialog, platforms and configurations, either.
Anyway, I've just tried converting an existing application by simply changing all of the solution projects' platforms to x64, and it worked. The only issue I had was with a C++ DLL referenced by one of the projects, which needed rebuilding as x64.
So my question (finally) is: why did I only have an issue with the C++ DLL, but the many .Net assembly DLLs referenced by the projects in my solution were fine?
And what actually determines whether my application is built as 64-bit? I changed all of the solution's projects to "x64", but would it still have worked if they were left as "Any CPU", and I only changed the WPF (startup) project to "x64"?
Edit
So in the end it seems that I was simply able to set all project platforms to "Any CPU" rather than "x64" (the TFS server wasn't able to run unit tests with the latter). Even the projects referencing the unmanaged 64-bit DLL seem happy with this (not sure why).
The trick was to uncheck the Prefer 32-bit option in the WPF (startup) project's properties. The built application now runs as a 64-bit app, and TFS/unit tests run happily.
TL;DR: .Net assemblies are never assembled in to machine specific instructions, instead they are compiled into MSIL which runs under the .Net virtual machine, at runtime the VM uses the JITer to produce machine instructions which can be executed by the CPU.
The differences between 32bit and 64bit that "breaks" object file are either: Pointer size, Memory model and most importantly the instruction set architecture (ISA). Lets break down each point and see why .Net assemblies are mostly unaffected.
Point Size (or Length)
In a 32bit execution environment pointers are 32 bit long, and as you'd expect int 64 bit they are 64 bit long. This fact can break your assembly in one of two ways:
Structures containing pointers will have their internal memory structure changes are they need more (or less) bytes to store pointers.
Pointer arithmetic - some clever developers like to play around with pointers, masking them, adding them and so on. In some cases the size of the pointer may be integral for these computations to yield correct results.
Why .Net assemblies are not affected
Since .Net languages are safe - you don't get to play with pointers (generally, unless you are under an unsafe context). Not having the option to play with pointers solves the 2nd point. As for the first point - this is the reason you can get the size (using sizeof) of a class no the size of a struct containing class references.
Memory Model
The old 32bit processors used to have a feature called Memory segmentation which allowed the OS or a Supervisor program to declare regions of memory that are accessed using special CPU registers called Segment registers. In 64bit this feature is mostly disabled. So programs compile to work under memory segmentation may have problems working in a non segmented environment.
Why .Net assemblies are not affected
Generally we don't deal with memory at this low level, this is possible because of the .Net virtual machine as explained in the next point.
Instruction Set Architecture
32bit and 64bit (x86 and AMD64) are similar but completely different ISAs which means that code which was assembled to run under one will not run under the other. So unlike the other two points this one will break your assembly regardless of what you've written in it.
Why .Net assemblies are not affected
So how could .Net assemblies possibly be compiled as Any CPU? The trick is that when you compile a .Net language you never assemble it. That means when you press compile you only compile your code into an intermediate object (known as a .Net assembly) that is not assembled (ie. not ran through an assembler).
Usually when you compile code in C/++ you first run it through a compile the generates some assembly instructions these instructions are then passed down to an assembler which generates machine instructions. The CPU is only capable of executing these machine instructions.
.Net languages are different, as mentioned above, when you compile a .Net language you run it through a compiler and stop there, no assembler involved. The compile here, too, generates some assembler instructions but unlike the instructions produced by a c/c++ compile these instructions are machine agnostic, they are written in language called MSIL - Microsoft intermediate language. No CPU knowns how to execute these instructions, since just like ordinary assembler instructions they too must be assembled into machine instructions.
The trick is that this all happens at run time, when a user opens you program he launches an instance of the .Net runtime in which your program runs, your program never runs directly against the native machine itself. When your program needs to call a method a special component in the .Net virtual machine called the JITer - just in time compiler is tasked with assembling this method from MSIL to machine instructions specific for the machine the .Net framework was installed on.
From a correctness standpoint 32/64 only matters when you are interoperating with native code, using low-level unsafe features, or allocating a lot (> 4GB) of memory. Absent those considerations you might as well just use AnyCpu. AnyCpu code will default to your host OS bitness (though that default can be altered), so for most people these days it will end up running in a 64 bit process.
There are sometimes performance reasons to explicitly prefer 32 or 64 bits. 32 bit apps will generally be more cache friendly. Compute-bound apps should perform better when run as 64 bit processes (though not always).

Is it good practice to design c# application under windows 64 bit?

I am new to c# . I am developing a C# application under windows 7 64 bit using visual studio 2015 professional. later when the application is ready I expect to install it in 64 bit or in 32 bit machines.
if I design the application under 64 bit I think it will not work under 32 bit, right? my question is: if I design it and then create setup file under 64 bit computer, what should I do later in order for my application to work under 32 bit windows machines?
Please help me. Thank you
if I design the application under 64 bit I think it will not work under 32 bit, right?
No. You can build an "Any CPU" assembly that will work in both. (You can also build a 32-bit assembly that will work in both, but on 64-bit it runs as a 32-bit app, which is rarely what you want).
my question is: if I design it and then create setup file under 64 bit computer, what should I do later in order for my application to work under 32 bit windows machines?
Test it in one. Selecting "Any CPU" is simple, but forgetting to do so is also simple, especially if you add another project for a library (an Any-CPU application with a 64-bit-only library is essentially 64-bit-only). Testing makes this very obvious, very soon.
If it's a managed application, generally it won't matter. The run-time will auto-magicly sort most problems simply by using Any CPU as target.
I've met 2 problems mostly:
developers who hard-code sizeof(int) to 4
managed applications with un-managed dependancies (like openssl libraries) - in this case you will need to have 2 different builds (one for 32 bit and one for 64 bit)
Your QAs should test your application on all target systems (even if by using virtual machines to do so).
Most .NET developers (I can't speak for C++ guys) use 64bit systems simply because we want performance and to use the newest stuff. All new CPUs support 64 bit for quite some time. It's no longer something exotic.
With .net it doesn't matters, just stick to "Any CPU" compilations and you should not have any problem.
Only exception is if you're using external libraries for an specific platform (unmanaged libraries or managed libraries specifically compiled for x64),
In the case of managed libraries you will need to compile twice, one for x86 and other for x64 with the correct libraries, on the unamnaged case you just need to declare the DllImports twice, one for each platform and use at runtime the correct ones.
It's a bit confusing to be honest...
... later when the application is ready I expect to install it in 64
bit or in 32 bit machines.
All 32-bit applications also work on 64-bit, because Intel/AMD made their AMD64 implementation backward compatible. In other words: regardless if you're using unmanaged or managed code, you can always run it on a 64-bit platform.
The other way around that's not the case, so:
if I design the application under 64 bit I think it will not work
under 32 bit, right?
Yes, correct.
Personally I don't bother anymore with x86 because everyone nowadays has a 64-bit capable processor.
It's usually best to compile to 'Any CPU'. The 'x86' and 'x64' targets are only there to enforce a certain platform. In practice everything is always compiler to .NET IL - which is JIT compiled by the runtime to native assembler.
The JIT uses the 'target flag' or the 'prefer 32-bit' (csproj properties) hint to determine which runtime should run the code. If
you have the 'x86' target or
'prefer 32-bit' (default) or
if you don't have a 64-bit system and don't have the 'prefer 32-bit' checked
it will use the x86 JIT compiler (and runtime). Otherwise it will use the x64 JIT compiler (and runtime).
There are some differences and subtleties that you change when you do this (that all have to do with native interop), but when you stick to normal C# code you shouldn't notice them.
Summary: Personally I seem to make it a habit of not being able to fit my application in 2 GB of RAM (I work what they now call 'big data'), so in those cases the best practice is to use the 'Any CPU' target and uncheck the 'prefer 32-bit'. That said, if you don't need the memory, it's probably better to leave it on as a safeguard.
my question is: if I design it and then create
setup file under 64 bit computer, what should I do later in order for
my application to work under 32 bit windows machines?
You might be surprised, but that's actually a totally different question. Setup systems (e.g. MSI) are highly dependent on the operating system. Most MSI's therefore use hacks to make it work on both platforms. Wix has some quite good documentation about this.

When should I compile my c# code with /platform:x64

Reading Microsoft explanation for different option for /platform, I found that when I use /platform:anycpu, it will run as 32 bit application on 32 bit system and 64 bit on 64 bit systems.
So why should I force it to use for example x64?
Is there any reason that people may use other options?
The must important and useful different is in memory management. x86 applications cannot allocate more than 4GB of RAM. But x64 apps can. and also:
It is related on how the CLR will execute your code when is implemented. Remember that CLR compiles your code on the fly, while is first requested the IL code is assembled into machine code and then cached (that's why the first access is slower on your applications). Now, if you select x86, you are flagging your code that your application will run on a x86 processor, therefore the CLR can use assembler functions that are exclusively for that type of processor (taking performance advantages). The same happens if you select x64, where the registers are bigger and there are new x64 functions that can speed up your application.
If you select "Any" the CLR will not use any CPU specific function and will hold to the standard set of instructions.
Reference: Project setting CPU: x86, x64, Any CPU
Hope this helps to understand.
It's important when you try to load dll's made with c++ for example. I have a particular case when i can't execute my program when compiled in x64 or Any CPU but it works fine in 32 Bit.
In other cases when you make a dll for a ASP.Net page. If you Compile it in 32 Bit you have to configure the IIS to load them correctly.
When you are writing an app that should only execute on a 64 bit OS. ie when it gets jitted you need 64 native code for some reason.
One reason would be you want to load a 64bit only native dll which can't be loaded into a 32 bit process. e.g. it's not a .net dll.

Compiling XNA to 64-bit

I'm working on a simulator that models very complex interactions between many objects, and I mean millions.
I've used XNA because of the useful things that it lets me do easily, especially with the rendering.
My problem is that I'm running into OutOfMemoryExceptions after only a few seconds of simulation. I quickly realized that my program is only running in 32-bit mode, so I only get a few GBs of my somewhat larger amount of RAM.
How can I run an XNA game in 64-bit mode?
XNA used to be 32-bit only. You can try to compile to 64-bit, but then you are going to have issues because there won't be any 64-bit XNA libraries to load up. Everything I have found trying to back this up has been comments reminding viewers to compile as x86.
To compile as x64: Right-Click on your solution and go to Configuration Manager. From there select x64 as your Active Solution platform. You may have to create it and then select all your projects as x64. You could also try building as Any CPU since you are on 64-bit Windows, it should automatically start up in 64-bit mode.
EDIT: I found one of my old VM's that had XNA 4 installed on it. It appears that there is not a way to force it to compile to 64-bit. Even when I try to create a new platform, it will only allow me to select x86.
This does not directly answer the question. I've already upvoted TyCobb's answer.
I've run into the same issue in the past.
Have you considered switching to SlimDX? It's a wrapper around DirectX and allows you to develop .Net applications using DirectX. According to the documentation:
SlimDX sports complete support for both 32 and 64 bit targets.
It might not be as user-friendly as XNA, but I think it could be a good option for you.

Does setting the platform when compiling a c# application make any difference?

In VS2012 (and previous versions...), you can specify the target platform when building a project. My understanding, though, is that C# gets "compiled" to CIL and is then JIT compiled when running on the host system.
Does this mean that the only reasons for specifying a target platform are to deliberately restrict users from running the software on certain architectures or to force the application to run as 32 bit on a 64 bit machine? I can't see that it would be to do with optimization, as I guess that happens at the CIL-->Native stage, which happens Just-In-Time on the host architecture?
This MS Link does not seem to offer any alternative explanation and I can find no suggestion of the fact that you should, for example, release separate 32 / 64 bit versions of the same application - it would seem logical that something compiled for "anycpu" should run just as well and, again, optimizations will be applied at the JIT stage.
Does this mean that the only reasons for specifying a target platform are to deliberately restrict users from running the software on certain architectures or to force the application to run as 32 bit on a 64 bit machine?
Yes, and this is critical if you're using native code mixed with your managed code. It does not change what gets optimized at runtime, however.
If your code is 100% managed, then AnyCPU (or the new AnyCPU Prefer 32-Bit) is likely fine. The compiler optimizations will be the same, and the JIT will optimize at runtime based on the current executing platform.
I can find no suggestion of the fact that you should, for example, release separate 32 / 64 bit versions of the same application
There is no reason to do this unless you're performing interop with non-managed code, in which case, that will require separate 32 and 64 bit DLLs.
Reed has a good answer here. However, I think it's also important to point out that this setting is just a flag in the DLL - it pretty much has no effect whatsoever in most situations. It is the runtime loader's (the bit of native code that starts the .NET runtime) responsibility to look at this flag, and direct the appropriate version of of the .NET runtime to be started up.
Because of this - the flag will mostly only matter when it is set on an EXE file - and have no effect when set on a DLL.
For example - if you have a '32-bit-flagged .NET DLL' which is used by either a 64-bit-flagged .NET EXE, or an any-cpu-flagged .NET EXE, and you run the EXE on a 64-bit machine - then the loader will start the 64-bit runtime. When it comes time to load the 32-bit DLL, it's too late - the 64-bit runtime has already been chosen, so your program will fail (I believe it's a BadImageFormatException that you will receive).

Categories