Dotfuscator doesn't support Unity3d? - c#

I tested Dotfuscator with some of my DLLs written in C#.
It seemed it works really well on .NET environment, so this time I tried them with Unity3D. Those simple DLLs were just coded only with simple standard C# classes and methods, no reflection, no LINQ, no other custom or 3rd party APIs including Unity's. But failed to make them work in my Unity3d project, while original pure DLLs which aren't obfuscated were working fine.
After struggling for a few hours, I decided to test with new simple projects. Below is all I got in C#, there are no other code, no other references at all.
[System.Reflection.Obfuscation(Feature = "renaming", Exclude = true)]
public class Class1
{
[System.Reflection.Obfuscation(Feature = "renaming", Exclude = true)]
public static int Get()
{
return 123;
}
}
After compiling this to DLL, I put it in an EMPTY project on Unity3D Editor and played, and it worked well. But when I gave another try after obfuscating the DLL with Dotfuscator, it stopped saying "InvalidProgramException: Invalid IL code in Class1:Get (): IL_000c: pop".
As you can see above, I added attributes which prevent the code from being renamed, and of course, I could see in Reflector, the name of both class and method weren't actually renamed.
Yesterday I newly registered and downloaded the evaluation version of the latest Dotfuscator Professional Edition 4.21.0 from PreEmptive's site. I compiled both on Visual Studio 2008 and 2015 with .NET 3.5, also tried with all different Solution Platforms. The version of Unity is 5.3.4f1.
My guess now is Dotfuscator only works with MSIL, not Mono 2.X which Unity3D internally uses.
Am I right? Is there anyone using Dotfuscator with Unity3D or Mono well?
[EDIT]
I checked IL code using ILDasm, don't understand what's going on, but interesting to see a bunch of code is modified.
Before applying Dotfuscator
.method public hidebysig static int32 Get() cil managed
{
.custom instance void [mscorlib]System.Reflection.ObfuscationAttribute::.ctor() = ( 01 00 02 00 54 0E 07 46 65 61 74 75 72 65 08 72 // ....T..Feature.r
65 6E 61 6D 69 6E 67 54 02 07 45 78 63 6C 75 64 // enamingT..Exclud
65 01 ) // e.
// Code Size 6 (0x6)
.maxstack 8
IL_0000: ldc.i4 0x75bcd15
IL_0005: ret
} // end of method Class1::Get
After applying Dotfuscator
.method public hidebysig static int32 Get() cil managed
{
// Code Size 127 (0x7f)
.maxstack 2
.locals init (int32 V_0)
IL_0000: ldc.i4 0x993
IL_0005: stloc V_0
IL_0009: ldloca V_0
IL_000d: ldind.i4
IL_000e: ldc.i4 0x993
IL_0013: stloc V_0
IL_0017: ldloca V_0
IL_001b: ldind.i4
IL_001c: ceq
IL_001e: switch (
IL_002f,
IL_004c,
IL_002f)
IL_002f: br.s IL_003e
IL_0031: ldc.i4.0
IL_0032: stloc V_0
IL_0036: ldloca V_0
IL_003a: ldind.i4
IL_003b: pop
IL_003c: br.s IL_004a
IL_003e: ldc.i4.0
IL_003f: stloc V_0
IL_0043: ldloca V_0
IL_0047: ldind.i4
IL_0048: br.s IL_003b
IL_004a: nop
IL_004b: nop
IL_004c: ldc.i4 0x1
IL_0051: stloc V_0
IL_0055: ldloca V_0
IL_0059: ldind.i4
IL_005a: br.s IL_0068
IL_005c: ldc.i4.0
IL_005d: stloc V_0
IL_0061: ldloca V_0
IL_0065: ldind.i4
IL_0066: br.s IL_0068
IL_0068: brfalse.s IL_006a
IL_006a: ldc.i4.0
IL_006b: stloc V_0
IL_006f: ldloca V_0
IL_0073: ldind.i4
IL_0074: brfalse IL_0079
IL_0079: ldc.i4 0x75bcd15
IL_007e: ret
} // end of method Class1::Get

UPDATE: As of version 6.0, Dotfuscator Professional no longer supports Unity. (See changelog for 6.0.0-beta). It continues to support Mono. The original answer follows.
Yes, Dotfuscator Professional Edition supports Mono and Unity. The reason you're seeing differences in the IL is due to the Control Flow obfuscation that Professional Edition provides. This is a distinct from the Renaming feature from which you have excluded your identifiers.
By default, Dotfuscator uses as many Control Flow transforms as it can get away with on the .NET Framework (i.e., Microsoft's implementation). However, some of these transforms are not compatible with Mono. Dotfuscator provides an option to disable such transforms.
After loading your Dotfuscator project in the GUI, this option can be found on the Settings tab, under "Advanced", named "Use only Mono-compatible transforms". Set this option to "Yes", then save and rebuild the project.
If that is still giving you problems, you can disable Control Flow entirely to see if that fixes the problem. This is also on the Settings tab, under "Features", "Disable Control Flow". Set this option to "Yes", then save and rebuild the project.
Full disclosure: I work for the Dotfuscator team at PreEmptive Solutions. If you have more questions, keep in mind that evaluation users like yourself also have full access to PreEmptive support.

Related

Simple Add method produces too much IL code

I'm trying to improve my C# math operations by using IL code. One problem is currently that C# does not allow math operations on generics, but the IL does - at least for the primitive data types (interestingly not decimal). For that reason I created some test method in C# to check that the resulting IL code.
Here's the code of the C# method:
public static float Add(float A, float B)
{
return A + B;
}
Here's the result VS2015SP2 / Release + Optimizations turned on.
Just to make sure: Here`s the csc command line from the build:
C:\Program Files (x86)\MSBuild\14.0\bin\csc.exe /noconfig
/nowarn:1701,1702,2008 /nostdlib+ /platform:anycpu32bitpreferred
/errorreport:prompt /warn:4 /define:TRACE /errorendlocation
/preferreduilang:en-US /highentropyva+ /reference:"C:\Program Files
(x86)\Reference
Assemblies\Microsoft\Framework.NETFramework\v4.5.2\Microsoft.CSharp.dll"
/reference:"C:\Program Files (x86)\Reference
Assemblies\Microsoft\Framework.NETFramework\v4.5.2\mscorlib.dll"
/reference:"C:\Program Files (x86)\Reference
Assemblies\Microsoft\Framework.NETFramework\v4.5.2\System.Core.dll"
/reference:"C:\Program Files (x86)\Reference
Assemblies\Microsoft\Framework.NETFramework\v4.5.2\System.Data.DataSetExtensions.dll"
/reference:"C:\Program Files (x86)\Reference
Assemblies\Microsoft\Framework.NETFramework\v4.5.2\System.Data.dll"
/reference:"C:\Program Files (x86)\Reference
Assemblies\Microsoft\Framework.NETFramework\v4.5.2\System.dll"
/reference:"C:\Program Files (x86)\Reference
Assemblies\Microsoft\Framework.NETFramework\v4.5.2\System.Net.Http.dll"
/reference:"C:\Program Files (x86)\Reference
Assemblies\Microsoft\Framework.NETFramework\v4.5.2\System.Xml.dll"
/reference:"C:\Program Files (x86)\Reference
Assemblies\Microsoft\Framework.NETFramework\v4.5.2\System.Xml.Linq.dll"
/debug- /filealign:512 /optimize+
/out:obj\Release\ConsoleApplication1.exe /ruleset:"C:\Program Files
(x86)\Microsoft Visual Studio 14.0\Team Tools\Static Analysis
Tools\Rule Sets\MinimumRecommendedRules.ruleset"
/subsystemversion:6.00 /target:exe /utf8output Program.cs
Properties\AssemblyInfo.cs
"C:\Users\Martin\AppData\Local\Temp.NETFramework,Version=v4.5.2.AssemblyAttributes.cs"
obj\Release\TemporaryGeneratedFile_E7A71F73-0F8D-4B9B-B56E-8E70B10BC5D3.cs
obj\Release\TemporaryGeneratedFile_036C0B5B-1481-4323-8D20-8F5ADCB23D92.cs
obj\Release\TemporaryGeneratedFile_5937a670-0e60-4077-877b-f7221da3dda1.cs
(TaskId:27) 1>
.method public hidebysig static
float32 Add (
float32 A,
float32 B
) cil managed
{
// Method begins at RVA 0x2054
// Code size 9 (0x9)
.maxstack 2
.locals init (
[0] float32
)
IL_0000: nop
IL_0001: ldarg.0
IL_0002: ldarg.1
IL_0003: add
IL_0004: stloc.0
IL_0005: br.s IL_0007
IL_0007: ldloc.0
IL_0008: ret
} // end of method Program::Add
Do you know any reason why there is still nop in it, why there's a local variable and why there's a jump at the end that does nothing?
It's clear to me that the final jitter might solve this, but if I see this I don't know, if I can trust the jitter.
Thanks
Martin
I'm afraid I cannot repeat your results. Compiling with csc /debug- /optimize+ and then using ildasm, I get:
.method public hidebysig static float32 Add(float32 A,
float32 B) cil managed
{
// Code size 4 (0x4)
.maxstack 8
IL_0000: ldarg.0
IL_0001: ldarg.1
IL_0002: add
IL_0003: ret
} // end of method Program::Add
which is what I'd expect from optimized code. Indeed if I change to optimize-, I get the code you posted. Are you checking a Debug vs. Release subdirectory perhaps?

Why x86 JIT is smarter than x64?

I'm running a very simple program
static void Main(string[] args)
{
Console.WriteLine(Get4S());
Console.WriteLine(Get4());
}
private static int Get4S()
{
return 4;
}
private static int Get4()
{
int res = 0;
for (int i = 0; i < 4; i++)
{
res++;
}
return res;
}
when it works under x86 it inlines Get4S method and Get4 asm code is:
00000000 push ebp
00000001 mov ebp,esp
00000003 xor eax,eax
00000005 inc eax
00000006 inc eax
00000007 inc eax
00000008 inc eax
00000009 pop ebp
0000000a ret
BUT when running under x64 we get same asm for Get4S method, but Get4 asm is not optimized at all:
00000000 xor eax,eax
00000002 xor edx,edx
00000004 inc eax
00000006 inc edx
00000008 cmp edx,4
0000000b jl 0000000000000004
0000000d ret
I supposed that x64 JIT unroll the loop, then see that result can be computed in compile-time, and function with compile-time result will be inlined. But nothing from it happend.
Why x64 is so stupid in this case?..
I got the point. It's because RyuJIT is used when x64 build is selected, even if .Net 4.5.2 target platform is selected. So i fixed it by adding this section in App.Config file:
<configuration>
<runtime>
<useLegacyJit enabled="1" />
</runtime>
</configuration>
This markup enables "legacy" x64 JIT (in quotes, because I think he's much better than "shiny" RyuJIT), and result ASM in main method is:
00000000 sub rsp,28h
00000004 mov ecx,4
00000009 call 000000005EE75870
0000000e mov ecx,4
00000013 call 000000005EE75870
00000018 nop
00000019 add rsp,28h
0000001d ret
both methods was calculated in compile time and inlined by theirs values.
Conclusion: When .Net 4.6 is installed, old x64 jitter is replaced by RyuJIT for all solutions under CLR4.0. So only way to turn it off is useLegacyJit switch or COMPLUS_AltJit environment variable

Compilation omits code after fixed blocks in certain methods

We have the following method in a class in one of our projects:
private unsafe void SomeMethod()
{
// Beginning of the method omitted for brevity
var nv = new Vector4[x];
fixed (Vector4* vp = nv)
{
fixed (float* tp = /* Source float ptr */)
{
fixed (double* ap = /* Source double ptr */)
{
for (var i = atlArray.Length - 1; i >= 0; --i)
{
vp[((i + 1) << 3) - 2] = new Vector4(tp[i], btt, 0.0f, 1.0f);
// Additional Vector4 construction omitted for brevity
nttp[i] = new Vector2(tp[i], this.ttvp);
nts[i] = string.Format(ap[i], /* etc. */);
}
}
}
}
this.ts = nts;
this.ttp = nttp;
this.V = nv; // <- This is a property setter
}
I've had to obfuscate this, but hopefully it's still clear enough to get an idea of what's going on.
On one of our developers' machines, in debug builds, the C# compiler removes the three assignments that occur after the fixed block is closed. If we attempt to put a breakpoint on these lines, the breakpoint skips to the ending brace of the method when the application starts. Code that appears between a fixed block and the end of a method is removed in other methods too, but puzzlingly, not in all of them.
After some experimentation, we found that turning on optimization for the affected project caused the missing code to be included. However, this work-around fails for our unit testing - the code is missing, and changing the optimization of the affected project and its test project does not help. We also found that moving the three assignments inside the inner-most fixed statement worked - it becomes clear why when examining the IL.
In the debug DLL built on the affected machine (with optimization turned off), a return op appears directly after ap is popped off the stack:
IL_03a1: nop
IL_03a2: ldc.i4.0
IL_03a3: conv.u
IL_03a4: stloc.s ap
IL_03a6: ret
This explains why moving the three assignments before the stloc instruction works. In the debug DLL built on my machine, the return op occurs in the expected place, after the three assignments:
IL_03a5: nop
IL_03a6: ldc.i4.0
IL_03a7: conv.u
IL_03a8: stloc.s ap
IL_03aa: nop
IL_03ab: ldc.i4.0
IL_03ac: conv.u
IL_03ad: stloc.s tp
IL_03af: nop
IL_03b0: ldc.i4.0
IL_03b1: conv.u
IL_03b2: stloc.s vp
IL_03b4: ldarg.0
IL_03b5: ldloc.s nts
IL_03b7: stfld string[] N.B.E.B::ts
IL_03bc: ldarg.0
IL_03bd: ldloc.s nttp
IL_03bf: stfld valuetype [SharpDX]SharpDX.Vector2[] N.B.E.B::ttp
IL_03c4: ldarg.0
IL_03c5: ldloc.s nv
IL_03c7: call instance void N.B.E.B::set_V(valuetype [SharpDX]SharpDX.Vector4[])
IL_03cc: nop
IL_03cd: ret
We've so far failed to produce an SSCCE - this seems to manifest only in very specific circumstances, and only in one of our projects. We've checked that the same versions of Visual Studio, the .NET framework, the C# compiler and MSBuild are being used on both machines. We've checked other potential differences like OS version and updates. Things appear to be the same on both machines (they are the same model of laptop). We're a bit puzzled, frankly. Any help would be much appreciated.
My colleague, the developer whose machine this was affecting, found a difference in the diagnostic output from the build - it was the C# compiler. We thought that MSBuild was using the csc.exe in the expected location (c:\Program Files (x86)\MSBuild\12.0\Bin), but bizarrely, it was actually executing a csc.exe in C:\Users\[user]\AppData\Local\Microsoft\VisualStudio\12.0\Extensions, which had a completely different version. We don't know how that compiler executable came to be there, or how it was being referenced by MSBuild, but once it was removed, MSBuild reverted to using the version of csc.exe in its home Bin directory, and the code is being built correctly now.

Errors/warning after converting VS 2008 (c#) solution to VS 2010

We have a Visual Studio 2008 solution with 58 projects. One project targets the 3.5 runtime while the other 57 target 3.0. The solution builds fine in Visual Studio 2008. I open the solution in Visual Studio 2010 and proceed through the Upgrade project Wizard. When prompted, I decline targeting the 4.0 runtime and stick with the currently selected runtime. The conversion completes with no errors.
When attempting to build I get a large number of the following two warnings:
"The primary reference [AssemblyX] could not be resolved because it has an indirect dependency on the .NET Framework assembly "CrystalDecisions.Enterprise.Framework, Version=11.5.3300.0, Culture=neutral, PublicKeyToken=692fbea5521e1304" which has a higher version "11.5.3300.0" than the version "10.5.3700.0" in the current target framework.
C:\WINDOWS\Microsoft.NET\Framework\v4.0.30319\Microsoft.Common.targets"
and
"The primary reference [AssemblyY] could not be resolved because it has an indirect dependency on the .NET Framework assembly "mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089" which has a higher version "4.0.0.0" than the version "2.0.0.0" in the current target framework.
C:\WINDOWS\Microsoft.NET\Framework\v4.0.30319\Microsoft.Common.targets"
I looked at the manifest for an assembly where this error occurs:
// Metadata version: v4.0.30319
.assembly extern System.Web
{
.publickeytoken = (B0 3F 5F 7F 11 D5 0A 3A ) // .?_....:
.ver 2:0:0:0
}
.assembly extern mscorlib
{
.publickeytoken = (B7 7A 5C 56 19 34 E0 89 ) // .z\V.4..
.ver 4:0:0:0 //***********Why is this targeting the 4.0?
}
.assembly extern System.Xml
{
.publickeytoken = (B7 7A 5C 56 19 34 E0 89 ) // .z\V.4..
.ver 2:0:0:0
}
.assembly extern [SomeAssembly1]
{
.ver 1:0:0:0
}
.assembly extern [SomeAssembly2]
{
.publickeytoken = (A7 E6 CA C5 42 3F 9E A9 ) // ....B?..
.ver 3:1:30307:0
}
.assembly extern [SomeAssembly3]
{
.publickeytoken = (A7 E6 CA C5 42 3F 9E A9 ) // ....B?..
.ver 3:1:30307:0
}
.assembly extern mscorlib as mscorlib_6
{
.publickeytoken = (B7 7A 5C 56 19 34 E0 89 ) // .z\V.4..
.ver 2:0:0:0
}
.assembly extern System
{
.publickeytoken = (B7 7A 5C 56 19 34 E0 89 ) // .z\V.4..
.ver 2:0:0:0
}
.assembly extern Relo.Profile.Client
{
.ver 1:0:0:0
}
.assembly extern PRERS.Logging
{
.publickeytoken = (A7 E6 CA C5 42 3F 9E A9 ) // ....B?..
.ver 3:1:30307:0
}
.assembly extern Microsoft.Practices.EnterpriseLibrary.Logging
{
.publickeytoken = (74 B5 57 D6 49 41 67 26 ) // t.W.IAg&
.ver 3:1:0:0
}
I looked at the refernces that SomeAssembly1, SomeAssembly2, SomeAssembly3 (and the entire solution) make and I can't find anything that is targetting 4.0.
I've searched for solutions/discussions on the internet and none of the workarounds I have found seem to work for me.
http://social.msdn.microsoft.com/Forums/en/msbuild/thread/516647ee-dccf-49ee-959a-00b1fc098eeb
http://connect.microsoft.com/VisualStudio/feedback/details/571860/assemblies-targetting-net-3-5-will-not-load-in-applications-also-targetting-net-3-5
http://arstechnica.com/civis/viewtopic.php?f=20&t=1112439
http://connect.microsoft.com/VisualStudio/feedback/details/510467/migrated-project-cant-compile-the-commandline-for-resgen-task-is-to-long
Any help is greatly appreciated. Thanks!
See these two blog posts to fix your problem:
http://traf-o-data.blogspot.com/2010/10/mystery-of-mscorlib.html
http://customerfx.com/pages/integrationblog/2010/11/08/working-with-crystal-reports-v11-5-and-visual-studio-2010-error-adding-references-to-crystaldecisions-dlls.aspx
58 Projects is a lot! One of the first things I would try is removing some of them from the solution. Knowing your going to get other errors about referencing the ones you took out, but if that error disappears/stays then you have a better idea of which one is your problem child.
Another approach may be to create a new VS2010 solution, then add the projects back one-by-one. 58 projects is going to cause you pain, I feel for you!

How does .NET locate the dll of the namespace I'm `using`?

How does .NET locate the dll of the namespace I'm using?
yeah, we do mention the path in /referene:c:\program files** but after building & deploying and when the software is installed on some user's machine. It may not be at the same path as I (developer) mentioned it would be. I mean it could be some where else right?
So, how does .NET find it?
How does .NET know at the runtime what all namespaces are there in a dll?
What if the user has the same dll but he renamed it? Does .net fails to load it then?
You can see the various probing steps in action if you run fuslogvw.exe (from the .NET SDK) and enable logging of all binds to disk. You will be able to see the various paths that .NET CLR uses to resolve assembly references. The rule of thumb though is to try the Global Assembly Cache first then check the local directory along with a bunch of other alternate paths.
Technically, there aren't any namespaces in the DLL.
On CLR level, there are no namespaces at all, only full class names. A CLR class name can consist of an arbitrarily long sequence of any Unicode characters - e.g. ##$% would be a perfectly fine class name, as far as CLR is concerned.
Now, by convention (CLS, to be specific), class names are restricted to certain Unicode characters (alphanumerics and _, and a bunch of other exotic Unicode categories - see the spec for more info) and dot, and dot is used to denote namespaces. This is purely a convention between compilers (and other tools).
So, whenever an assembly references some type for any reason, it simply uses its complete CLR name, such as System.String. But there is more - in fact, it uses a fully qualified name, which includes an assembly as well. You can see those if you look at ildasm output - they look something like [mscorlib]System.String, so the runtime knows where to look.
In other words, CLR really sees assembly mscorlib.dll having class System.String, and assembly B.exe referencing that class as [mscorlib]System.String. Your using statement doesn't generate any code in the output DLL/EXE on its own; it's there just so that you don't have to write System.String all the time.
It's compiler's job to translate your code saying String, in a scope of a using System; statement, in a project which references mscorlib.dll, to [mscorlib]System.String. It's all done at compile-time. The only thing CLR does at runtime is resolving mscorlib to locate the actual mscorlib.dll on disk (and the other answer explains how exactly that happens).
Assembly location rules are explained in http://msdn.microsoft.com/en-us/library/yx7xezcf.aspx
The names of assemblies and namespaces are unrelated concepts.
Compiled assembly reference other assemblies by containing their name.
When an assembly is loaded, its references are loaded by searching for the name as described here. The most common location for assemblies are the Global Assembly Cache of a machine, and the base directory of an application.
After loading the assembly and all its references (and the references of the references, etc) the runtime loads the required types from the assemblies.
For example, if you use the type String as in
using System;
...
String x = "Hello World";
the C# compiler looks up String and resolves it to the System.String type in the mscorlib assembly, and transforms the code to something similar to this:
[mscorlib]System.String x = "Hello World";
So when the statement is executed the runtime knows both the full name of the type System.String and the name of the assembly that contains the type.
The only answer that answers the question is Pavel Minaev's comment.
Take the following basic C# Program.
using System;
namespace ConsoleApp1
{
class Program
{
static void Main(string[] args)
{
Console.WriteLine("Hello");
}
}
}
The compiler resolves Console by looking at all the assemblies that are referenced using /r: in the compilation command line. If there are more than 2 definitions of Console for the global namespace in the assemblies that are referenced, which is extended using the using statements, then there will be a compiler error -- to resolve this you would need to explicitly use System.Console to indicate the System namespace rather than the global namespace.
Similarly, if you use System.Console, then if there is more than one definition of System.Console in the assemblies that are referenced then there is a compiler error. This would either be 2 classes in the System namespace with the same name, which would need to be resolved by the programmer by renaming that, or defining a class called System in the global namespace that has been extended by using System, where System.Console is a subclass, which would be impossible to distinguish even if it were using global::
Once it has found a referenced assembly that contains the definition, it can now work with that definition and produce the correct IL.
When you compile this simple application, you get the following in ildasm:
// Metadata version: v4.0.30319
.assembly extern /*23000001*/ mscorlib
{
.publickeytoken = (B7 7A 5C 56 19 34 E0 89 ) // .z\V.4..
.ver 4:0:0:0
}
.assembly /*20000001*/ ConsoleApp1
{
.custom /*0C000001:0A000001*/ instance void [mscorlib/*23000001*/]System.Runtime.CompilerServices.CompilationRelaxationsAttribute/*01000001*/::.ctor(int32) /* 0A000001 */ = ( 01 00 08 00 00 00 00 00 )
.custom /*0C000002:0A000002*/ instance void [mscorlib/*23000001*/]System.Runtime.CompilerServices.RuntimeCompatibilityAttribute/*01000002*/::.ctor() /* 0A000002 */ = ( 01 00 01 00 54 02 16 57 72 61 70 4E 6F 6E 45 78 // ....T..WrapNonEx
63 65 70 74 69 6F 6E 54 68 72 6F 77 73 01 ) // ceptionThrows.
// --- The following custom attribute is added automatically, do not uncomment -------
// .custom /*0C000003:0A000003*/ instance void [mscorlib/*23000001*/]System.Diagnostics.DebuggableAttribute/*01000003*/::.ctor(valuetype [mscorlib/*23000001*/]System.Diagnostics.DebuggableAttribute/*01000003*//DebuggingModes/*01000004*/) /* 0A000003 */ = ( 01 00 07 01 00 00 00 00 )
.custom /*0C000004:0A000004*/ instance void [mscorlib/*23000001*/]System.Reflection.AssemblyTitleAttribute/*01000005*/::.ctor(string) /* 0A000004 */ = ( 01 00 0B 43 6F 6E 73 6F 6C 65 41 70 70 31 00 00 ) // ...ConsoleApp1..
.custom /*0C000005:0A000005*/ instance void [mscorlib/*23000001*/]System.Reflection.AssemblyDescriptionAttribute/*01000006*/::.ctor(string) /* 0A000005 */ = ( 01 00 00 00 00 )
.custom /*0C000006:0A000006*/ instance void [mscorlib/*23000001*/]System.Reflection.AssemblyConfigurationAttribute/*01000007*/::.ctor(string) /* 0A000006 */ = ( 01 00 00 00 00 )
.custom /*0C000007:0A000007*/ instance void [mscorlib/*23000001*/]System.Reflection.AssemblyCompanyAttribute/*01000008*/::.ctor(string) /* 0A000007 */ = ( 01 00 00 00 00 )
.custom /*0C000008:0A000008*/ instance void [mscorlib/*23000001*/]System.Reflection.AssemblyProductAttribute/*01000009*/::.ctor(string) /* 0A000008 */ = ( 01 00 0B 43 6F 6E 73 6F 6C 65 41 70 70 31 00 00 ) // ...ConsoleApp1..
.custom /*0C000009:0A000009*/ instance void [mscorlib/*23000001*/]System.Reflection.AssemblyCopyrightAttribute/*0100000A*/::.ctor(string) /* 0A000009 */ = ( 01 00 12 43 6F 70 79 72 69 67 68 74 20 C2 A9 20 // ...Copyright ..
20 32 30 32 31 00 00 ) // 2021..
.custom /*0C00000A:0A00000A*/ instance void [mscorlib/*23000001*/]System.Reflection.AssemblyTrademarkAttribute/*0100000B*/::.ctor(string) /* 0A00000A */ = ( 01 00 00 00 00 )
.custom /*0C00000B:0A00000B*/ instance void [mscorlib/*23000001*/]System.Runtime.InteropServices.ComVisibleAttribute/*0100000C*/::.ctor(bool) /* 0A00000B */ = ( 01 00 00 00 00 )
.custom /*0C00000C:0A00000C*/ instance void [mscorlib/*23000001*/]System.Runtime.InteropServices.GuidAttribute/*0100000D*/::.ctor(string) /* 0A00000C */ = ( 01 00 24 64 66 65 35 65 36 32 61 2D 65 61 31 33 // ..$dfe5e62a-ea13
2D 34 66 37 64 2D 39 36 39 32 2D 37 35 39 39 64 // -4f7d-9692-7599d
31 31 66 31 63 36 61 00 00 ) // 11f1c6a..
.custom /*0C00000D:0A00000D*/ instance void [mscorlib/*23000001*/]System.Reflection.AssemblyFileVersionAttribute/*0100000E*/::.ctor(string) /* 0A00000D */ = ( 01 00 07 31 2E 30 2E 30 2E 30 00 00 ) // ...1.0.0.0..
.custom /*0C00000E:0A00000E*/ instance void [mscorlib/*23000001*/]System.Runtime.Versioning.TargetFrameworkAttribute/*0100000F*/::.ctor(string) /* 0A00000E */ = ( 01 00 1C 2E 4E 45 54 46 72 61 6D 65 77 6F 72 6B // ....NETFramework
2C 56 65 72 73 69 6F 6E 3D 76 34 2E 36 2E 31 01 // ,Version=v4.6.1.
00 54 0E 14 46 72 61 6D 65 77 6F 72 6B 44 69 73 // .T..FrameworkDis
70 6C 61 79 4E 61 6D 65 14 2E 4E 45 54 20 46 72 // playName..NET Fr
61 6D 65 77 6F 72 6B 20 34 2E 36 2E 31 ) // amework 4.6.1
.hash algorithm 0x00008004
.ver 1:0:0:0
}
.module ConsoleApp1.exe
// MVID: {5BC9CD36-3807-4339-8AAD-6E73A42CE87B}
.imagebase 0x00400000
.file alignment 0x00000200
.stackreserve 0x00100000
.subsystem 0x0003 // WINDOWS_CUI
.corflags 0x00020003 // ILONLY 32BITREQUIRED
// Image base: 0x00000000003B0000
.class /*02000002*/ private auto ansi beforefieldinit ConsoleApp1.Program
extends [mscorlib/*23000001*/]System.Object/*01000010*/
{
.method /*06000001*/ private hidebysig static
void Main(string[] args) cil managed
// SIG: 00 01 01 1D 0E
{
.entrypoint
// Method begins at RVA 0x2050
// Code size 13 (0xd)
.maxstack 8
IL_0000: /* 00 | */ nop
IL_0001: /* 72 | (70)000001 */ ldstr "Hello" /* 70000001 */
IL_0006: /* 28 | (0A)00000F */ call void [mscorlib/*23000001*/]System.Console/*01000011*/::WriteLine(string) /* 0A00000F */
IL_000b: /* 00 | */ nop
IL_000c: /* 2A | */ ret
} // end of method Program::Main
.method /*06000002*/ public hidebysig specialname rtspecialname
instance void .ctor() cil managed
// SIG: 20 00 01
{
// Method begins at RVA 0x205e
// Code size 8 (0x8)
.maxstack 8
IL_0000: /* 02 | */ ldarg.0
IL_0001: /* 28 | (0A)000010 */ call instance void [mscorlib/*23000001*/]System.Object/*01000010*/::.ctor() /* 0A000010 */
IL_0006: /* 00 | */ nop
IL_0007: /* 2A | */ ret
} // end of method Program::.ctor
} // end of class ConsoleApp1.Program
As can be seen, the assembly that contains the type in a certain namespace that is being referenced is resolved by the compiler, so the runtime does not need to search the manifests of all of the dependent assemblies. Instead, looks for the assembly in the square brackets and then uses the manifest in the image that is currently running to acquire more details. It then searches the global assembly cache if it is strong named and then it searches certain directories for a .config file or for the assemblies themselves. On Mono, these directories are the directory that contains the image that is currently being executed on the virtual machine and the MONO_PATH environment variable -- it searches the GAC last.
The runtime dynamically loads assemblies when an object of type of a class in the assembly is first referenced as opposed to instantiated – the instantiation is deferred. When you use DllImport, the .dll is also dynamically loaded when it is required by the runtime. For internal calls, you would need to internally call your own native function from C# that dynamically loads a .dll using LoadLibrary and then get GetProcAddress to make the internal call binding, before making the real call.
An extern alias can be used to access an assembly that is referenced using /r: explicitly: /r:GridV1=grid.dll and then extern alias GridV1 and then GridV1::Namespace.Class. The :: operator accesses a member of an aliased namespace, which is either the global alias, an extern alias, or an alias created by the using alias directive. These alias directives do not extend the global namespace unlike the regular using or using static.

Categories