What is the purpose of *deps.json file in .NET Core?
What is the reason to store references in such file and not in assembly manifest(as in standalone .NET Framework)?
Using Ildasm i checked that assembly manifest doesn't contain entries for these dependecies after dotnet build command.
But it has entries after dotnet publish command.
The .deps.json file contains metadata about the assemblies referenced by the built assembly and the locations to search for them as well as information about the compilation options used.
This information is read by the native component (corehost) that loads and configures the runtime. When a referenced assembly needs to be loaded, the host will use the information in this file (as well as any runtimeconfig.json/runtimeconfig.dev.json) to locate the correct assembly to load.
This information is used in other places as well. For example ASP.NET Core's Razor view compilation also uses it to pass the correct references and configuration to the generated code. And unit test hosts also need to use the information in this file when a unit test library is loaded into the test host. The managed API to read and write this file is available in the Microsoft.Extensions.DependencyModel NuGet package.
Though there are many articles to tell about the Pipeline Component deployment. I dont understand how to do it. I build the solution and I have the .dll file. How do I GAC the Pipeline Component Assembly.
Where do I find the assembly, Is the .dll file is the assembly?I am new to .NET or C# I dont understand any of the terminologies. Can anybody help me with the details.
This is a good way of looking at it (taken from Difference Between Assembly and DLL):
An assembly is .NET's "minimum unit of deployment". Usually an
assembly corresponds to a single file, but it doesn't have to - you
can have multiple files, with one of them being the master which knows
where all the other bits are.
Single-file assemblies are usually DLLs or EXE files. If you've got a
normal class library and you just want to send it to the other side,
the DLL is what you want. I'd only worry about more complicated
scenarios as and when you run into them :)
In your case, the DLL is the assembly.
To deploy your custom pipeline component, you would need to
1) Add it to the Global Assembly Cache (use gacutil4)
Check here for more information: http://msdn.microsoft.com/en-us/library/dkkx7f79(v=vs.100).aspx
2) Copy it the Pipeline Components folder (default in C:\Program Files (x86)\Microsoft BizTalk Server 2010\Pipeline Components)
3) Restart the Host Instance (likely BizTalkServerApplication in your case) and deploy your new pipeline which makes use of the pipeline component.
I've been creating various plugins for an application that requires me to produce a .tlb file. In the past, it has simply been a case of configuring my project's build properties to 'Register for COM interop' thereby producing a .tlb file along with my output dll. Previously, when using the Visual Studio 2010 installer projects template, this would always correctly register .tlb during installation on the target machine.
I've recently attempted to make the switch to Visual Studio 2012 and use the InstallShield LE project to produce my installer, but it doesn't seem to register the type library during the install, nor does the express addition seem to allow me to manually register via the cmd-line regasm route - or at least it's not that obvious to me.
In the InstallShield project options I had to manually add the .tlb application file (from the build's \release folder) to the list of files to be included in the installer as it doesn't seem to get included along with the files produced by the project output or content options. In the .tlb file's 'COM & .Net Settings' properties, I have it configured to Registration Type: 'Extract COM Information' and have enabled 'COM Interop'.
What am I missing?
Try this thread for a good technical description: Are *.tlb files ever used at runtime?
In most cases I believe the *.tlb file is not needed because it is already compiled into most binaries regardless of whether it is an exe file or a dll, but .NET Interop is a lot more complex and as the other thread explains the *.tlb file can be needed to deal with advanced communication issues between threads and processes - something that I incidentially had forgotten.
When implementing a setup I have never had the need to register a *.tlb file by itself, it has been enough to register the corresponding binary (exe/dll), but this all depends on the use cases for the product.
Be aware that the way Installshield is registering for COM Interop is not always the best option as far as I recall. I am not sure exactly what they are doing, but I would check and compare with a normal regasm.exe registration:
http://msdn.microsoft.com/en-us/library/tzat5yw6(v=vs.110).aspx . And read the linked thread (Are *.tlb files ever used at runtime?) carefully on the issues of proxies, stubs and marshalling in case you need these features supported.
Here are some further links with concise information on COM/DCOM:
Contents of a Type Library:
http://msdn.microsoft.com/en-us/library/windows/desktop/ms221355(v=vs.85).aspx
COM, DCOM, and Type Libraries: http://msdn.microsoft.com/en-us/library/windows/desktop/aa366757(v=vs.85).aspx
Files Generated for a COM Interface: http://msdn.microsoft.com/en-us/library/windows/desktop/aa366830(v=vs.85).aspx
I have created a dll that will be used by multiple applications, and have created an installer package that installs it to the program files, as well as adds it to the Global Assembly Cache.
The dll itself uses log4net, and requires a xml file for the logging definitions.
Therefore when the installer is run, the following files get copied to the install directory within program files:
The main dll that I developed
- The Log4Net.dll
- the Log4Net.xml file
I am now experiencing a problem. I have created a test console application for experimentation. I have added my dll as a reference, and set the 'local copy' flag to false.
When I compile the test console exe however, I noticed that it has copied the log4net.dll and log4net.xml files to the bin directory. And when running the test console, it appears that it will only work if the log4net.dll is in the same directory as the exe. This is dispite the fact that the test console application does not use log4net, only the dll that was added as a reference does.
Is there some way to have it so that the log4net.dll & xml files used will be the ones that were installed to the program files, rather than any application needed to copy over local copies? The applications that will be using my dll will not be using log4net, only the dll that they are referencing uses it.
Many thanks
Don't install into the Global Assembly Cache! Even if your library dll is used by multiple applications each should have it's own local copy. Otherwise you get into a whole world of pain for saving a few KB of disk space.
Always copy the required dlls locally. If you are really sure that the application won't need it you can simply delete the unnessesary dlls later or don't include them in the installer. But if your application will call ANY reference there it will crash at runtime. So best option is to leave them there (after all they WERE referenced for a reason).
No, it's not possible (at least not without much efford) to have .Net load dlls from arbitrary locations on the disk. And it should be this way (look up DLL-hell if you want to know why).
I suspect your problem is the configuration. You must use fully qualified names if you want it to work from the GAC. As per the documentation at http://logging.apache.org/log4net/release/faq.html:
"When loading an assembly from the GAC the fully qualified assembly name, including the version, culture and public key must be specified. This is in the standard syntax supported by System.Type.GetType. See the next FAQ on how to get the version and public key for an assembly."
I managed to resolve this by adding Log4net.dll to the GAC as well. It will now run without needing a local copy the dll.
It does however require a local copy of the XML file, to correctly log.
Do you use ILMerge? Do you use ILMerge to merge multiple assemblies to ease deployment of dll's? Have you found problems with deployment/versioning in production after ILMerging assemblies together?
I'm looking for some advice in regards to using ILMerge to reduce deployment friction, if that is even possible.
I use ILMerge for almost all of my different applications. I have it integrated right into the release build process so what I end up with is one exe per application with no extra dll's.
You can't ILMerge any C++ assemblies that have native code.
You also can't ILMerge any assemblies that contain XAML for WPF (at least I haven't had any success with that). It complains at runtime that the resources cannot be located.
I did write a wrapper executable for ILMerge where I pass in the startup exe name for the project I want to merge, and an output exe name, and then it reflects the dependent assemblies and calls ILMerge with the appropriate command line parameters. It is much easier now when I add new assemblies to the project, I don't have to remember to update the build script.
Introduction
This post shows how to replace all .exe + .dll files with a single combined .exe. It also keeps the debugging .pdb file intact.
For Console Apps
Here is the basic Post Build String for Visual Studio 2010 SP1, using .NET 4.0. I am building a console .exe with all of the sub-.dll files included in it.
"$(SolutionDir)ILMerge\ILMerge.exe" /out:"$(TargetDir)$(TargetName).all.exe" "$(TargetDir)$(TargetName).exe" "$(TargetDir)*.dll" /target:exe /targetplatform:v4,C:\Windows\Microsoft.NET\Framework64\v4.0.30319 /wildcards
Basic hints
The output is a file "AssemblyName.all.exe" which combines all sub-dlls into one .exe.
Notice the ILMerge\ directory. You need to either copy the ILMerge utility into your solution directory (so you can distribute the source without having to worry about documenting the install of ILMerge), or change the this path to point to where ILMerge.exe resides.
Advanced hints
If you have problems with it not working, turn on Output, and select Show output from: Build. Check the exact command that Visual Studio actually generated, and check for errors.
Sample Build Script
This script replaces all .exe + .dll files with a single combined .exe. It also keeps the debugging .pdb file intact.
To use, paste this into your Post Build step, under the Build Events tab in a C# project, and make sure you adjust the path in the first line to point to ILMerge.exe:
rem Create a single .exe that combines the root .exe and all subassemblies.
"$(SolutionDir)ILMerge\ILMerge.exe" /out:"$(TargetDir)$(TargetName).all.exe" "$(TargetDir)$(TargetName).exe" "$(TargetDir)*.dll" /target:exe /targetplatform:v4,C:\Windows\Microsoft.NET\Framework64\v4.0.30319 /wildcards
rem Remove all subassemblies.
del *.dll
rem Remove all .pdb files (except the new, combined pdb we just created).
ren "$(TargetDir)$(TargetName).all.pdb" "$(TargetName).all.pdb.temp"
del *.pdb
ren "$(TargetDir)$(TargetName).all.pdb.temp" "$(TargetName).all.pdb"
rem Delete the original, non-combined .exe.
del "$(TargetDir)$(TargetName).exe"
rem Rename the combined .exe and .pdb to the original project name we started with.
ren "$(TargetDir)$(TargetName).all.pdb" "$(TargetName).pdb"
ren "$(TargetDir)$(TargetName).all.exe" "$(TargetName).exe"
exit 0
We use ILMerge on the Microsoft application blocks - instead of 12 seperate DLL files, we have a single file that we can upload to our client areas, plus the file system structure is alot neater.
After merging the files, I had to edit the visual studio project list, remove the 12 seperate assmeblies and add the single file as a reference, otherwise it would complain that it couldnt find the specific assembly. Im not too sure how this would work on post deployment though, could be worth giving it a try.
I know this is an old question, but we not only use ILMerge to reduce the number of dependencies but also to internalise the "internal" dependencies (eg automapper, restsharp, etc) that are used by the utility. This means they are completely abstracted away, and the project using the merged utility doesn't need to know about them. This again reduces the required references in the project, and allows it to use / update its own version of the same external library if required.
We use ILMerge on quite a few projects. The Web Service Software Factory, for example produces something like 8 assemblies as its output. We merge all of those DLLs into a single DLL so that the service host will only have to reference one DLL.
It makes life somewhat easier, but it's not a big deal either.
We had the same problem with combining WPF dependencies .... ILMerge doesn't appear to deal with these. Costura.Fody worked perfectly for us however and took about 5 minutes to get going... a very good experience.
Just install with Nuget (selecting the correct default project in the Package Manager Console). It introduces itself into the target project and the default settings worked immediately for us.
It merges the all DLLs marked "Copy Local" = true and produces a merged .EXE (alongside the standard output), which is nicely compressed in size (much less than the total output size).
The license is MIT as so you can modify/distribute as required.
https://github.com/Fody/Costura/
Note that for windows GUI programs (eg WinForms) you'll want to use the /target:winexe switch. The /target:exe switch creates a merged console application.
I'm just starting out using ILMerge as part of my CI build to combine a lot of finely grained WCF contracts into a single library. It works very well, however the new merged lib can't easily co-exist with its component libraries, or other libs that depend on those component libraries.
If, in a new project, you reference both your ILMerged lib and also a legacy library that depends on one of the inputs you gave to ILMerge, you'll find that you can't pass any type from the ILMerged lib to any method in the legacy library without doing some sort of type mapping (e.g. automapper or manual mapping). This is because once everything's compiled, the types are effectively qualified with an assembly name.
The names will also collide but you can fix that using extern alias.
My advice would be to avoid including in your merged assembly any publicly available lib that your merged assembly exposes (e.g. via a return type, method/constructor parameter, field, property, generic...) unless you know for sure that the user of your merged assembly does not and will never depend on the free-standing version of the same library.
We ran into problems when merging DLLs that have resources in the same namespace. In the merging process one of the resource namespaces was renamed and thus the resources couldn't be located. Maybe we're just doing something wrong there, still investigating the issue.
We just started using ILMerge in our solutions that are redistributed and used in our other projects and so far so good. Everything seems to work okay. We even obfuscated the packaged assembly directly.
We are considering doing the same with the MS Enterprise Library assemblies.
The only real issue I see with it is versioning of individual assemblies from the package.
I recently had issue where I had ilmerged assembly in the assembly i had some classes these were being called via reflection in Umbraco opensource CMS.
The information to make the call via reflection was taken from db table that had assembly name and namespace of class that implemented and interface. The issue was that the reflection call would fail when dll was il merged however if dll was separate it all worked fine. I think issue may be similar to the one longeasy is having?
It seems to me like the #1 ILMerge Best Practice is Don't Use ILMerge. Instead, use SmartAssembly. One reason for this is that the #2 ILMerge Best Practice is to always run PEVerify after you do an ILMerge, because ILMerge does not guarantee it will correctly merge assemblies into a valid executable.
Other ILMerge disadvantages:
when merging, it strips XML Comments (if I cared about this, I would use an obfuscation tool)
it doesn't correctly handle creating a corresponding .pdb file
Another tool worth paying attention to is Mono.Cecil and the Mono.Linker [2] tool.
[2]: http:// www.mono-project.com/Linker