I'm hoping someone can help me with this. I have a solution that consists of two Web sites and two class libraries. One of the class libraries is a shared library for use within many of our projects so its output is stored a shared location (D:\Applications\SharedLibraries\Bus_logic) so we can reference the dll from there. We have this directory structure on our local machines and build server.
This works perfectly fine on my local machine. Building the solution locally pushes the updated dll to the local D:\Applications\SharedLibraries\Bus_logic folder. Our old CCNet builds would do the very same on the build server.
However, with TFS the output path of a class library doesn't seem to matter. I have a CI build for the solution and the class libraries never get outputted to that path. They're just grouped together in the drop folder.
Is there any easy way to make sure the build copies those dlls to their rightful locations, or do I have to create a custom build template for every one of my solutions that compiles shared libraries so that the dlls get copied to the right directory?
I've been doing a but with TFS builds recently so I hope the following helps.
Before you do any of the following I recommend you create a new build definition and new build template (via Edit Build Definition -> Process -> New Template -> Copy from existing) to test this works.
TFS provides a custom path for the OutDir argument to MSBuild, the variable passed for this is called outputDirectory. The step where this is set is deep within the default build template here, open it and navigate you way to Run On Agent -> Try Compile, Test and Associate -> Sequence -> Compile, Test and Associate -> Try Compile and Test -> Compile and Test -> For Each Configuration -> Compile and Test -> Initialize Variables, once there you'll find a task called Initialize OutputDirectory which by default is set to the 'BinariesDirectory/Platform/Configuration' folder. You could change this to your own custom logic.
It might be an easier way be to set the OurDir argument to nothing on the Run MSBuild task as I assume this will then use the projects default paths. This can be found here in the template, Run On Agent -> Try Compile, Test and Associate -> Sequence -> Compile, Test and Associate -> Try Compile and Test -> Compile and Test -> For Each Configuration -> Compile and Test -> If BuildSettings.HasProjectsToBuild -> For Each Project -> Try to Compile the Project -> Compile the Project.
There is probably a more elegant way via tinkering with the solution or project files but I am not aware of that just yet.
There are really two good ways to make this light up.
Check the dll's in - you can create a folder in source with the files that you depend on checked in. Then add a mapping in the build to get them to a well known location.
You can have the DLL's packaged as a NuGet package and take a dependency on that.
1 is cheap and cheerful and proan ro error, but not as much as your current solution. #2 is the right way to do things. NuGet was designed to solve these sorts of issues.
If I clean my project under release. And then I build the project under release. And then I take that .dll from from the bin is that file going to be any different than the .dll generated from using the Publish feature with "Release" selected?
I am not an expert reading MSBuild files, but it looks like there's no difference at all because it looks like that's exactly what MSBuild..
You can pull up "%windir%\Microsoft.NET\Framework64\v4.0.30319\Microsoft.Common.targets", find the Publish target, and trace through the dependencies to eventually find the _CopyFilesToPublishFolder target which does exactly what it says: copy everything covered by the OutputFiles property (among a bunch of other files) to the publish directory.
Well, providing that nothing in the source code changes, then yes it should be the same. If something in the source (or a dependency) changes, then VS will likely rebuild the project before publishing.
I have some c# projects. I added post build event to those projects that copy the resulted assembly (dll) from the bin into common folder.
It appears that each compile generates assembly which is binary different from the previous even when I don't modify the project files.
It is quite a problem for me since I'm using Kiln that monitor those file and think they were modified.
I read somewhere that the dll stores time stamp of compilation which if true then I cannot fix this. If so how do you manage your shared DLL in such a way that your repository (Git/HG) doesn't commit all your compiled projects that weren't modified?
Thanks,
Eran.
To address the specific question of "How do you manage your shared DLL in such a way that your repository (Git/HG) doesn't commit all your compiled projects that weren't modified?", I have a very simple answer: ignore.
We exclude /bin and /obj from the directories which our source control will even attempt to commit. This does mean that you will need to recompile the code on each machine after each change, but Visual Studio would do that anyway for any project where the code has changed.
Don't commit the output folders of your projects.
If you want to have a Setup folder or something similar that always contains the latest versions of the assemblies created by your projects, the solution is to make sure that your post-build event is configured to run only when the build updates the project output. There is an option that is named like this:
Our solution has several (10+) C# projects. Each has a reference to the CAB extension library, with the reference pointing to the DLLs in the library's release folders. Each project has between four and seven such references.
We'd like to make some changes to the library; but to debug the changes, we'll need to build a debug version of the library and refer to that. I'd like to add the library's projects to our solution and change each of the DLL references to a project reference.
Is it possible to perform a 'find and replace' on the existing references, or will I have to do it by hand?
There isn't such a feature in the VS IDE.
However, as a .csproj file is just an XML document it is possible to do such a global search and replace in a scripted fashion e.g. by changing one file to observe the before and after states then running sed over the remainder.
For a one-off, going to the extent of writing a script to load the XML and making the substitutions by DOM manipulation is probably overkill.
Take a look at Jared's answer to this SO thread. That approach will likely work for you.
If you download CI Factory, it just so happens that there is a nant function in there called FixUpThirdPartyRefs which you could use or tweak to help you do this. So you could just setup nant and use that function.
It is part of the power tools with CI Factory: http://www.cifactory.org/joomla/index.php?option=com_content&task=view&id=29&Itemid=41
Why don't you just replace DLLs in library's release folder with debug version temporary ? I assume that you have local development environment.
EDIT:
You could:
1. develop all time with debug version of library
2. make updating references in *.csproj more flexible
3. make file system location of library files more flexible
On point 3: If the path to your library dlls contains "release" and if debug and release library folder structure is the same than change from release could be made by just renaming folder "release" to "release.original" and "debug" to "release".
I would probably choose option 1 and all time develop with debug assemblies. Release build would use just for final testing and deploy to customer. Debug and release dlls are not that different.
Do you use ILMerge? Do you use ILMerge to merge multiple assemblies to ease deployment of dll's? Have you found problems with deployment/versioning in production after ILMerging assemblies together?
I'm looking for some advice in regards to using ILMerge to reduce deployment friction, if that is even possible.
I use ILMerge for almost all of my different applications. I have it integrated right into the release build process so what I end up with is one exe per application with no extra dll's.
You can't ILMerge any C++ assemblies that have native code.
You also can't ILMerge any assemblies that contain XAML for WPF (at least I haven't had any success with that). It complains at runtime that the resources cannot be located.
I did write a wrapper executable for ILMerge where I pass in the startup exe name for the project I want to merge, and an output exe name, and then it reflects the dependent assemblies and calls ILMerge with the appropriate command line parameters. It is much easier now when I add new assemblies to the project, I don't have to remember to update the build script.
Introduction
This post shows how to replace all .exe + .dll files with a single combined .exe. It also keeps the debugging .pdb file intact.
For Console Apps
Here is the basic Post Build String for Visual Studio 2010 SP1, using .NET 4.0. I am building a console .exe with all of the sub-.dll files included in it.
"$(SolutionDir)ILMerge\ILMerge.exe" /out:"$(TargetDir)$(TargetName).all.exe" "$(TargetDir)$(TargetName).exe" "$(TargetDir)*.dll" /target:exe /targetplatform:v4,C:\Windows\Microsoft.NET\Framework64\v4.0.30319 /wildcards
Basic hints
The output is a file "AssemblyName.all.exe" which combines all sub-dlls into one .exe.
Notice the ILMerge\ directory. You need to either copy the ILMerge utility into your solution directory (so you can distribute the source without having to worry about documenting the install of ILMerge), or change the this path to point to where ILMerge.exe resides.
Advanced hints
If you have problems with it not working, turn on Output, and select Show output from: Build. Check the exact command that Visual Studio actually generated, and check for errors.
Sample Build Script
This script replaces all .exe + .dll files with a single combined .exe. It also keeps the debugging .pdb file intact.
To use, paste this into your Post Build step, under the Build Events tab in a C# project, and make sure you adjust the path in the first line to point to ILMerge.exe:
rem Create a single .exe that combines the root .exe and all subassemblies.
"$(SolutionDir)ILMerge\ILMerge.exe" /out:"$(TargetDir)$(TargetName).all.exe" "$(TargetDir)$(TargetName).exe" "$(TargetDir)*.dll" /target:exe /targetplatform:v4,C:\Windows\Microsoft.NET\Framework64\v4.0.30319 /wildcards
rem Remove all subassemblies.
del *.dll
rem Remove all .pdb files (except the new, combined pdb we just created).
ren "$(TargetDir)$(TargetName).all.pdb" "$(TargetName).all.pdb.temp"
del *.pdb
ren "$(TargetDir)$(TargetName).all.pdb.temp" "$(TargetName).all.pdb"
rem Delete the original, non-combined .exe.
del "$(TargetDir)$(TargetName).exe"
rem Rename the combined .exe and .pdb to the original project name we started with.
ren "$(TargetDir)$(TargetName).all.pdb" "$(TargetName).pdb"
ren "$(TargetDir)$(TargetName).all.exe" "$(TargetName).exe"
exit 0
We use ILMerge on the Microsoft application blocks - instead of 12 seperate DLL files, we have a single file that we can upload to our client areas, plus the file system structure is alot neater.
After merging the files, I had to edit the visual studio project list, remove the 12 seperate assmeblies and add the single file as a reference, otherwise it would complain that it couldnt find the specific assembly. Im not too sure how this would work on post deployment though, could be worth giving it a try.
I know this is an old question, but we not only use ILMerge to reduce the number of dependencies but also to internalise the "internal" dependencies (eg automapper, restsharp, etc) that are used by the utility. This means they are completely abstracted away, and the project using the merged utility doesn't need to know about them. This again reduces the required references in the project, and allows it to use / update its own version of the same external library if required.
We use ILMerge on quite a few projects. The Web Service Software Factory, for example produces something like 8 assemblies as its output. We merge all of those DLLs into a single DLL so that the service host will only have to reference one DLL.
It makes life somewhat easier, but it's not a big deal either.
We had the same problem with combining WPF dependencies .... ILMerge doesn't appear to deal with these. Costura.Fody worked perfectly for us however and took about 5 minutes to get going... a very good experience.
Just install with Nuget (selecting the correct default project in the Package Manager Console). It introduces itself into the target project and the default settings worked immediately for us.
It merges the all DLLs marked "Copy Local" = true and produces a merged .EXE (alongside the standard output), which is nicely compressed in size (much less than the total output size).
The license is MIT as so you can modify/distribute as required.
https://github.com/Fody/Costura/
Note that for windows GUI programs (eg WinForms) you'll want to use the /target:winexe switch. The /target:exe switch creates a merged console application.
I'm just starting out using ILMerge as part of my CI build to combine a lot of finely grained WCF contracts into a single library. It works very well, however the new merged lib can't easily co-exist with its component libraries, or other libs that depend on those component libraries.
If, in a new project, you reference both your ILMerged lib and also a legacy library that depends on one of the inputs you gave to ILMerge, you'll find that you can't pass any type from the ILMerged lib to any method in the legacy library without doing some sort of type mapping (e.g. automapper or manual mapping). This is because once everything's compiled, the types are effectively qualified with an assembly name.
The names will also collide but you can fix that using extern alias.
My advice would be to avoid including in your merged assembly any publicly available lib that your merged assembly exposes (e.g. via a return type, method/constructor parameter, field, property, generic...) unless you know for sure that the user of your merged assembly does not and will never depend on the free-standing version of the same library.
We ran into problems when merging DLLs that have resources in the same namespace. In the merging process one of the resource namespaces was renamed and thus the resources couldn't be located. Maybe we're just doing something wrong there, still investigating the issue.
We just started using ILMerge in our solutions that are redistributed and used in our other projects and so far so good. Everything seems to work okay. We even obfuscated the packaged assembly directly.
We are considering doing the same with the MS Enterprise Library assemblies.
The only real issue I see with it is versioning of individual assemblies from the package.
I recently had issue where I had ilmerged assembly in the assembly i had some classes these were being called via reflection in Umbraco opensource CMS.
The information to make the call via reflection was taken from db table that had assembly name and namespace of class that implemented and interface. The issue was that the reflection call would fail when dll was il merged however if dll was separate it all worked fine. I think issue may be similar to the one longeasy is having?
It seems to me like the #1 ILMerge Best Practice is Don't Use ILMerge. Instead, use SmartAssembly. One reason for this is that the #2 ILMerge Best Practice is to always run PEVerify after you do an ILMerge, because ILMerge does not guarantee it will correctly merge assemblies into a valid executable.
Other ILMerge disadvantages:
when merging, it strips XML Comments (if I cared about this, I would use an obfuscation tool)
it doesn't correctly handle creating a corresponding .pdb file
Another tool worth paying attention to is Mono.Cecil and the Mono.Linker [2] tool.
[2]: http:// www.mono-project.com/Linker