Detecting dependency collisions - c#

Short version of the question:
Is there a good way to detect at build time if I have any cases where 2 or more projects reference different versions of the same assembly? (really, I would like to teach our CI server to do this)
Long Version:
So here's an interesting problem (simplified a bit for easy digestion):
Recently encountered a situation where we had 2 projects in a solution, A and B. Both A and B depend upon a 3rd party nuget package C.
A always loads C, B only needs C in rare circumstances.
So, during this sprint, a developer updated Project A to use the latest version of the C package (not realizing that B also depended upon C)
Everything built and the tests that we had passed (we have insufficient test coverage), but when we released to production, we had failures occuring when B attempted to use the dependency (loader issues, because we wanted a different version of the strongly named assembly).
We found the problem, and corrected it, but. I would really love to be able to catch this during development. It would be even cooler if our build server could detect this (TFS 2012) when it does a CI build.
How might I go about detecting this situation?

VS can't do this for you because of the dynamic loading (unless I'm missing something): it just has no way of knowing which assemblies wil be loaded at runtime.
We had the same problem once (using Prism - all our assemblies are loaded at application startup normally, and the order is described in a config file though most are optional). I First thought of making a small tool that basically scans all packages.config or csproj files to see what assemblies are used in what version, and make it complain when two packages of different versions are found. But I ended up with dealing with it at a higher level, more direct and foolproof: we now have a simple class, sort of a stub of the actual application, that just loads all the application's components and modules as described in the config file. This results in all assemblies that can ever get loaded to be loaded so if something goes wrong it will be found. This functionality is simply placed in a unit test.

Related

Deploy assembly as part of .Net Framework

I have an assembly (MYASM.dll) targeting .NETFramework 4.0 (with a strong name)
I want to deploy this assembly in a way it is part of .NETFramework (or the whole system thinks it is) on target machine.
By that I mean:
.NET runtime sees it at it sees System.dll (no need to deploy locally or provide a reference path)
MSBuild sees it when I do <Reference Include="MYASM" /> without needing a hintpath
User is able to make Add reference in Visual Studio and that introduces <Reference Include="MYASM" /> without the strong/full name
I have solved 1. (and apparently 2.) by adding it to the GAC. But this is apparently not sufficient.
I have partially solved 3. by putting my assembly in a special folder ([INSTALLFOLDER]\lib) and set registryKey HKEY_LOCAL_MACHINE\SOFTWARE\WOW6432Node\Microsoft\.NETFramework\v4.0\AssemblyFoldersEx\MyAssemblies
Then I can do Add reference, but then I get:
<Reference Include="MYASM, Version=1.1, Culture=neutral, ..." /> in my csproj instead of just <Reference Include="MYASM" /> as I'd like.
With the second approach, if I manually edit the csproj, everything is OK, but I can't ask my users to do that.
What should I do here?
[EDIT] apparently it’s not obvious I have my own MSI. But yes I have. I don’t control users machines with a magic wand
No, you've taken this as far as it can go. It isn't actually that obvious how VS figures out to put the partial assembly name into the project file. This is not public code and can't be tampered with. Pretty sure it does not use a white-list and it can't pay attention to the reference assembly location.
Most likely detail is the PublicKeyToken of the assembly. The framework assemblies always have to exact same value for them, b77a5c561934e089. Its value is even prescribed in the CLI spec (Ecma-335). Next most likely by a considerable distance is the signing certificate, identifying the assembly as owned by Microsoft. Both however present the exact same problem, you can't get the private key that is required to strong-name or sign the assembly. They are locked inside a vault in Redmond, only trusted build engineers have access to them.
There is another nasty little detail you are overlooking, you are not nearly scared enough of DLL Hell. Cold hard fact is that if you ever expose the assembly in the GAC on another machine that is not in your control then you can never change it again. You can no longer modify the public interface of the assembly. Can't add a new public method or type, can't modify the arguments and return type of a method, can't add an enum member, etc. Even harsher, something Microsoft worries about, is that you can't really change private and internal members either. Programmers have a knack for using Reflection to poke around, terrific bug fixing tool. But at least you can tell them "don't do that!".
Making such modification requires increasing the [AssemblyVersion]. Now you get a different kind of DLL Hell, the machine might not have been updated by your installer. Or worse, a solution uses projects that have different references. Microsoft had to solve this problem for framework assemblies, they did so by modifying the CLR. Automatically forwarding old versions to new ones. The basic reason why using an assembly built for .NET 2.0 can be used in a .NET 4.x project. You can't get that kind of service for your own DLL.
"Don't do it" is the only good advice, getting in DLL Hell trouble is however a terrific learning experience I can recommend for anybody. Hell has to be experienced to be feared.
Best advice is to publish a Nuget package. They do the exact opposite, never deployed in the GAC and version numbers change very rapidly. But always available when a programmer needs it.
There are a few ways...
1) is to create a new setup and package this for the framework you target. You can Package this and have it deployed using the domain controller. When your users log in the domain will update the packages, this way you'll be able to deploy your software to specific users and or user groups. Depending on your infrastructure you'll have a software management infrastructure that you can use (2 links included).
2) Create a NuGet package if you're targeting developers. If your organisation host your own NuGet server limiting the distribution. Add the Package source to Visual studio open the Options Page, type NuGet in the search field and set the URL/ UNC path.
3) use OneClick deployment, this allows you to have the application download the updated dll's and install them on the machine. It requires a Code Sign certificate but you're probably signing your code anyway (better for Anti-Virus tools if you do).
Now linking your MyAsam.dll will be done by the application linking definition or IoC container. Basically, if it finds the dll and no version is defined it will take the first one it finds I think the order is 1 AppFolder, 2 GAC, 3 Path, not sure. This "take what you find" is generally referred to as "DLL-Hell", The NuGet and OneClick solution works best in this as You will always get the Updated dll that works for the application. Placing the DLL in GAC is going to get problematic if you have moe than 1 application using your dll and both need the "right" version where the "right version" differes between them....
If you have the source code available for MYASM.dll, then I would prefer adding a project reference to your consuming application. When doing so, Visual Studio shall create a GUID for all the referenced project.

Does readony VS const make any sense if I live in Nuget world?

I know the differences between const & readonly and the side effects when I patch/deploy assemblies where const values are changed, without recompiling referencing assemblies.
But I just wondering if I should bother about them when I live in Nuget world?
Since my development flow goes through the Nuget - If assembly A is referencing assembly B and I want new changes from B, I just go ahead and update it via Nuget. I mean I'm not patching/deploying dlls to folders.
Looks like in Nuget scenarios these side effects are no more issue.
What do You think?
I guess you are in enclosed to a scenario where your development process only considers new versions of your base DLL being deployed when all the references are recompiled... and I mean a full build.
That might not be the case if the developer upgrades the DLL via Nuget, does not change the code on his project (so that the compiler does not need to update the previously compiled objects).
I suggest you to make some tests with this scenario, specially if you are relaying that users upgrade theses DLLs without any code modification from their DLLs.

C# Dynamic compile and replace/reload of assembly from within same assembly

I have several issues with several SDK's comming from OEM manufacturers for specific devices. SDK is usually based on C or C++ dll, so I have a lot of Marshaling going around (a lot===YOU CAN'T EVEN IMAGINE). Problem start with next version of SDK when they extend some functions or some structures, they effectively break compatibility. In past I have made copy of our library supporting their device and start making changes to support new SDK. But each time our library was only for specific SDK, and upgrades of our systems were tough (Installation script if one heavy thing also ~ 3 GB install).
I have 78 projects in solution, commonly 4-5 libraries for each OEM Manufacturer, this is without any service tools. And Yesterday I said NO MORE. Started research on subject how to recompile C# code in runtime and reload/replace same assembly without quiting App.
And the result is the following:
- Class file that defines external C/C++ dll API was referenced from external Project referencing only System.dll. And me being insane I've already had each SDK version changes wrapped around #if #elif #endif so I could recompile last version of our library to support previous version of SDK. But that was maybe only once done, I've used #defines along with CSharpCodeProvider to recompile this assembly in runtime. Idea was like this:
Application loading ...
Open main SDK file get file version (extract version and identify it).
Load original External Assembly in new AppDomain (so I could destroy domain later).
Extract current version from external assembly.
Destroy new AppDomain to release hook from external assembly.
If versions mismatch, recompile external assembly (source code for external assembly is embedded within parent assembly), and replace original DLL with just compiled one.
Continue loading application...
So far this test approach works on one live demo system, and I was amazed. Switching from one to another SDK was flawless without any hick-ups.
And also code recompiles it self only when SDK version changes. So with safe guard I could say this is my first Metamorphic code I've wrote, that recompiles/changes it self from runtime.
Unfortunately this approach requires me to add one more Project for each OEM Manufacturers SDK. Which effectively kills my first though why I said NO MORE. True I now have only two libraries to maintain per one OEM manufacturer, and there will be no more projects added after this. But...
I wonder is there better approach which could allow me to replace DLL of currently loaded assembly in runtime from true within same assembly? Or change executing code on "fly" each time, this mainly includes Marshaled function, classes, structures, constants, ...?
Please notice code should be maintained from within same project without any externals. Also please notice this project exposes only hard-coded interface to "outside" world (Interface is referenced Interface only project - is more complex than I wrote). But this "outside" world is blind to any OEM specific stuff, which was the point using interface to have exactly same behavior across any OEM Device.
Any ideas? thoughts? suggestions?

Test compatibility between DLL in .NET

I'm working with Visual Studio 2010 and WinForms, .Net 4.0 (C#). I'm building an application with a lot of DLL (150). When I provide the application to my client, it's :
The Executable (.exe)
Dll files (.dll)
Each Dll is related to a module of the application, for example :
Ado.dll (provide access to database)
AccesManagement.dll (this module allows to manage users in the application)
Import.dll (this module allows the user to import data to the application)
etc.
When my client find a bug in the application, I correct it and I provide him impacted DLLs (in order to avoid him to test all the application). It can be for example the Import Dll.
The thing is, after some deliveries, we can have compatibility problems between Dll (a method that doesn't exist anymore in a new DLL for example). To avoid this problem, I would like to find a tool capable of checking compatibility between differents DLL.
I would like something like :
I specify the directory of the program to analyse (executable + Dll)
I launch the analyse
The program tells me for example : Error between Import.dll and Ado.dll, there is a class xxx in Import.dll expecting a method named xxx in the class xxx of Ado.dll
I've found some tools able to compare two versions of a Dll and provide added and removed members (Libcheck, ApiChange), but it's too complicated for me to do that because there are to many changes.
I think you may have a configuration management problem here -- at least as much as you've got a "compatibility" problem.
I'd recommend you find a way to track what versions of which assemblies each of your customers is using so that (1) you know what they're using when you decide what to ship, and (2) when they report bugs, you can replicate their setup (and thus, replicate their bug). If that sounds like a lot of work, it is. This is why a lot of software development shops take steps to ensure that there's a limit to the variation in setups among customers. It's nearly certain that you'll end up with some variation from customer-to-customer, but anything you can do to manage that problem will be beneficial.
Beyond the process implications, if you really need to create a "pluggable" environment, you probably need to create some interfaces for your objects to control the points where they connect, and you should probably look at Microsoft's Managed Extensibility Framework (MEF). MEF can help you manage the way objects "demand" behaviors from other objects.
I finally found a solution to my problem.
Since I'm :
Using SourceSafe and adding labels with the version of the application I'm building
Tagging each of my DLL with the version of the application
I built a program which is capable of :
Opening each Dll of a folder to read the version of the application in it
Getting from SourceSafe each project for the version specified in the DLL (With the functionnality "Get Label")
Then I just have to build the projet. If there is any compilation error, there is a compatibility problem.
This solution can avoid big compatibility problems, but you can still have compatibility problems which can't be seen with a compilation...

How do you package external libraries in your .Net projects?

A lot of my projects contain the Castle/NHibernate/Rhino-Tools stack. What's confusing about this is that Castle depends on some NHibernate libraries, NHibernate depends on some Castle libraries, and Rhino-Tools depends on both.
I've built all three projects on my machine, but I feel that copying the NHibernate/Castle libraries is a bit redundant since I built Rhino-Tools using the resulting libraries from my NHibernate and Castle builds.
Right now, I include all projects in seperate folders in my /thirdparty/libs folder in my project tree. Should I simply just have /thirdparty/libs/rhino-tools in my project and use the Castle/NHibernate libs from there? That would seem to make logical sense in not duplicating files, but I also like having each project in it's own distinct folder.
What are your views on this?
This is one of the problems that we're trying to tackle in the Refix open source project on CodePlex.
The idea is that Refix will parse all the projects in your solution, and before your project compiles, copy the necessary binaries from a single local repository on your machine into a folder within the solution tree and point the projects at them. This way, there's no need to commit the binaries. Your local Refix repository will pull binaries from a remote one (we're setting one up at repo.refixcentral.com), and you can set up an intermediate one for your team/department/company that can hold any additional software not held centrally.
It will also try to resolve conflicting version numbers - Visual Studio can be too forgiving of mismatched component version numbers, leading to solutions that compile but fall over at run time when they fail to load a dependency because two different versions would be needed.
So to answer the question "how do you package external libraries in your .Net projects", our vision is that you don't - you just include a Refix step in your build script, and let it worry about it for you.
I use a folder for each, which seems to be the convention.
Does it really make a difference if you're copying them?
What if you want to switch one out? Let's say you go with a new O/R mapper. It will be much easier to just delete the NHibernate folder than to selectively delete DLLs in your Rhino-Tools folder.
Take this to it's logical conclusion and you won't have any folder organization in your lib folder since everything uses log4net :)
Add additional probing paths to your app.config files to locate the dependency dlls. This way your can get away with having just one copy of everything you want. Though there are some quirks to using this feature (you must create the folder structure in a certain way). Look here for more details on the tag.
I will definetly recommend having a thirdparty or vendor folder in each of your project trees. If you find it annoying to have 32 copies of the rhino-tools package, you can have a single copy of it in your code repository, and do external references to it in your project tree.
Lets say you are using SVN, you can make a repository called "thirdparty libs" and in this have versioned copies of the libs. You then make an external property on your "thirdparty"-folder in your project tree which then in turn automaticly will do a check out of your centralized thirdparty libs. This way you for instance only have to update in one place if a security or a bugfix comes out, but each project is still in command of choosing which thirdparty libs, and which versions to use.
About the deps internally in thirdparty libs, i wouldn't mind those. The first time you compile your project, and some of the libs arent copied to your bin-folder because of implicit dependencies you can add an external attribute into your bin-folder, which will then automaticly check out the missing libs. That way you still only have to update your thirdparty libs in one place.

Categories