I develop a library with some functional named CompanyName.SDK which must be integrated in company project CompanyName.SomeSolution
CompanyName.SDK.dll must be deployed via NuGet package.
And CompanyName.SDK package has a dependency on 3rd party NuGet packages. For good example, let's take Unity. Current dependency is on v3.5.1405-prerelease of Unity.
CompanyName.SomeSolution.Project1 depends on Unity v2.1.505.2.
CompanyName.SomeSolution.Project2 depends on Unity v3.0.1304.1.
Integrating CompanyName.SDK into this solution adds dependency on Unity v3.5.1405-prerelease.
Let's take that CompanyName.SomeSolution has one runnable output project CompanyName.SomeSolution.Application that depends on two above and on CompanyName.SDK
And here problems begin. All Unity assemblies has equal names in all packages without version specifier. And in the target folder it will be only one version of Unity assemblies: v3.5.1405-prerelease via bindingRedirect in app.config.
How can code in Project1, Project2 and SDK use exactly needed versions of dependent packages they were coded, compiled and tested with?
NOTE1: Unity is just an example, real situation is 10 times worse with 3rdparty modules dependent on another 3rdparty modules which in turn has 3-4 versions simultaneously.
NOTE2: I cannot upgrade all packages to their latest versions because there are packages that have dependency not-on-latest-version of another packages.
NOTE3: Suppose dependent packages has breaking changes between versions. It is the real problem why I'm asking this question.
NOTE4: I know about question about conflicts between different versions of the same dependent assembly but answers there does not solve the root of a problem - they just hide it.
NOTE5: Where the hell is that promised "DLL Hell" problem solution? It is just reappearing from another position.
NOTE6: If you think that using GAC is somehow an option then write step-by-step guide please or give me some link.
Unity package isn't a good example because you should use it only in one place called Composition Root. And Composition Root should be as close as it can be to application entry point. In your example it is CompanyName.SomeSolution.Application
Apart from that, where I work now, exactly the same problem appears. And what I see, the problem is often introduced by cross-cutting concerns like logging. The solution you can apply is to convert your third-party dependencies to first-party dependencies. You can do that by introducing abstractions for that concepts. Actually, doing this have other benefits like:
more maintainable code
better testability
get rid of unwanted dependency (every client of CompanyName.SDK really needs the Unity dependency?)
So, let's take for an example imaginary .NET Logging library:
CompanyName.SDK.dll depends on .NET Logging 3.0
CompanyName.SomeSolution.Project1 depends on .NET Logging 2.0
CompanyName.SomeSolution.Project2 depends on .NET Logging 1.0
There are breaking changes between versions of .NET Logging.
You can create your own first-party dependency by introducing ILogger interface:
public interface ILogger
{
void LogWarning();
void LogError();
void LogInfo();
}
CompanyName.SomeSolution.Project1 and CompanyName.SomeSolution.Project2 should use ILogger interface. They are dependent on ILogger interface first-party dependency. Now you keep that .NET Logging library behind one place and it's easy to perform update because you have to do it in one place. Also breaking changes between versions are no longer a problem, because one version of .NET Logging library is used.
The actual implementation of ILogger interface should be in different assembly and it should be only place where you reference .NET Logging library.
In CompanyName.SomeSolution.Application in place where you compose your application you should now map ILogger abstraction to concrete implementation.
We are using that approach and we are also using NuGet for distribute our abstractions and our implementations. Unfortunately, issues with versions can appear with your own packages. To avoid that issues apply Semantic Versioning in packages you deploy via NuGet for your company. If something change in in your code base that is distributed via NuGet you should change in all of the packages that are distributed via NuGet. For example we have in our local NuGet server :
DomainModel
Services.Implementation.SomeFancyMessagingLibrary (that references DomainModel and SomeFancyMessagingLibrary)
and more...
Version between this packages are synchronized, if version is changed in DomainModel, the same version is in Services.Implementation.SomeFancyMessagingLibrary. If our applications needs update of our internal packages all dependencies are updated to the same version.
You can work at post-compilation assembly level to solve this issue with...
Option 1
You could try merging the assemblies with ILMerge
ilmerge /target:winexe /out:SelfContainedProgram.exe Program.exe ClassLibrary1.dll ClassLibrary2.dll
The result will be an assembly that is the sum of your project and its required dependencies. This comes with some drawbacks, like sacrificing mono support and losing assembly identities (name, version, culture etc.), so this is best when all the assemblies to merge are built by you.
So here comes...
Option 2
You can instead embed the dependencies as resources within your projects as described in this article. Here is the relevant part:
At run-time, the CLR won’t be able to find the dependent DLL
assemblies, which is a problem. To fix this, when your application
initializes, register a callback method with the AppDomain’s
ResolveAssembly event. The code should look something like this:
AppDomain.CurrentDomain.AssemblyResolve += (sender, args) => {
String resourceName = "AssemblyLoadingAndReflection." +
new AssemblyName(args.Name).Name + ".dll";
using (var stream = Assembly.GetExecutingAssembly().GetManifestResourceStream(resourceName)) {
Byte[] assemblyData = new Byte[stream.Length];
stream.Read(assemblyData, 0, assemblyData.Length);
return Assembly.Load(assemblyData);
}
};
Now, the first time a thread calls a method that references a type in
a dependent DLL file, the AssemblyResolve event will be raised and the
callback code shown above will find the embedded DLL resource desired
and load it by calling an overload of Assembly’s Load method that
takes a Byte[] as an argument.
I think this is the option i would use if I were in your shoes, sacrificing some initial startup time.
Update
Have a look here. You could also try using those <probing> tags in each project's app.config to define a custom sub-folder to look in when the CLR searches for assemblies.
Related
I'm building package for my project in C#, and I want to reuse it across different projects.
Some Utility methods/classes in my package are dependent on another package (say RX .Net). But I know that I would use this package in projects without RX .Net installed.
For example I have class EmailSender, which has callback - style methods (void .Send(attrs, Action clb)) as well as RX based methods (IObservable .SendAsObservable(attrs)).
Could I do something like
#IF PACKAGE EXISTS RX.NET
# ENDIF
So parts of my code will be ignored if package does not exist?
Which is the best way to accomplish something like this without building 2 separate packages with duplicate class names etc.
Investigate using a fake assembly in the references of your project: adding a fake reference can resolve missing dlls: Code generation, compilation, and naming conventions in Microsoft Fakes
Example of faking individual methods:
Using Microsoft Fakes with mscorlib.dll
I have an assembly (MYASM.dll) targeting .NETFramework 4.0 (with a strong name)
I want to deploy this assembly in a way it is part of .NETFramework (or the whole system thinks it is) on target machine.
By that I mean:
.NET runtime sees it at it sees System.dll (no need to deploy locally or provide a reference path)
MSBuild sees it when I do <Reference Include="MYASM" /> without needing a hintpath
User is able to make Add reference in Visual Studio and that introduces <Reference Include="MYASM" /> without the strong/full name
I have solved 1. (and apparently 2.) by adding it to the GAC. But this is apparently not sufficient.
I have partially solved 3. by putting my assembly in a special folder ([INSTALLFOLDER]\lib) and set registryKey HKEY_LOCAL_MACHINE\SOFTWARE\WOW6432Node\Microsoft\.NETFramework\v4.0\AssemblyFoldersEx\MyAssemblies
Then I can do Add reference, but then I get:
<Reference Include="MYASM, Version=1.1, Culture=neutral, ..." /> in my csproj instead of just <Reference Include="MYASM" /> as I'd like.
With the second approach, if I manually edit the csproj, everything is OK, but I can't ask my users to do that.
What should I do here?
[EDIT] apparently it’s not obvious I have my own MSI. But yes I have. I don’t control users machines with a magic wand
No, you've taken this as far as it can go. It isn't actually that obvious how VS figures out to put the partial assembly name into the project file. This is not public code and can't be tampered with. Pretty sure it does not use a white-list and it can't pay attention to the reference assembly location.
Most likely detail is the PublicKeyToken of the assembly. The framework assemblies always have to exact same value for them, b77a5c561934e089. Its value is even prescribed in the CLI spec (Ecma-335). Next most likely by a considerable distance is the signing certificate, identifying the assembly as owned by Microsoft. Both however present the exact same problem, you can't get the private key that is required to strong-name or sign the assembly. They are locked inside a vault in Redmond, only trusted build engineers have access to them.
There is another nasty little detail you are overlooking, you are not nearly scared enough of DLL Hell. Cold hard fact is that if you ever expose the assembly in the GAC on another machine that is not in your control then you can never change it again. You can no longer modify the public interface of the assembly. Can't add a new public method or type, can't modify the arguments and return type of a method, can't add an enum member, etc. Even harsher, something Microsoft worries about, is that you can't really change private and internal members either. Programmers have a knack for using Reflection to poke around, terrific bug fixing tool. But at least you can tell them "don't do that!".
Making such modification requires increasing the [AssemblyVersion]. Now you get a different kind of DLL Hell, the machine might not have been updated by your installer. Or worse, a solution uses projects that have different references. Microsoft had to solve this problem for framework assemblies, they did so by modifying the CLR. Automatically forwarding old versions to new ones. The basic reason why using an assembly built for .NET 2.0 can be used in a .NET 4.x project. You can't get that kind of service for your own DLL.
"Don't do it" is the only good advice, getting in DLL Hell trouble is however a terrific learning experience I can recommend for anybody. Hell has to be experienced to be feared.
Best advice is to publish a Nuget package. They do the exact opposite, never deployed in the GAC and version numbers change very rapidly. But always available when a programmer needs it.
There are a few ways...
1) is to create a new setup and package this for the framework you target. You can Package this and have it deployed using the domain controller. When your users log in the domain will update the packages, this way you'll be able to deploy your software to specific users and or user groups. Depending on your infrastructure you'll have a software management infrastructure that you can use (2 links included).
2) Create a NuGet package if you're targeting developers. If your organisation host your own NuGet server limiting the distribution. Add the Package source to Visual studio open the Options Page, type NuGet in the search field and set the URL/ UNC path.
3) use OneClick deployment, this allows you to have the application download the updated dll's and install them on the machine. It requires a Code Sign certificate but you're probably signing your code anyway (better for Anti-Virus tools if you do).
Now linking your MyAsam.dll will be done by the application linking definition or IoC container. Basically, if it finds the dll and no version is defined it will take the first one it finds I think the order is 1 AppFolder, 2 GAC, 3 Path, not sure. This "take what you find" is generally referred to as "DLL-Hell", The NuGet and OneClick solution works best in this as You will always get the Updated dll that works for the application. Placing the DLL in GAC is going to get problematic if you have moe than 1 application using your dll and both need the "right" version where the "right version" differes between them....
If you have the source code available for MYASM.dll, then I would prefer adding a project reference to your consuming application. When doing so, Visual Studio shall create a GUID for all the referenced project.
I have a project which needs to indirectly use three different versions of a third-party library. These versions are incompatible with each other, so I can't use a binding redirect - it has to be the exact .dll file. (The libraries are Spire.Doc, Spire.XLS & Spire.PDF; the Spire.PDF DLL is referenced by all three)
I have separated the three components into individual wrapper projects, and created classes which wrap direct references to anything in the libraries. However, this doesn't solve my issue: the 'consuming' project still has to copy all of the libraries to the bin folder in order to run. The build process doesn't know which version to copy, and so just copies the latest one. This gives me runtime exceptions due to the wrong DLL being present.
What I've considered/tried:
Adding a binding redirect to a specific version (runtime exception because the exact version of the library is not found)
Using a post-build step to merge the wrapper projects (again a runtime exception complaining about the absence of the library DLL)
Creating separate console applications for each part of the application, then invoking them in a separate - this is a complicated last resort that I'd really rather not do!
I have read that extern alias might be able to help - but as far as I can tell, you can only distinguish between assemblies with different names. The Spire.PDF library has the same name in each project (and the same signed public token).
How can I use these three separate versions of the library independently in the same solution?
Edit:
This issue is slightly different to the suggested duplicate because I don't have the ability to change any code in the dependent libraries. Spire.Doc relies on a different version of Spire.PDF to Spire.XLS
In your consuming project (Project A), create a common interface (ISpiroPdfAlex) that encompasses all the functionality that the 3 versions of your external assembly provides (and you use). You cannot reference anything in Project A from these wrappers in any way, otherwise you'd create a dependency, which is what you're trying to avoid.
Have all 3 wrapper projects import Project A and implement ISpiroPdfAlex. This will give you the ability to call each of the 3 different versions through the same API.
After this, create a subfolder under Project A for each of the versions (so 3 subfolders total) - since Project A has no reference to any of the external assemblies, it cannot load them by itself - you'll have to manually load them when you need the right version. Since your external DLLs may have dependencies with the same name, they cannot all be in the same folder (as you wrote), this is why you need the subfolders.
At run-time when you need one of these versions, you can call Assembly.LoadFile to load a specific version of your assembly from the specified folder and then you can either use Activator.CreateInstance or dependency injection to create an instance of a class that implements your interface. Once you have the instance, you're free to call any of the functions and you'll get version-dependent behavior.
Edit:
OP mentioned in a comment that it's not his code that has the dependency on different versions of the PDF library but the other 3rd-party Spire libraries that his code depends on.
In this case, the 3rd-party code cannot be modified to support dynamic loading of assemblies and they already have a binary dependency. It's not possible to load different versions of the "same" assembly into the same process, especially that you mentioned that these versions are not even backward-compatible with each other.
The only solution I can think of in this situation is to break out all dependent functionality into separate console applications (one for each different version) and call those separate .exe-s through the command-line.
To pass information, you can either pass data directly on the command-line or through stdin. Alternatively, you can just pass the name of a temporary file that has all data necessary to do some processing. To get return data back from the console process, you can either read its stdout or use the same / different file.
This way your main process never loads any of these assemblies and has no dependency on them - each console application has a dependency on just one version so there's no collision.
Short version of the question:
Is there a good way to detect at build time if I have any cases where 2 or more projects reference different versions of the same assembly? (really, I would like to teach our CI server to do this)
Long Version:
So here's an interesting problem (simplified a bit for easy digestion):
Recently encountered a situation where we had 2 projects in a solution, A and B. Both A and B depend upon a 3rd party nuget package C.
A always loads C, B only needs C in rare circumstances.
So, during this sprint, a developer updated Project A to use the latest version of the C package (not realizing that B also depended upon C)
Everything built and the tests that we had passed (we have insufficient test coverage), but when we released to production, we had failures occuring when B attempted to use the dependency (loader issues, because we wanted a different version of the strongly named assembly).
We found the problem, and corrected it, but. I would really love to be able to catch this during development. It would be even cooler if our build server could detect this (TFS 2012) when it does a CI build.
How might I go about detecting this situation?
VS can't do this for you because of the dynamic loading (unless I'm missing something): it just has no way of knowing which assemblies wil be loaded at runtime.
We had the same problem once (using Prism - all our assemblies are loaded at application startup normally, and the order is described in a config file though most are optional). I First thought of making a small tool that basically scans all packages.config or csproj files to see what assemblies are used in what version, and make it complain when two packages of different versions are found. But I ended up with dealing with it at a higher level, more direct and foolproof: we now have a simple class, sort of a stub of the actual application, that just loads all the application's components and modules as described in the config file. This results in all assemblies that can ever get loaded to be loaded so if something goes wrong it will be found. This functionality is simply placed in a unit test.
I am using Ninject IoC assembly - an external assembly for IoC , they support both mono and windows. but to work on mono a different compiled version of the assembly is needed.
I have the following problem:
I have a Domain.Core project that uses Ninject windows version
I have two additional project - Call them For.Mono and For.Windows, they both have reference for the Domain.Core project
The problem is that For Ninject to work on mono we have to compile it with a special compilation symbol.
Now the mono version will not work on windows, how can I have resolve the issue of using both versions in the same solution to have the following:
When I run the For.Mono project only the Mono version of Ninject is used, even if the Domain.Core project is using Ninject in some classes inside it's code and referencing the local to it's score windows version - I would like to override this with the Mono version somehow,
And use the Ninject windows version on the For.Windows project, this issue is trivial as it just works, but the first request I am asked to reference the windows version by the compiler when I use code from the Domain.Core in my For.Mono project. I understand that the compiler is right but how can I resolve this issue of cross-platform support with one code-base
Your question is not exactly clear, but from what I gather: You have shared base classes/interfaces in Domain.Core assembly that your project uses, you also have 2 sets of derived classes/interfaces (For.Windows and For.Mono) that provide platform specific implementation of classes/interfaces from Domain.Core. You want to have both For.XXX referenced from your solution.
I don't see problem with that approach. As long as your code only refers to classes from Domain.Core and instantiation code is wrapped in
if (platform = Platform1)
{
// must be function calls in both branches to avoid JIT-ing references
// to unsupporterd DLL for the other platform.
InstantiatePlatformOneClasses();
}
else
InstantiatePlatformTwoClasses();
there should be no problem at either compile or run-time.
Note: using some DI container will solve the same issues easier since you can simply configure it to pick platform specific implementations a run-time.