I'm working with Visual Studio 2010 and WinForms, .Net 4.0 (C#). I'm building an application with a lot of DLL (150). When I provide the application to my client, it's :
The Executable (.exe)
Dll files (.dll)
Each Dll is related to a module of the application, for example :
Ado.dll (provide access to database)
AccesManagement.dll (this module allows to manage users in the application)
Import.dll (this module allows the user to import data to the application)
etc.
When my client find a bug in the application, I correct it and I provide him impacted DLLs (in order to avoid him to test all the application). It can be for example the Import Dll.
The thing is, after some deliveries, we can have compatibility problems between Dll (a method that doesn't exist anymore in a new DLL for example). To avoid this problem, I would like to find a tool capable of checking compatibility between differents DLL.
I would like something like :
I specify the directory of the program to analyse (executable + Dll)
I launch the analyse
The program tells me for example : Error between Import.dll and Ado.dll, there is a class xxx in Import.dll expecting a method named xxx in the class xxx of Ado.dll
I've found some tools able to compare two versions of a Dll and provide added and removed members (Libcheck, ApiChange), but it's too complicated for me to do that because there are to many changes.
I think you may have a configuration management problem here -- at least as much as you've got a "compatibility" problem.
I'd recommend you find a way to track what versions of which assemblies each of your customers is using so that (1) you know what they're using when you decide what to ship, and (2) when they report bugs, you can replicate their setup (and thus, replicate their bug). If that sounds like a lot of work, it is. This is why a lot of software development shops take steps to ensure that there's a limit to the variation in setups among customers. It's nearly certain that you'll end up with some variation from customer-to-customer, but anything you can do to manage that problem will be beneficial.
Beyond the process implications, if you really need to create a "pluggable" environment, you probably need to create some interfaces for your objects to control the points where they connect, and you should probably look at Microsoft's Managed Extensibility Framework (MEF). MEF can help you manage the way objects "demand" behaviors from other objects.
I finally found a solution to my problem.
Since I'm :
Using SourceSafe and adding labels with the version of the application I'm building
Tagging each of my DLL with the version of the application
I built a program which is capable of :
Opening each Dll of a folder to read the version of the application in it
Getting from SourceSafe each project for the version specified in the DLL (With the functionnality "Get Label")
Then I just have to build the projet. If there is any compilation error, there is a compatibility problem.
This solution can avoid big compatibility problems, but you can still have compatibility problems which can't be seen with a compilation...
Related
I have several issues with several SDK's comming from OEM manufacturers for specific devices. SDK is usually based on C or C++ dll, so I have a lot of Marshaling going around (a lot===YOU CAN'T EVEN IMAGINE). Problem start with next version of SDK when they extend some functions or some structures, they effectively break compatibility. In past I have made copy of our library supporting their device and start making changes to support new SDK. But each time our library was only for specific SDK, and upgrades of our systems were tough (Installation script if one heavy thing also ~ 3 GB install).
I have 78 projects in solution, commonly 4-5 libraries for each OEM Manufacturer, this is without any service tools. And Yesterday I said NO MORE. Started research on subject how to recompile C# code in runtime and reload/replace same assembly without quiting App.
And the result is the following:
- Class file that defines external C/C++ dll API was referenced from external Project referencing only System.dll. And me being insane I've already had each SDK version changes wrapped around #if #elif #endif so I could recompile last version of our library to support previous version of SDK. But that was maybe only once done, I've used #defines along with CSharpCodeProvider to recompile this assembly in runtime. Idea was like this:
Application loading ...
Open main SDK file get file version (extract version and identify it).
Load original External Assembly in new AppDomain (so I could destroy domain later).
Extract current version from external assembly.
Destroy new AppDomain to release hook from external assembly.
If versions mismatch, recompile external assembly (source code for external assembly is embedded within parent assembly), and replace original DLL with just compiled one.
Continue loading application...
So far this test approach works on one live demo system, and I was amazed. Switching from one to another SDK was flawless without any hick-ups.
And also code recompiles it self only when SDK version changes. So with safe guard I could say this is my first Metamorphic code I've wrote, that recompiles/changes it self from runtime.
Unfortunately this approach requires me to add one more Project for each OEM Manufacturers SDK. Which effectively kills my first though why I said NO MORE. True I now have only two libraries to maintain per one OEM manufacturer, and there will be no more projects added after this. But...
I wonder is there better approach which could allow me to replace DLL of currently loaded assembly in runtime from true within same assembly? Or change executing code on "fly" each time, this mainly includes Marshaled function, classes, structures, constants, ...?
Please notice code should be maintained from within same project without any externals. Also please notice this project exposes only hard-coded interface to "outside" world (Interface is referenced Interface only project - is more complex than I wrote). But this "outside" world is blind to any OEM specific stuff, which was the point using interface to have exactly same behavior across any OEM Device.
Any ideas? thoughts? suggestions?
The situation:
I'm working on a research project which, due to some constraints, has a C# user interface (used mostly for visualization) but does most of the processing with PInvoke and unmanaged C++ code. The unmanaged code has TONS of dependencies on various 3rdparty libraries: Boost, PCL, OpenCV, CGAL, VTK, Eigen, FLANN, OpenMesh, etc. (if you can name it, we probably depend on it!). The C# project interacts with a C++ project (which I simply refer to as "wrapper" from now on). Obviously, the wrapper is where all the 3rdparty dependencies are consumed and is where entry points for PInvokes are defined. The wrapper is compiled into a DLL and copied into the output directory of the C# project via a post-build event.
I am the sole developer of the project. My primary development platform is Windows 10 with Visual Studio 2015 and Git is my version control. I mostly develop on 3 differenct machines, but sometimes I need to develop on other machines which only have Visual Studio 2015 installed.
What I've done so far:
Obsiously, managing all those 3rdparty dependencies is a hassle for one person, and I'd hate to have to install those libraries on new development machines . What I've done is that I've compiled all those 3rdparty libraries from source into static lib files (except the header-only ones obviously). All sources are built once for Debug configuration and once for Release configuration. I spent some time and integrated them into my wrapper project (i.e. defining extra include directories, using lots of #pragma comment (lib, "blah.lib") which reference different builds depending on the build configuration, etc.). I also followed some of the advice in Microsoft's linker best practices, to reduce link times. Specifically, I'm using the incremental linker, I've disabled /LTCG and /OPT.
Now I have this gigantic "Dependencies" folder in my VS solution which is around 8GBs, and is version-controlled separately from the project (using a Git submodule). The wrapper project gets statically linked to all these, as a result and as mentioned above, only one DLL is produced after building the wrapper project. The upside of this approach is that on any new development machine, I clone the main repository, clone the Dependencies submodule and I'm ready to roll! But...
The worst part:
You've guessed it! Terrible link times. Even on a powerful computer, after I change a single line in the wrapper project, I would have to sit for a couple of minutes till the linker finishes. The thing I didn't see when I took the above approach was that I forgot how much I valued rapid prototyping: quick and dirty testing of some random feature in the wrapper project before exposing that feature to PInvoke. Ideally, I would like to be able to change something small in the wrapper project, quickly build, run and test that change and get on with exposing the feature to PInvoke.
My Question:
Clearly, I'm inexperienced in this! How should I have done things differently, specifically given the dependencies I mentioned above? Was building DLLs instead of static libraries better? But then I would've had to add the Dependencies to PATH everytime the C# program started (as mentioned here). As a side question, how do you evaluate my current approach?
Based on the comment by #silverscania, I decided to just take the DLL route. It was a bit of pain rebuilding all the dependencies, but I'm now super happy about the results.
Now, building the whole solution from scratch takes 36 seconds! It used to be about 4 minutes before, so I have nothing to complain about. Also, modifying a single file in the wrapper project and building again takes 3 seconds which is amazing! The fact that all the compiled dependencies are now about 1 GB (opposed to ~8GB with the static libraries) is a plus! I couldn't be happier.
A coupt of notes:
On the main machine where I do most of my development, I have a SanDisk SSD. I noticed that for some reason beyond my comprehension, building the project on that device was way slower compared to a regular HDD. I'm looking into this issue, but haven't found an reason for this (TRIM is enabled and the drive is in AHCI mode).
I played around with the flags a bit more. I noticed that the compiler flag /GL (Whole program optimization) caused considerable slowdown during linking. I disabled that option too.
Short version of the question:
Is there a good way to detect at build time if I have any cases where 2 or more projects reference different versions of the same assembly? (really, I would like to teach our CI server to do this)
Long Version:
So here's an interesting problem (simplified a bit for easy digestion):
Recently encountered a situation where we had 2 projects in a solution, A and B. Both A and B depend upon a 3rd party nuget package C.
A always loads C, B only needs C in rare circumstances.
So, during this sprint, a developer updated Project A to use the latest version of the C package (not realizing that B also depended upon C)
Everything built and the tests that we had passed (we have insufficient test coverage), but when we released to production, we had failures occuring when B attempted to use the dependency (loader issues, because we wanted a different version of the strongly named assembly).
We found the problem, and corrected it, but. I would really love to be able to catch this during development. It would be even cooler if our build server could detect this (TFS 2012) when it does a CI build.
How might I go about detecting this situation?
VS can't do this for you because of the dynamic loading (unless I'm missing something): it just has no way of knowing which assemblies wil be loaded at runtime.
We had the same problem once (using Prism - all our assemblies are loaded at application startup normally, and the order is described in a config file though most are optional). I First thought of making a small tool that basically scans all packages.config or csproj files to see what assemblies are used in what version, and make it complain when two packages of different versions are found. But I ended up with dealing with it at a higher level, more direct and foolproof: we now have a simple class, sort of a stub of the actual application, that just loads all the application's components and modules as described in the config file. This results in all assemblies that can ever get loaded to be loaded so if something goes wrong it will be found. This functionality is simply placed in a unit test.
Is it necessary to register a compiled DLL (written in C# .NET) on a target machine.
The target machine will have .NET installed, is it enough to simply drop the DLL onto the target machine?
I think you're confusing things a little. Registering a dll has never been needed in order to use it.
Using a dll requires only to load it (given a known location or if the library is in the system path) and get the address of the function you wanted to use.
Registering the dll was used when distributing COM or ActiveX objects which need to add certain entries to the windows registry. In order to use a COM service (for example) you need to reference a GUID — that is, a unique identifier — which allows you to get a handle to the dll that implements the service (or provide access to it). Sometimes you can make reference to a fully-qualified name and get the same results.
In order for all that to work the dll needed to be registered. This "registration" process just creates several entries in the registry, but mainly these two: one associating a GUID with the location of the dll (so that you can reference it through the GUID without knowing where is it exactly located) and a second one associating the full name with the GUID. But again, this is just for COM or ActiveX objects.
When you develop an application in .NET, the libraries referenced on your project are automatically loaded when they're needed without you having to worry about locating or loading them. In order to to that, the framework checks two locations for the referenced libraries.
The first location is the application path.
The second location is the GAC.
The GAC (Global Assembly Cache) allows you to effectively register a dll to be used throughout the system and works as an evolution of the old registering mechanism.
So basically you just need to put the dll in the same folder of the application.
You need to "drop" it into a directory where the application needing it will find it.
If there are multiple applications, or you want to "drop" the file somewhere other than the application directory, you generally need to either adjust the PATH variable, or register the assembly in the Global Assembly Cache (GAC).
It is usually enough to drop the dll into the folder of your app on the target machine.
If the dll must be available to other applications then you may want to consider the GAC.
If you wish to access the assembly via com+. An example would be using a type defined in a .NET assembly from a non .NET application, such as a VB6 winforms app.
If you plan on accessing the assembly from another .NET application, you don't have to do anything. If your assembly has a strong name, it probably is a good idea to drop it in the GAC. Otherwise, just drop it in the directory of the application that will be referencing it.
One of the great selling points of .NET for the Windows platform when it came onto the scene is that by default, .NET assembly DLLs don't have to be registered and can be consumed privately by an application by merely putting them in the same folder as the EXE file. That was a great stride forward because it enabled developers to avoid the fray of DLL/COM hell.
Shared DLL/COM modules proved to be one of the greatest design mistakes of Windows as it lead to instability of applications that users installed. Installing a new app could well screw up an app that had been working just fine - because the new app introduced newer versions of shared DLL/COM modules. (It proved in practice to be too much of a burden for developers to properly manage fine-grained version dependencies.)
It's one thing to manage versions of modules with a build repository system like Maven. Maven works extremely well doing what it does.
It's an entirely different matter, though, to deal with that problem in an end-user runtime environment spread across a population of millions of users.
The .NET GAC is by no means a sufficient solution to this age-old Windows problem.
Privately consumed DLL assemblies continue to be infinitely preferable. It's a no-brainer way to go as diskspace is extremely cheap these days (~$100 can by a terabyte drive at Fry's these days). There is nothing to be gained with sharing assemblies with other products - and yet company reputation to loose when things go south for the poor user.
Actually there is NO need to register a dll in .NET on the target machine.
If you reference a .dll in your application, click on the referenced .dll under references in your project, look at the properties and set Isolated to TRUE.
This will now automatically include this .dll in your project and your application will use the copy of the .dll included in your project without any need to register it on the target system.
To see a working Example of this look here:
http://code.msdn.microsoft.com/SEHE
The .dll in question will need to be registered on the system where you build your application for this to work properly. However once you build your project, there will not be any need to register the .dll in question on any system you deploy your application or program.
An additional benefit of using this method, is that even if in the future, another .dll is registered with the same name on the target system in question, your project will continue to use the .dll you deployed with. This is very handy where a .dll has many versions and you wish to maintain some stability, like using the one you tested with, yet all other applications will use the registered .dll unless they use the isolated = true method as well.
The example above is one of those cases, there are many versions of Skype4COM which is a Skype API .dll and can change often.
This method allows the above example to use the API .dll that the project was tested with, each time a user installs a new version of Skype, it is possible that a modified version of this .dll is installed.
Also, there are some Skype clients that do not install this .dll, the business version of the Skype client for example, is smaller, and does not include this .dll, so in this case, the project does not fail on that .dll missing and not being registered because it is included in the project as isolated = true.
An application can use a .NET dll by simply having it present in the same folder with the application.
However if you want other third-party applications to find the DLL and use it they would also have to include it in their distribution. This may not be desirable.
An alternative is to have the DLL registered in the GAC (Global Assembly Cache).
Scenario
I have two wrappers around Microsoft Office, one for 2003 and one for 2007. Since having two versions of Microsoft Office running side by side is "not officially possible" nor recommended by Microsoft, we have two boxes, one with Office 2003 and the other with Office 2007. We compile the wrappers separately. The DLLs are included in our solution, each box has the same checkout but with either Office 2003 or 2007 "unloaded" so it doesn't attempt to compile that particular DLL. Failure to do that will throw errors on compilation due to the Office COM DLLs not available.
We use .NET 2.0 and Visual Studio 2008.
Facts
Since Microsoft mysteriously changed the Office 2003 API in 2007, renaming and changing some methods (sigh) thus making them not backwards compatible, we need the two wrappers.
We have each build machine with the solution and one Office DLL activated. E.g.: the machine with Office 2003 has the "Office 2007" DLL unloaded, therefore not compiling it. The other box is the same idea but the other way around. All this because we can't have 2 different Office in the same box for programming purposes. (you could technically have two Office together according to Microsoft) but not for programming and not without some issues.
Problem
When we change the Application Version (from 1.5.0.1 to 1.5.0.2 for example) we need to recompile the DLL to match the new version of the application, this is automatically done, because the Office wrapper is included in the solution. Since the wrappers are contained in the solution, those inherit the APP Version, but we have to do it twice and then "copy" the other DLL to the machine that creates the installer. (A Pain…)
Question
Is it possible to compile a DLL that will work with any version of the application, despite being "older"? I've read something about manifests but I have never had to interact with those. Any pointers will be appreciated.
The secret reason for this is that we haven't changed our wrappers in "ages" and neither did Microsoft with their ancient APIs, yet we are recompiling the DLL to match the app version on every release we make. I'd like to automate this process instead of having to rely on two machines.
I can't remove the DLL from the project (neither of them) because there are dependencies.
I could create a third "master wrapper" but haven't thought about it yet.
Any ideas? Anyone else with the same requirement?
UPDATE
Clarifying:
I have 1 solution with N projects.
"Application" + Office11Wrapper.dll + Office12Wrapper.dll.
Both "wrappers" use dependencies for application + other libraries in the solution (datalayer, businesslayer, framework, etc.)
Each wrapper has references for the respective Office package (2003 and 2007).
If I compile and don't have office 12 installed, I get errors from Office12Wrapper.dll not finding the Office 2007 libraries.
So what I have are two building machines, one with Office 2003, one with Office 2007. After a full SVN update + compile on each machine, we simply use office12.dll in the "installer" to have the wrapper compiled against the "same code, same version".
Note: The Office 2007 Build Machine, has the Wrapper for Office 2003 "unloaded" and viceversa.
Thanks in advance.
When the .NET assembly resolver is unable to find a referenced assembly at runtime (in this case, it cannot find the particular wrapper DLL version the application was linked against), its default behavior is to fail and essentially crash the application. However, this behavior can be overridden by hooking the AppDomain.AssemblyResolve event. This event is fired whenever a referenced assembly cannot be found, and it gives you the opportunity to substitute another assembly in place of the missing one (provided that they are compatible). So, for instance, you could substitute an older version of the wrapper DLL that you load yourself.
The best way I've found to do this is to add a static constructor on the main class of the application that hooks the event, e.g.:
using System.Reflection;
static Program()
{
AppDomain.CurrentDomain.AssemblyResolve += delegate(object sender, ResolveEventArgs e)
{
AssemblyName requestedName = new AssemblyName(e.Name);
if (requestedName.Name == "Office11Wrapper")
{
// Put code here to load whatever version of the assembly you actually have
return Assembly.LoadFile("Office11Wrapper.DLL");
}
else
{
return null;
}
}
}
By putting this in a static constructor of the main application class, it is guaranteed to run before any code attempts to access anything in the wrapper DLL, ensuring that the hook is in place ahead of time.
You can also use policy files to do version redirection, but that tends to be more complex.
Just a thought - could you use TlbExp to create two interop assemblies (with different names and assemblies), and use an interface/factory to code against the two via your own interface? Once you have the interop dll, you don't need the COM dependency (except of course for testing etc).
TlbImp has a /asmversion for the version, so it could be done as part of the build script; but I'm sure you even need this: just make sure that "specific version" is false in the reference (solution explorer)?
Also - I know it doesn't help, but C# 4.0 with dynamic and/or "No PIA" might help here (in the future; maybe).
I'm not sure I am completely following everything you stated, but let me try:
It sounds like you have one solution with 2(?) projects. One is the actual application, and the other is a wrapper for the Office API. Your application then has a project reference to your Office API wrapper. I've never programmed for office before, but it sounds like the programming APIs are a common component that you can only have one version of on a machine (ie. 2003 or 2007, not both). And maybe this is where the problem is, but because you have a project reference, the wrapper will be compiled first, copied to the bin directory of your application, where your application will be linked to that build of the wrapper. This will cause the manifest of the application to specifically request that version of the wrapper at run time.
If you had the wrapper in a separate solution, and added a reference to the compiled library rather than the project, you would always link your application to that version of the wrapper and you could avoid the problem.
Another possible choice is Assembly Binding Redirection. This is more advanced, and comes with it's own set of problems, but you can read about it here.
Or similar to Marc's idea, you could extract an interface and pull some common objects into a Framework library, and code your application against the interface and common objects. Then at runtime use reflection to load the assembly and instantiate the wrapper you want.
I think the key is to remove the project dependency if you can. It sounds like the wrapper is pretty stable and isn't changing, otherwise you wouldn't be asking to link to a previous version of it.
Installing Office 2003 and 2007 side-by-side on the same machine is definitely possible - we do it in our organisation even on end-user production workstations.
In that linked article, Microsoft recommend that you don't do this for actual use. But in your case it appears to be just for a single build machine, i.e. you're not going to actually use either version of Office on that machine. In this context, I would try to see if you can make the side-by-side installation work.
My assumption might be wrong, and you're attempting to do this for every developer's machine. In that case, you should ignore this answer :-)
Nice sleuthwork! I just threw together an implementation based on the concept presented above, and it works wonderfully:
static Assembly domain_AssemblyResolve(object sender, ResolveEventArgs args)
{
string partialName = args.Name.Substring(0, args.Name.IndexOf(','));
return Assembly.Load(new AssemblyName(partialName));
}
Of course there is room for enhancement, but this does the trick for me!