Background
I have a deployed system that consists of:
A few dozen "core" DLLs
Hundreds of "plugins" that make use of the "core" DLLs
The core (and each plugin) are independently deployable, but the plugins can only execute correctly if they're deployed alongside the correct version of the core.
The Problem
What if the core gets modified in a way that breaks a plugin? The actual APIs/interfaces might remain the same, but a logic change within one of those methods could (in theory) cause negative downstream impacts. How can we prevent those?
This architecture is old and "GAC style" and isn't really what I'd suggest today. But it's what I have. I've started moving the "core" DLLs into individual nuget packages to try and move the system to a "bring your own" model. In the interim time, I am trying to figure out how to execute existing (~5000) integration tests on the core+plugins, but I'm having a hard time coming with a good way to achieve this
What I have Tried
You can't just compile the tests in the plugin's solution because the plugins won't necessarily be compiled against the latest core DLL set, they'll be at whatever version they consumed when they were last modified. Similarly, you can't try and attach the *.csproj files for those plugins to the core's solution because the plugin's xunit test project will be executing against the (outdated) nuget packages for that plugin.
The best I could come up with was:
Publish each plugin with its test DLLs
When building the core, consume all the plugins + test DLLs
This will be the "latest" core, so will mimic what will happen if the core updates
I don't like this solution. It's awfully kludgy. Will it work? Yes, at least well enough to tell you that plugin X was broken and needs attention. Do I like it? Nope.
Does anyone have a better idea?
Related
My goal is to create a build output that acts as portable version of my application, containing the non-framework dependencies (nuget + projects) directly as dlls.
This demonstrates the solution and the four projects contained:
You can see the libraries are .net standard 2.0 while a console app (to run / debug some code) is using .net framework 4.7.2 (because we have some .net framework apps in use around here). The arrows in the image show the references that have been set.
Parts of the the libraries (not console app) might be imported into MS sql server in the future; these imports usually pick up all dependencies from the same folder if available (which is the reason for a self-contained output).
Problem A: Running the code on linux/mono
Using jetbrains rider to open the solution, build + restore packages will work without problems.
Running the console app will work to some extend (e.g. loading data from SQL) until it fails when calling some code that makes use of BouncyCastle (dll not found exception).
Looking at the build output shows three .dll files of my lib-projects along with the .exe file of the console-app plus all the .pdb files AND additionally the System.Data.SqlClient.dll.
Seems that might be the reason my sql code worked.
Problem B: Running the code on win10, .net framework / core installed
Using VS / rider made no difference here, opened the solution, restored packages + build without problems. Running the console-app fails earlier than before: this time it was unable to find the Syste.Data.SqlClient.dll.
Checking the build folder shows my three lib-dlls and the .exe including the .pdb files, nothing else.
To my understanding, the files have to be either in GAC or inside the same folder for them to be found. It seems when a .net standard library dependency includes nuget-packages, something is having troubling either loading them into GAC or at least copying them to the build directory (which is outdated I guess).
So even if my approach (having a portable-ish / self-contained version of my app) might be uncommon or even stupid, I would have thought that just running this code on the dev-machine should work fine.
Problem C: Including dependencies in the build output
According to information I found here, the <CopyLocalLockFileAssemblies>true</CopyLocalLockFileAssemblies> property can be used to have dependencies being copied to the build directory.
And while this seems to copy too many files (tried excluding some according to docs) it does indeed copy the .dll files for a complete package when building the ExchangeIntegration project (.net standard 2.0).
Looking into accomplishing the same for .net framework .csproj projects yielded mixed results. I was playing around with the copy-local setting and stumbled across similar problems when creating nuget packages and using .targets files but had no success.
About the specific libraries used
I'm more interested in learning the right concept to handle this, it's not really about whether or not it makes sense to import Newtonsoft.Json into MS sql server.
I feel I'm fundamentally missing something; maybe the output type library is not correct for my goal or maybe I'm mixing problematic .net versions, although I did a quick check and it seems ok.
How can I get a .net framework (console-app) build that contains all non-framework dependencies (e.g. nuget) AND an separate .net standard (library) build, that also contains all non-framework dependencies?
EDIT: Adding all nuget packages to the top tier project (console-app in my example) seems to work around the problem, code executes without problems. I'm still waiting for a proper solution.
EDIT2: Added mono/msbuild version used under arch: 16.4
I'm using .NET standard libraries inside my desktop solution so I can share those libraries between my desktop and web solution. However, I'm seeing incremental build problems now. Has anyone else experienced this or managed to fix it?
I can see that the .NET standard libraries have some incremental build features in that if I build them on the command line using "dotnet build" it takes, for example, 1 second and then if I make a code change in that library and run again it takes 2 seconds. This goes back to 1 second if I run for a third time.
However, in pure .NET framework applications like a console app or wpf app, traditionally if you make no code change then inside visual studio it doesn't even attempt to build the project. It simply says "1 up-to-date". I'm seeing the .NET standard libraries always re-copied to the output directory and they also cause my entire WPF project to rebuild every time. After a while this starts to get annoying if every code change takes 5 seconds to build!
I have at least a partial answer for you. The issue of the binaries copying out every time is explained by Microsoft as essentially there is no support, update to a later target framework version.
At this point we don't do any smart trimming of the compat shims that come with the tooling so it ends up copying them all. If you (or your dependencies) don't rely on them it is safe to delete/not-deploy them but doing it blindly may break some of your dependencies.
And this is supposed to be fixed in 4.7.1
It's worth mentioning that with .NET Framework 4.7.1 and up you'll not have to deploy any extra files for .NET Standard. Today, that's already the case if you target Xamarin or .NET Core.
As for why it always copies, instead of doing so selectively as it should, I am unsure. I have scoured build logs and I believe that the files have not been properly added to the output list of a build target but it is difficult to diagnose. Visual Studio also has smart ways to entirely avoid invoking MSBuild, so logic in the MSBuild script to skip a target doesn't even have to run. Visual Studio is mostly a black box, but I suspect this logic is broken too, based on the behavior both you and I have observed.
This is a frustrating issue because there are still good reasons to target old versions of .NET Framework, for example if a library is distributed to a customer who may build against it using Visual Studio 2013, and cannot be expected to update.
We are developing a WPF application at work which has various "common" dependencies (Unity, Prism, etc.).
It's all fine when adding new projects and then setting up the NuGet package dependency per project but when it comes to upgrades, it's really painful as it means we have to go through each and every project, delete the old references and then refetch the latest packages from NuGet.
Today for instance, I was tasked with upgrading Prism from 5.0 to 6.0 (which has breaking changes anyway) and this meant, in addition to fixing all the namespace conflicts, etc. that I had to go through every project, delete the old references, add the new dependecies and rinse and repeat.
My question is, is there a smarter way to deal with this problem or is this the standard approach?
Many thanks in advance,
Update:
I am mostly concerned with "major" upgrades which don't show up on the package manager. Version 5.0 -> 6.0 upgrade would be treated as a major upgrade and hence, would not have an automatic update applied to it in the NuGet package manager.
I don't expect NuGet to be able to do this automatically for me since such upgrades may (and often do) include breaking changes but I would like to know if there's a way to do the major upgrades less painfully than deleting the references from the projects and the packages.config for every project and then re-adding them using NuGet. For a relatively large project, this is very time consuming and I was wondering if anyone had a better way of managing such dependencies.
If you use VS2013 like you say, you can manage ALL your NuGet packages by right-clicking on your Solution and selecting 'Manage NuGet Packages For Solution'. This brings up a dialog where you can view all packages installed for all projects in the solution and all packages that have updates available. When you do upgrade the packages, VS takes care of all the reference changes required. If the package has breaking changes, then you're still on the hook for fixing those.
Disclaimer: I've never worked on a WPF project/solution but for Web/Forms apps, NuGet packages are handled this way.
I can understand your pain because i had the similar problem like you, but there is no easy way. but certainly you need to break the process differently of your daily development and your dependency update roll-out.
for the project i worked on, I use the common repository path that shared among the solutions that you work on, and you need to delete all the solutions folder references in order to get a clean state.
For each solution you work on you need to modify the property group that point to the common target repository (i'm using relative path)
Once all the things setup, you can actually perform an update with a script(I'm using python run-time script)
you can actually look at setting up common nuget-packages-folder for reference updates for detail, but it seems like what you looking at for the automate process
I had a similar problem when trying to upgrade multiple packages with alpha channel issues in Xamarin Studio, which also does not have the niceties of VS 2015 NuGet manager. I ended up writing a very simple PowerShell script that I run multiple times a day.
#
# This script updates local ibGib NuGet packages for mobileGib Android app solution.
# For convenience in copy+paste in manager console:
# ../UpdateLocalNugetPackages.ps1
Update-Package commonGib
Update-Package ibGib
Update-Package languageGib.Biz
Etc.
I believe you could tailor your NuGet commands to fit your needs.
Also, just in case you aren't aware of it, you should definitely read the NuGet command line reference. I may be mistaken, but it sounds like your scenario is doable with the Update command.
I have an independent solution with multiple projects including class libraries and control libraries. This solution and all its projects are under TFS source control.
I reference the output of one or more of these libraries in all new projects I develop. References are currently binary rather than project references.
The new projects are also always under source control and now I need to add debugging support for the libraries.
If I reference the library projects from them, the project file is modified and no longer works with the original library solution since source control providers for the library and referencee may be different.
Is there an easy way to accommodate this?
You should package the shared binaries, along with indexed PDB's, into a Nuget package. Nuget was specifically designed to solve these problems.
You can index your PDB's by running an indexing tool. TF Build can automatically index your PDB's.
Nope.
There are some strategies you can use, however. Easiest (possibly, but not in some cases) is to build the project you wish to debug, drop the binaries on top of the application that hosts them, and attach your debugger to the running application. This makes sure you have the correct version of the assembly under debug, but you might have to do unwanted things, such as making sure you're not targeting a specific version of the assembly
Which may be bad news for an assembly under development. It also requires lots of handiwork, which depending on where your application runs may require you run remote debugging, deal with issues transmitting dlls across untrusted networks, etc etc.
We're building a .NET software platform for test automation for in house use in our company.
The application is composed of a GUI (WinForms) component, and various "Actions" that are dynamically being loaded into it to be executed.
There are approximately ~ 100 Action projects going already, with this number increasing.
Some of these projects are interdependent on other projects and so on.
All actions loaded must reference our "SDK" dll for various activities (results to the main app, logging, etc).
With this relatively simple design, we are facing some management decisions that we'd like to solve in the best way:
Should actions ("plugins") reference our SDKs project output, or some known stable version of it?
For example, when developing a large application (MS Office just for the example), not all teams work with source code for all components naturally.
What is the best solution for something like this ? and why?
How to properly verify that all needed dependencies (3rd party libraries for example) are indeed taken from the correct location ?
What are common practices in scenarios where managing multitude of projects that are linked in between? are there any tips for doing so ?
This is a problem that doesn't have a clear answer, but...
There are two paths you can take. A strongly coupled system or a loosely coupled system.
For a strongly coupled system I can suggest two directories for binaries: a 3rd party directory and a directory that houses DLLs that you company builds that other developers can reference. The 3rd party DLLs (outside your company) should be located in source control so that all developers reference the same versions of the 3rd party DLLs from the same location this avoids developer machine inconsistencies and having the problems of installing 3rd party software on every machine. The in house DLLs should not be referenced in source control and should be built on each developers machine via an automated build batch file or similiar. In a build post step you can copy them all to the same directory and as long as developers get the latest source control and build, everyone has the same DLLs from within your company.
For example, get latest, build (using a batch file to build all the projects needed), and then as a post build step copy the output to common. Now all of your other projects can reference the common compnay DLLs and the third party DLLs from the same location and everyone is consistent.
The problem is that the references are strong coupled, so changes can sometimes be problematic if not communicated properly.
A loosely coupled system uses a framework such as MEF (Managed Extensibility Framework) and your components reference "Contract DLL" which define the interfaces for your components. The project reference the interface or contract DLLs and don't really care about the implementation and then MEF manages the plugin for you.
In this case, you reference the interface DLL but not the actual DLL that implements.
For example, say I have an interface called ILog with a method called LogMessage.
private ILog _logger;
_logger.LogMessage();
So, in a strongly coupled case: Action.DLL references Logger.DLL directly.
In a loosely coupled case Action.DLL references ILog.DLL (just the interface). Logger.DLL implements ILog.DLL. But Action.DLL has no refernce to Logger.DLL directly.
Now I can have any number of DLLs that implment the ILog interface, but the Action.DLL does not reference them directly. That's pretty cool and one of the more exciting features of MEF and loose coupling in general, the ability to not to have dependencies.
How you choose to go, either way is acceptable, I think the loosely coupled idea fits your scenario the best as teams would just have to know the contracts versus the actual implementations.
I wouldn't have one massive contract DLL, I would try and break the interfaces into logical groupings. For example, logging seems like a Utility type of interfance, so I would create a Utility contract DLL with a ILog interface. How it is split up depends on what you are trying to do. Or each interface could be a contract DLL, but maybe that is a little extreme.
This is a somewhat complex topic, especially in .NET land. I don't know about "best" solution, but I'll explain how we manage it; perhaps you will it useful for yourself.
This allows you to build large systems with lots of linked projects, but incurs in a lot of complexity issues. As, I think, any solution of this kind would.
First: physical structure (we use SVN).
There is a source control root for each project
Each project has its own trunk, branches and tags
The trunk folder has a versioned \src and \build folder, and an unversioned \lib folder
The \lib folder contains binaries to reference.
Those binaries could be 3rd party libraries or other projects that you need to link to (e.g., your SDK). All binaries under \lib come from an enterprise ivy repository (see http://ant.apache.org/ivy/). There is a lot of movement these days in .NET land concerning NuGet, so you could check that out too.
Your versioned \build folder contains build scripts, for example to get binaries from ivy, publish your project to ivy, or compile each project. They will also come in handy when you want to point a Continuous Integration server at each of your projects.
Second: To define where dependencies come from
Answer: They come from your ivy repository (it could be as simple as a network shared file system).
You have created your repository, so you have control as to its contents.
Be very careful with 3rd party binaries that get installed in the GAC. Visual Studio is a pain in the a^^ to deal with this.
Specifically:
How to properly verify that all needed
dependencies (3rd party libraries for
example) are indeed taken from the
correct location ?
Ivy gives you a great flexibility with dependencies, and it also resolves transitive dependencies; e.g., you can depend on SDK rev="1.+" status="latest.release" which would mean "the latest stable 1.x release of SDK, or on SDK rev="2.+" status="latest.integration" which would mean the latest available binary of 2.x SDK (probably as produced from a continuous integration build).
So you will always depend on compiled binaries, never on project output. And you can control which version of the binaries to get. 3rd party dependencies will probably be brought in as transitive upon your SDK.
This also means that the amount of code in your projects will stay as small as you need to have workable Visual Studio solutions. It also means that refactoring tools like ReSharper will be a lot less useful. There will also be a certain amount of complexity concerning your build scripts and your branching strategy. That depends a lot on the logic of your components.
This is a brief overview, if you think this is the sort of thing you want I can expand the answer. Good luck; the .NET ecosystem, and Visual Studio in particular, isn't really thought to work like this.