Let's say I have project A and project B. Project A depends on project B. So A would normally have a direct reference to B's DLL.
Then I decided to publish B as a nuget package. Now, A has a reference to B via nuget instead of a local DLL.
The downside to this arrangement is that if I update B, I need to upload a new nuget and wait for it to be available in order to use it from A.
I see that I can point to a local nuget package when updating A's reference to B so that does help a bit. However, if I make a change to B, I do still have to go through the steps of generating a new package and updating A's reference to the package before A will see the change. With the pre-nuget arrangement, I simply built B and A would see the changes.
An alternative is to remove A's reference to B's nuget package and revert to pointing to the local DLL while doing local development. However, if A is published to github, then the reference has to be reverted to a nuget reference before pushing to github.
What's the best practice in a situation like this? Surely many folks are dealing with this sort of thing with github and nuget being widely used.
UPDATE
This topic came up for discussion on the C# subreddit and some interesting approaches were pointed out.
Azure devops - original comment - thanks to B0dona
What we do is use azure devops (
https://azure.microsoft.com/en-us/services/devops/ ) to build and push
our nuget packages to our own repository (nuget.org is slow)
automatically after a new commit has been pushed.
You set it up once, push project B drink some coffee and enjoy the joy
that is updating packages.
git submodules - original comment - thanks to alkrun
You mention GitHub so I'll propose something a bit different:
In my opinion, if work on project A is likely to cause changes in
project B, A referencing B as a git submodule is much easier than
dealing with nuget. Git Submodules aren't without their headaches, but
this is what they were designed for. Some of the benefits:
1) The biggest is that if you say "If I need to change B then I'll
just make the change and push to get a new package built then test it
out in A" it's not a very fluid model to work with and it's asking
developers to push untested code into B. Then that 1-3 minute turn
around for CI build/package turns into 3 x 1-3 minute turnarounds and
it just feels horrible when I've tried it in the past. It's a massive
productivity killer.
2) Any other options around changing csproj files, they can work, but
they're very error prone. Developers aren't always great about
reviewing all changes before they commit them, you're going to have
developers forgetting and checking in the change to a project
reference and it's going to cause build failures.
3) Using B as a submodule for A doesn't prevent you from having
automated builds on B that produce nuget packages, and maybe other
projects which are less likely to change B could/should refer to those
4) At some point in the development of A, if it matures and becomes
less likely to cause changes in B, then you can switch A->B to a nuget
package reference also
Another option, I remember reading an article years ago where someone
had created a tool that generated an msbuild xproj file that would
replace package references with project references. They had it set up
where the xproj file was on the .gitignore and if the file didn't
exist, the nuget package reference was used. In this way, a developer
would run a command to switch over to project references, make the
changes they need, commit them, push changes, then update the nuget
reference. This seemed fairly clean to me, but I can't find the
article and it was a bit more complicated than I'm making it sound.
So I'd still go with the git submodule route. There are some quirks
with git submodules but they have gotten a lot easier to deal with in
the last couple years and it feels like the quirks are easier to
explain than the other options. Having used this for a few projects
now, it's very intuitive. Understand what a Git Submodule in a
detached head state is and how to avoid it, make sure to use the -b
option when adding a submodule, and find a good tool that can
handle submodules well. For what it's worth, VS Code is the most
intuitive interface I've found for working with submodules. Open A in
VS Code then switch to the version control tab and you'll see A and
any submodules listed up top along with the branches they're tracking.
Commit changes to the submodule, then to the parent and you're good to
go.
The Node community demonstrates a more robust approach. The npm link command which is available out of the box (see also this blog post), creates symbolic link from distributed package directory to package source directory. In this way:
package consumers are transparently redirected to package source project
changes in package sources are automatically reflected on the consumer side
The npm link approach has also these advantages over reference switching:
no changes are made to consumer's source code -- which can accidentally be committed.
works cross-platform, doesn't need a specific IDE
can be scripted and shared with the team
This feature is obviously required in the NuGet community, but for some reason, NuGet still lacks it. There is a feature request https://github.com/NuGet/Home/issues/1821, which indicates they have no plans of adding it.
Meanwhile I created a tool, which works similarly to npm link for NuGet packages, you may want to give it a try: https://www.nuget.org/packages/NuLink
If you are using Visual Studio 2017, you can install NuGet Reference Switcher for Visual Studio 2017 extension. Here is a guide how to use it: https://github.com/RicoSuter/NuGetReferenceSwitcher/wiki/Guide
Related
I have a huge solution with many projects and in-house NuGet packages that has a pervasive dependency on Unity 4.0.1. We are evaluating migrating this solution to Unity 5.11.1 to improve performance and solve random DI-related crashes stemming from code that the Unity project outright deleted on the 5.0.0 release.
In searching for a way to ease the migration from the outside-in two tools have been developed:
A Roslyn-based source code converter
A bridge that implements the Unity 5 interface but in reality maps calls transparently to a wrapped Unity 4 container interface
Both tools pass their unit tests just fine and the converter managed to convert one key "leaf" project, however, we've hit a roadblock when trying to reference migrated leaf project from one inner project: The infamous NU1605.
I absolutely can see how the NU106 error is warranted, as the inner project still references Unity 4.0.1 and the leaf project references Unity 5.11.1. however, this is one case of the tools getting in our way: I require both versions to "co-exist", as I am manually bridging their inconsistencies.
On paper, this should be plenty viable as the DLLs have different versions and even namespaces are different.
Is there any way to "force" nuget into accepting this weird setup?
You have two options to suppress that code. One is to use the <NoWarn>NU1605</NoWarn> msbuild property (must be defined inside a PropertyGroup). Visual Studio's project properties probably has a way to edit it in the UI.
The other option is to add the NoWarn="NU1605" metadata to your ProjectReference item:
<ProjectReference Include="package id" Version="1.2.3" NoWarn="NU1605" />
Finally, NuGet actually reports NU1605 as a warning, which you might notice if you read the docs page title carefully. The .NET Core SDK elevates it to a warning using the WarningsAsErrors property. So, if you're sufficiently proficient with MSBuild, you could either remove it after they add it, or check how to prevent it from being added to the list. My guess as to the motivation is because the BCL is being distributed as packages for .NET Core 1.x and 2.x (it won't for 3.x) and when there's a security update, you don't want NuGet's nearest-wins rule causing a package with a known vulnerability to be accidentally used.
I'm working on stabilizing a rather old website so that development can resume on it by adding new features. The previous implementers however had a rather strange approach to modularity - their heart was in the right place, but the execution was... off.
There's one "Main" solution, which - as a front end guy, will typically work with. It includes the web project which is served by IIS. A lot of (pretty much all) the Back End stuff however, is brought in via a NuGet package from a private NuGet Server (TeamCity)
Now. This seems kind of nice and modular until you have to make a backend change. Previously, if a backend change was required, the team would make the change in the backend solution, and then commit and republish the entire package. The Front End solution must then update the NuGet Package on it's end in order to receive the changes.
This is a nightmare...
I won't even start on the version control situation. But lets just say branching hasn't been a known concept here for a number of years. But I'm here to put it on the straight and narrow.
I was wondering if anybody has had experience Adding and existing project to a solution that was previously a NuGet package. I added the entire Backend solution and removed the NuGet dependency, only for the build to blow up in my face declaring that the types from the Backend solution no longer existed.
I'm thinking I need to add these new projects to the build order or something? Maybe that's a red herring.
Yours sincerely,
A JavaScript guy who's out of his depth with .NET...
Have you added references to the "new" backend in your front-end project? Right-click on the project Add -> reference... and select the backend project within Solution list.
I need my C# desktop application to create TFS work items depending on my data.It should not use any further functionality: only connect to tfs and create workitems\tasks. I think solution of this problem should be simple and dont use a lot of code and a lot of referenced libs.
According to https://www.visualstudio.com/en-us/integrate/get-started/client-libraries/dotnet i should use these nuget packeges.
In https://msdn.microsoft.com/en-us/library/bb130322(v=vs.120).aspx article supposed to use Microsoft.TeamFoundation.Common and
Microsoft.TeamFoundation.Client libraries.
I'm a bit confused what to use, because in first case the memory overhead is too big(all libs = 60Mb when my app is only 10Mb) and a lot of redundand packages are used(some webapi, soap,sazure fetures).In second one i can't find them except as a parts of different packages.
I don't need help about write code, i need advice about least weight functional package to do this.
There is also the option to use the REST API instead of client libraries. This will remove the need to reference the Microsoft TFS libraries, but you might need other packages like Json or something.
For example, a call to https://{instance}/defaultcollection/{project}/_apis/wit/workitems/${workitemtypename}?api-version={version} will create a WorkItem (source)
That Nuget package includes lots of assemblies for all aspects of TFS (SourceControl, Builds, WorkItems, etc). You can add the Nuget package to your project but only reference the assemblies you actually need for Work Items and that should cut down the size of your compiled application package.
The official release notes say:
Improved compatibility with the EventSource nuget package
SLAB's source must be updated and rebuilt to work with the EventSource nuget package (which supports channels, but does not
support sampling). The process is now fairly painless.
Added references to the EventSource nuget package to all projects
Changed System.Diagnostics.Tracing to Microsoft.Diagnostics.Tracing in all source files
Defined the EVENT_SOURCE_PACKAGE constant in the unit test project (to disable tests that cannot possibly work with the nuget version).
This is a bit cryptic. Something seems backwards because I can't see any references at all to Microsoft.Diagnostics.Tracing in the Nuget download.
Or are the sub-bullets things that you have to do to get it to build (so it should say, Add, Change, Define instead of Added, Changed, Defined)?
Hm, well those instructions (if they are instructions) are not sufficient:
There are three places where Microsoft.Diagnostics.Tracing is already referenced, so that gives duplicate warnings
There are multiple places where ambiguities appear between Microsoft.Practices.EnterpriseLibrary.SemanticLogging.Etw.Configuration.EventSourceSettings (which is a class) and Microsoft.Diagnostics.Tracing.EventSourceSettings (which is an enum).
A bit of detective work and common sense:
The last release date of SLAB is 25 July 2014, there have been a whole load of versions of Microsoft.Diagnostics.Tracing.EventSource, including one which presumably innocently introduced EventSourceSettings.
If I install and reference version 1.0.26, the instructions work.
Now just have to find out what things from version 1.1.28 are missing, and whether I miss them.
So, I just made SLAB work with the NuGet EventSource packages by following the directions above with SLAB 1.1.28 and the latest NuGet EventSource from the Microsoft.Diagnostics.Tracing.EventSource namespace.
Essentially, you need to fix up the ambiguous references between Microsoft.Practices.EnterpriseLibrary.SemanticLogging.Etw.Configuration.EventSourceSettings and Microsoft.Diagnostics.Tracing.EventSourceSettings, just as it says above.
You want the Microsoft.Practices.EnterpriseLibrary.SemanticLogging.Etw.Configuration.EventSourceSettings reference.
It works, produces flat file logs and writes to the Event Viewer, and when used in conjunction with a controller like PerfView, produces ETL files for detailed analysis.
Next, I'll be testing the out-of-process case.
#Benjol's detective work is correct.
Many users wanted to be able to use EventSource channels (which is included in the EventSource NuGet package) with SLAB so compatibility was improved to make compiling against the EventSource package at the time of release quite painless.
However, SLAB has not been updated recently but the EventSource Package continues to add/modify features. Some of these could be breaking changes with the current SLAB implementation. Since compatibility with subsequent releases of EventSource might not have been tested (I'm not sure what the team has done on this) there could be potential issues.
At work we have a tracing library that has to be referenced for all applications.
Recently following a major version change, the trace lib name changed as well
From
dotnet_tracing-w32r-1-2
To
dotnet_tracing-w32r-2-0
This broke several of our pre-packaged solutions (projects that have a main branch that get forked for customization to specific customers).
What I'm trying to figure out, is there any way to (auto-magically) reference one OR the other? Having the version in the filename is screwing everything up and I really dont want to maintain two separate branches of these projects.
To solve this problem, we used Conditional References
First I created 2 different build configurations - and based upon those build configurations, I used conditional references to reference the proper assembly. Finally we then use some post build scripting to generate our 2 different NuGet Packages and publish to our NuGet feed.