Nuget Dependencies in .NET Core - c#

If I install some Nuget package Package1, it is added to Dependencies/Packages/Package1. When I install another Nuget package Package2 which has dependency to Package1, there will be added Dependencies/Packages/Package2/Package1.
In this case I have right now this:
Dependencies
|_Packages
|_Package1
|_Package2
|_Package1
There is duplicity of the Package1. Should I remove the Dependencies/Packages/Packages1, or is it ok like this? Isn't it takes more space?

It's fine, assuming both your direct dependency and the indirect one use the same major version. If they have different major versions, you could be in trouble, as they may well be incompatible. (This is a weakness in .NET versioning at the moment, IMO.)
You can remove the direct dependency if you want - unless you want a later version than Package2 depends on. For example, if Package2 depends on Package1 version 1.2.0, but you want something that's only in Package1 version 1.5.0, it's fine for you to state that dependency explicitly. Only one version of Package1 will end up being deployed.

This user-interface feature isn't showing you files on disk. It's a logical hierarchy of dependencies and Nuget doesn't store downloaded packages like this physically. You can't "remove" them because the UI is showing you a statement of fact - this package does depend on these other packages.
(It took me a while to understand what you were asking because I was looking for this structure on disk, and could not reproduce this.)

If Package2/Package1 contains everything Package1 contains by itself, you won't need to reference it twice.

Related

How to ignore NuGet NU1605?

I have a huge solution with many projects and in-house NuGet packages that has a pervasive dependency on Unity 4.0.1. We are evaluating migrating this solution to Unity 5.11.1 to improve performance and solve random DI-related crashes stemming from code that the Unity project outright deleted on the 5.0.0 release.
In searching for a way to ease the migration from the outside-in two tools have been developed:
A Roslyn-based source code converter
A bridge that implements the Unity 5 interface but in reality maps calls transparently to a wrapped Unity 4 container interface
Both tools pass their unit tests just fine and the converter managed to convert one key "leaf" project, however, we've hit a roadblock when trying to reference migrated leaf project from one inner project: The infamous NU1605.
I absolutely can see how the NU106 error is warranted, as the inner project still references Unity 4.0.1 and the leaf project references Unity 5.11.1. however, this is one case of the tools getting in our way: I require both versions to "co-exist", as I am manually bridging their inconsistencies.
On paper, this should be plenty viable as the DLLs have different versions and even namespaces are different.
Is there any way to "force" nuget into accepting this weird setup?
You have two options to suppress that code. One is to use the <NoWarn>NU1605</NoWarn> msbuild property (must be defined inside a PropertyGroup). Visual Studio's project properties probably has a way to edit it in the UI.
The other option is to add the NoWarn="NU1605" metadata to your ProjectReference item:
<ProjectReference Include="package id" Version="1.2.3" NoWarn="NU1605" />
Finally, NuGet actually reports NU1605 as a warning, which you might notice if you read the docs page title carefully. The .NET Core SDK elevates it to a warning using the WarningsAsErrors property. So, if you're sufficiently proficient with MSBuild, you could either remove it after they add it, or check how to prevent it from being added to the list. My guess as to the motivation is because the BCL is being distributed as packages for .NET Core 1.x and 2.x (it won't for 3.x) and when there's a security update, you don't want NuGet's nearest-wins rule causing a package with a known vulnerability to be accidentally used.

Locally developed nuget with dependent projects

Let's say I have project A and project B. Project A depends on project B. So A would normally have a direct reference to B's DLL.
Then I decided to publish B as a nuget package. Now, A has a reference to B via nuget instead of a local DLL.
The downside to this arrangement is that if I update B, I need to upload a new nuget and wait for it to be available in order to use it from A.
I see that I can point to a local nuget package when updating A's reference to B so that does help a bit. However, if I make a change to B, I do still have to go through the steps of generating a new package and updating A's reference to the package before A will see the change. With the pre-nuget arrangement, I simply built B and A would see the changes.
An alternative is to remove A's reference to B's nuget package and revert to pointing to the local DLL while doing local development. However, if A is published to github, then the reference has to be reverted to a nuget reference before pushing to github.
What's the best practice in a situation like this? Surely many folks are dealing with this sort of thing with github and nuget being widely used.
UPDATE
This topic came up for discussion on the C# subreddit and some interesting approaches were pointed out.
Azure devops - original comment - thanks to B0dona
What we do is use azure devops (
https://azure.microsoft.com/en-us/services/devops/ ) to build and push
our nuget packages to our own repository (nuget.org is slow)
automatically after a new commit has been pushed.
You set it up once, push project B drink some coffee and enjoy the joy
that is updating packages.
git submodules - original comment - thanks to alkrun
You mention GitHub so I'll propose something a bit different:
In my opinion, if work on project A is likely to cause changes in
project B, A referencing B as a git submodule is much easier than
dealing with nuget. Git Submodules aren't without their headaches, but
this is what they were designed for. Some of the benefits:
1) The biggest is that if you say "If I need to change B then I'll
just make the change and push to get a new package built then test it
out in A" it's not a very fluid model to work with and it's asking
developers to push untested code into B. Then that 1-3 minute turn
around for CI build/package turns into 3 x 1-3 minute turnarounds and
it just feels horrible when I've tried it in the past. It's a massive
productivity killer.
2) Any other options around changing csproj files, they can work, but
they're very error prone. Developers aren't always great about
reviewing all changes before they commit them, you're going to have
developers forgetting and checking in the change to a project
reference and it's going to cause build failures.
3) Using B as a submodule for A doesn't prevent you from having
automated builds on B that produce nuget packages, and maybe other
projects which are less likely to change B could/should refer to those
4) At some point in the development of A, if it matures and becomes
less likely to cause changes in B, then you can switch A->B to a nuget
package reference also
Another option, I remember reading an article years ago where someone
had created a tool that generated an msbuild xproj file that would
replace package references with project references. They had it set up
where the xproj file was on the .gitignore and if the file didn't
exist, the nuget package reference was used. In this way, a developer
would run a command to switch over to project references, make the
changes they need, commit them, push changes, then update the nuget
reference. This seemed fairly clean to me, but I can't find the
article and it was a bit more complicated than I'm making it sound.
So I'd still go with the git submodule route. There are some quirks
with git submodules but they have gotten a lot easier to deal with in
the last couple years and it feels like the quirks are easier to
explain than the other options. Having used this for a few projects
now, it's very intuitive. Understand what a Git Submodule in a
detached head state is and how to avoid it, make sure to use the -b
option when adding a submodule, and find a good tool that can
handle submodules well. For what it's worth, VS Code is the most
intuitive interface I've found for working with submodules. Open A in
VS Code then switch to the version control tab and you'll see A and
any submodules listed up top along with the branches they're tracking.
Commit changes to the submodule, then to the parent and you're good to
go.
The Node community demonstrates a more robust approach. The npm link command which is available out of the box (see also this blog post), creates symbolic link from distributed package directory to package source directory. In this way:
package consumers are transparently redirected to package source project
changes in package sources are automatically reflected on the consumer side
The npm link approach has also these advantages over reference switching:
no changes are made to consumer's source code -- which can accidentally be committed.
works cross-platform, doesn't need a specific IDE
can be scripted and shared with the team
This feature is obviously required in the NuGet community, but for some reason, NuGet still lacks it. There is a feature request https://github.com/NuGet/Home/issues/1821, which indicates they have no plans of adding it.
Meanwhile I created a tool, which works similarly to npm link for NuGet packages, you may want to give it a try: https://www.nuget.org/packages/NuLink
If you are using Visual Studio 2017, you can install NuGet Reference Switcher for Visual Studio 2017 extension. Here is a guide how to use it: https://github.com/RicoSuter/NuGetReferenceSwitcher/wiki/Guide

Minimal package for TFS API

I need my C# desktop application to create TFS work items depending on my data.It should not use any further functionality: only connect to tfs and create workitems\tasks. I think solution of this problem should be simple and dont use a lot of code and a lot of referenced libs.
According to https://www.visualstudio.com/en-us/integrate/get-started/client-libraries/dotnet i should use these nuget packeges.
In https://msdn.microsoft.com/en-us/library/bb130322(v=vs.120).aspx article supposed to use Microsoft.TeamFoundation.Common and
Microsoft.TeamFoundation.Client libraries.
I'm a bit confused what to use, because in first case the memory overhead is too big(all libs = 60Mb when my app is only 10Mb) and a lot of redundand packages are used(some webapi, soap,sazure fetures).In second one i can't find them except as a parts of different packages.
I don't need help about write code, i need advice about least weight functional package to do this.
There is also the option to use the REST API instead of client libraries. This will remove the need to reference the Microsoft TFS libraries, but you might need other packages like Json or something.
For example, a call to https://{instance}/defaultcollection/{project}/_apis/wit/workitems/${workitemtypename}?api-version={version} will create a WorkItem (source)
That Nuget package includes lots of assemblies for all aspects of TFS (SourceControl, Builds, WorkItems, etc). You can add the Nuget package to your project but only reference the assemblies you actually need for Work Items and that should cut down the size of your compiled application package.

Is it possible to get SLAB working with Microsoft.Diagnostics.Tracing.EventSource?

The official release notes say:
Improved compatibility with the EventSource nuget package
SLAB's source must be updated and rebuilt to work with the EventSource nuget package (which supports channels, but does not
support sampling). The process is now fairly painless.
Added references to the EventSource nuget package to all projects
Changed System.Diagnostics.Tracing to Microsoft.Diagnostics.Tracing in all source files
Defined the EVENT_SOURCE_PACKAGE constant in the unit test project (to disable tests that cannot possibly work with the nuget version).
This is a bit cryptic. Something seems backwards because I can't see any references at all to Microsoft.Diagnostics.Tracing in the Nuget download.
Or are the sub-bullets things that you have to do to get it to build (so it should say, Add, Change, Define instead of Added, Changed, Defined)?
Hm, well those instructions (if they are instructions) are not sufficient:
There are three places where Microsoft.Diagnostics.Tracing is already referenced, so that gives duplicate warnings
There are multiple places where ambiguities appear between Microsoft.Practices.EnterpriseLibrary.SemanticLogging.Etw.Configuration.EventSourceSettings (which is a class) and Microsoft.Diagnostics.Tracing.EventSourceSettings (which is an enum).
A bit of detective work and common sense:
The last release date of SLAB is 25 July 2014, there have been a whole load of versions of Microsoft.Diagnostics.Tracing.EventSource, including one which presumably innocently introduced EventSourceSettings.
If I install and reference version 1.0.26, the instructions work.
Now just have to find out what things from version 1.1.28 are missing, and whether I miss them.
So, I just made SLAB work with the NuGet EventSource packages by following the directions above with SLAB 1.1.28 and the latest NuGet EventSource from the Microsoft.Diagnostics.Tracing.EventSource namespace.
Essentially, you need to fix up the ambiguous references between Microsoft.Practices.EnterpriseLibrary.SemanticLogging.Etw.Configuration.EventSourceSettings and Microsoft.Diagnostics.Tracing.EventSourceSettings, just as it says above.
You want the Microsoft.Practices.EnterpriseLibrary.SemanticLogging.Etw.Configuration.EventSourceSettings reference.
It works, produces flat file logs and writes to the Event Viewer, and when used in conjunction with a controller like PerfView, produces ETL files for detailed analysis.
Next, I'll be testing the out-of-process case.
#Benjol's detective work is correct.
Many users wanted to be able to use EventSource channels (which is included in the EventSource NuGet package) with SLAB so compatibility was improved to make compiling against the EventSource package at the time of release quite painless.
However, SLAB has not been updated recently but the EventSource Package continues to add/modify features. Some of these could be breaking changes with the current SLAB implementation. Since compatibility with subsequent releases of EventSource might not have been tested (I'm not sure what the team has done on this) there could be potential issues.

How to resolve NuGet dependency hell

I develop a library with some functional named CompanyName.SDK which must be integrated in company project CompanyName.SomeSolution
CompanyName.SDK.dll must be deployed via NuGet package.
And CompanyName.SDK package has a dependency on 3rd party NuGet packages. For good example, let's take Unity. Current dependency is on v3.5.1405-prerelease of Unity.
CompanyName.SomeSolution.Project1 depends on Unity v2.1.505.2.
CompanyName.SomeSolution.Project2 depends on Unity v3.0.1304.1.
Integrating CompanyName.SDK into this solution adds dependency on Unity v3.5.1405-prerelease.
Let's take that CompanyName.SomeSolution has one runnable output project CompanyName.SomeSolution.Application that depends on two above and on CompanyName.SDK
And here problems begin. All Unity assemblies has equal names in all packages without version specifier. And in the target folder it will be only one version of Unity assemblies: v3.5.1405-prerelease via bindingRedirect in app.config.
How can code in Project1, Project2 and SDK use exactly needed versions of dependent packages they were coded, compiled and tested with?
NOTE1: Unity is just an example, real situation is 10 times worse with 3rdparty modules dependent on another 3rdparty modules which in turn has 3-4 versions simultaneously.
NOTE2: I cannot upgrade all packages to their latest versions because there are packages that have dependency not-on-latest-version of another packages.
NOTE3: Suppose dependent packages has breaking changes between versions. It is the real problem why I'm asking this question.
NOTE4: I know about question about conflicts between different versions of the same dependent assembly but answers there does not solve the root of a problem - they just hide it.
NOTE5: Where the hell is that promised "DLL Hell" problem solution? It is just reappearing from another position.
NOTE6: If you think that using GAC is somehow an option then write step-by-step guide please or give me some link.
Unity package isn't a good example because you should use it only in one place called Composition Root. And Composition Root should be as close as it can be to application entry point. In your example it is CompanyName.SomeSolution.Application
Apart from that, where I work now, exactly the same problem appears. And what I see, the problem is often introduced by cross-cutting concerns like logging. The solution you can apply is to convert your third-party dependencies to first-party dependencies. You can do that by introducing abstractions for that concepts. Actually, doing this have other benefits like:
more maintainable code
better testability
get rid of unwanted dependency (every client of CompanyName.SDK really needs the Unity dependency?)
So, let's take for an example imaginary .NET Logging library:
CompanyName.SDK.dll depends on .NET Logging 3.0
CompanyName.SomeSolution.Project1 depends on .NET Logging 2.0
CompanyName.SomeSolution.Project2 depends on .NET Logging 1.0
There are breaking changes between versions of .NET Logging.
You can create your own first-party dependency by introducing ILogger interface:
public interface ILogger
{
void LogWarning();
void LogError();
void LogInfo();
}
CompanyName.SomeSolution.Project1 and CompanyName.SomeSolution.Project2 should use ILogger interface. They are dependent on ILogger interface first-party dependency. Now you keep that .NET Logging library behind one place and it's easy to perform update because you have to do it in one place. Also breaking changes between versions are no longer a problem, because one version of .NET Logging library is used.
The actual implementation of ILogger interface should be in different assembly and it should be only place where you reference .NET Logging library.
In CompanyName.SomeSolution.Application in place where you compose your application you should now map ILogger abstraction to concrete implementation.
We are using that approach and we are also using NuGet for distribute our abstractions and our implementations. Unfortunately, issues with versions can appear with your own packages. To avoid that issues apply Semantic Versioning in packages you deploy via NuGet for your company. If something change in in your code base that is distributed via NuGet you should change in all of the packages that are distributed via NuGet. For example we have in our local NuGet server :
DomainModel
Services.Implementation.SomeFancyMessagingLibrary (that references DomainModel and SomeFancyMessagingLibrary)
and more...
Version between this packages are synchronized, if version is changed in DomainModel, the same version is in Services.Implementation.SomeFancyMessagingLibrary. If our applications needs update of our internal packages all dependencies are updated to the same version.
You can work at post-compilation assembly level to solve this issue with...
Option 1
You could try merging the assemblies with ILMerge
ilmerge /target:winexe /out:SelfContainedProgram.exe Program.exe ClassLibrary1.dll ClassLibrary2.dll
The result will be an assembly that is the sum of your project and its required dependencies. This comes with some drawbacks, like sacrificing mono support and losing assembly identities (name, version, culture etc.), so this is best when all the assemblies to merge are built by you.
So here comes...
Option 2
You can instead embed the dependencies as resources within your projects as described in this article. Here is the relevant part:
At run-time, the CLR won’t be able to find the dependent DLL
assemblies, which is a problem. To fix this, when your application
initializes, register a callback method with the AppDomain’s
ResolveAssembly event. The code should look something like this:
AppDomain.CurrentDomain.AssemblyResolve += (sender, args) => {
String resourceName = "AssemblyLoadingAndReflection." +
new AssemblyName(args.Name).Name + ".dll";
using (var stream = Assembly.GetExecutingAssembly().GetManifestResourceStream(resourceName)) {
Byte[] assemblyData = new Byte[stream.Length];
stream.Read(assemblyData, 0, assemblyData.Length);
return Assembly.Load(assemblyData);
}
};
Now, the first time a thread calls a method that references a type in
a dependent DLL file, the AssemblyResolve event will be raised and the
callback code shown above will find the embedded DLL resource desired
and load it by calling an overload of Assembly’s Load method that
takes a Byte[] as an argument.
I think this is the option i would use if I were in your shoes, sacrificing some initial startup time.
Update
Have a look here. You could also try using those <probing> tags in each project's app.config to define a custom sub-folder to look in when the CLR searches for assemblies.

Categories