Problem Statement:
Optimal way of importing dependent DLLs in PowerShell script.
Explanation:
I have a DLL, namely a.dll which has almost 10 dependencies on b.dll, c.dll,.... When I import the a.dll from the Nuget Package A in a PowerShell script then as the Dependent DLLs are not present in the same Directory it throws an error. I only have the DLLs in Nuget packages. The packages as usual will be in the
├── Nuget-A
│ ├── 1.0
| | |── a.dll
│
├── Nuget-B
│ ├── 2.0
| | |── b.dll
In the PoweShell script I will import the a.dll,
Import-Module "Nuget-A/1.0/a.dll"
This throws me an error so I do the following
Import-Module "Nuget-B/2.0/b.dll"
Import-Module "Nuget-A/1.0/a.dll"
In the same way I have to do for the 10 DLLs.
What is the optimal way of handling the scenario? I am open to any other approaches too
I'm not terribly experienced with PowerShell, but I expect that if all the assemblies were in the same directory, then when you import a.dll, it will automatically find the references, such as b.dll. That's how the .NET runtime loads assemblies, more or less, regardless of PowerShell, a console app, web app, and so on.
Therefore, instead of creating a package A that contains only a.dll and a dependency on package B, have package A not have any dependencies, and contain all 10 dlls in it.
There are a few ways to achieve that. One is to re-use csproj's PackAsTool. It publishes the project (using the equivalent of dotnet publish), and then packs everything in the publish folder. Since it's designed to pack console apps so that they can later be installed via dotnet tool install, you might have to do some hackery to get it to work. Another way is to run dotnet publish, then get nuget.exe from nuget.org/downloads, and cd to the publish directory run nuget.exe spec to create a template .nuspec file, edit that nuspec with all the metadata you want, then run nuget.exe pack. It will pack all the files in the directory into a nupkg. There are other ways too, for example try nugetizer, a community created tool, but this question is about how to solve referenced assembly loading, not how to pack, so I'll leave it at that. My point is that when you have a package that contains a PowerShell cmdlet, you shouldn't need to worry about loading all the dependencies in the powershell script/environvment using the package, so make sure the package is "self contained". It shifts the burden from consuming time to packing time, but at packing time all the dependencies are known, so it's an easier problem to solve.
Related
I am working on creating a sample Nuget package to test out the process of creating an internal Nuget package for use in another project of mine. My end goal is to create a simple Nuget package, which can be installed onto another simple C# project, and tested out.
I have been following the Microsoft tutorial to create & publish a package using VS:
https://learn.microsoft.com/en-us/nuget/quickstart/create-and-publish-a-package-using-visual-studio-net-framework
I successfully created & published my package on nuget.org, called MyNugetPackage, and attempted to install it onto my other C# project called TestingMyNugetPackage. I received an error in the NuGet package console stating:
Package does not support any target framework
This error makes sense, because I had read about supporting multiple .NET versions and specifying the version under the lib folder, and I definitely did not do that when creating my package:
https://learn.microsoft.com/en-us/nuget/create-packages/supporting-multiple-target-frameworks
This idea of lib folder makes sense to me and I think I understand how to add my target .NET version to it. However, I cannot find this folder anywhere! It's not anywhere in the C# project directory. I assume I may need to create it on my own, but I'm not sure where to put it.
Many tutorials and SO questions I have read about this topic talk about how to use the lib folder, but no one ever says where it is. I'm a complete beginner to this and I know I am missing something obvious here, but I'm not sure what it is.
Edit: I did try to change my .nupkg file to a .zip file and extracting the contents in attempt to view the lib folder. This did work in extracting the contents, but I did not see any lib folder after expanding entire project tree and searching for lib.
Here is a quick layout of my C# solution tree:
Solution titled MyNugetPackage with a MyNugetPackage.sln file, a MyNugetPackage.csproj file, and a simple class Logger.cs that just has a public void Print(string text) { Console.WriteLine(text); } method:
MyNugetPackage
MyNugetPackage.csproj.1.0.0.nupkg
MyNugetPackage.nuspec
MyNugetPackage.sln
MyNugetPackage (folder)
bin (folder)
Debug (folder) -> .dll, .pdb
Release (folder) -> .dll, .pdb
obj (folder)
Debug (folder)
Release (folder)
Properties (folder)
AssemblyInfo.cs
Logger.cs
MyNugetPackage.csproj
Could someone direct me where I need to place my lib folder, so that I can add my supported .NET 4.7 framework reference, and successfully install my package?
A NuGet package (.nupkg) is just a zip file. If you are trying to view the contents of this file, open it like a zip file (using 7zip or something). Alternatively change the extension to zip. In the package you will find the "lib" folder as well as the .nuspec, and package folder (among other contents). But this is the resulting package that is built when you Pack your project, changes here would have no affect on your code.
If you're just trying to target one or more frameworks. In VS, edit your project file (.csproj). This file is an XML with a PropertyGroup that contains either a "TargetFramework" OR a "TargetFrameworks" element. To target a single framework add a TargetFramework element, to target multiple use the TragetFrameworks instead.
To target a single .Net framework:
<PropertyGroup>
<TargetFramework>net472</TargetFramework>
</PropertyGroup>
Alternatively, you can target multiple frameworks.
<PropertyGroup>
<TargetFrameworks>net472; netcoreapp3.0; netcoreapp2.1</TargetFrameworks>
</PropertyGroup>
This would target .Net 4.7.2, .Net Core 3.0, and .Net Core 2.1
I'm getting my knickers in a twist with 'project' versus 'package' (ie Nuget package) references in asp.net 5.0. I'd really like for someone to explain a bit more fully the way references are pulled in in asp.net 5.0. How does a 'dnu restore' determine if something is a project reference rather than a package reference?
I had thought that a reference would be pulled in as a project if the projects were in the same directory, but this is clearly not the whole story. It does appear that you can have a deeper directory nesting and still pick up the project reference.
Here is an outline of my common project structure:
I've got a set of projects, some of which reference one another. There are libraries called TextHelpers and MathHelpers and a project called MainProject. The libraries live in a folder called Libraries, and the MainProject lives in a folder called Tools. This separation is necessary as Libraries and Tools belong to different Git repos:
Root/Libraries/TextHelpers.Project1 - version 1.0.0-*
Root/Libraries/TextHelpers.Project2 - version 1.0.0-*
Root/Libraries/MathHelpers.Project1 - version 1.0.0-*
Root/Libraries/MathHelpers.Project2 - version 1.0.0-*
Root/Tools/MainProject - version 1.0.0-*
Usually MainProject references the libraries as Nuget packages from a private Nuget repository (just a folder on the file system) which serves the libraries.
While I'm building MainProject, however, sometimes I need to make a change to one of the library projects, or sometimes I'd like to step into the files without using a Nuget symbol server. For this reason, I'd like to switch to referencing the (live) projects rather than from the (static) Nuget packages. How would I do this?
I've discovered this much so far: if I have a global.json file, a 'dnu restore' creates a project.lock.json with 'project' rather than 'package' references. Is this the whole story?
dnu and dnx look in the following folders:
The folder where the current project is (that means the parent folder of the folder containing the project.json of the current project). E.g. if you have repo/src/project1/project.json it will look in repo/src
Any other folder included in global.json
Then the algorithm is really naive: if it finds a folder with the name matching the package in any the folders mentioned above it will assume those are the sources for that package.
For example, if you have
src/P1/project.json
src/System.Collections/project.json
and in src/P1/project.json you have a reference to System.Collections, it will use src/System.Collection instead of the NuGet package System.Collections. Projects take precedence over packages.
Caveats:
Since the algoritm looks in the current folder and everything in global.json you might be able to reference some projects from one folder but not another. If in my previous example you'd add a test/T1/project.json project but src is not in global.json then the projects in src will reference System.Collections the project while T1 will reference the package (installed in the global packages folder).
There's no verification to see if the project reference is actually that package. If the name matches, it's a match. So an empty project could replace any package.
If you have multiple project with the same name you can get in trouble.
Hope this helps and answers your question.
Side note: with dotnet (the tool replacing dnx) you can specify for every reference if you want the project or the package to have higher priority.
I found several problems while using MsBuild from command line and I think they are all related. There are also separated threads for them. The problem occurs for MVC project, created in VS2013.
First - what is the problem.
My bin folder contains several "*.npl" files + some extra dlls
When project A references project B, which references some 3rd party dll, the dll is not present in the package (or at least not on the server after deploy), however log4net is also not referenced in project A, but only in project B, but it IS being copied to bin (and package).
Environment and settings
My run command is like this:
msbuild projectA.csproj /p:Configuration=Release /p:Platform=AnyCpu /p:VisualStudioVersion=12.0 /T:Build
My machine is running Win 8.1 with latest updates, VS2012 MSBuild(4.0.30319.33440) installed. Server runs Windows Server 2008 RC2 with installed VS2013 and slightly updated MSBuild(4.0.30319.34209).
How it behaves
On my local machine, when I run this command, the build runs OK. When I open the bin folder I can see my 3rd party dlls (including log4net) with no extra files. All was built ok.
When I run this command on server, the same bin folder is now missing my 3rd party dlls (but log4net is there!) and there are also some *.nlp files and mscorlib.dll. The build itself returns 9 warning, mostly this one:
There was a mismatch between the processor architecture of the project
being built "MSIL" and the processor architecture of the reference
"{Several_System_Dll_Are_There}", "AMD64".
And one warning complains about missing SDK. Important to note, that I can resolve these warnings and solving the problem number 1 above (npl files..) by appending this line to the command.
/p:FrameworkPathOverride="C:\Program Files\Reference Assemblies\Microsoft\Framework\.NETFramework\v4.0"
Probably the wierdest thing is the fact, that log4net works ok, but my 3rd party dlls are not. I checked csproj files. ProjectA has no reference to log4net. Only projectB references it. And they are referenced exactly the same (I checked the csproj in notepad). Only two differences are, that log4net was installed using nuget and also, log4 net is configuret in projectA's web.config.
As I was searching trough internet, I found these issues in separated threads and solutions were usualy by adding some extra settings to projects, editing csproj, sln, registry, etc. I don't like these solutions. Especialy, when the build works perfectly on my local machine.
The question is - why the server need frameworkPath to be specified? And why it is still not copying some 3rd party dlls? Why log4net works? And most important - Why is it working on my local machine, but not on the server?
Update - solution of 2. problem
So, it looks like I have a solution for the number 2 problem. The third party dll was referenced in GAC. And based on this Include GAC Assemblies in Bin, it looks like that msbuild can't handle that. adding True to the reference itself (MSBuild doesn't copy references (DLL files) if using project dependencies in solution) was not helping. I finaly had to add the reference on the third party dll to the projectA as well and add the Private tag. Now it works.
This is more like a hot-fix than solution. I don't have time for this.. if someone find a really solid solution, that would be awesome! :-)
I am attempting to publish and consume versioned NuGet packages of class libraries while avoiding headaches for local development. Here is a sample Visual Studio solution layout:
| Libraries
| LibraryA
| LibraryB
| LibraryC
| Applications
| ApplicationD
| ApplicationE
This is a single solution containing both shared class libraries and multiple applications. Currently references to the class libraries by the applications are local in-solution references.
What I would like to do is to publish the libraries (A,B,C) as versioned NuGet packages which are then referenced by the applications as needed (D,E). This allows a change to a shared library to be independent from an update to an application which is deployed. Without this, changing one library could cause the binaries to change in a dozen or more applications, all of which would technically need to be tested. This is undesirable, and versioning with NuGet fixes this.
However, let us say that I want to update the content of LibraryA and ApplicationD at the same time. In order to do this after we have switched to NuGet, I will have to make changes to LibraryA, commit them, wait for the package to be created, tell ApplicationD to update its reference to LibraryA, and then test or develop in ApplicationD. This is far more complicated than simply working with both at the same time using local in-solution references.
What is a better way to get both the robustness of versioned NuGet packages for my shared class libraries while also keeping development simple even if it spans over multiple projects and applications? The only other solutions I have found all involve too much overhead or headache, such as having to constantly change the references for ApplicationD between the NuGet package and the local project.
EDIT: To clarify the premise, this question assumes the following:
The architecture (solution and project organization) cannot be significantly reorganized
Shared libraries are going to change at a non-trivial frequency
Changing a shared library cannot force any application to be updated
Applications can reference different versions of shared libraries
Although it takes some work, it is possible to hand-edit .csproj files in order to set up conditional referencing by adding a Condition attribute to the appropriate references.
EDIT I've moved these conditions into ItemGroups, as it seems this is how my mentioned production code is working, and there has been mention of this being a possible issue in VS 2013.
<ItemGroup Condition="'$(Configuration)' == 'Debug Local'">
<!-- Library A reference as generated by VS for an in-solution reference, children unmodified -->
<ProjectReference>...
</ItemGroup>
<ItemGroup Condition="'$(Configuration)' == 'Debug NuGet'">
<!-- Library A reference as generated by NuGet, child nodes unmodified -->
<Reference Include="LibraryA">...
</ItemGroup>
This would allow you to have, on the Projects D & E, configurations of "Debug NuGet" vs. "Debug Local" which reference the libraries differently. If you then have multiple solution files which have their configurations mapped to the appropriate configurations on the projects within, the end user would never see more than "Debug" and "Release" for most operation, since those are the solution configs, and would only need to open the full solution for editing the A, B, & C projects.
Now, as for getting the A, B, & C projects out of the way, you could set them up under a folder marked as a subrepo (assuming you're using an SCM that supports this, such as Git). Most users would never need to pull the subrepo since they're not accessing the ABC projects, and are instead grabbing from NuGet.
Maintenance wise, I can guarantee that VS will not edit the conditional references, and will respect them during compilation -I have gone through both VS 2010 and 2013 (EDIT: Professional version, though I have delved into doing the same with express) with the same conditional reference projects at work. Keep in mind than in VS, references can be made version-agnostic, making NuGet the only place from which version need be maintained, and that can be done like any other NuGet package. While I'm hopeful, I have NOT tested whether NuGet will fight with the conditional references.
EDIT It may also be prudent to note that conditional references can cause warnings about missing DLLs, but does not actually hinder compilation or run.
EDIT For those still reading this, I'm now (7/2019) hearing that the IDE isn't as friendly to these changes anymore, and either it or the Package Manager may override them. Proceed with caution, and always read your commits!
Update for .NET Core (2.x ++)
.NET Core 2.x actually has this functionality built in!
If you have a project reference to project A in project B, and project A is a .NET Standard or Core project with proper package information (Properties -> Package with Package id set to your NuGet package ID), then you can have a regular project reference in project B's .csproj file:
<ItemGroup>
<ProjectReference Include="..\..\A\ProjectA.csproj" />
</ItemGroup>
When you pack (dotnet pack) project B, because of the Package id in project A, the generated .nuspec file will be set up with a NuGet dependency to that Package ID, together with other NuGet references you might have, instead of just including the built DLL file.
<dependencies>
<group targetFramework=".NETStandard2.0">
<dependency id="Project.A" version="1.2.3" exclude="Build,Analyzers" />
<dependency id="Newtonsoft.Json" version="12.0.2" exclude="Build,Analyzers" />
</group>
</dependencies>
I know this is a 2-years old post, but just found it while facing the same situation. Also found this for VS2015, I'm in the process of testing it. I'll come back and adjust my answer accordingly.
https://marketplace.visualstudio.com/items?itemName=RicoSuter.NuGetReferenceSwitcherforVisualStudio2015
I also faced a similar problem. One approach that worked was using local repository (which is basically just a folder in local) and adding post-build script in the libraries. For example: let's say you need to update your implementation for LibraryA, then include following 3 steps in your post-build event for LibraryA:
Check if local repository has that version of package; if yes then delete it
rd /s /q %userprofile%\.nuget\packages\LibraryA\#(VersionNumber) -Recurse -ErrorAction Ignore
Create a nuget package
nuget pack LibraryA.csproj
Push it to local repository
nuget push LibraryA#(VersionNumber) -Source %userprofile%\.nuget\packages
These steps will make sure that the package is always updated for that version after each build (we had to do this since nuget packages are immutable)
Now in ApplicationD, you can point to local repository (%userprofile%.nuget\packages) to get LibraryA; such that after each build of LibraryA, you will receive an updated version of it in ApplicationD
PS: Inorder to get version number of you library you can use this : Determine assembly version during a post-build event
Unfortunately, there really isn't a way to have the best of both worlds. Internally in my company, we've mitigated it somewhat with a fast build/deploy process, which counteracts most of the burdens with always referencing a NuGet package. Basically, all of our applications use a different version of the same library hosted in a local NuGet repository. Since we use our own software to build, deploy, and host the packages, it makes it pretty quick to update the library, then update its NuGet package in another solution. Essentially, the fastest workflow we've found is this:
Make changes to library
Automatically build and deploy version of library incremented by 1 to internal NuGet feed
Update NuGet package in consumer application
The whole process from check-in to updating the consuming project takes around 3 minutes. The NuGet repository also has a symbol/source server which helps tremendously with debugging.
In the properties of ApplicationD, go to the "Reference Paths" tab and add the path of the output folder of LibraryA. Then, if you change and build LibraryA, the next build of ApplicationD will use the modified LibraryA.
When you are finished, don't forget to remove the "Reference Paths" and update the referenced NuGet package version.
My not-so-clean yet fastest solution so far is:
Assuming the following two separate solutions:
VS Solution 1: contains libraries published as nuget packages:
Solution1
|_ my.first.library
|_ my.second.library
VS Solution 2: contains applications, which consume one or more of the above libraries as PackageReferences:
Solution2
|_ my.first.application
| |_ depends on nuget my.first.library (let us say v1.0.1)
|
|_ my.second.application
In case, I'm making changes to my.first.library
I proceed as follows:
Make code changes to my.first.library and rebuild
Navigate to the build output directory of my.first.library (e.g. <Solution1 directory>/my.first.library/bin/debug/netstandard2.0) and copy the .dll and .pdb files
Navigate to the my.first.library's local directory of the currently being used nuget feed (for example at: C:\Users\user.name\.nuget\packages\my.first.library\1.0.1lib\netstandard2.0) and replace the .dll and .pdb files there with the ones generated in step 1 (possibly making backup).
Changes get reflected in my.first.application. Continue working and repeat steps 1-4, when needed
Advantages:
being completely local. No secondary nuget feeds needed.
zero changes to .csproj/.sln files
Caution:
While this solution offers you flexibility, make sure you clear your nuget cache before acting on them, for example by publishing to a nuget server. Thanks #Benrobot
I have a hobby project that is written in C# using MonoDevelop. I've been trying for some time now to get my head around linux packaging, but I keep coming away feeling frustrated and overwhelmed.
My program consists of:
A library project ("Generator") that does stuff with the data created by my program.
An ui ("Interface") project using Gtk#. This project has two subdirectories: "glade" (xml files that gtk uses to build widgets) and "book" (data used by my program).
A utility project ("Utils") used by both the library and interface projects.
A main project ("MyProgramName") that just starts the interface.
What (I think) I want to do is really very simple (I think):
Compile my application
Copy the .exe and .dll files (to /usr/local/bin?)
Copy the "book" directory (to /usr/local/bin?)
Copy the "glade" directory (to /usr/local/bin?)
Oh, and I want to do this as a .deb package. I think if I can get the tarball working, a .deb package shouldn't be too much trouble, but that's what I want to do eventually.
I'm still not really sure how to do this. I've used MonoDevelop to create a Tarball. When I install the tarball (using ./configure, make, sudo checkinstall), it seems to install the executable code (and even create a command to run the program), but forgets about the "book" and "glade" directories.
How would I go about doing this? Sorry if this is a basic/broad question. I've been googling around about this, and I can't seem to find anything that doesn't assume I know the basics of packaging (even if it claims it doesn't assume this).
Debian packages are like tar files - they contain a copy of the file system. To create a Debian package...
Install the tarball in a build directory.
Add a DEBIAN directory with the control files. I found this article helpful.
Create the package with dpkg --build.
I would start by learning GNU's autotools: autoconf and automake. They make it very easy to install the program in a build directory. You mentioned ./configure. So I assume ythis project already has some of the structure. From the description, it sounds like the project might need...
Entries in configure.in for files in "book" and "glade".
Makefile.am files in "book" and "glade".
Putting it all together, the following commands result in a package file named project.deb.
# ./configure --prefix build/usr
# make && make install
# dpkg --build build project.deb
Perhaps this blog post may be of help to you.
It thoroughly describes the structure of a deb package, which is as follows:
<YOUR PACKAGE NAME>
└── deb
├── DEBIAN
│ ├── conffiles
│ ├── control
│ └── preinst
└── opt
└── <YOUR APPLICATION>
└── <Your Application Contents>
Basically, you have a deb folder inside the package with the following 2 mandatory folders inside:
DEBIAN - containing files that describe the deb package itself
file-system structure mirroring the destination for the package
installation. In the above example, the package will be deployed inside /opt/<YOUR APPLICATION> directory.
From the DEBIAN directory, you must have at least the control file, which is plain text. It needs to contain entries in specific format that is described in detail in the linked page. Here is a mere example (taken from there) with a sample control file:
Package:packagingmono
Version:1.0
Maintainer:Mikael Chudinov <mikael#chudinov.net>
Architecture:amd64
Section:net
Description:Template for Debian packaged Mono application.
Depends:mono-complete (>=3)
Package must be your package name. Allowed is upper/lower latin letters, numbers and -.
Version - the package version. I'd recommend using the assembly version for that field.
Maintainer - the package developer(s) name and contact info.
Architecture - either i386 or amd64. If you want to distribute your application optimized for x86 and x64 as separate executables per platform (I mean built explicitly for x64 or x86, not using AnyCPU), then you should produce separate .deb packages for each, and set the Architecture field appropriately. The rest of the fields may still be the same.
Section - optional, could be any of the allowed package categories in the debian apt system.
Description. Consist of two tokens - short description (the first item before a new line symbol) and optionally longer one (the text after the first new line).
Depends - a list of dependencies to your package. The example states mono-complete which is the package name for the mono runtime, and further restricts it to be higher or equal to version 3
Important thing to know about a deb package is that you can actually put an entire application (the contents of the bin folder) into a single package.
There is no need to put the referenced libraries in separate packages and mark them as dependencies, the latter makes sense if you plan to install other applications that would rely on the same libraries. Besides, packing the application together would not allow the dll-hell problem to one day become a package-hell problem. A drawback to this is that the size of the package might become larger.
The article also recommends some native GNU/Linux tools that will aid you in the package creation. For example xbuild can be used to run an MSBuild file that will do the packaging for you. This will help make things more familiar to Windows developers. The lintian tool may also assist you in fixing issues with the produced .deb file. The rest of the tools are intermediate utilities that are invoked during the MSBuild packaging process.