I need to create csproj file that will be usable as project reference in VS2013 and will output prebuilt binary as it's "Build" result.
We use referenced projects for build, however company policy doesn't allow access to some of that projects for everyone. As a result projects need to be updated manually to make them build. This is really a major inconvenience when switching branches and when making edits to project files, so I want to create dummy project that will be bound to pre-built binaries as their "output" and will be placed instead of real projects.
EDIT: Moving that assembly to Nuget package is not an option for now since Nuget has some issues with dev flow (when you need to debug/test/develop package). I saw some VS extension that implements switching between Nuget package and local project which might solve this issue, but I'm not sure if it will be accepted and want to explore other options.
To be clear - the thing I want to avoid is editing project in any way, so that project can be built cleanly after pulling it from Git, and I don't have to clean it every time before commit.
I haven't properly tested it, but the solution seems really simple (if I understand the question properly).
Just add this to the existing .csproj, overriding the Build target to just give the path to the pre-built assembly.
<Target
Name="Build"
Returns="$(TargetPath)" />
This assumes the TargetPath property already defined, and it should automatically be if you're modifying the original .csproj. Otherwise just define it yourself in a <PropertyGroup> before the Build task.
Note that having TargetPath defined is important for the ProjectReferences in your own project to resolve.
How about having those restricted (binary only) projects reside in an internal Nuget package feed, so that Nuget can install the packages as needed, on build?
Related
I have a VSIX extension which I have migrated to a new solution (basically to remove older projects targeting older VS versions no longer supported by my company) and to simplify the codebase for ease of maintenance.
Within the IDE, it does not matter if I set the active configuration to Debug|x86 or Release|x86, it will build a VSIX artifact OK. All good so far.
If I use
MSBuuild /t:Build /p:Configuration=Release /p:Platform=x86 -restore -detailedSummary MyExtension.sln
it will build without any errors, but no VSIX is produced.
I have poured over the terminal output and there are no warnings/errors and the DLL output of projects in the solution are produced.
I did read the following:
Project not selected to build for this solution configuration
The option to click deploy from the above link is not available for my VSIX - all the deploy options are disabled.
I have searched S.O. for similar issues regarding a VSIX not being produced, but none seem apt.
How should I debug this? What is different about a command-line MSBuild from the in-IDE build? Hopefully somebody has had a similar experience and can let me know what was causal for them, so that I can give something a try.
Update 1:
It transpired that although I was targeting .NET Framework 4.6, some .csproj references copied over from the migrated project had entries for net472, despite NuGet packages themselves being selected for compatibility with .NET Framework 4.6.
I had to manually edit a few .csproj files. There were some reference issues in associated projects that then needed fixing.
The residual issue now is as follows:
The in-IDE build fails with a single error...
A PackageReference to Microsoft.Build.* without ExcludeAssets="runtime" exists in your project. This will cause MSBuild assemblies to be copied to your output directory, causing your application to load them at runtime. To use the copy of MSBuild registered by MSBuildLocator, set ExcludeAssets="runtime" on the MSBuild PackageReferences. To disable this check, set the property DisableMSBuildAssemblyCopyCheck=true in your project file (not recommended as you must distributed all of MSBuild + associated toolset). Package(s) referenced: Microsoft.Build.Framework
So I grepped my source code folder for <PackageReference Include="Microsoft.Build and only a single project was in the result list. When I checked this project file, the entry in question did have ExcludeAssets="runtime" so I am unsure why the error is reported. I have tried project cleans followed by rebuild, or deleting bin and obj folders before building, to no avail.
I guess my question now is whether <Package Include="Microsoft.Build are relevant, since these are not <ReferencePackage Include elements as mentioned in the error message.
Update 2:
I hang my head in shame. PBKAC regarding Update 1 error. I had sent a copy of the code to a build engineer who committed it to a branch in our VCS. I then cloned this branch to a different location, and copy+pasted my more recent changes over the top. However, the grep tool (AstroGrep) I was using was still pointing at the older location not in the VCS. The older location contained package references with ExcludeAssets="runtime" as required. However, the newer location did not. Once I noticed this, I corrected it by editing the faulty .csproj file and the error from Update 1 went away.
However, I still appear to have the original issue the question is about.
I am awaiting my company's security team to approve the use of MSBuildLog so that I can get more detail and hopefully find the cause.
One other commenter suggest moving to solution PackageReference build rather than using packages.config. There is a question as to why this is needed. I am aware this seems like it could create a significant amount of extra work due to: this for which there are workarounds, but the commenter mentioned a "need" to use NuGet this way, when I think it is optional. I wish to understand more before committing to such a change.
Unfortunately, this is one of those things where it's a case of user beware.
When using NuGet, it is possible for it to appear to have succeeded in updating a NuGet reference, but unless one checks the underlying packages.config meticulously, you may not be getting what you think.
As I am migrating a solution that used packages.config instead of <Project Reference .../> elements in .csproj files, I have been caught out by IDE default behaviour changes.
NuGet seems to update the .csproj using <PacakageReference.../> elements by default. But this does not amend the packages.config entries that may already exist. As such, I ended up with a mish-mash that MSBuild seemed confused about at build time. Rather than throw an error, it just did not build what was expected.
The old packages.config files had entries targeting .NET Framework of net472 in some cases. I was adding NuGet references to earlier versions for net46 since this is what I need to target now, and this resulted in the problem behaviour, since any unchanged net472 entries were no good for producing the build output.
Since the project needs to support VS2015 also, I need to rely on packages.config approach and not <PackageReference.../> approach, which was not updating older references in the expected way.
As such, I had to remove the NuGet <PacakgeReference.../> and re-introduce correct package versions in packages.config. Once these were all correct, the VSIX built OK.
I need to use a NuGet package containing a utility for my project. It contains several binaries (EXEs and DLLs).
I've added it to my project successfully but I suspect the nupkg isn't formed correctly because I cannot use any of its DLLs or EXEs in my project without manually pointing to the package in my local NuGet cache. When compiling, none of its resources are added to the output (I assume this is because nothing is referenced in my code).
I'd like to create a wrapper project to call the binaries but I'd also like other project devs to be able to compile the solution without adjusting directory variables. Ideally, I could configure the csproj to pull in the bits directly from the local package cache. I think this would be possible by setting the Generate Path Property value to Yes in Visual Studio, but the variable cannot be found when I attempt to use an <Include/> statement in the csproj file.
Is what I'm asking possible? Namely, reference the NuGet package bits within my csproj to ensure the binaries are dropped in the compilation output? Can I do this with the Path Property, or is there something else I can do without directly committing the package's binaries into my project?
(I realize I need to work with the developer to fix whatever issue they have with their package, but I have no direct influence at the moment so this is the best I can do at the moment).
I figured this out, mostly due to misunderstanding how some of the different tags and attributes are meant to be used.
To achieve the desired effect, I did the following:
<ItemGroup>
<Content Include="$(Pkg{PackageId})\**">
<Link>{NameOfSolutionDirectory}\%(RecursiveDir)%(Filename)%(Extension)</Link>
<CopyToOutputDirectory>Always</CopyToOutputDirectory>
</Content>
</ItemGroup>
Where {PackageId} is the name of the NuGet package (this step requires setting 'Generate Path Property' to 'Yes' in the package properties via Solution Explorer), and {NameOfSolutionDirectory} is the name of a folder within the solution I'd like to use for containing those bits, if you're as concerned about keeping the project as organized as I am. The {} should be excluded when replacing these values.
If you want to scope to a specific directory within the package contents, do it within the Include attribute. The ** is necessary if you want to include all files within that directory, or else you can scope by extension or whatever additional pattern you'd like.
I have a solution with an application project (ASP.NET Core) and multiple library projects. I want to separate some of the library projects into a separate solution and turn them into NuGet packages.
With the libraries in the same solution I could of course simply edit something in a library, run the application and see how it works (and debug, if necessary).
However, when I turn the libraries into a NuGet package, the application references the packages from our private NuGet feed instead of the project file.
My question is: is it possible to locally "override" the package reference and use the local source code instead? That way I could still edit the libraries and see the effects in the application. This is a lot easier than having to publish a new package for every small change (especially when trying to fix an issue or implementing a new feature).
DNT (Dot Net Tools) does this. You can specify which packages to switch and where they are.
See the 'switch-to-packages' and 'switch-to-projects' command line switches.
Its a bit fiddley as (when I last tried) you had to create a config file that holds the mapping, and it seems to be easy to break the switching. But its something.
https://github.com/RicoSuter/DNT
I've not tried it, but maybe you can use it to switch to packages on a commit for the build server to work correctly? (Or to ensure the references are correct in source control?)
If you want to use nuget in your project and debug, even modify the source files of the nuget packages, this is not a good choice because you should build the nuget project(generate the new changed dll) and repack it as a nuget package, then reinstall, to enable the changes. It is too complex.
Once you install the nuget, no matter how many changes you make, it’s useless. The nuget installed at this time is the version you made before any changes. No matter how you change it, it is the previous version. The version stays at that timestamp, unless you repackage the project. Generate nupkg and update the nuget version.
So nuget is not a good choice for your situation, you should use ProjectReference.
Directly use the ProjectReference to reference two source projects, build at the same time, and get the changed parts at the same time.
ProjectReference could cross two different solutions.
Add this on the main project:
<ItemGroup>
<!--add any nuget project'csproj file like this to debug its source code-->
<ProjectReference Include="..\xxx\xxx.csproj">
</ProjectReference>
</ItemGroup>
If the proejct is out of the solution, you could directly use the full path of the nuget project's csproj to connect it.
I'm not sure what you mean by "override" but you can always add the library project to your ASP.NET Core solution and reference it like normal project references. A project referenced within a solution doesn't have to be physically placed in the same folder as the solution itself.
This, however, does require that any developer on the project has both GIT repositories cloned locally (given your two solutions are located in separate GIT repos) in order to be able to build the ASP.NET Core solution. But I don't really see that as a downside.
I am attempting to publish and consume versioned NuGet packages of class libraries while avoiding headaches for local development. Here is a sample Visual Studio solution layout:
| Libraries
| LibraryA
| LibraryB
| LibraryC
| Applications
| ApplicationD
| ApplicationE
This is a single solution containing both shared class libraries and multiple applications. Currently references to the class libraries by the applications are local in-solution references.
What I would like to do is to publish the libraries (A,B,C) as versioned NuGet packages which are then referenced by the applications as needed (D,E). This allows a change to a shared library to be independent from an update to an application which is deployed. Without this, changing one library could cause the binaries to change in a dozen or more applications, all of which would technically need to be tested. This is undesirable, and versioning with NuGet fixes this.
However, let us say that I want to update the content of LibraryA and ApplicationD at the same time. In order to do this after we have switched to NuGet, I will have to make changes to LibraryA, commit them, wait for the package to be created, tell ApplicationD to update its reference to LibraryA, and then test or develop in ApplicationD. This is far more complicated than simply working with both at the same time using local in-solution references.
What is a better way to get both the robustness of versioned NuGet packages for my shared class libraries while also keeping development simple even if it spans over multiple projects and applications? The only other solutions I have found all involve too much overhead or headache, such as having to constantly change the references for ApplicationD between the NuGet package and the local project.
EDIT: To clarify the premise, this question assumes the following:
The architecture (solution and project organization) cannot be significantly reorganized
Shared libraries are going to change at a non-trivial frequency
Changing a shared library cannot force any application to be updated
Applications can reference different versions of shared libraries
Although it takes some work, it is possible to hand-edit .csproj files in order to set up conditional referencing by adding a Condition attribute to the appropriate references.
EDIT I've moved these conditions into ItemGroups, as it seems this is how my mentioned production code is working, and there has been mention of this being a possible issue in VS 2013.
<ItemGroup Condition="'$(Configuration)' == 'Debug Local'">
<!-- Library A reference as generated by VS for an in-solution reference, children unmodified -->
<ProjectReference>...
</ItemGroup>
<ItemGroup Condition="'$(Configuration)' == 'Debug NuGet'">
<!-- Library A reference as generated by NuGet, child nodes unmodified -->
<Reference Include="LibraryA">...
</ItemGroup>
This would allow you to have, on the Projects D & E, configurations of "Debug NuGet" vs. "Debug Local" which reference the libraries differently. If you then have multiple solution files which have their configurations mapped to the appropriate configurations on the projects within, the end user would never see more than "Debug" and "Release" for most operation, since those are the solution configs, and would only need to open the full solution for editing the A, B, & C projects.
Now, as for getting the A, B, & C projects out of the way, you could set them up under a folder marked as a subrepo (assuming you're using an SCM that supports this, such as Git). Most users would never need to pull the subrepo since they're not accessing the ABC projects, and are instead grabbing from NuGet.
Maintenance wise, I can guarantee that VS will not edit the conditional references, and will respect them during compilation -I have gone through both VS 2010 and 2013 (EDIT: Professional version, though I have delved into doing the same with express) with the same conditional reference projects at work. Keep in mind than in VS, references can be made version-agnostic, making NuGet the only place from which version need be maintained, and that can be done like any other NuGet package. While I'm hopeful, I have NOT tested whether NuGet will fight with the conditional references.
EDIT It may also be prudent to note that conditional references can cause warnings about missing DLLs, but does not actually hinder compilation or run.
EDIT For those still reading this, I'm now (7/2019) hearing that the IDE isn't as friendly to these changes anymore, and either it or the Package Manager may override them. Proceed with caution, and always read your commits!
Update for .NET Core (2.x ++)
.NET Core 2.x actually has this functionality built in!
If you have a project reference to project A in project B, and project A is a .NET Standard or Core project with proper package information (Properties -> Package with Package id set to your NuGet package ID), then you can have a regular project reference in project B's .csproj file:
<ItemGroup>
<ProjectReference Include="..\..\A\ProjectA.csproj" />
</ItemGroup>
When you pack (dotnet pack) project B, because of the Package id in project A, the generated .nuspec file will be set up with a NuGet dependency to that Package ID, together with other NuGet references you might have, instead of just including the built DLL file.
<dependencies>
<group targetFramework=".NETStandard2.0">
<dependency id="Project.A" version="1.2.3" exclude="Build,Analyzers" />
<dependency id="Newtonsoft.Json" version="12.0.2" exclude="Build,Analyzers" />
</group>
</dependencies>
I know this is a 2-years old post, but just found it while facing the same situation. Also found this for VS2015, I'm in the process of testing it. I'll come back and adjust my answer accordingly.
https://marketplace.visualstudio.com/items?itemName=RicoSuter.NuGetReferenceSwitcherforVisualStudio2015
I also faced a similar problem. One approach that worked was using local repository (which is basically just a folder in local) and adding post-build script in the libraries. For example: let's say you need to update your implementation for LibraryA, then include following 3 steps in your post-build event for LibraryA:
Check if local repository has that version of package; if yes then delete it
rd /s /q %userprofile%\.nuget\packages\LibraryA\#(VersionNumber) -Recurse -ErrorAction Ignore
Create a nuget package
nuget pack LibraryA.csproj
Push it to local repository
nuget push LibraryA#(VersionNumber) -Source %userprofile%\.nuget\packages
These steps will make sure that the package is always updated for that version after each build (we had to do this since nuget packages are immutable)
Now in ApplicationD, you can point to local repository (%userprofile%.nuget\packages) to get LibraryA; such that after each build of LibraryA, you will receive an updated version of it in ApplicationD
PS: Inorder to get version number of you library you can use this : Determine assembly version during a post-build event
Unfortunately, there really isn't a way to have the best of both worlds. Internally in my company, we've mitigated it somewhat with a fast build/deploy process, which counteracts most of the burdens with always referencing a NuGet package. Basically, all of our applications use a different version of the same library hosted in a local NuGet repository. Since we use our own software to build, deploy, and host the packages, it makes it pretty quick to update the library, then update its NuGet package in another solution. Essentially, the fastest workflow we've found is this:
Make changes to library
Automatically build and deploy version of library incremented by 1 to internal NuGet feed
Update NuGet package in consumer application
The whole process from check-in to updating the consuming project takes around 3 minutes. The NuGet repository also has a symbol/source server which helps tremendously with debugging.
In the properties of ApplicationD, go to the "Reference Paths" tab and add the path of the output folder of LibraryA. Then, if you change and build LibraryA, the next build of ApplicationD will use the modified LibraryA.
When you are finished, don't forget to remove the "Reference Paths" and update the referenced NuGet package version.
My not-so-clean yet fastest solution so far is:
Assuming the following two separate solutions:
VS Solution 1: contains libraries published as nuget packages:
Solution1
|_ my.first.library
|_ my.second.library
VS Solution 2: contains applications, which consume one or more of the above libraries as PackageReferences:
Solution2
|_ my.first.application
| |_ depends on nuget my.first.library (let us say v1.0.1)
|
|_ my.second.application
In case, I'm making changes to my.first.library
I proceed as follows:
Make code changes to my.first.library and rebuild
Navigate to the build output directory of my.first.library (e.g. <Solution1 directory>/my.first.library/bin/debug/netstandard2.0) and copy the .dll and .pdb files
Navigate to the my.first.library's local directory of the currently being used nuget feed (for example at: C:\Users\user.name\.nuget\packages\my.first.library\1.0.1lib\netstandard2.0) and replace the .dll and .pdb files there with the ones generated in step 1 (possibly making backup).
Changes get reflected in my.first.application. Continue working and repeat steps 1-4, when needed
Advantages:
being completely local. No secondary nuget feeds needed.
zero changes to .csproj/.sln files
Caution:
While this solution offers you flexibility, make sure you clear your nuget cache before acting on them, for example by publishing to a nuget server. Thanks #Benrobot
I'm trying to figure out what the best way to handle this scenario is.
Let's say I have a library that's referenced by multiple different non-related solutions, let's call it WebServiceInterface.dll. This library has a dependency on JSON.NET.
Before NuGet
The JSON.NET binary was referenced via a SVN external in the WebServiceInterface project. Other solutions which had a dependency on WebServiceInterface referenced the project (also as an SVN external) and as a result pulled both the project, and it's dependencies.
With NuGet
I haven't figured out how to force the JSON.NET reference to be stored under the WebServiceInterface project (as opposed to the RandomSolution\packages location). I found reference # nu-get to project-level and solution-level pacakges, but I can't seem to find out how to specify this when I add a dependency via nu-get.
The goal here is that when someone checks out WebServiceInterface and adds it to a new solution that it builds (instead of having broken references to JSON.NET which point to the packages directory under whatever the last solution was that checked in).
When I went to find out if Chris B had created a NuGet issue for this, I couldn't find one. EDIT: He did, see his comment below. But I did find a semi-documented feature of NuGet that I used to solve this problem: Allow specifying the folder where packages are installed
Let me break this question into 2 issues:
getting NuGet to allow for multiple solutions to use the same packages location
getting the NuGet packages to automagically fetch from source control when you include a project that has NuGet packages
Problem 1:
By default NuGet stores packages in a packages folder in the solution's folder. To change that location, create a nuget.config file in the solution's root folder with the following contents:
<settings>
<repositoryPath>..\..\..\Utilities\Library\nuget.packages</repositoryPath>
</settings>
<repositoryPath> is relative to your solution; so obviously make it whatever you want. Make each solution have it's own relative path to the same packages folder.
As far as NuGet's flow, from that point, the paths in repositories.config are relative to the folder containing repositories.config, not the solution, so now all projects/packages are managed independent of the solution location.
This allows multiple solutions to use the same packages in source control, and if those solutions use the same projects (that use NuGet packages), those solutions/projects will all be kept in sync no matter which solution updates the package.
Problem 1 completely solved.
Problem 2:
Let me address this from 2 perspectives. This applies to Visual Studio and TFS -- I'll leave SVN for someone else to address.
First: if you have no source code on your drive and do a get of a solution (not a project), I prefer to make it so that you get everything that solution needs to build. There shouldn't be any missing references to go manually grab. That much we can do by adding the package files as solution items. Yes, in each solution. A bit of work, yes, but when it's done the package files will fetch/update from source control automagically.
Second: In a new solution, when you include an existing source control project that has NuGet packages, you have to manually fetch the packages from source control and add them as solution items. At least anyone else getting your solution in the future will automagically get everything they need to successfully build. At least with VS/TFS, this is just the way it is, AFAIK. If projB depends on projA, and you add projB to a new solution, VS/TFS won't automatically grab projA from TFS. You have to do that manually. So then the same goes for dll references (like NuGet packages).
Summary of my solution:
Only one copy of packages in source control for all solutions
Any solution can update packages and all the other solutions will be kept in sync*
* Once one solution updates packages to new paths or file names, they will appear as missing references to the other solutions and you'll have to manually clean that up. But at least you know right where the packages are in source control "(as opposed to the RandomSolution\packages location)."
The packages are always stored at the solution level, so if you install a package into multiple projects, they came from the same place. I don't believe you can configure it so that each project has its own packages folder.
I'm not sure there's a nice way to do what you're trying. You could maybe have a build step on the project that fetches the package, but I don't know how well that will suit you.
I'd recommend posting in the NuGet Issue Tracker to get a discussion going. The people working on it seem pretty active, so it might be something they can add support for in a future version :-)