MSBuild error MSB4018 cannot access project.assets.json in NET 5 build operation
MSBuild error MSB4018
I am building 70 C# NET 5 projects in parallel groups and sometimes get the following error on random projects within the build
error MSB4018: The "GenerateDepsFile" task failed unexpectedly.
[c:\dev\highspeed\HsCore\hscore\HsCore.csproj]
C:..\sdk\6.0.202\Sdks\Microsoft.NET.Sdk\targets\Microsoft.NET.Sdk.targets(172,5): error MSB4018: System.IO.IOException: The process cannot access the
file
'c:\dev\highspeed\HsCore\hscore\bin\Debug\net5.0-windows7.0\win-x64\HsCore.deps.json'
because it is being used by another process.
[c:\dev\highspeed\HsCore\hscore\HsCore.csproj]
The Microsoft doc says: This error is emitted when a task fails with an unhandled exception. This is generally a sign of a bug in the task. Error MSB4018 may be caused when running a task in an environment that it was not prepared for, for instance when a task has an x86 dependency but is running in a 64-bit MSBuild environment. This can manifest as a System.DllNotFoundException.
In my case, I am totally in a windows x64 environment, building and using AnyCPU libraries (and publishing to win-x64, but that doesn’t matter before the build runs).
I invoke the build on xxx.sln files with the following arguments:
Exit Code 1: /t:Clean;Restore;Publish /p:Platform="Any CPU"
/p:Configuration=Debug /p:TargetFramework=net5.0-windows7.0
/p:RuntimeIdentifier=win-x64 "c:\path\MySolution.sln"
The code normally builds and runs fine, except for when this kind of an error occurs. Often, when I run the build a second time, the build succeeds.
I do not understand why the MSBuild process (or one of its processes) cannot access the project.assets.json file, because MSBuild is the only one who ever accesses that file in that project. None of my tools ever reference that file; Visual Studio does not have the file or project open; No other projects in the parallel build group would ever access the projects.assets.json file anyway; so MSBuild is the only one working with the entire project tree.
The best I can think of is that maybe the Restore target in the Clean;Restore;Publish chain might be locking the file and not releasing it fast enough for the Publish(Build included) operation. But wouldn’t MSBuild be smart enough to manage that kind of thing?
Does anyone have any idea what might be going on? What other process could possibly be looking at (and locking) that file? Thank you
After days of debugging and investigating, I am convinced the problem is caused by using MSBuild multi-core operation (-m:) when building my solution (.sln) files.
If I remove the -m MSBuild command line argument, all the weird "file being used by another process" errors go away instantly. If I enable the -m:2 argument (or higher, or unbounded -m), the errors come back, instantly.
My .sln files that fail typically have three .csproj projects - a library DLL project, a LibraryXxxTests project, and a thin console program interface to the library. The Tests and console projects typically use a project reference to the library project.
Maybe (but I can't imagine why) the project references that point to the library project open the gate for MSBuild parallelism errors. (But MSBuild -m should be smart enough to see the project references and build the library project first before the console and Tests projects that reference the library DLL, yes?)
The other kind of project that sometimes fails has only two projects (lib and tests project).
My solution file and project files do nothing special or tricky - as far as I know, they are typical C# projects with 20 - 50 C# files. The errors occur on the main DLL file (Library.DLL) produced by the build, when some process is trying to write to that .DLL file and can't do it because of a lock. I don't know why.
But I do know that if I remove the MSBuild -m command line argument, the errors go away. Back to serial builds at the solution level for me ...
UPDATE: I wanted to believe MSBuild could figure out the build order among projects in a solution file, so I did more searching on the net. I found this information in the MSBuild documentation on how it calculates builds.
Graph option
If you specify the graph build switch (-graphBuild or -graph), the
ProjectReference becomes a first-class concept used by MSBuild.
MSBuild will parse all the projects and construct the build order
graph, an actual dependency graph of projects, which is then traversed
to determine the build order. As with targets in individual projects,
MSBuild ensures that referenced projects are built after the projects
they depend on.
Using the -graph option with the -m option seemed to work for solution-level builds (passing .sln files to MSBuild). The graph option tells it to calculate a build order graph among projects within each solution file. The -m -graph combination seems to let me use multiple cores to do the builds in a shorter time without MSBuild getting access errors among projects within a solution.
Keep in mind that this method requires that you use ProjectReferences in your project XML files. If you use direct assembly references to the project.DLLs stored in some other location, the -graph option cannot calculate a proper build order within the solution.
UPDATE 2:
The information about using -graph above seemed to help but did not solve all the problems. I was lucky enough to find the following information on GitHub:
I'm tempted to call this a duplicate of #9585, which is itself not the
first time the problem has been found but I can't find the original.
Problems arise whenever you explicitly pass a --framework (CLI) or
/p:TargetFramework= (direct MSBuild) to the build of a solution. The
issue is that global properties (such as that TF setting) are
inherited in most cases, but not all. Here, the ClassLibrary1 project
is built once with an explicit TF (inherited from the solution build)
and once with no TF (since it only has one, the ProjectReference from
WebApplication1 does not pass its TF down).
The best way out of this is to only pass --framework to builds of
individual projects, not solutions. Project-to-project references are
careful not to get into this situation.
Removing the /p:TargetFramework=win-x64 from the msbuild command line when building .SLN files seems to be working for me so far.
Recently we had this case:
We have multiple projects in our solution that depend on each other. With project dependencies we make sure the project is build in the right order.
In the case above a developer forgot to add a project dependency. Therefore 2 things could happen:
1. The build would fail because msbuild builds and outputs the projects in an order that does not work
2. The build would succeed because msbuild builds and outputs the projects in the right order (by accident)
As you can see above, the first nightly build succeeded, the second nightly did not, and the third succeeded again. This can be confusing. I would like to make the builds more repeatable, so that the consecutive builds either all fail or all succeed.
Is there an approach i can take to make this better?
I read about deterministic builds but i am unsure whether that would help my case:
https://github.com/dotnet/roslyn/blob/master/docs/compilers/Deterministic%20Inputs.md
Additional info;
- The code is checkout clean before each build
- We use a plugin-based model so projects are not stitched together with project references, but plugin projects are build separately without dependencies. Then we reference the output dll from our application projects. Therefore the plugin should be built before the application project. We manually add project dependencies to make sure this happens.
I am currently migrating a large solution from the old csproj file format to the new csproj format. I am doing this a few projects at a time, so I have a mixed environment with some projects using the old project file format and some projects using the new project file format.
I have started to notice some builds failing because files are in use. My theory (based on this answer) is that MSBuild is building some projects twice because the properties are different (i.e. the new project file format specify the TargetFramework property while old projects do not).
The projects that seem to have concurrency issues are projects that are referenced by other projects, where the referencing projects are split between the project file formats.
The command I am using to build the project is:
msbuild.exe /maxcpucount:6 /property:Configuration=Debug;Platform=x64 /t:Rebuild my.sln
Is there a way to instruct MSBuild to only build a given project once (regardless of properties) until I am able to convert all of the projects in the solution over to the new project file format?
Note that building single threaded does correct the concurrency issues, but that significantly slows down the build and the projects are still built multiple times.
This is a bug in MSBuild when referencing C++ CLI projects from multi-targeting projects. It appears they are putting in a fix to address this.
To work around the bug, the property can be removed from the reference using the GlobalPropertiesToRemove attribute:
<ProjectReference Include="..\B.CppCLILibrary\B.CppCLILibrary.vcxproj" GlobalPropertiesToRemove="TargetFramework" />
Only you could have correctly solved this since we do not have access to your code. But generally MSBuild and all build systems operate under the rule that a build 'target' is only processed once. No matter how many projects refer to it.
A build system should provide language for the user to specify dependencies between different 'targets'. And then it is up to the build system to figure out in which order to build these things in. (i.e. called a topological sort). Beginning or starting with making the most independent targets all the way to the least independent target.
If something is getting built twice it could be:
1. A bug in the build system.
2. The user forcing the project to build twice.
Anyways glad you got it sorted out.
I'm looking for a way to detect problems with assembly references in a large Visual Studio solution:
Binary references to bad locations, like a path not in source control or in the output of another project
Binary references to multiple versions of the same assembly across projects in the solution
Binary references without a path, that may be redirected to the GAC
Binary references that should have been project references
The whole story
I work on a large C# project with almost at 200 projects.
One of the problems that is creeping in over time is that references to assemblies are added but not always to the same version or to the correct location.
For example, a project may get a reference to System.Web.Mvc without a hint path, making it reference what ever version is in the GAC. Visual Studio (and Resharper) will also offer to add a missing reference but may do so by adding a reference to the output folder of another project.
Now the recent Windows Update catastrophy left some team members dead in the water, unable to build the solution. As you can imagine, this bumped up the priority of assembly reference management for us.
To detect some of the most obvious problems I've already setup an msbuild file that can be included in every csproj file and will detect bad references.
However, new project files will need to be edited manually to include that script. So that will inevitably be forgotten.
What I would really like is to check all project files in a solution for 'bad' references during the continuous build, so that all projects will be checked always.
I've been googling for a solution like this for some time and found lots of static analysis and code analysis tools but nothing to analyze project files in a solution.
So, before I go off and roll my own solution, is there a way to do this already?
Update
In order to clean up the code base I've created a bit of ScriptCS code that'll scan all csproj files for referenced to assemblies in Nuget packages and fix them. It's up on GitHub.
You can create a NuGet package where the sole purpose is incorporating a custom .targets file into a project. I recently used this strategy to solve another problem (error messages for missing .snk files).
Testing strong names of NuGet-distributed assemblies
Rackspace.KeyReporting source code
If you create a similar package, it's easy to right click on your solution node and verify that it is installed in all of your C# projects.
If your analysis is more complicated and requires the use of an assembly (custom build tasks) in addition to the .targets file, you can use an approach like I use for the Antlr4 NuGet package, which contains build tasks, resources, and custom .props and .targets files, but no actual assemblies that are referenced by the project it gets installed in.
ANTLR 4 C# Target source code (includes the Antlr4 package source and build scripts)
Instead of adding it to all projects in your solution, why not create some kind of test (unit test, build file, whatever) that can take a project file as input, parse it, and throw an error if OE or more references are incorrect. Much easier than adding (and checking out, committing etc) custom build steps to project files.
Even if you would use a nuget package as proposed earlier, you'd still have to check manually whether all your projects (200 projects? Really?) Reference the package.
Do you use ILMerge? Do you use ILMerge to merge multiple assemblies to ease deployment of dll's? Have you found problems with deployment/versioning in production after ILMerging assemblies together?
I'm looking for some advice in regards to using ILMerge to reduce deployment friction, if that is even possible.
I use ILMerge for almost all of my different applications. I have it integrated right into the release build process so what I end up with is one exe per application with no extra dll's.
You can't ILMerge any C++ assemblies that have native code.
You also can't ILMerge any assemblies that contain XAML for WPF (at least I haven't had any success with that). It complains at runtime that the resources cannot be located.
I did write a wrapper executable for ILMerge where I pass in the startup exe name for the project I want to merge, and an output exe name, and then it reflects the dependent assemblies and calls ILMerge with the appropriate command line parameters. It is much easier now when I add new assemblies to the project, I don't have to remember to update the build script.
Introduction
This post shows how to replace all .exe + .dll files with a single combined .exe. It also keeps the debugging .pdb file intact.
For Console Apps
Here is the basic Post Build String for Visual Studio 2010 SP1, using .NET 4.0. I am building a console .exe with all of the sub-.dll files included in it.
"$(SolutionDir)ILMerge\ILMerge.exe" /out:"$(TargetDir)$(TargetName).all.exe" "$(TargetDir)$(TargetName).exe" "$(TargetDir)*.dll" /target:exe /targetplatform:v4,C:\Windows\Microsoft.NET\Framework64\v4.0.30319 /wildcards
Basic hints
The output is a file "AssemblyName.all.exe" which combines all sub-dlls into one .exe.
Notice the ILMerge\ directory. You need to either copy the ILMerge utility into your solution directory (so you can distribute the source without having to worry about documenting the install of ILMerge), or change the this path to point to where ILMerge.exe resides.
Advanced hints
If you have problems with it not working, turn on Output, and select Show output from: Build. Check the exact command that Visual Studio actually generated, and check for errors.
Sample Build Script
This script replaces all .exe + .dll files with a single combined .exe. It also keeps the debugging .pdb file intact.
To use, paste this into your Post Build step, under the Build Events tab in a C# project, and make sure you adjust the path in the first line to point to ILMerge.exe:
rem Create a single .exe that combines the root .exe and all subassemblies.
"$(SolutionDir)ILMerge\ILMerge.exe" /out:"$(TargetDir)$(TargetName).all.exe" "$(TargetDir)$(TargetName).exe" "$(TargetDir)*.dll" /target:exe /targetplatform:v4,C:\Windows\Microsoft.NET\Framework64\v4.0.30319 /wildcards
rem Remove all subassemblies.
del *.dll
rem Remove all .pdb files (except the new, combined pdb we just created).
ren "$(TargetDir)$(TargetName).all.pdb" "$(TargetName).all.pdb.temp"
del *.pdb
ren "$(TargetDir)$(TargetName).all.pdb.temp" "$(TargetName).all.pdb"
rem Delete the original, non-combined .exe.
del "$(TargetDir)$(TargetName).exe"
rem Rename the combined .exe and .pdb to the original project name we started with.
ren "$(TargetDir)$(TargetName).all.pdb" "$(TargetName).pdb"
ren "$(TargetDir)$(TargetName).all.exe" "$(TargetName).exe"
exit 0
We use ILMerge on the Microsoft application blocks - instead of 12 seperate DLL files, we have a single file that we can upload to our client areas, plus the file system structure is alot neater.
After merging the files, I had to edit the visual studio project list, remove the 12 seperate assmeblies and add the single file as a reference, otherwise it would complain that it couldnt find the specific assembly. Im not too sure how this would work on post deployment though, could be worth giving it a try.
I know this is an old question, but we not only use ILMerge to reduce the number of dependencies but also to internalise the "internal" dependencies (eg automapper, restsharp, etc) that are used by the utility. This means they are completely abstracted away, and the project using the merged utility doesn't need to know about them. This again reduces the required references in the project, and allows it to use / update its own version of the same external library if required.
We use ILMerge on quite a few projects. The Web Service Software Factory, for example produces something like 8 assemblies as its output. We merge all of those DLLs into a single DLL so that the service host will only have to reference one DLL.
It makes life somewhat easier, but it's not a big deal either.
We had the same problem with combining WPF dependencies .... ILMerge doesn't appear to deal with these. Costura.Fody worked perfectly for us however and took about 5 minutes to get going... a very good experience.
Just install with Nuget (selecting the correct default project in the Package Manager Console). It introduces itself into the target project and the default settings worked immediately for us.
It merges the all DLLs marked "Copy Local" = true and produces a merged .EXE (alongside the standard output), which is nicely compressed in size (much less than the total output size).
The license is MIT as so you can modify/distribute as required.
https://github.com/Fody/Costura/
Note that for windows GUI programs (eg WinForms) you'll want to use the /target:winexe switch. The /target:exe switch creates a merged console application.
I'm just starting out using ILMerge as part of my CI build to combine a lot of finely grained WCF contracts into a single library. It works very well, however the new merged lib can't easily co-exist with its component libraries, or other libs that depend on those component libraries.
If, in a new project, you reference both your ILMerged lib and also a legacy library that depends on one of the inputs you gave to ILMerge, you'll find that you can't pass any type from the ILMerged lib to any method in the legacy library without doing some sort of type mapping (e.g. automapper or manual mapping). This is because once everything's compiled, the types are effectively qualified with an assembly name.
The names will also collide but you can fix that using extern alias.
My advice would be to avoid including in your merged assembly any publicly available lib that your merged assembly exposes (e.g. via a return type, method/constructor parameter, field, property, generic...) unless you know for sure that the user of your merged assembly does not and will never depend on the free-standing version of the same library.
We ran into problems when merging DLLs that have resources in the same namespace. In the merging process one of the resource namespaces was renamed and thus the resources couldn't be located. Maybe we're just doing something wrong there, still investigating the issue.
We just started using ILMerge in our solutions that are redistributed and used in our other projects and so far so good. Everything seems to work okay. We even obfuscated the packaged assembly directly.
We are considering doing the same with the MS Enterprise Library assemblies.
The only real issue I see with it is versioning of individual assemblies from the package.
I recently had issue where I had ilmerged assembly in the assembly i had some classes these were being called via reflection in Umbraco opensource CMS.
The information to make the call via reflection was taken from db table that had assembly name and namespace of class that implemented and interface. The issue was that the reflection call would fail when dll was il merged however if dll was separate it all worked fine. I think issue may be similar to the one longeasy is having?
It seems to me like the #1 ILMerge Best Practice is Don't Use ILMerge. Instead, use SmartAssembly. One reason for this is that the #2 ILMerge Best Practice is to always run PEVerify after you do an ILMerge, because ILMerge does not guarantee it will correctly merge assemblies into a valid executable.
Other ILMerge disadvantages:
when merging, it strips XML Comments (if I cared about this, I would use an obfuscation tool)
it doesn't correctly handle creating a corresponding .pdb file
Another tool worth paying attention to is Mono.Cecil and the Mono.Linker [2] tool.
[2]: http:// www.mono-project.com/Linker