at the moment I‘m very frustrated while developing at work. We‘re working in a visual studio 2019 solution file with 290 projects in it. Our software is written in C# using .NET code.
I‘m getting even more frustrated every time when I change a branch of the git repository, because the visual studio then starts a whole reload of the solution which takes up to 5 or 10 minutes.
Do you know if there is a limit for the amount of projects in a visual studio solution? Or did you work within solution with a higher amount than 300 projects?
While there is no enforced limit (in theory you can have thousands of projects in a solution file and visual studio will try to load them) there are some, as you can see here: Recommended number of projects in Visual Studio Solution
Machine limits. After a certain number, non powerful machines take to much to load or even crash.
Older vs version have been reported to crash after a certain number (even lower than yours)
Manageability is also a concern. Too many projects sometimes mean that there should be a better separation of concerns. Having them all in one solution sometimes brings unneeded coupling between unrelated codebases.
If number 3 is not the case, I'd advise you to create multiple solutions, each a subset of the projects per domain area.
If it is, try to refactor and decouple. Each decoupled dll should be moved outside the solution, packaged into a nuget and consumed independently.
Related
We have a moderately sized solution, with about 20 projects. In one of them I have my business entities. On compiling any project, visual studio waits and hangs about one and a half minutes on this BusinessEntities project.
I tried our solution in SharpDevelop and it compiles our complete solution, in 18 seconds. Similar timing with MSBuild.
My guess is that VS is trying to find out if the project needs a compile, but this process is about 15 times slower than actually performing the compile!!
I can't switch to the great sharpdevelop, it lacks some small, but essential requirements for our debugging scenarios.
Can I prevent VS from checking this project, And have it compile the projects without such a check, just like sharpdevelop?
I already know about unchecking projects in configuration management to prevent building some projects, but my developers will forget they need to compile this project after updating to latest sources and they face problems that seem strange to them.
Edit: Interesting results of an investigation: The delay happens to one of the projects only. In configuration manager I unchecked all projects, then compiled each of them individually. All projects compile in a few seconds!! The point is this: if that special project is built directly, compiles in a few seconds, if it is being built (or skipped, because it is up-to-date) as a result of building another project that depends on it, VS hangs for about a minute and half, and then decides to compile it (or skip it). My conclusion: Visual studio is checking to know if any files are changed, but for some reasons, for this special project it is extremely inefficient!!
I recently (massively?) upgraded my system from a (circa 2010) Core i7 980x with 12 GB of Ram to a Dual dodeca-core system with hyperthreading (2x12x2) with 128GB of Ram.
On Visual Studio's end, I rarely see it go beyond 6% utilization (3 cores) during build. Does anyone here know what the caveats to its build pipeline are: does it just have one core per project, does it distribute a single project across multiple cores, or is there something I'm missing?
I'm currently using Visual Studio 2013 and have shelved the 2015 upgrade until ... I recover from the system upgrade :)
On the note above, does Visual Studio 2015 see improved performance on distributing its workload due to the Roslyn pipeline, or is it still using the preexisting MSBuild architecture?
OP:
Does anyone here know what the caveats to its build pipeline are: does it just have one core per project, does it distribute a single project across multiple cores, or is there something I'm missing?
To change how many cores Visual Studio will use during compilation of a .NET solution, check out Tools.Options.Projects and Solutions.Build and Run https://msdn.microsoft.com/en-us/library/y0xettzf(v=vs.90).aspx.
MSDN:
1. In the Visual Studio IDE, on the Tools menu, click Options.
The Options Dialog Box will appear.
2. Expand the Projects and Solutions folder, and then select the Build and Run property page.
3. Enter an integer in the text box for the Maximum number of parallel project builds property. The highest value that you can set this property is 32.
Having said that actual results will vary depending on how many projects are independent vs projects that depend on other projects to be built first. If you have many dependencies, you may not notice much gain.
OP:
I've already set this to 48, which was incorrectly set to 32 by default on this new machine
EDIT: OP has indicated that the number of logical cores is 48.
Generally, setting this value to inordinately large values has no benefit:
MSDN:
Build performance does not increase when you set the Maximum number of parallel project builds property to a value greater than the number of CPUs on your computer. https://msdn.microsoft.com/en-us/library/y0xettzf(v=vs.90).aspx
we are using Visual Studio 2013 with latest updates installed. Our solution(s) contains about 20 to 30 C# library and asp.net projects. We also have some extensions installed such as Resharper, VSCommands and NCrunch (all latest versions as well).
After working a couple of minutes on a solution the memory of Visual studio increases to more than 2GB of RAM. The IDE is becoming very unresponsive.
Usually i have to close a solution after working with it after 1 or 2 hours.
When not closing VS myself the RAM usage goes up to 3GB+ and VS crashes (probably because VS is still a 32bit application)
How can i identify what causes Visual Studio to use so much memory?
Thanks
(I realized that ncrunch is also using a lot of RAM when executing our tests. Therefor i have disabled it. It takes a couple of minutes longer for VS to use so much ram when ncrunch is disabled but the problem remains)
Ran into a similar scenario on my end. But mine was purely resharper.
I had personally used the following things to optimize the performance.
http://confluence.jetbrains.com/display/NETCOM/Ultimate+Guide+to+Speeding+Up+ReSharper+(and+Visual+Studio)
For huge projects, I usually turn off the solution wide analysis for it.
We identified that the latest perforce plugin for visual studio (p4vs: p4vs11_2014.2.93.1619.vsix) was the cause for VS to grow in memory usage.
We downgraded to p4vs11_2013.3.78.1524.vsix. This fixed the problem.
(Also it seems that p4vs11_2014.1.85.4506.vsix is still working)
Starting a few months ago I noticed that when a WPF project is out of date, it builds much more slowly than other projects. This seems to have started with Visual Studio 2012, though I'm not positive about that.
So my question is, are there any settings or tools I could use to help diagnose performance problems with the build process? I have enabled extra logging for C++ projects in the past to determine why VS thinks a project needs to be rebuilt when it hasn't changed. Is something like that available for other types of projects?
We are working with Visual Studio 2008 in a solution with 128 projects, almost half of them are unit test projects which make a total number of more than 5000 unit tests. For that reason we usually leave all unit test projects unloaded except those ones we are working with. Nevertheless, we often suffer from having Visual Studio 2008 blocked whenever we try to launch a single unit test or load/unload a different unit test project. In Task Manager we see that devenv.exe process takes constantly 25% of CPU and through Process Explorer we have seen that devenv.exe is in a loop of CreateFile, QueryBasicInformationFile and CloseFile for each project file loaded in the solution, iterating through all project files over and over again.
We have tried to set in the registry the key HKLM\Software\Microsoft\VisualStudio\9.0\EnterpriseTools\QualityTools\EnableCMI to 0 as it is stated in some forums where we have searched for a solution or workaround, but with no luck.
As a workaround, we are working with solutions that contain only a small subset of projects, but we don´t see it as a definite solution as we don´t consider very 'developer-friendly' to be switching between different solutions and different instances of VS2008 all the time.
Anyone who had experienced this same behavior could give us any clue on how to prevent Visual Studio 2008 from being unresponsive for so much time when dealing with unit tests? It makes developing unit tests almost impossible.
Many thanks in advance.
Jose Antonio
Consider upgrading Visual Studio. 2008 is ancient and it had serious performance issues in large solutions. 2012 was better, but IIRC a lot of work was done in 2013 to make it workable with a lot of projects.
128 projects in one solution in 2008 times is - ah - challenging it. Not to say totally over the top by factors.
With 2013 it should actually work
There is not a lot you can do here in fact - the problems with a significant number of projects in older versions of Visual Studio are simply documented and were fixed a long time ago - in newer versions.