I am having a bit of trouble. I have no idea what question title is appropiate in this case, feel free to edit it, if you have something better in mind.
So, basically this is the current situation:
Currently I have four projects (not the real names, but the architecture is identical):
Client (the main client logic)
Server.Main (the main server logic)
Server.Extensions (some functions for the server, e.g. helpers etc. can be used standalone, shouldn't rely on something from Server.Main)
Shared (shared code between client & server)
For each of the projects I create a Nuget-Package and upload it to my online repository. This repository is private for now and only for development purposes.
Here is a summary, what project uses what Nuget-packages:
Client uses the Nuget Package of Shared.
Server.Main uses the Nuget Package of Shared & Server Extensions.
Server.Extensions uses the Nuget Package of Shared.
This works fine for me at the moment... I can easily update my repository for testing purposes and use the freshly updated version of my package.
But here comes the problem:
I would like to share my project with other people now (e.g. the GitHub community). But when they have gotten the projects, they don't have any access to my private repository and the nuget package manager will not find the packages.
Further more there is another problem with my architecture: When they will fix something, e.g. in Shared, they wouldn't be able to test the changes, because the Client & Server would always use the Nuget package from the repository and not the fixed/changed local code.
And I thought about referencing the Shared project directly in all other three projects. Would that mean, that whenever I update Shared, I have update all other three projects aswell?
I think, my whole Nuget architecture is wrong. But I don't know how to do it correctly / in any better way. Does anyone have a better approach for me?
I wouldn't say this is necessarily wrong. If someone is trying to consume and work on your solution, P2P (project to project) references are probably the best since there is minimum overhead and there is a higher probability of catching issues early on during build and subsequent debugging sessions.
You can still easily create NuGet packages for all three during build and consume them in lets say a integration test by either packing them in a post build step or using tools like NuProj.
Related
We have created multiple application solutions in ASP.NET Core 6 MVC.
I want to use each solution under a single solution so that I can use common menus/submenus in all applications.
Also not sure how will handle session in this case.
Example: we have created 4 separate solution modules for Admin, Employees, Department, Students.
Now I am creating a new Login solution which will have login functionality and menus.
I tried by creating DLL for each solution and referencing in Login solution but it's not working properly also static files are not getting added.
All modules for a single application should rather go in a single solution with multiple projects, if feasible. Each project yields a binary assembly that can be directly referenced in other projects. This prevents code duplication by allowing multiple projects to reference the same code. Using multiple solutions can lead to problems regarding referencing paths. There is also a big drawback of spliting your application into multiple solutions: your application's code cannot be accessed in full in the development environement, making refactoring harder.
Since Visual Studio 2022 is now 64bit, you can have solutions with a large amount of projects with little performance drops.
If you really need to have dependencies across multiple solutions, you should turn towards nuget. Nuget is the package manager for dotnet. After compiling a project, you provide some metadata to create a package. Then you publish the package to the repository. Other solutions can reference the package and the linker will download the binaries from the repository. Nuget packages support semantic versioning and you can reference specific versions of a library.
However, this will require you to write your code like a quality library. It means early design thinking, strong QA, and heavy testing are required so you don't ping-pong updates between your libs and their clients. This is why this strategy is more designed towards sharing libs accross multiple applications.
There are also on premises solutions if you don't want to upload your binaries to the internet. You can create a nuget repository as simply as creating a new directory and adding the path to the list of nuget packages references sources in Visual Studio. Nuget packages can be shared across your intranet using a simple SMB fileshare. If you need better access control, you can install a local copy of NugetGallery.
Details on nuget usage are availble in Microsoft documentation about nuget.
Using: .net core mvc c#
I have a solution which has a .net mvc core web app & one class library. There is a shared project (class library) that I want to this solution
which is a part of different project (different solution as well).
All of these projects are stored in our local GIT repository.
If I add the external project as project dependency in my existing project then there would be 2 copies of the external project that we have to
maintain. If some developer updates external project how does the change propogates to other projects using it.
And there could be that some developer updates the external project when under its local solution which we want to prevent. Since all are in GIT
is it possible somehow to make dependency related so that any change in external is known to others.
So basically how can we prevent anyone to make local updates to the external project but also make sure any updates to external project are available to
any other project using them.
There are several approaches that you can use to achieve this.
Quick: Reference project in two solutions
The quickest is to reference the shared project from both solutions. This way, you can use it in both projects and the changes are propagated to the other solution because you are basically working on the same files. However, a huge drawback of this approach is that if you make changes in solution A that are not compatible with solution B (e.g. removing a method that is used in solution B), you will only find out when working on solution B.
Easy: Single solution
To fix this, you could merge the solutions into a single one that contains the shared proect and also the other projects from solutions A & B. This way, you still get the convenience of project references in a solution. In addition, you are notified about breaking changes immediately if you build the complete solution. If this approach is viable for you in terms of solution size and team structure, I'd favor this approach. As you already share a single Git repository, I think this approach is well worth considering.
Nuget Package
If you want to keep the solutions strictly separated, you'd need to follow a more complex procedure. You could for instance move the shared project into a solution of its own and create a Nuget package with a clear build and versioning strategy. You can host the Nuget package on a package feed (e.g. on Visual Studio Team Services). Solutions A and B can then reference the Nuget package from the feed and also update it if a new version becomes available.
Here the official documentation to create nuget package with nuspec or csproj
Create .NET Standard 2.0 packages with Visual Studio 2017 [CSPROJ]
Creating NuGet packages [NUSPEC]
We are developing a WPF application at work which has various "common" dependencies (Unity, Prism, etc.).
It's all fine when adding new projects and then setting up the NuGet package dependency per project but when it comes to upgrades, it's really painful as it means we have to go through each and every project, delete the old references and then refetch the latest packages from NuGet.
Today for instance, I was tasked with upgrading Prism from 5.0 to 6.0 (which has breaking changes anyway) and this meant, in addition to fixing all the namespace conflicts, etc. that I had to go through every project, delete the old references, add the new dependecies and rinse and repeat.
My question is, is there a smarter way to deal with this problem or is this the standard approach?
Many thanks in advance,
Update:
I am mostly concerned with "major" upgrades which don't show up on the package manager. Version 5.0 -> 6.0 upgrade would be treated as a major upgrade and hence, would not have an automatic update applied to it in the NuGet package manager.
I don't expect NuGet to be able to do this automatically for me since such upgrades may (and often do) include breaking changes but I would like to know if there's a way to do the major upgrades less painfully than deleting the references from the projects and the packages.config for every project and then re-adding them using NuGet. For a relatively large project, this is very time consuming and I was wondering if anyone had a better way of managing such dependencies.
If you use VS2013 like you say, you can manage ALL your NuGet packages by right-clicking on your Solution and selecting 'Manage NuGet Packages For Solution'. This brings up a dialog where you can view all packages installed for all projects in the solution and all packages that have updates available. When you do upgrade the packages, VS takes care of all the reference changes required. If the package has breaking changes, then you're still on the hook for fixing those.
Disclaimer: I've never worked on a WPF project/solution but for Web/Forms apps, NuGet packages are handled this way.
I can understand your pain because i had the similar problem like you, but there is no easy way. but certainly you need to break the process differently of your daily development and your dependency update roll-out.
for the project i worked on, I use the common repository path that shared among the solutions that you work on, and you need to delete all the solutions folder references in order to get a clean state.
For each solution you work on you need to modify the property group that point to the common target repository (i'm using relative path)
Once all the things setup, you can actually perform an update with a script(I'm using python run-time script)
you can actually look at setting up common nuget-packages-folder for reference updates for detail, but it seems like what you looking at for the automate process
I had a similar problem when trying to upgrade multiple packages with alpha channel issues in Xamarin Studio, which also does not have the niceties of VS 2015 NuGet manager. I ended up writing a very simple PowerShell script that I run multiple times a day.
#
# This script updates local ibGib NuGet packages for mobileGib Android app solution.
# For convenience in copy+paste in manager console:
# ../UpdateLocalNugetPackages.ps1
Update-Package commonGib
Update-Package ibGib
Update-Package languageGib.Biz
Etc.
I believe you could tailor your NuGet commands to fit your needs.
Also, just in case you aren't aware of it, you should definitely read the NuGet command line reference. I may be mistaken, but it sounds like your scenario is doable with the Update command.
I have an independent solution with multiple projects including class libraries and control libraries. This solution and all its projects are under TFS source control.
I reference the output of one or more of these libraries in all new projects I develop. References are currently binary rather than project references.
The new projects are also always under source control and now I need to add debugging support for the libraries.
If I reference the library projects from them, the project file is modified and no longer works with the original library solution since source control providers for the library and referencee may be different.
Is there an easy way to accommodate this?
You should package the shared binaries, along with indexed PDB's, into a Nuget package. Nuget was specifically designed to solve these problems.
You can index your PDB's by running an indexing tool. TF Build can automatically index your PDB's.
Nope.
There are some strategies you can use, however. Easiest (possibly, but not in some cases) is to build the project you wish to debug, drop the binaries on top of the application that hosts them, and attach your debugger to the running application. This makes sure you have the correct version of the assembly under debug, but you might have to do unwanted things, such as making sure you're not targeting a specific version of the assembly
Which may be bad news for an assembly under development. It also requires lots of handiwork, which depending on where your application runs may require you run remote debugging, deal with issues transmitting dlls across untrusted networks, etc etc.
We're building a .NET software platform for test automation for in house use in our company.
The application is composed of a GUI (WinForms) component, and various "Actions" that are dynamically being loaded into it to be executed.
There are approximately ~ 100 Action projects going already, with this number increasing.
Some of these projects are interdependent on other projects and so on.
All actions loaded must reference our "SDK" dll for various activities (results to the main app, logging, etc).
With this relatively simple design, we are facing some management decisions that we'd like to solve in the best way:
Should actions ("plugins") reference our SDKs project output, or some known stable version of it?
For example, when developing a large application (MS Office just for the example), not all teams work with source code for all components naturally.
What is the best solution for something like this ? and why?
How to properly verify that all needed dependencies (3rd party libraries for example) are indeed taken from the correct location ?
What are common practices in scenarios where managing multitude of projects that are linked in between? are there any tips for doing so ?
This is a problem that doesn't have a clear answer, but...
There are two paths you can take. A strongly coupled system or a loosely coupled system.
For a strongly coupled system I can suggest two directories for binaries: a 3rd party directory and a directory that houses DLLs that you company builds that other developers can reference. The 3rd party DLLs (outside your company) should be located in source control so that all developers reference the same versions of the 3rd party DLLs from the same location this avoids developer machine inconsistencies and having the problems of installing 3rd party software on every machine. The in house DLLs should not be referenced in source control and should be built on each developers machine via an automated build batch file or similiar. In a build post step you can copy them all to the same directory and as long as developers get the latest source control and build, everyone has the same DLLs from within your company.
For example, get latest, build (using a batch file to build all the projects needed), and then as a post build step copy the output to common. Now all of your other projects can reference the common compnay DLLs and the third party DLLs from the same location and everyone is consistent.
The problem is that the references are strong coupled, so changes can sometimes be problematic if not communicated properly.
A loosely coupled system uses a framework such as MEF (Managed Extensibility Framework) and your components reference "Contract DLL" which define the interfaces for your components. The project reference the interface or contract DLLs and don't really care about the implementation and then MEF manages the plugin for you.
In this case, you reference the interface DLL but not the actual DLL that implements.
For example, say I have an interface called ILog with a method called LogMessage.
private ILog _logger;
_logger.LogMessage();
So, in a strongly coupled case: Action.DLL references Logger.DLL directly.
In a loosely coupled case Action.DLL references ILog.DLL (just the interface). Logger.DLL implements ILog.DLL. But Action.DLL has no refernce to Logger.DLL directly.
Now I can have any number of DLLs that implment the ILog interface, but the Action.DLL does not reference them directly. That's pretty cool and one of the more exciting features of MEF and loose coupling in general, the ability to not to have dependencies.
How you choose to go, either way is acceptable, I think the loosely coupled idea fits your scenario the best as teams would just have to know the contracts versus the actual implementations.
I wouldn't have one massive contract DLL, I would try and break the interfaces into logical groupings. For example, logging seems like a Utility type of interfance, so I would create a Utility contract DLL with a ILog interface. How it is split up depends on what you are trying to do. Or each interface could be a contract DLL, but maybe that is a little extreme.
This is a somewhat complex topic, especially in .NET land. I don't know about "best" solution, but I'll explain how we manage it; perhaps you will it useful for yourself.
This allows you to build large systems with lots of linked projects, but incurs in a lot of complexity issues. As, I think, any solution of this kind would.
First: physical structure (we use SVN).
There is a source control root for each project
Each project has its own trunk, branches and tags
The trunk folder has a versioned \src and \build folder, and an unversioned \lib folder
The \lib folder contains binaries to reference.
Those binaries could be 3rd party libraries or other projects that you need to link to (e.g., your SDK). All binaries under \lib come from an enterprise ivy repository (see http://ant.apache.org/ivy/). There is a lot of movement these days in .NET land concerning NuGet, so you could check that out too.
Your versioned \build folder contains build scripts, for example to get binaries from ivy, publish your project to ivy, or compile each project. They will also come in handy when you want to point a Continuous Integration server at each of your projects.
Second: To define where dependencies come from
Answer: They come from your ivy repository (it could be as simple as a network shared file system).
You have created your repository, so you have control as to its contents.
Be very careful with 3rd party binaries that get installed in the GAC. Visual Studio is a pain in the a^^ to deal with this.
Specifically:
How to properly verify that all needed
dependencies (3rd party libraries for
example) are indeed taken from the
correct location ?
Ivy gives you a great flexibility with dependencies, and it also resolves transitive dependencies; e.g., you can depend on SDK rev="1.+" status="latest.release" which would mean "the latest stable 1.x release of SDK, or on SDK rev="2.+" status="latest.integration" which would mean the latest available binary of 2.x SDK (probably as produced from a continuous integration build).
So you will always depend on compiled binaries, never on project output. And you can control which version of the binaries to get. 3rd party dependencies will probably be brought in as transitive upon your SDK.
This also means that the amount of code in your projects will stay as small as you need to have workable Visual Studio solutions. It also means that refactoring tools like ReSharper will be a lot less useful. There will also be a certain amount of complexity concerning your build scripts and your branching strategy. That depends a lot on the logic of your components.
This is a brief overview, if you think this is the sort of thing you want I can expand the answer. Good luck; the .NET ecosystem, and Visual Studio in particular, isn't really thought to work like this.