My team has a fairly large set of desktop applications with many shared libraries between them all in one common solution file in our repository. We'd like to use semantic versioning for a number of reasons, chief of which is to make it easier for our users to install updates. However, given the number of assemblies we're dealing with, we're finding it pretty tedious updating AssemblyInfo files for each one, especially if it's for a library that's a dependency for multiple applications.
I was wondering if there's an easy way to use git tags or some kind of external tool to tell the build server that, for example, XYZ has a bug fix and its patch number needs to be updated.
Use GitVersion : https://gitversion.readthedocs.io/en/latest/
It will do automatically the semantic versionning based on the last tag and git history.
You could use GitVersionTask if you use msbuild or (better) use it with build tools like fake or cake.net
Edit: you now have also alternatives easier to use : https://www.nuget.org/packages/Nerdbank.GitVersioning/, https://www.nuget.org/packages/GitInfo/,. ..
Related
Firstly, I'm new to VSTS and Git, so apologies if my terminology gets muddled!
PROBLEM
My situation is that I have a VS/C# Project (called "PluginBase") that is, essentially, "starting template" code for a plugin. Historically, I've would just copy that PluginBase project code every time I wanted to create a new "tailored/derived" build for a particular customer.
What I would like to be able to do is, as and when bug fixes are resolved and features are added to the PluginBase project, I'd like the option to migrate these changes to one or more of the "tailored/derived" builds. Likewise, if the bug was first found while developing a "tailored/derived" build, I'd like to migrate that back to the PluginBase plugin.
IDEAS
From my research, I've come across a few "possible" ways of achieving my goal, but I'm not sure which (if any) of these approaches are suitable.
Branches
Seems the common approach, perhaps the "best", but...
Means all code must be in the same repository? (otherwise can't "cherry pick" across) - which I'd prefer to avoid as this may not always be possible
Git Submodules
Seems more intended when projects are sharing a common "library" (not deriving from same code-base)
Also not sure Visual Studio fully supports this feature
Cherry Pick
Doesn't seem possible to do this from one repository to another?
Git Patch
Doesn't seem Visual Studio supports this feature yet?
So, if anyone has any advice, guidance or new suggestions for approaches I could (or should) be using, I'd really appreciate your input.
Many thanks! :)
Git Branches are definitely the way to go. The code indeed has to be in the same repository, git stores change sets and in order for a change set to be applied git has to know what happened since the code-paths split or it can not replace the correct lines of code.
Make a branch for each time you roll out a version to a customer, you can then cherry-pick across the different branches.
Currently we use Source Safe and start migration to Subversion.
All external SDK(> 500 MB) hold in Source Safe now, and I look for way to move them from VSS
to some repository.
We have C++ (mostly), C# (many), Java (few) projects. Hundreds projects. Only Windows platform.
I a couple several dependency managers but not satisfied:
NuGet - good for .Net but painful for C++
Ivy - not look in depth, but seems not acceptable for C++
First question: what I can check else? It should be easy for using by end developer. Best case - simple build within IDE.
Currently I am inclined to next solution:
Allocate some rarely used drive, like S: and declare it as 'DEV HOME'.
Then place externals here:
S:\SDK\boost\1.30\...
S:\SDK\boost\1.45\...
S:\SDK\oracle\agile_9.0.0.0\...
S:\SDK\IBM\lotus_8.0\...
S:\SDK\IBM\lotus_9.0\...
S:\Tools\NuGet\nuget.exe
S:\Tools\clr\gacutil.exe
Autobuild machine will hold mastercopy of this 'DEV HOME'. Every developer should copy necessary SDKs from autobuild machine to local and create disk with subst.
I can't find big problems with this solution:
Branches. Projects in different branches can contains references to different versions of SDK (boost for example)
Version of external component will not change too frequently, so here will no hundreds of, say, boost versions.
Easy for developers to setup.
Absolute paths supported by any tool.
No problems with disk space if you want use not-so-big SSD drive for sources. (Currently I move my externals to separate drive with help of symbolic links. But for other developers this look like black magic)
Minor problems:
Personally for me it is not beautiful solution.
Disk (S:) can be busy
Can't be uses as is in Linux (but currently we not interested in it)
Second question: which troubles in this solution can be?
Update 1: Why not relative paths.
Is externals should be in one directory up with sources root? :
:
externals/...
branch-root-1.0/project_collection_1/project1/...
branch-root-2.0/project_collection_2/...
Here all projects should be in one place or duplicate externals. Seems not much different from solutions with absolute path.
Externals should be in same folder with sources root? :
:
branch-root-1.0/externals/...
branch-root-1.0/project_collection_1/project1/...
branch-root-1.0/project_collection_2/...
branch-root-2.0/externals/...
Then externals will be duplicate in each checkouted branch. This +500MB for every branch checkout + some additional work for setup them.
Well, this look acceptable, but I do not see how it is beter then absolute paths. Really, I want to know advantages of relative paths, because I am also uncomforntable with absolute paths.
I have gone down the path you have.... it can work. However I suggest you make everything relative paths and spend the time getting your projects sorted for relative paths.
The problem with any fixed directory system and source control is you can branch or have multiple check outs of your projects.
Also, while subversion is good, it is worth considering Mercurial or Git. They allow for a number of different kinds of work flows that subversion doesn't. It takes a bit more work thinking how to structure your repositories, but it's well worth it. It is a big jump from sourcesafe, and from my experience, many people coming from sourcesafe really struggle / dislike subversion / git / mercurial initially. They all require you to understand version control in a bit more detail, but thats a good thing, as it is a very good tool.
I think, if your platform is Windows only and Visual Studio then NuGet is the best one. What I like about Nuget is almost no configuration. For example, you can use Boost library immediately after you install Boost Nuget package to your project.
You don't need to configure include/library paths (your current problem).
It automatically installs/configures/updates packages for your project on other computers as soon as you copy (store in SVN/mercurial) the packages.config file.
It can solve/warning about compatibility problems between packages.
I'm not aware about any good cross-platform solution for this problem.
I am looking at moving our .net(c#) projects from TFS to git. The general consensus in the tema that we do not want to continue with tfs has been reached and we wish to trial git. We currentley do not have that many projects to migrate over but we expect these to grow as our old systems are replaced.
Currently we have a tfs project for all things that we think will be needed by multiple projects, database stuff, 3rd party dll's etc. What is the best way to have a similar structure in git?
The best way I could see is to have a similar thing to our current structure, with a seperate repository for all the common files.
I have read about using submodules but there seems to be a lot of complaints about these. Is it worth trying something like repo or another alternative? Or is there a better way to handle this?
This question is going to be pretty subjective, but IMO I would solve this by having a separate repository for your common stuff.
Another option is to migrate your common stuff to Nuget packages so you can move your common stuff forward without worrying about breaking all your existing projects.
In my experience common projects in an Enterprise environment tend to calcify your ability to respond to change quickly. Instead you spend lot's of time worrying about how changes in your "Core" or "Lib" modules will affect the 80+ projects you have that are using them. Worse, people just start shoving everything into those modules even if it is only pertinent to a few projects simply because it's easy.
Good day experts!
I am about to start a new project and I would like to have a build script for my code. It will be a .NET project developed with VS2010.
Unfortunatelly, I have no idea how to start. What should the build script do? What are the best practices? How should I configure the projects/solutions?
Is there a how to guide for this? I was thinking about using msbuild.
Thanks
Depending on how big your product will be I would suggest using a Version Control System like TFS and when having that in place you could/should also use a build engine like TeamBuild. May look like overkill but my bet is that it is even easier than trying to figure out how MSBuild works...
Some good practices:
Aim for a "one click build" approach. Try to put all your projects for an app under a single solution. That way, you can build the whole stuff with a single command. Plus, with projects like SharePoint ones, you can create all packages during build (this requires customizing the .csproj files, but it's worthy), I have to try it but this may work
<PostBuildEventDependsOn>
$(PostBuildEventDependsOn);
CreatePackage;
</PostBuildEventDependsOn>
This also helps you in searching across "Entire Solution", so all devs can be in sync without ambiguities.
Make sure to have some good naming convention. For ex. a solution like MyApp containing projects like MyApp.Model, MyApp.View and MyApp.Presenter if you are following an MVP pattern, etc.
Which brings us to another point: aim for a layered organization of your code. A project for utilities, another for your business model, another for presenters, yet another for your UI, etc. That facilitates testing, reusability, etc.
Either case, just try with different approaches and evaluate by yourself pros and cons.
I'm very annoyed by C# or Java refactoring of namespaces or packages. If you referenced in many classes a class in a common package used in many independent projects and if you decide to just move that package as a package child of the current parent package you have to modify all clients just because you cannot use generic imports like this
import mypackage.*
which would allow refactoring without impacting clients.
So how do you manage to do refactoring when impact can be so big for such a small change ?
What if it's client's code not under my control am I stuck ?
Use an IDE with support for refactoring. If you move a java file in Eclipse, all references are updated. Same for rename, package name changes, etc. Very handy.
It sounds like your asking about packages that are compiled and deployed to other projects as for instance, a jar file. This is one reason why getting your API as correct as possible is so important.
How to Design a Good API and Why it Matters
I think that you could deprecate the existing structure and modify each class to be a wrapper or facade to the new refactored class. This might give you flexibility to continue improving the new structure while slowing migrating projects that use the old code.
imagine someone doing an import like import com.* and if it was like what you wanted it to be, it will load anything and everything in a com package which means zillions of classes are going to be imported, and then you will complain about why it is so slow, why it requires too much memory......
In your case, if you use a IDE, that will take care of most of the work and will be very easy but you will still need to deploy new executables to your clients as well if your application architecture requires.