I'm looking at migrating from TFS (Team Foundation Server) to Git, but can't find anything matching TFS' support for gated check-ins (also called pre-tested or delayed commits).
Atlassian Bamboo has no support for gated check-ins. TeamCity does support it ("delayed commits" using their terminology), but not for Git. Using Jenkins by itself or Jenkins+Gerrit has huge drawbacks and doesn't come close to the gated check-in functionality in TFS. (Drawbacks explained by the creator of Jenkins himself in this video: http://www.youtube.com/watch?v=LvCVw5gnAo0)
Git is very popular (for good reason), so how are people solving this problem? What is currently the best solution?
We have just started using git and have implemented pretested commits using workflows (I finished testing this just today).
basically each dev has a personal repository which they have read/write access. The build server TeamCity in our case, builds using these personal repositories, and then if successful pushes the changes to the 'green' repository. Devs have no write access to 'green', only TeamCity build agents can write to that, but devs pull common updates from 'green'.
So dev pulls from 'green', pushes to personal, TeamCity builds from personal, pushes to green.
This blog post shows the basic model we are using, with GitHub forks for the personal repositories (using forks means that the number of repositories doesn't get out of hand and end up costing more, and means that the developers can manage the personal builds, as they can fork and then create the team city build jobs to get their code pushed to 'green'):
This is more work to set up in TeamCity as each developer has to have their own build configuration. Which actually has to be 2 configurations as TeamCity seems to execute all build steps (including the final 'push to green' step) even if the previous build steps fail (like the tests :)), which meant that we had to have a personal build for the developer, then a another build config which was dependent on that, which would just do the push assuming the build worked.
Check out Verigreen - A lightweight, server side gated check-in system. It verifies each commit before it finds its way into the branches the system protects. Verigreen will not allow any failed CI commit to break the integration, release, or any branch that need be protected.
Moreover – it's a free, open-source project.
How it works:
Verigreen intercepts check-ins and runs verification in an ad-hoc branch - so that in case of failed commit, only the relevant developer is affected.
A pre-receive hook intercepts and creates an ad-hoc branch of the code.
Verification is run via a Jenkins job. The verification job content is fully configurable.
The verified code is merged back into the protected branch whereas a failed commit is blocked with a notification sent to the developer.
Decisions are made based on the following flow:
For more information, please see the wiki or Verigreen.io site
I think that after October 23, 2013 the answer can be - Automatic Merge in TeamCity.
git has different philosophy - commits are easy, commit as you wish. If something wrong, you can change it later. And merges are easy. So, you could organize a appropriate workflow, e.g. developer(s) could commit in a separate branch(es). When a branch is tested, it could be merged into a main branch.
Why not use TFS as the central repository and make use of GIT as a local DVCS solution?
This would allow you to build and commit locally and then push what you want to the TFS server and do a gated build.
Sometimes it is good to have the best of both worlds...
With VS Team Services (fka Visual Studio Online) and TFS 2015 or newer you can use branch policies requiring a passing build for a pull request as a gated checkin workflow with Git.
Related
I have been tasked with updating the companies outdated build process. It is all done in batch and perl scripts. The current build process is:
Schedule a build through a web interface.
Build server takes the build process off the queue.
Build server checks out all of the files from the TFS source control.
Build server runs a couple of code injection scripts that modify the source before the build.
Build server updates versions and signs the code.
Build server uses visual studio to compile the projects.
After that is finished the build server zips up the output and drops it in a network share location.
The real difficult part is the code injection scripts. They are 3 perl scripts that modify a lot of code. They are also very machine dependent in the way they were designed. (So I can't just drop them in the build process without a lot of modification)
My end goal is to be able to run the build process on local dev machines and also have CI running on the TFS server.
In my searching it seems that there is no way to emulate a TFS Build on a local machine. So is my only option to use pre-/post-build command line scripts in my cs.proj files? Or is there a better way to do complex builds on the local machine and run the same builds on the TFS?
I have seen Using TFS build definitions on a local machine, but that seems a bit hacky to me. I guess it wouldn't be a horrible solution if there isn't a better one.
I have tried to do something similar in the past myself. Unfortunately, there isn't a good way to go about it because of everything that the TFS Build Workflow requires. What I found was that there are basically 2 ways to go about it.
Create a MSBuild script that will run on both Server and Local
Create a MSBuild script for local and Custom Activity for Server.
If you are intent or have a requirement that you be able to reproduce the build exactly on both the developer machine and the build server, then I would opt for #1. Otherwise I would go with #2. The second option is nice because then you can play within the TFS workflow for doing the main builds which provides you with many objects that you would need as well as giving you a nice place to configure settings without having to check out/in files to change how the build occurs.
For either method you would most likely have to modify your Perl scripts to take in parameters to account for any customization you have to do between systems. Then you can have the user either pass these in or default them in the MSBuild script for local builds and set them up as parameters in the TFS Build Workflow. Thus they can be easily modified if needed. Regardless of the method though, the only good way to do it is to standardize on how things need to be setup on the developer machines and the build server so that you don't have to provide the customization as much.
If you do opt for the first option then you can use the Legacy build configuration for TFS build which supports using a MSBuild script for everything and then you can share the script between both developers and the build server but if someone accidentally changes this script then it does run the risk of breaking the build.
The question in short form and then the explanation
We want to create patches and include only files which have changed in the build due to some bugfixes for a dotnet application. The patches should get automatically built in the Continuous Integration process involving SVN, Cruisecontrol.net and msbuild.
We have a scenario here:
We want to maintain a .net application which runs on remote servers using continuous integration. The source code is in a SVN and has 3 different repositories for DEV, QA and PROD.
Our developers do new bug fixes almost everyday and merge the changes into the dev repository after their initial testing and satisfaction.
The code after a certain problem is solved or a feature has been added is then merged into the QA repository.
The QA code is built and tested on QA machines manually.
After the QA testing we merge it into PROD. With it the QA also makes new patches for the files which have to be replaced or changed manually. Then the patches are deployed on a staging server. On which it is tested until perfect and then the patches are deployed on actual remote servers.
In search of continuous integration we are now trying to use a mix of CruiseControl.net and msbuild to do the process. The process is good until the stage where we have to generate patches from the QA builds automatically. After the patches are generated we will put them on a ftp server and from their they will be downloaded to the staging server to be tested.
The problem i.e. the generation of patches from the new build has a few aspects. The solution file for the application has many projects and the dlls are copied using postbuild events to the startup app bin folder. So we have a specific directory structure in the actual application which itself is a combination of 6 solutions which are more or less independent of each other.
The way we are trying to create the patches is we are searching the logs of svn to find which files have changed. Then we are parsing it finding the project name. Then we are copying all the files from the bin directory of that project to the patch folder in the specific manner in which the release has them by using a mapping file which has all the files of the application in it.
So can anyone please suggest a much better or easier way to make a patch provided we have svn and the cruisecontrol.net. Or any other opensource tool to do that.
hope the problem is clear
This whole process, in general, goes against established best practices. This is not necessarily a bad thing if you have good reasons for it, but I don't see them here.
In essence, you are not using QA and DEV environments to secure the stability of production. Worse, you use different source trees to build code for them. This introduces new points of possible failures into the deployment process.
A standard way of approaching this would be to have a single SVN code tree - tag a version when it is released to QA (using already built binaries!), possibly tag it again when releasing to PROD. Don't re-build the binaries, use the ones that you actually tested!
If your Msbuild task is performing a build instead of a rebuild then the date/time of unaffected dlls will not have their modified date changed.
I would suggest this for the following reasons:
Msbuild will update the modified date for any assembly which has been affected by a change - I.e. an interface change?
the assembly is what you want to deploy, so check this rather than the source for modifications. Otherwise you need to know the build process (won't change) - I.e. Souce file locations, references etc.
Your deployment would just include the dlls from the build directories where the modified date >= BuildDate.
I agree with skolima about the recommanded build & patch process. You should create your patches from your initial tags plus the modifications and then create a new version which have to be deployed in all environments.
In my company, we are using this method :
Each successfull build are automatically tagged by our CI Server
When a patch is needed for a specific version, the programmers copy
& check-out the tagged version and apply fixes on it
Then we have a specific "Patch" build on our CI Server which do
exactly the same thing as a normal build with a "Patch" flag and point to the patched sources
The deployment target is the same, the build process is the same, only sources changes.
The plus is the patches have their own build history on the CI because they are builded separately but are treated as normal builds.
Anyway, if you want to automate a patching processes between two repositories via your CI, ihmo you have to create specific MSBuild tasks to do this. You can either try to merge the changes between them or check the SVN diff & patch commands.
In my team we create assemblies to attach to extensible released software created and published elsewhere in my company. These assemblies are often specific for an individual client, though some are reused. I want to introduce a couple of standards into this environment - version numbers and installers.
Currently, many assemblies go to clients without adequate versioning. I want to institute automated version number updates so when a client has a problem we can be sure which source code was used in their software.
Currently, assemblies are installed by the individual copying them manually to the correct path and performing any necessary registration. I want to force people to use an installer package so the path and registration is handled automatically.
I could implement the first step by getting people to use:
[assembly: AssemblyVersion("1.0.*")]
But I'd prefer to update the AssemblyFileVersion rather than the AssemblyVersion. This is because I understand that advancing AssemblyVersion combined with our manual installation can lead to multiple versions of an assembly being registered. AssemblyFileVersion doesn't update automatically, and I'm wary of a solution that requires developers install 3rd party tools. If we had a proper installation process, the problem would multiple versions would go away.
For the second step, if I use a Visual Studio setup project then adding the assembly causes it to try to add other assemblies from the original published software, which I don't want. I assume I can create this as a patch somehow, but I've not worked that out yet. Of course, an installer will require reliable version numbers or things will go badly.
It seems clear having written this that I need to advance both issues simultaneously, but I'd really rather approach one at a time.
Any thoughts for the best way to get over these two issues?
I don't have nearly enough information to point you to a solution. What are you using to build your application and installers? Desktop F5 build? Team Foundation Server? Cruise Control?
Things to realize:
1) Visual Studio Deployment Projects suck. Yes, I'll stick by that comment. In your case, the dependency scanning problem you have is unfixable. Even if you right click | exclude the dependency it could scan a new dependency at build time. We even wrote visual studio automation to open the project, right click | exclude everything and then save it on the build machine to avoid this problem. Trust me, it's a horrible road to go down. Even Microsoft knows it sucks and that's why it won't be in the next release of Visual Studio anyways. Use other tools such as Windows Installer XML or InstallShield Limited Edition or Professional.
2) You must update AssemblyFileVersion. This is such a core/foundational tenant of Change Management and it's critical in getting Windows Installer upgrades and patches to work. AssemblyVersion can be changed at your discretion and is only applicable to Strong Naming and IoC scenarios such as Prism where you write rules on what constitutes a valid class for injection.
3) 1.0.* isn't what you want. You want a system that increments your version and passes it into your build automation. What you use will depend on what you are using for build automation. I use Team Foundation Server and a project in CodePlex to do my versioining.
4) You should never be building on a developers machine. You should always be using a clean build machine with automated scripts and not F5.
If these are released applications, then the installer method is fine. If you are adding libraries through this method, and not necessarily the actual application, then something like NuGet (package manager) is an option. NuGet itself is a bit infant and needs to grow up a bit, but I think it should fit your basic scenario.
If you have published software, a bootstrap on the client that calls for updates and then runs the update installer is a good pattern.
The basic answer is you have options, depending what bits you are employing and should take advantage of the one(s) that fit your needs.
Time to ask the pros, since I can't find a good answer anywhere else and I'm venturing into a side of the world that I'm just learning.
I'm in a primarily open source shop that has recently begun taking in a lot of internal tools and partners that are .Net based. That got me to thinking that I may be able to utilize the best of both worlds by leveraging C#/mono in certain spaces. On a small scale I've been very successful and it's working great. However, pressing 'Build' and scp'ing the exe into place isn't going to scale well.
I'd like to step it up a bit and get some more resources behind it, so here's my question; what are the baseline resources I need to establish a good dev/testing/staging environment.
I don't need uber-detailed information and I'm willing to consider both commercial and open source solutions, I guess I'm more looking for good advice on resources. 99% of the items developed on either side of the OS line will be services.
What sort of Unit/Regression testing tools are recommended, is NUnit the standard?
What sort of deployment mechanisms are recommended for service level software?
What, if any, additional tools have you found useful or indispensable during your development/design work?
The first 2 items are of interest since they are the last things I'm lacking before I have workable, repeatable development and deployment process.
You might want to look into http://go-mono.com/monovs/
It will alow you to debug on Linux from within Visual Studio.
The unit testing framework in Visual Studio is rather good as well,
but if you use the standard or free version of Visual Studio, NUnit is a good option as well
(And there is the option of Visual Studio integration)
Aside from that I've come quite attached to Refactor Pro (and other products by that company)
http://www.devexpress.com/Products/Visual_Studio_Add-in/Refactoring/
As for scp'ing the files to your linux/mac machines, it might be easier to
configure MSBuild to do that for you automatically.
This might help: http://bartdesmet.net/blogs/bart/archive/2006/04/13/3896.aspx
Many more msbuild tasks can be found here: http://msbuildcontrib.codeplex.com/
I hope this helps.
For build and deployment you might give NAnt a try. It'll handle your builds, and has tasks for running your tests, doing clean SVN checkouts, zipping up releases, that kind of thing. You can embed C# too. Grab the nightlies rather than the releases and don't worry too much about the lack of recent activity. Also the nant-contrib project is full of additional goodies.
Another option is to try msbuild (I believe there's a Mono equivalent, although I'm not sure to what degree). Truth be told there's not a lot of difference between the two.
I have built effective build/test/deploy infrastructure with the following:
NUnit
CruiseControl.NET (or CruiseControl)
NAnt (and NAnt Contrib)
or MSBuild (depends on your environment)
We also use subversion to manage both source control, as well as deployment (for things like CMS and website systems)
A few of the build tools we use are:
Simian
NCover
NDepend
Powershell (for both build automation as well as deploy automation and machine control)
Of course any of these tools can be substituted for other tools you like (perl, python, ruby, Ant, etc).
This is roughly how I've set up my environment at work:
I use NUnit as a unit-testing platform
I use TestDriven.NET as a plugin to easily run my unittests from within my IDE
I've set up a separate computer, which runs CruiseControl.NET
This CruiseControl.NET computer checks my source-repository on regular times. When it sees that something has changed, it gets the latest version from the source-repository and builds it. It also performs unit-tests, and runs fx-cop over the targets.
Next to that, i've configured it so that it performs a nightly build as well. This does roughly the same:
When something has changed during the day:
remove every file that exists locally
get the latest version from the source repository
build it
run unittests
run fxcop
create documentation using sandcastle helpfile builder
when the build was successfull, copy the build output to a separate folder which is named 'build-yyyymmdd'.
I've setup my source-repository so that I can keep different versions (branches) from my project.
In short, my source-repository looks like this:
I have a folder which is called
'devtrunk', which contains the actual
codebase. (On which I'm actively
developping)
I have a folder calles 'releases'.
Every time I release a new version, I
make a branch of the trunk, and I put
this branch in a new folder under
'releases'. This allows me to fix
bugs in a version that has been
released, without disturbing my
actual work on the trunk.
Since I'm working on the Windows platform, I use MSBuild to create my build-scripts (which are executed by Cruisecontrol), but, you can use NAnt instead. (Which I've used as well).
I want to create an ASP.NET build server for the first time since I've never used it.
Does anyone have a tutorial or resource on how to make an ASP.NET build server?
Or can anyone tell me how it's done?
If by "create" you mean "setup a build server" then I suggest you take a look at TeamCity from JetBrains.
TeamCity is a multi purpose build server and can be used to build ASP.NET projects as well. You can get up and running for free, and its very easy to set up, compared to CruiseControl.Net.
Take a look at MSBuild to see how to do specific ASP.NET build stuff.
MSBuild reference
How to use MSBuild to do ASP.NET compilation (video)
You might need something from the msbuildtasks open source task collection
If you really want to create your own build server from scratch (but why?), I can't help you.
You could make a build server using CruiseControl.NET which can build your project.
CruiseControl.Net Tutorial – Part 1
CruiseControl.Net Tutorial – Part 2
There is no ASP.NET build server as such.
Do you have a one-click build script? If not, you should create that first. Once you are able to run a single command and get a complete build, then it is easy to set up CruiseControl or some other build server.
Given the strength of the build servers out there it's really not sensible to spend any time developing your own.
You will, however, need at the minimum
a one-click build script
a source code repository (e.g. Subversion, TFS, or even [shudder] SourceSafe)
a server to use as a build box (I use a virtual image)
You may also find a one-click deployment script written using something like Powershell to be useful, too.
Note that a very effective alternative to CruiseControl.NET and TeamCity is Hudson. Although it's written in Java it's ridiculously simple to get going with a J2EE server like Tomcat.
The key strength of Hudson is the range the plug-ins, which allow you to monitor most version control systems and then not just build (through MSBuild or even the command line) but also run unit tests, acceptance tests, and so on.
You might look at Web Deployment Projects -- they allow you to build your site and merge all DLLs into a single file, for a fully pre-compiled site. You can use it with MSBuild.
An option on the automation side is Team Foundation Server's (TFS) automated builds. TFS also includes source control, bug tracking and many other features you may or may not need.