Automated deployment to many servers with different app.config settings? - c#

I know there are many questions like this one on SO, but I haven't found a good solution so far. The best solutions I've seen are homegrown, but before implementing a custom tool, I'd like to hear your take. So here we go:
I have a .NET solution with a couple of web applications and a few windows services. I want to automate the rollout of these applications to, say, 10-20 different servers - but the app/web.config files on each server may have different values.
Microsoft's answer to this issue is to have 10-20 different web.config files locally on the dev machine, and then using configuration manager to choose the right one. But that's not good enough, because the developers don't know about the production server settings, nor should they!
The ideal solution would be to include a "deployment model" of some sort where the production servers and their settings are defined, and which could be used with some deployment script (could be Powershell) as a step in the build server (I'm using TeamCity). This could be done by replacing the config settings before XCopying the solution to the remote server. But it's a tedious and time consuming task.
Another solution could be to use "configSource" to point to a folder with a fixed name, but the problem here is that some parts of the config files (such as serviceModel) can't be used with configSource.
So I haven't found the best answer to this. Any ideas?

Part of our deployment model (also a TeamCity centralised build going to many different servers) was to automatically create deployment scripts as part of the MSBUILD file and base the deployment around MSDEPLOY/web deployment 2.0.
The build would automatically produce a build candidate suitable for deployment with MSDEPLOY and would also knock up a powershell/cmd scriptlet that would pick the appropriate config file and copy it into place.
A deployment to all servers then just becomes a case of stringing these individual deployment scripts together (ie with a batch file). Since MSDEPLOY only sends over file changes it's normally quite speedy, and can be used to take backups as well as do deployments, so as part of the deploy script it will:
Take a backup of the appropriate server (eg Web1) and stick it on a network share
Deploy the appropriate package to the server (Web1) transforming any files as necessary (eg Web.Web1.Config -> Web.config)
Write any necessary logs
The build process also spits out an 'undo' script that will restore the appropriate server to the backup.
There's more on MSDEPLOY here. It can also be used with databases etc.
Just a suggestion, it might be helpful :)

[Blatant vendor post] We use exactly this sort of deployment model in our uDeploy product.
The basic idea is that you define a single deployment process that includes steps for updating configuration files (app.config and web.config are the most normal for ASP apps). Those variances can be per server or per logical environment (dev test, qa, stage, prod...). Alternately, you can put a template app.config file directly in uDeploy and we will write it out at deployment time.
The tool integrates with TeamCity to retrieve builds as well as IIS for deployment and configuring application pools and servers. It's also designed to track how multiple services and web applications that might be different TeamCity builds, come together as a release set.
On the surface, it sounds like we might have a decent fit. Feel free to reach out to me directly at eric#urbancode.com. Cheers! [/blatant vendor post]

I'm using XmlPreprocess tool for config files manipulation. It is using one mapping file for multiple environments/servers. You can edit mapping file by Excel. It is very easy to use.
Call XmlPreprocess in your custom PS scripts and pass server name as environment parameter.

Related

How to automate development machine for MVC IIS

I don't know if this is entirely possible as a one click and done option, but what I would like is to automate our IIS MVC development setup for new developers.
Basically, what I am looking for is:
App pool creation if one is not already created (and make sure it is the correct version of .NET 4.0)
Creation of an IIS Application under the above app pool that points to the source code
Figure out if aspnet_regiis -i is needed (in the case that IIS was installed before 4.0 code was introduced)
I am not looking for a publish option as that does not point to the source code. Also, I am not looking to use the built in VS host as I am trying to use IIS to make sure everything is set up appropriately. Maybe I should not impose these limits and do not need to worry about setting up the machine as described? But if this does sound correct, currently I am looking for a way to set this up straight out of source control? Is that possible, or do I need to have an initial setup phase for all new developers?
Although I can't yet write exactly what you want, Powershell will do what you want. Here's a tutorial for creating web sites and app pools.
http://learn.iis.net/page.aspx/433/powershell-snap-in-creating-web-sites-web-applications-virtual-directories-and-application-pools/
What I would suggest is setting up the full environment on a computer then making a boot disk to return to that environment. Then run that boot disk on another computer (note: must be of the same type) which you want the environment to be on.
If it must work for different computers, and different operating systems, then the complexity of allowing for the environment to all be done at once will become greater than just doing it yourself unless you are setting up environments as frequently as daily. If that is the case, then perhaps you could make a custom installation program to do that which I am not very familiar with off the top of my head.

Automated deployments with Kentico

Does anybody have experience automating deployments with Kentico? E.g. the difficulty of synchronizing document types, bizforms etc to another server?
I've used the built in content staging module to do this sort of thing. Unfortunately it's not all Unicorns and Rainbows. There were definitely some bugs in the module which essentially serializes the data from one server, and deserializes on the target server.
That was back in version 5.5 or 5.5R2 though, and they released version 6 a few months ago. I would take some time and look at the documentation for it's limitations, and then maybe give it a test before committing to it. It can definitely work for some, but it may not be Content Editor friendly.
Kentico Developer Documentation on Content Staging Module
Another possibility would be to utilize a tool that does database comparisons and syncing. I've used the SQL Examiner Suite before, but I've heard that Red Gate makes good tools too.
SQL Examiner
SQL Data Exminer
Red Gate Tools SQL Compare
While this probably isn't the best method, it can work. If you're not making significant changes on a regular basis this can be good for one off syncs between your local/dev server and production. This probably wouldn't be a good solution for "content staging", but more for changes that occurred due to development oriented tasks.
Another option is to use the Export/Import feature in Kentico: http://devnet.kentico.com/docs/6_0/devguide/index.html?export_and_import_overview.htm.
I haven't automated this process, but you can have a look at the ExportManager class in Kentico's API Reference: http://devnet.kentico.com/Documentation.aspx.
Hope this helps
With Kentico 10 you could use the Continuous Integration Feature. It is now working much better than in Kentico 9.
With the Continuous Integration Feature Database objects could be deployed together with the code files and are serialized automatically into the target database.
If you do not want to use this module, you need to use the Object Export Feature in Kentico (Site => Export site or objects).
In both scenarios you have to know, that content (Pages) are difficult to stage between different servers. Content staging is only usefull if you have a "real" staging server, where contend editors prepare the contet that should be staged to the live server on time.
In case you want to stage from a DEV server to the LIVE server, the pages will be overwritten by the dev version, if the GUID of the page is matching.
If you use Continuous Integration, all pages which are not in the DEV server instance will be deleted!
All other objects (Develop objects like Templates, Web Parts, Page Types, etc.) could be imported without any issues.

How to sync compiled code to multiple EC2 instances

We have several EC2 instances behind a load balancer. Each server has several ASP.NET applications deployed to it. I'm looking for an easy, realtime, automated way to deploy new compiled code to all instances simultaneously.
I've seen solutions using source control repositories like SVN or Git, but this doesn't seem like an appropriate use of the technology for us since we're deploying compiled code to the EC2 instances - not source code.
I've also set up Dropbox to accomplish the sync. It somewhat works, but has its quirks. For instance, you need to build your directory structure around the "one root sync folder" limitation. Any other reason why we definitely should NOT use dropbox for this?
Writing a custom application using the S3 API is an option, but we'd prefer a third party solution over writing more code.
This seems like a common scenario, but I haven't found any good solutions yet.
Elastic Beanstalk seems to be the best route to go now. You simply push your web deploy project to an elastic beanstalk environment and it deploys code to all of your instances. (It manages auto scaling for you.) It also makes sure that new instances launched will have you latest code and it keeps previous versions which you can easily roll back to.
If your asp.net website needs to be auto scaled on AWS, Elastic Beanstalk is really the best end-to-end solution.
Since these are ASP.Net applications and IIS, why not use Web deploy. It's MADE for this.
http://www.iis.net/download/webdeploy
Web Deploy allows you to efficiently synchronize sites, applications or servers across your IIS 7.0 server farm by detecting differences between the source and destination content and transferring only those changes which need synchronization. The tool simplifies the synchronization process by automatically determining the configuration, content and certificates to be synchronized for a specific site. In addition to the default behavior, you still have the option to specify additional providers for the synchronization, including databases, COM objects, GAC assemblies and registry settings.
You can use Git, Mercurial or SVN to push compiled code to the servers, or to have the servers fetch code. Source control is not only for source code - it can be used for files of any type.
Also, one way around the Dropbox issue is to use multiple DropBox accounts if that's the issue. But Dropbox is a pretty easy solution because then you never need to write any code. As long as Dropbox is up, it will work.
You might want to give AppHarbor a try. We take care of managing ASP.NET application servers, loadbalancers and all the other required infrastructure, leaving you to get on with developing your application. We also provide a convenient way for you to push new versions of your app using your choice of Git, Mercurial, Subversion and TFS.
Git or mercurial will do a good job at that, subversion is bad at handling blobs.
And you get very nice control and assurance, that the code got deployed everywhere by looking at the revisions.
Seems obvious but, shared filesystem? Or push out with scp or rsync?

Is Team Foundation Server the right solution to automatically publish .net website to remote server?

We currently build our .net website in C# in Visual Studio 2010 Pro on our dev server, then manually publish it and upload to the live server where it is copied over the current files to go live.
We want to automate this process as much as possible and if possible push it at a certain time, such as every day at midnight. We don't currently use any Source Control so this probably makes it essential anyway...
Is Team Foundation Server [TFS] the best solution to enable this? If so, how much would it cost our client for this or how can we find out? We're in the UK and they do have an MSDN subscription.
At this point, you need to slow down and set more realistic goals. Here's my biggest red flag:
"We don't currently use any Source
Control so this probably makes it
essential anyway..."
Without proper SCC, you're not capable of getting where you need to go. A full scale TFS implementation can most certainly do what you want to do, and it has a couple of really nice features that you can use to integrate automated deployment scenarios, which is great, but you really need to learn to walk before you can learn to run.
I've commented on TFS cost before, so I won't do that in this post, but suffice it to say that a TFS implemenation which does what you want will cost a significant amount of effort, especially if you factor in the time it will take you to set it up and script out the automated publishing workflow you want.
I don't know what your budgets are or how big your teams are, or the nature of your development strategy, or a number of things that might actually change my answer, but I'm proceeding on the assumption that you have a limited budget and no dedicated staff of people that you can draw upon to set up a first-class TFS implementation, so here's what I would recommend (in this order!)
Set up version control using something that's free such as Subversion or Git. For an organization that's just starting off with SCC, I'd recommend Subversion over Git, because it's conceptually a lot simpler to get started with. This is the bedrock of everything you're going to do. Unlike adding a when fuze to a 2000 pound bomb or assembling a bicycle, I'd recommend that you read the manual before and during your SVN installation.
Make a build file using MSBuild. Yes, you can use nAnt, but MSBuild is fairly equivalent in most scenarios, and is a bit more friendly with TFS, if you ever decide to go in that direction in the distant, distant future. Make sure your builds work properly on your development boxes and servers.
Come up with a deployment script. This may very well just equate to a target in your MSBuild file. Or it may be an MSI file -- I don't know your environment well enough to say, but guessing by the fact that you said you copied stuff over to production, an MSBuild target will likely suffice.
Set up a Continuous Integration server such as Hudson or CruiseControl.NET. I personally use CruiseControl, but the basic idea behind both is that they are automated services which watch your SCC system for changes and perform the builds for you. If you have set up a MSBuild target to perform your deployment, you can configure a "project" in CCNET (or probably Hudson) to do the deployment as well.
The total software cost of these solutions is $0, but you will likely face quite a bit of a learning curve on it all. TFS's learning curve, IMO, is even steeper, and the software cost is definitely north of $0. Either way, the take away is not to try to bite it all off in one chunk at one time, or you will probably fail. Go step-by-step, and you will get there. And have fun! I personally loved learning about all of this stuff!
If you client has MSDN then TFS is free!
Wither you have MSDN Professional, Permium or Ultimate you get both a CAL to access ANY TFS server and a licence to run a TFS server in production included. You just need to make sure that all your users have MSDN. If they do not, then you can buy a Retial TFS Licence for $500 which allows the first 5 users without a CAL. You can then add CAL packs which are cheaper than MSDN for users who need access to the data. If you have internal users that need to access only the work items that they have created, then they are also FREE.
As long as the person that kicks off the build has an MSDN licence your build server is also free. You can have the sequence that Dean describes, but I would sugest shelling out a little cash for Final Builder and use it to customise the process. It it integrates well into TFS and provides a nice UI.
The advantage is you get Dev->test->Deploy all recorded, audited and reportable in one product...
http://www.finalbuilder.com/download.aspx
Sounds like you are after a Continuous Integration (Build) server.
One I have used a lot with .net is Hudson.
You can kick off a build by a number of triggers such as at a particular time and can run various steps in sequence. These can include running batch commands (windows or linux depending on what platform you run it on) or running MS Build. It has a lot of plugins and most major tools are supported.
A common sequence for apps that we build is:
update from source control (but that doesn't mean you can't do something like take a copy from a file share)
Compile using msbuild
run unit tests using nunit
Deployed built project to test server
TFS Team Build is certainly capable of doing what you wish by setting up a Build which executes the Deploy target of a Web App. The key is MSDeploy which in reality can be executed in many ways and is not dependent upon any one tool. You could simply schedule a task to execute MSDeploy.
See these two links for more information:
http://weblogs.asp.net/scottgu/archive/2010/07/29/vs-2010-web-deployment.aspx
http://www.hanselman.com/blog/WebDeploymentMadeAwesomeIfYoureUsingXCopyYoureDoingItWrong.aspx?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed:+ScottHanselman+(Scott+Hanselman+-+ComputerZen.com)
There are a lot of really cool things you can do with the new Build system based on Windows Workflow in TFS 2010. Learning to customize your Build Process Templates is a worthwhile investment.
Well team TFS is the ultimate solution. But, to reduce cost you can make use of MSbuild to achieve your task. You can create a windows scheduler which fires MSBuild at a particular time. There is an open source MSBuild task available at http://msbuildtasks.tigris.org/ through which you can even upload your files through FTP
You need to do Continuous Integration. TFS 2010 is fully capable to do this for you. But before you continue you should move your sources to TFS Source Control Management. We are doing same thing as you need: All our sources resides in TFS, with each check-in, a build occurs in Build Server then deployed to a remote live server.

Storring data in web.config(custom section/appSettings element) vs storing it in a class

Why is it better to store data inside an appSettings element (or inside a custom section) of a web.config file than to store it in a class?
One argument would be that by using custom sections we don’t have to recompile code when we change data, but that’s a weak argument, especially if we’re using Web Sites, which get recompiled automatically whenever code changes!
Thank you
Because you can change it on the fly and use it without regard to class structure. Your configuration can vary from each developers machine to staging to deployment environment by changing and maintaining a single file independently of the code, and you can take advantage of *.config masking with different areas of your site.
Hard coding anything configurable is a recipe for failure and it absolutely will bite you - this is just a matter of experience, if you don't believe it then you have but to wait a little while!
By putting settings into web.config, you have them all in a centralized location.
Also when deploying a web site, you might want to precompile it once. So you won't be able to change the source afterwards (without another recompilation).
It's not really a concern of just recompiling the code, it's more about re-deploying the code. Normally, you don't deploy code to the web server, you just deploy the binaries and aspx/html files. If you hard-code your config data in the code, you'll have to rebuild and redeploy the library or application to get the change up to the server, which is a lot more work than just updating the web.config.
Putting data in the web.config file also allows the same code to be run in different environments with different environment-dependent data. This can mean running the same website code in staging with a test database connection string and in the production environment with the production database connection string. Or it could mean allowing the developers to configure the data for their own tests without changing any code, as 'annakata' mentioned.
It's just a WHOLE lot easier to manage and update the settings.
If you're using Notepad to do your development and putting the code out on the server, I would agree that there is little benefit, but if you're using Visual Studio and you build your website and publish it, you're publishing the pre-compiled dlls and not just updating text source code (.cs or .vb files) on the server. So when it comes time to update a setting at that point, anything in the web.config can be updated by simply modifying the text file, where as with other changes, you have to re-compile t whole web site and publish it.
And from experience, that becomes tricky when taking over after other developers that weren't careful about ensuring everything needed to make a web site work is in source control. I'm now stuck with a web site where we can't update huge chunks of it because of (kindly putting it) non-standard practices in the past.
Being able to update something without re-publishing the site is a huge blessing in my situation, and you never know who the poor maintenance programmer will be that takes over on your code.
Be nice to him or her. Make it easy to make simple changes.

Categories