How to sync compiled code to multiple EC2 instances - c#

We have several EC2 instances behind a load balancer. Each server has several ASP.NET applications deployed to it. I'm looking for an easy, realtime, automated way to deploy new compiled code to all instances simultaneously.
I've seen solutions using source control repositories like SVN or Git, but this doesn't seem like an appropriate use of the technology for us since we're deploying compiled code to the EC2 instances - not source code.
I've also set up Dropbox to accomplish the sync. It somewhat works, but has its quirks. For instance, you need to build your directory structure around the "one root sync folder" limitation. Any other reason why we definitely should NOT use dropbox for this?
Writing a custom application using the S3 API is an option, but we'd prefer a third party solution over writing more code.
This seems like a common scenario, but I haven't found any good solutions yet.

Elastic Beanstalk seems to be the best route to go now. You simply push your web deploy project to an elastic beanstalk environment and it deploys code to all of your instances. (It manages auto scaling for you.) It also makes sure that new instances launched will have you latest code and it keeps previous versions which you can easily roll back to.
If your asp.net website needs to be auto scaled on AWS, Elastic Beanstalk is really the best end-to-end solution.

Since these are ASP.Net applications and IIS, why not use Web deploy. It's MADE for this.
http://www.iis.net/download/webdeploy
Web Deploy allows you to efficiently synchronize sites, applications or servers across your IIS 7.0 server farm by detecting differences between the source and destination content and transferring only those changes which need synchronization. The tool simplifies the synchronization process by automatically determining the configuration, content and certificates to be synchronized for a specific site. In addition to the default behavior, you still have the option to specify additional providers for the synchronization, including databases, COM objects, GAC assemblies and registry settings.

You can use Git, Mercurial or SVN to push compiled code to the servers, or to have the servers fetch code. Source control is not only for source code - it can be used for files of any type.
Also, one way around the Dropbox issue is to use multiple DropBox accounts if that's the issue. But Dropbox is a pretty easy solution because then you never need to write any code. As long as Dropbox is up, it will work.

You might want to give AppHarbor a try. We take care of managing ASP.NET application servers, loadbalancers and all the other required infrastructure, leaving you to get on with developing your application. We also provide a convenient way for you to push new versions of your app using your choice of Git, Mercurial, Subversion and TFS.

Git or mercurial will do a good job at that, subversion is bad at handling blobs.
And you get very nice control and assurance, that the code got deployed everywhere by looking at the revisions.

Seems obvious but, shared filesystem? Or push out with scp or rsync?

Related

How to automate development machine for MVC IIS

I don't know if this is entirely possible as a one click and done option, but what I would like is to automate our IIS MVC development setup for new developers.
Basically, what I am looking for is:
App pool creation if one is not already created (and make sure it is the correct version of .NET 4.0)
Creation of an IIS Application under the above app pool that points to the source code
Figure out if aspnet_regiis -i is needed (in the case that IIS was installed before 4.0 code was introduced)
I am not looking for a publish option as that does not point to the source code. Also, I am not looking to use the built in VS host as I am trying to use IIS to make sure everything is set up appropriately. Maybe I should not impose these limits and do not need to worry about setting up the machine as described? But if this does sound correct, currently I am looking for a way to set this up straight out of source control? Is that possible, or do I need to have an initial setup phase for all new developers?
Although I can't yet write exactly what you want, Powershell will do what you want. Here's a tutorial for creating web sites and app pools.
http://learn.iis.net/page.aspx/433/powershell-snap-in-creating-web-sites-web-applications-virtual-directories-and-application-pools/
What I would suggest is setting up the full environment on a computer then making a boot disk to return to that environment. Then run that boot disk on another computer (note: must be of the same type) which you want the environment to be on.
If it must work for different computers, and different operating systems, then the complexity of allowing for the environment to all be done at once will become greater than just doing it yourself unless you are setting up environments as frequently as daily. If that is the case, then perhaps you could make a custom installation program to do that which I am not very familiar with off the top of my head.

Automated deployments with Kentico

Does anybody have experience automating deployments with Kentico? E.g. the difficulty of synchronizing document types, bizforms etc to another server?
I've used the built in content staging module to do this sort of thing. Unfortunately it's not all Unicorns and Rainbows. There were definitely some bugs in the module which essentially serializes the data from one server, and deserializes on the target server.
That was back in version 5.5 or 5.5R2 though, and they released version 6 a few months ago. I would take some time and look at the documentation for it's limitations, and then maybe give it a test before committing to it. It can definitely work for some, but it may not be Content Editor friendly.
Kentico Developer Documentation on Content Staging Module
Another possibility would be to utilize a tool that does database comparisons and syncing. I've used the SQL Examiner Suite before, but I've heard that Red Gate makes good tools too.
SQL Examiner
SQL Data Exminer
Red Gate Tools SQL Compare
While this probably isn't the best method, it can work. If you're not making significant changes on a regular basis this can be good for one off syncs between your local/dev server and production. This probably wouldn't be a good solution for "content staging", but more for changes that occurred due to development oriented tasks.
Another option is to use the Export/Import feature in Kentico: http://devnet.kentico.com/docs/6_0/devguide/index.html?export_and_import_overview.htm.
I haven't automated this process, but you can have a look at the ExportManager class in Kentico's API Reference: http://devnet.kentico.com/Documentation.aspx.
Hope this helps
With Kentico 10 you could use the Continuous Integration Feature. It is now working much better than in Kentico 9.
With the Continuous Integration Feature Database objects could be deployed together with the code files and are serialized automatically into the target database.
If you do not want to use this module, you need to use the Object Export Feature in Kentico (Site => Export site or objects).
In both scenarios you have to know, that content (Pages) are difficult to stage between different servers. Content staging is only usefull if you have a "real" staging server, where contend editors prepare the contet that should be staged to the live server on time.
In case you want to stage from a DEV server to the LIVE server, the pages will be overwritten by the dev version, if the GUID of the page is matching.
If you use Continuous Integration, all pages which are not in the DEV server instance will be deleted!
All other objects (Develop objects like Templates, Web Parts, Page Types, etc.) could be imported without any issues.

Automated deployment to many servers with different app.config settings?

I know there are many questions like this one on SO, but I haven't found a good solution so far. The best solutions I've seen are homegrown, but before implementing a custom tool, I'd like to hear your take. So here we go:
I have a .NET solution with a couple of web applications and a few windows services. I want to automate the rollout of these applications to, say, 10-20 different servers - but the app/web.config files on each server may have different values.
Microsoft's answer to this issue is to have 10-20 different web.config files locally on the dev machine, and then using configuration manager to choose the right one. But that's not good enough, because the developers don't know about the production server settings, nor should they!
The ideal solution would be to include a "deployment model" of some sort where the production servers and their settings are defined, and which could be used with some deployment script (could be Powershell) as a step in the build server (I'm using TeamCity). This could be done by replacing the config settings before XCopying the solution to the remote server. But it's a tedious and time consuming task.
Another solution could be to use "configSource" to point to a folder with a fixed name, but the problem here is that some parts of the config files (such as serviceModel) can't be used with configSource.
So I haven't found the best answer to this. Any ideas?
Part of our deployment model (also a TeamCity centralised build going to many different servers) was to automatically create deployment scripts as part of the MSBUILD file and base the deployment around MSDEPLOY/web deployment 2.0.
The build would automatically produce a build candidate suitable for deployment with MSDEPLOY and would also knock up a powershell/cmd scriptlet that would pick the appropriate config file and copy it into place.
A deployment to all servers then just becomes a case of stringing these individual deployment scripts together (ie with a batch file). Since MSDEPLOY only sends over file changes it's normally quite speedy, and can be used to take backups as well as do deployments, so as part of the deploy script it will:
Take a backup of the appropriate server (eg Web1) and stick it on a network share
Deploy the appropriate package to the server (Web1) transforming any files as necessary (eg Web.Web1.Config -> Web.config)
Write any necessary logs
The build process also spits out an 'undo' script that will restore the appropriate server to the backup.
There's more on MSDEPLOY here. It can also be used with databases etc.
Just a suggestion, it might be helpful :)
[Blatant vendor post] We use exactly this sort of deployment model in our uDeploy product.
The basic idea is that you define a single deployment process that includes steps for updating configuration files (app.config and web.config are the most normal for ASP apps). Those variances can be per server or per logical environment (dev test, qa, stage, prod...). Alternately, you can put a template app.config file directly in uDeploy and we will write it out at deployment time.
The tool integrates with TeamCity to retrieve builds as well as IIS for deployment and configuring application pools and servers. It's also designed to track how multiple services and web applications that might be different TeamCity builds, come together as a release set.
On the surface, it sounds like we might have a decent fit. Feel free to reach out to me directly at eric#urbancode.com. Cheers! [/blatant vendor post]
I'm using XmlPreprocess tool for config files manipulation. It is using one mapping file for multiple environments/servers. You can edit mapping file by Excel. It is very easy to use.
Call XmlPreprocess in your custom PS scripts and pass server name as environment parameter.

patch creation for asp.net web application

I have a web application deployed at client Intra net using websetup project and now i want that any change or update in the source application should be provided as a small release to the client which the client will install using the deployed application and the changes will be reflected rather than the whole application re deployed. Also the deployed patch would be roll back if it is creating problem.
I need to ask what would be the best and easy way to create a patch to fulfill my requirements discussed above?
The minimum you will have to install is: replace dlls that have changed, replace markup (apsx/ascx) that has changed. However it is rather pointless as it really doesn't save you very much in most cases. The only time I ever do it is if i only have remote access and it is slow and the site is large. It saves on the amount of data to transfer.

Is it ok to store a solution inside inetpub?

One of the other developers at my company wrote a .NET 2.0 web site. He stores everything...solution, project, source...everything inside of "inetpub\AppName" (the IIS share). I have never seen this done before. In fact I'm kinda surprised the website loads up in a browser. Are there any disadvantages to doing this over say...storing your solution in the visual studio 2010 projects folder and then publishing the website to inetpub (security, speed, etc)? Also, Why does this work?
It works because the site would be compiled on the fly. This is bad from performance point of view (because the late compilation) and it's bad from security point of view (you're exposing your code more than necessary).
From MSDN
Because ASP.NET compiles your Web site on first user request, you can simply copy your application's source code to the production Web server. However, ASP.NET also provides precompilation options that allow you to compile your Web site before it has been deployed, or to compile it after it has been deployed but before a user requests it. Precompilation has several advantages
There's really no reason it shouldn't work, but it's generally considered a bad idea. Is he developing directly on the shared site? That's scary. Even if he isn't, that's putting a lot of files on a shared site that shouldn't be there. The server may be configured not to return them, but one shouldn't get comfortable relying on that.
Even on his local machine, it's bad practice. If for no other reason than it doesn't properly mimic the published site and makes for a bad place to test things.
There's nothing special about the Inetpub folder—it's just the default web server root by convention. Nothing about it will prevent IIS from displaying ASPX pages if it's also part of a solution (which is only referenced in the project file's XML). You can also point IIS to the project directory in the Visual Studio Projects folder.
Storing userdata on C: is usually bad practice (specially for a programmer).
Most of us have a data partition that contains just userdata which is backed-up frequently or use a source repository on another server.
If you're on a secure LAN and just developing by yourself there is really no problem putting the solution in InetPub. However if you use the same IIS to publish to the world i wouldn't recommend it. You never know who might get to your precious gems.
I'd say it's bad practice. Your entire code is at the mercy of the web server. If the server is hacked the code is a freebie reward.

Categories