Automated deployments with Kentico - c#

Does anybody have experience automating deployments with Kentico? E.g. the difficulty of synchronizing document types, bizforms etc to another server?

I've used the built in content staging module to do this sort of thing. Unfortunately it's not all Unicorns and Rainbows. There were definitely some bugs in the module which essentially serializes the data from one server, and deserializes on the target server.
That was back in version 5.5 or 5.5R2 though, and they released version 6 a few months ago. I would take some time and look at the documentation for it's limitations, and then maybe give it a test before committing to it. It can definitely work for some, but it may not be Content Editor friendly.
Kentico Developer Documentation on Content Staging Module

Another possibility would be to utilize a tool that does database comparisons and syncing. I've used the SQL Examiner Suite before, but I've heard that Red Gate makes good tools too.
SQL Examiner
SQL Data Exminer
Red Gate Tools SQL Compare
While this probably isn't the best method, it can work. If you're not making significant changes on a regular basis this can be good for one off syncs between your local/dev server and production. This probably wouldn't be a good solution for "content staging", but more for changes that occurred due to development oriented tasks.

Another option is to use the Export/Import feature in Kentico: http://devnet.kentico.com/docs/6_0/devguide/index.html?export_and_import_overview.htm.
I haven't automated this process, but you can have a look at the ExportManager class in Kentico's API Reference: http://devnet.kentico.com/Documentation.aspx.
Hope this helps

With Kentico 10 you could use the Continuous Integration Feature. It is now working much better than in Kentico 9.
With the Continuous Integration Feature Database objects could be deployed together with the code files and are serialized automatically into the target database.
If you do not want to use this module, you need to use the Object Export Feature in Kentico (Site => Export site or objects).
In both scenarios you have to know, that content (Pages) are difficult to stage between different servers. Content staging is only usefull if you have a "real" staging server, where contend editors prepare the contet that should be staged to the live server on time.
In case you want to stage from a DEV server to the LIVE server, the pages will be overwritten by the dev version, if the GUID of the page is matching.
If you use Continuous Integration, all pages which are not in the DEV server instance will be deleted!
All other objects (Develop objects like Templates, Web Parts, Page Types, etc.) could be imported without any issues.

Related

Reporting Solution web-based using .net framework suggestions

I have a situation, where we have a MS SQL Server 2012 database and the client is requesting a number of reports to be generated from this db. The client is also requesting that this data should be seen in his current portal (custom web based using .Net C# and ASP.NET) or a hosted web-application in his IIS.
I was thinking of going for something which lets you create queries on the fly with the least amount of effort and with the possibility to customize them through code (since the requirements always get changed + certain reports are very easy to make). Just to give you an example of what I had in mind is Ubiq (see video here: https://www.youtube.com/watch?v=UgXrbdnsa9Y )
Kindly can you please give me some examples or solutions that can be used.
There are a lot of reporting solutions out there, i recommend you to try them all and pick the one that better fits your needs, with that in mind, you can start by trying DBxtra, which lets you create queries easily (although not on the fly) and lets you customize them if you need/want to, it also runs on the .NET Framework, so you can easily host it along with your customer's application.
P.S.: I'm a DBxtra evangelist.

Umbraco best practice for deploying content [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
So, I have a staging and live environment of Umbraco.
Our content guys make changes in Live because they need something to be visible straight away.
Now, to back this up - I'm currently copying & pasting what they've done onto our staging environment and putting into source control..
Is there a better way of doing this?
From what I know of your situation, I would recommend setting up the staging site and the production site with the same database. Unless you are using the ContentService to pull content into your templates (which you should avoid because it hits the DB), your umbraco site should only be hitting the App_Data/umbraco.config xml cache and the examine indexes in App_Data/TEMP/ExamineIndexes. This means that even though your staging and production site will be sharing the same database, changes that you make on the staging site won't show up on the production site until you log in and republish the entire site or republish the specific node.
This approach is definitely not appropriate for every scenario. For example, we have clients who won't like that the database is shared for security reasons. They want as much separation from the production site and the staging site as possible. I also wouldn't use this if the content on the site is very time sensitive. If content being accidentally published before it is ready would be very bad for your client, this might not be the best solution. We haven't experienced any trouble with the xml cache being automatically refreshed when we weren't ready, but I wouldn't trust a cache to protect sensitive information from being released early.
We have been using it and are very happy with the simplicity. There are very few moving parts, so compared to some of the other deployment methods below, this is a pretty safe way to deploy. To make things more user friendly for our clients, we rig up a button on our staging site that, when clicked, will republish the cache for that node on the production site. I hope to release this as a package and will update this answer with a link to the package when it is ready.
UPDATE
I would consider the above approach experimental. Umbraco has been putting a lot of work into load balancing scenarios in later versions of umbraco 7, and some of the work they have done may invalidate what I was talking about. Just keep that in mind if you do decide to try this out.
Here are some other tools that are interesting to think about when dealing with content deployment:
Conveyor: https://our.umbraco.org/projects/backoffice-extensions/conveyor
Courier/Umbraco Deploy: http://umbraco.com/products/more-add-ons/courier-2
Umbraco Cloud: https://umbraco.com/products/umbraco-cloud/
CMS Import: http://soetemansoftware.nl/cmsimport
uMirror: https://our.umbraco.org/projects/backoffice-extensions/umirror
uSync Content Edition: https://our.umbraco.org/projects/developer-tools/usynccontentedition
Conveyor is a young package (at least right now it is). It has a dashboard that you would be able to use to selectively export content from your production site. You can then log in to the backoffice on your staging site and import the content. I am trying this out for the first time this month. It looks very promising, so far, but I can't give you a lot of advice from experience.
Courier is meant to be the ultimate solution in content deployment. It is one of the few content deployment options that allows you to selectively deploy only the content you want to. You can right click on content and deploy from staging to production or from production to staging. Courier also tries to detect dependencies and deploy them along with your content selections. The trick with Courier is that when something goes wrong, it is a big deal. Sites can go down and depending on what went wrong, it could take a lot of time to recover them. Courier might try to deploy a document type that it detected as a dependency and accidentally ruin things. I've also found that it requires a lot of training to use properly. I haven't had a lot of success allowing non-technical folk to use Courier. If you use Courier, set up a test environment and play around for a while. Make sure you know what workflows work for you and what will break things. Courier will let you shoot yourself in the foot. Update: Umbraco has been using Courier a lot for their new Umbraco as a service. They have been finding and fixing a lot of the bugs. The 2015 versions of Courier are much more stable. If you want to use Courier, Make sure you are using the newest versions for Umbraco 7. I've recently been doing some testing on Courier version 2.50.1. Much, much better. I'd still tread carefully though. Another Update Umbraco has been depending more and more heavily on Courier. They have announced a new and reworked Courier called Umbraco Deploy. I look forward to it. Once it is released, that will be a better choice than Courier and I expect that it will function similarly.
Umbraco Cloud is a whole SaaS setup that Umbraco has been working on very heavily. They can host your Umbraco site in Azure and have a very neat UI and process for deploying not only the content and media of your site but also all of the code, document types, and data types. This is still somewhat new, and a lot of very complex sites may not be a good fit for Umbraco Cloud. Also sites that rely heavily on document type inheritance vs document type composition might have problems. As far as I can tell, Umbraco Cloud is nice for small to medium sized sites, but Umbraco does have some very very large sites hosted on Umbraco Cloud as well. Umbraco Cloud relies heavily on the new Umbraco Deploy that is based off of Courier. Chances are that if your site is having trouble with the new Courier, it will still have problems on Umbraco Cloud.
uMirror is one that I've never used, but it exists and could be useful.
uSync Content Edition is another one that I've never used. We do have experience using the regular uSync, and I've found that the author is very responsive to issues and questions.
It sounds like you are seeking something like uSync.ContentEdition, which will allow you to export the database content to disk.
You can copy the files over to staging, and then import them into the database.
Be careful though, the author himself states that it is "Experimental (but getting better)".
An alternative option would be to copy the database itself from live to staging every so often, assuming that the staging database can be overwritten. This is the approach I would take.

Import data into sql server from files with different format

I have a program, which watches a folder on the server. when new files (flat file) come in, the program (C#) read data, bulk insert into the table. it works fine.
Now, we extend the system. It means the data files could be in different formats (flat file, csv, txt, excel..), or with different columns (we need map them to the columns in the table).
my question is: is C# the best choice for this? or, SSIS is a better choice?
Thanks
I wouldn’t necessarily choose one or the other but choose depending on the file type and the amount of processing. For some file types its probably easier to go with C# and for some other SSIS works better.
Do you have someone on your team who is good with SSIS? It’s much easier to find a C# dev to do the job for you than to find someone who knows SSIS.
How likely is that requirements/formats are going to be updated in the future? That’s also important thing to keep in mind.
I do agree with what others said that SSIS is more powerful and offers support for more complex transformations but the questions is do you really need this?
It's depends on your context. Different format should not decision go to SSIS. With solution C# program: you can continue go with it because it run stable before. Easy to deployment, specific into your domain, easy to configuration as well.
With solution SSIS: The configuration more complicate required developer has deep knowledge into SSIS. The administration fee required more than C# program. However it easy to visual (has diagram for you see the flow integration more easier).
From my viewpoint, if the integration process does not required complicated about business rule you should go with C# program. Otherwise, SSIS more powerful if integration process required rules complicated. Hope this help.
In C# application I guess you are using the SqlbulkCopy component and compared to SSIS its not that powerful. So if your data size becomes huge,then C# application will become slower.
If you are familiar with SSIS,my suggestion is to go with SSIS. In SSIS,you can implement end-to-end solution as you have developed in C#,right from checking the files in a specific folder to loading the data into database.

How to sync compiled code to multiple EC2 instances

We have several EC2 instances behind a load balancer. Each server has several ASP.NET applications deployed to it. I'm looking for an easy, realtime, automated way to deploy new compiled code to all instances simultaneously.
I've seen solutions using source control repositories like SVN or Git, but this doesn't seem like an appropriate use of the technology for us since we're deploying compiled code to the EC2 instances - not source code.
I've also set up Dropbox to accomplish the sync. It somewhat works, but has its quirks. For instance, you need to build your directory structure around the "one root sync folder" limitation. Any other reason why we definitely should NOT use dropbox for this?
Writing a custom application using the S3 API is an option, but we'd prefer a third party solution over writing more code.
This seems like a common scenario, but I haven't found any good solutions yet.
Elastic Beanstalk seems to be the best route to go now. You simply push your web deploy project to an elastic beanstalk environment and it deploys code to all of your instances. (It manages auto scaling for you.) It also makes sure that new instances launched will have you latest code and it keeps previous versions which you can easily roll back to.
If your asp.net website needs to be auto scaled on AWS, Elastic Beanstalk is really the best end-to-end solution.
Since these are ASP.Net applications and IIS, why not use Web deploy. It's MADE for this.
http://www.iis.net/download/webdeploy
Web Deploy allows you to efficiently synchronize sites, applications or servers across your IIS 7.0 server farm by detecting differences between the source and destination content and transferring only those changes which need synchronization. The tool simplifies the synchronization process by automatically determining the configuration, content and certificates to be synchronized for a specific site. In addition to the default behavior, you still have the option to specify additional providers for the synchronization, including databases, COM objects, GAC assemblies and registry settings.
You can use Git, Mercurial or SVN to push compiled code to the servers, or to have the servers fetch code. Source control is not only for source code - it can be used for files of any type.
Also, one way around the Dropbox issue is to use multiple DropBox accounts if that's the issue. But Dropbox is a pretty easy solution because then you never need to write any code. As long as Dropbox is up, it will work.
You might want to give AppHarbor a try. We take care of managing ASP.NET application servers, loadbalancers and all the other required infrastructure, leaving you to get on with developing your application. We also provide a convenient way for you to push new versions of your app using your choice of Git, Mercurial, Subversion and TFS.
Git or mercurial will do a good job at that, subversion is bad at handling blobs.
And you get very nice control and assurance, that the code got deployed everywhere by looking at the revisions.
Seems obvious but, shared filesystem? Or push out with scp or rsync?

Is Team Foundation Server the right solution to automatically publish .net website to remote server?

We currently build our .net website in C# in Visual Studio 2010 Pro on our dev server, then manually publish it and upload to the live server where it is copied over the current files to go live.
We want to automate this process as much as possible and if possible push it at a certain time, such as every day at midnight. We don't currently use any Source Control so this probably makes it essential anyway...
Is Team Foundation Server [TFS] the best solution to enable this? If so, how much would it cost our client for this or how can we find out? We're in the UK and they do have an MSDN subscription.
At this point, you need to slow down and set more realistic goals. Here's my biggest red flag:
"We don't currently use any Source
Control so this probably makes it
essential anyway..."
Without proper SCC, you're not capable of getting where you need to go. A full scale TFS implementation can most certainly do what you want to do, and it has a couple of really nice features that you can use to integrate automated deployment scenarios, which is great, but you really need to learn to walk before you can learn to run.
I've commented on TFS cost before, so I won't do that in this post, but suffice it to say that a TFS implemenation which does what you want will cost a significant amount of effort, especially if you factor in the time it will take you to set it up and script out the automated publishing workflow you want.
I don't know what your budgets are or how big your teams are, or the nature of your development strategy, or a number of things that might actually change my answer, but I'm proceeding on the assumption that you have a limited budget and no dedicated staff of people that you can draw upon to set up a first-class TFS implementation, so here's what I would recommend (in this order!)
Set up version control using something that's free such as Subversion or Git. For an organization that's just starting off with SCC, I'd recommend Subversion over Git, because it's conceptually a lot simpler to get started with. This is the bedrock of everything you're going to do. Unlike adding a when fuze to a 2000 pound bomb or assembling a bicycle, I'd recommend that you read the manual before and during your SVN installation.
Make a build file using MSBuild. Yes, you can use nAnt, but MSBuild is fairly equivalent in most scenarios, and is a bit more friendly with TFS, if you ever decide to go in that direction in the distant, distant future. Make sure your builds work properly on your development boxes and servers.
Come up with a deployment script. This may very well just equate to a target in your MSBuild file. Or it may be an MSI file -- I don't know your environment well enough to say, but guessing by the fact that you said you copied stuff over to production, an MSBuild target will likely suffice.
Set up a Continuous Integration server such as Hudson or CruiseControl.NET. I personally use CruiseControl, but the basic idea behind both is that they are automated services which watch your SCC system for changes and perform the builds for you. If you have set up a MSBuild target to perform your deployment, you can configure a "project" in CCNET (or probably Hudson) to do the deployment as well.
The total software cost of these solutions is $0, but you will likely face quite a bit of a learning curve on it all. TFS's learning curve, IMO, is even steeper, and the software cost is definitely north of $0. Either way, the take away is not to try to bite it all off in one chunk at one time, or you will probably fail. Go step-by-step, and you will get there. And have fun! I personally loved learning about all of this stuff!
If you client has MSDN then TFS is free!
Wither you have MSDN Professional, Permium or Ultimate you get both a CAL to access ANY TFS server and a licence to run a TFS server in production included. You just need to make sure that all your users have MSDN. If they do not, then you can buy a Retial TFS Licence for $500 which allows the first 5 users without a CAL. You can then add CAL packs which are cheaper than MSDN for users who need access to the data. If you have internal users that need to access only the work items that they have created, then they are also FREE.
As long as the person that kicks off the build has an MSDN licence your build server is also free. You can have the sequence that Dean describes, but I would sugest shelling out a little cash for Final Builder and use it to customise the process. It it integrates well into TFS and provides a nice UI.
The advantage is you get Dev->test->Deploy all recorded, audited and reportable in one product...
http://www.finalbuilder.com/download.aspx
Sounds like you are after a Continuous Integration (Build) server.
One I have used a lot with .net is Hudson.
You can kick off a build by a number of triggers such as at a particular time and can run various steps in sequence. These can include running batch commands (windows or linux depending on what platform you run it on) or running MS Build. It has a lot of plugins and most major tools are supported.
A common sequence for apps that we build is:
update from source control (but that doesn't mean you can't do something like take a copy from a file share)
Compile using msbuild
run unit tests using nunit
Deployed built project to test server
TFS Team Build is certainly capable of doing what you wish by setting up a Build which executes the Deploy target of a Web App. The key is MSDeploy which in reality can be executed in many ways and is not dependent upon any one tool. You could simply schedule a task to execute MSDeploy.
See these two links for more information:
http://weblogs.asp.net/scottgu/archive/2010/07/29/vs-2010-web-deployment.aspx
http://www.hanselman.com/blog/WebDeploymentMadeAwesomeIfYoureUsingXCopyYoureDoingItWrong.aspx?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed:+ScottHanselman+(Scott+Hanselman+-+ComputerZen.com)
There are a lot of really cool things you can do with the new Build system based on Windows Workflow in TFS 2010. Learning to customize your Build Process Templates is a worthwhile investment.
Well team TFS is the ultimate solution. But, to reduce cost you can make use of MSbuild to achieve your task. You can create a windows scheduler which fires MSBuild at a particular time. There is an open source MSBuild task available at http://msbuildtasks.tigris.org/ through which you can even upload your files through FTP
You need to do Continuous Integration. TFS 2010 is fully capable to do this for you. But before you continue you should move your sources to TFS Source Control Management. We are doing same thing as you need: All our sources resides in TFS, with each check-in, a build occurs in Build Server then deployed to a remote live server.

Categories