TFS Deploy WITHOUT using WebDeploy - c#

I'm trying to setup our CI build environment and having an issue.
First, I'm using VS and TFS 2012 so I can't use the *.12.xaml templates since those are for VS/TFS 2013.
Second, right now I'm configured to use just the defaulttemplate.11.xaml. Originally, I was using WebDeploy for the deployment method and that was working great. Since then, our web/server team has re-configured our test environment to use IIS Shared Configuration as well as DFS Replication to keep everything in sync.
Because of that, I'm no longer able to use WebDeploy (I passed this post over to the TFS admins, but they said no).
Is there a place where I can add some msbuild arguments, or a post-build event where I can send a *.cmd file with some arguments so I can get my code copied/deployed?
I've read Hanselman's (and everyone else that copied him) posts/blogs that say "if you're using xcopy, you're doing it wrong, etc...", but I believe in my case I CAN'T use Web Deploy.
Update:
So I thought I found my answer. Since the web deploy doesn't work for me, I found a workflow activity called CopyDirectory that sounded exactly like what I need.
I went through the process of updating my default template to add this additional step to the build process, which by the way, does NOT work very well. After adding the step, saving, etc, the step doesn't ever show up in my build output. I gave up for awhile to go see if I could do this on our Jenkins build server, got some different errors over there so I came back to TFS to make the changes and commit. Since the CI was still setup in TFS (granted, failing), I noticed that a build got kicked off when I made my commit. I decided to watch for awhile and IT FINISHED SUCCESSFULLY! Woah, all right. So I checked through the build logs, and find out that it threw a WARNING saying "failed to copy. Ensure the source directory exists and that you have the appropriate permissions".
Well, since I just entered this value incorrectly, no big deal, just change to the correct BuildDetail.DropLocation, and we should be golden.
WRONG, after building again with my changes to the source and destination values, I come to find out that since I'm trying to deploy my files to a different domain, it still fails.
Oh, and in addition to that, YOU CAN'T PASS CREDENTIALS TO THE COPYDIRECTORY STEP! REALLY! Phew, I found some documentation though, it says "give the tfs build service/account permissions on the domain that you want to copy to. Well, that would be great, if my server team would allow that, but they don't.
Back to square one...(this is going to turn into a blog about me complaining about TFS...)

I believe you can do it using robocopy. You will want to update your build template to include a new InvokeProcess activity. Set the activity's FileName to "RoboCopy" (include the quotes) and it's Arguments to something like the following:
String.Format(" ""{0}"" ""{1}"" /E /R:10 /W:10 /NFL /NDL ", BinariesDirectory, BuildDetail.DropLocation)
Of course changing the robocopy flags to your specific needs.
I don't think you can pass credentials into robocopy either though, so you might still be SOL there.
One possible alternative though is that because your admins won't give the TFS Build User (i.e. tfsservice) permissions on the destination box, you could change the TFS Builds to run as a different User that does have permissions on that box. To do this I believe you just have to log onto your TFS Build machine, go to the Services, find the Visual Studio Team Foundation Build Service Host 2012 (or something similar), and change the Log On As user from tfsservice to whatever user has permissions on the box that you want to publish to. Of course you will also need to give that user permissions to do everything else that the build system needs to do (download source code, etc.).

Related

Does a simple solution exist for programmatically detecting add/edit/deletes locally and updating visual studio online accordingly?

I have code that generates SQL scripts that will run nightly. I want to check this into source control each night, so I get a history of changes to tables etc. as well as picking up new tables and when tables are deleted.
I have a team project created in Visual Studio Online.
From looking online it looks like there's no reliable way of automatically picking up changes locally and committing them to VSO. I'd have to create something that compares what I have locally to what is in VSO, which to me seems error-prone.
If I use the command line utility it looks like I have to tell it what is added and deleted (i can't just check everything out, then add/edit/delete my local files, then commit).
I've also looked into the Team Foundation Server class, but that's obsolete.
TL;DR: Is there anything I can to do easily sync local changes (add/edit/delete) to VSO, without having to tell it what's been changed?
Why not just check in the changed from your workspace?
If you have a Local Workspace that includes the folder that you generate the SQL into you can just call tf.exe checkin to get all of the changed into TFS.
+Daniel is right.

How to handle switching between two databases

Let's assume I'm going to create project in ASP.NET MVC 4 and users should be able to easily switch between test/demo database and production one. Both databases will have same schema, but different users and data.
Should I simply use two versions of web.config with different connection string and deploy it to two separate IIS instances?
I thought also about choosing database from dropdown list during logging in, so I would need only one IIS instance and one config with two connection strings. Do I gain anything from latter approach other than more complex code to handle it?
In our system we use web.config transforms and use MSDeploy with Publish Profiles in the project for our two different servers, Dev and Production.
To deploy to Dev, you simply click publish and select the "Dev Server" Publish profile and out it goes. Same thing on production, but you select "Production Server" from the publish profiles.
Dev and Production are two different physical servers, which is ideal. What if you do something where you need to reboot the dev server to test some big changes, or some new updates etc, you don't want production going down with it.
Our environment is virtualized so it was easy to create a production server and a dev server. Really we created a dev server and once done, we based production off a snapshot of the dev server, so they are basically identical with different code bases.
We also setup Visual Studio Remote Debugger on both servers so we can debug code without having to install visual studio on them.
Being on two different servers, they have different urls, e.g.
something.com
login.something.com
admin.something.com
dev.something.com
dev.login.something.com
dev.admin.something.com
Now, we also use twitter boot strap and out design has 2 columns on the left and 2 columns on the right for spacing. So when on the dev server, I render giant "DEV" images vertically in the column spaces.
I should also mention source control. We use SubVersion for a source control server and Tortoise SVN with the Visual SVN VisualStudio extension to keep our projects in source control.
It's setup to the point that any developer given access to the code can to a Get on the repository and open it in visual studio 2013 and click publish. The code is very easy to move from developer to developer.
We also have some versioned assets the site uses and we have the dev/prod servers setup with Tortoise SVN as well. E.g. EmployeePhotos are in source control, and a developer can add new photos to SVN and go on the server and go a get on the virtual directory containing them and they are uptodate. Handles deletes as well. If we delete a folder from source control and do an Update on it, SVN deletes the deleted images.
All code aside, I would host the two separately. The chance of someone screwing up production data when they think they're messing with test data is high enough that it's better to have explicitly different URLs. I'd even probably have visual cues in your Master layout (color differences, differences in the main page header,etc) to make it clear to the user where they are and what they're expected to do there.
Even if you're not worried about that per se, you're right that managing it internally will be more complex as well, and error-prone. I'd steer clear of it.

Updating MSMQ permissions on a private queue via C#

we are using some private MSMQ queues with our production system. Since implementing, we've had to update some stuff with the queues and re-create them to work with updated code. We have over 200 machines that need these updates so I'm working on a program that will be pushed via SMS to do this update.
What I'm noticing is that the only person that can run my program to do the update is the one that originally setup the private queue on that particular machine. Because of the number of machines, there were multiple admins that setup these queues.
Since not all of the employees still work here, this is causing me [my program] a problem. The permissions are not allowing the program to update everything that I need. I googled and found this link http://social.msdn.microsoft.com/Forums/hu-HU/msmq/thread/36a3d910-d533-4af3-86dc-498d00c68fef that shows how to update the permissions by modifying the file that is created for each of the queues. Great! It works when I manually navigate to that path and do the update. Now I want to do it programmatically via SMS push.
When trying to run this from my program, I get an error back saying the directory does not exist "C:\Windows\System32\msmq\storage\lqs". Huh? When I enter that path into Start->Run, it brings up the folder just fine. Well, breaking on the if (Directory.Exists(path)) part of my code definitely returns a false. If I remove all of the folders beyond System32, then the Directory.Exsts works just fine.
Why can't my program determine whether or not that msmq folder exists? I've tried "running as administrator" and it still returns false. What do I need to do to get that check to work?
Thanks
Edit: This is really weird, I have a FindAndReplace API that I wrote that will take in a path (file or directory), find text, replace text, etc. When I'm running that program from my Find And Replace GUI wrapper, it works just fine. But when I'm calling this API from my update program, it says that directory doesn't exist. This is really confusing. (I should probably remove the MSMQ tag because it has nothing to do with my question...)
Project + Properties, Build tab. Change the "Target platform" setting from x86 to AnyCPU. This lets you program run in 64-mode so the c:\windows\system32 directory search doesn't get redirected to c:\windows\syswow64.

Visual SVN - How to maintain separate user settings and publish settings?

For a new MVC web development project, I'm collaborating with a couple of other developers and we want to use Visual SVN to manage source control.
Following the "Getting Started" instructions at the VisualSVN website (http://www.visualsvn.com/visualsvn/getting-started/) seems to to commit everything within the Solution folder including all the settings file (.suo, user, .Publish.Xml)
However, we want to maintain separate Publish Settings within Visual Studio as we publish to our local machines for testing.
Is that possible?
P.S. Shouldn't VisualSVN Client automatically ignore the .suo and .user files?
it doesn't you'll need to either
add them to the ignore on commit lists - you can do this while committing but its a per user setting
remove them from svn - delete them from svn using tortoise as visual svn cant see them (take copies first, as I think this will actually delete them), commit the delete. Put them back into the folder and commit again, svn will show up these files as uncommited, right click on them and select ignore in the commit window, and commit them, this will apply to everyone. Its easier to not commit them in the first place :)
I use SVN as my source control as well. I also use VisualSVN (but only server side). The main thing I would suggest is to use VisualSVN to host your repositories, but use something else to commit/update/checkout your repositories to your local machine.
I would suggest TortoiseSVN for this. Use TortoiseSVN to control your workflow on local machines. You can then use it to simply right-click/ignore your *.suo files. Or any other files/folders you wish to keep out of the repository!
It may take a bit of research to get it setup. But this is what I use on an every day basis, and it is very user friendly.
I've never used VisualSVN, but I would be surprised...no shocked if what you said was true.
Does VisualSVN really by default automatically add and commit user files? You'd think a solution that's built for VisualStudio would simply know better. I would call the company and verify this.
If VisualStudio does commit local user files, I would recommend that you use AnkhSVN instead.
Not only does AnkhSVN know better than to commit user files, it's also open source and you can save yourself the $49 per user you need for VisualSVN. And, it's not just the $50 you're paying per user that you pay with VisualSVN either. It's also the fact that you have another license you need to track while users come in and leave the project. Who do you think is going to get that fun job?
However, if you must use VisualSVN, and VisualSVN does commit user local files by default, You need to get my kitchen sink pre-commit hook. One of the things it does is allow you to completely ban the addition of files such as Visual Studio's *.csuser` files and the other types of VisualStudio detritus.
Of course, you should let developers know how they can set global-ignores and autoproperties in Subversion. This will prevent them from accidentally adding them. But, there's no way you can configure that globally, or to prevent someone from purposefully adding them. Only my pre-commit hook can keep them out of your repository. After a few failed commits because your developers tried to add in these private user files, your developers will quickly fall into line and set up their global-ignores.

Releasing WinForm Program Updates

I'd like to release some updates for a WinForm program, but to date I have simply released an all-new compile. People have to un-install the old version and install the new version.
EDIT: I'm using an auto-generated InstalWizard. It preserves my file strucutre and places the [PrimaryProgramOutput] in a particular directory. I forget what this is called.
I bet there's a way to get around this, but I don't know what it's called. As you may guess, searches for "updates" "new version" "install" and the other obvious things I've tried have generated an impressive number of irrelevant results. >_<
I suspect this process has a particular name, which should point me in the right direction, but if it doesn't please link to a tutorial or something.
I see from the tags you are using C#. Visual Studio can create Setup projects for these kind of tasks. The setup projects als contain a property RemovePreviousVersion, which will remove a previous version if the versioning of your setup is correct and the GUID of the application stays the same.
See this link for more information:
http://www.simple-talk.com/dotnet/visual-studio/updates-to-setup-projects/
ClickOnce deployment is a great solution most of the time...
You can deploy to the web and when ever your users start the application it will check for updates and automatically update the application if there is a new version available.
It can also be configured not to update automatically but only to notify the user that there is a new version available and allow the user to control the update process.

Categories