I was wondering if anyone knew of a good site showing examples of using TFS 2010's API.
I would like to develop a project that would allow a team to see which files/items other team members had checked out. It simply a system users could see which projects developers are currently working on. Does any one have any experience with this that could give advise to get started?
I would be developing in .NET (C# OR VB) and the application would ride on a SQL Server 2008 database.
As Alex mentions, TFS Sidekicks from Attrice has this functionality.
In addition, the TFS Power Tools allows you to use "Find in Source Control" to see what files are checked out by any (or all) users.
However, if you did want to roll your own solution, you could do so pretty easily using the TFS SDK. I'll let the documentation speak for itself, but you'll probably want to do something along the lines of:
TfsTeamProjectCollection projectCollection = new TfsTeamProjectCollection(new Uri("http://tfs.mycompany.com:8080/tfs/DefaultCollection"));
VersionControlServer vc = projectCollection.GetService<VersionControlServer>();
/* Get all pending changesets for all items (note, you can filter the items in the first arg.) */
PendingSet[] pendingSets = vc.GetPendingSets(null, RecursionType.Full);
foreach(PendingSet set in pendingSets)
{
/* Get each item in the pending changeset */
foreach(PendingChange pc in set.PendingChanges)
{
Console.WriteLine(pc.ServerItem + " is checked out by " + set.OwnerName);
}
}
(Note: totally untested)
But again, I'd recommend you check out those two existing projects to see if they fit your needs.
TFS Sidekicks by Attrice already does this and a lot more. Plus, it's free
TFS Sidekicks
Related
I'm currently searching for a way to make it possible to deploy and update a C# .NET application over SFTP. Background is that most of my users do not have admin rights, internet access rights or common file structures/group policies. The best I could actually get was the ability to also use the SFTP-infrastructure the application already uses for data transfers.
So I tried using Visual Studio publish, which can deploy the application initially, however it does not support the update mechanisms (it only supports URLs and File Paths). A manual update/deployment process is out of the question, however, purely through the sheer size of the userbase (1000+ users).
I then had a look at wyBuild, a third-party build-tool that can actually use sftp to upload Updates. It can not, however, download them via sftp.
So I'm reaching somewhat the end of my rope here. Writing an updater myself seems like a large time investment, but I could not find any other solution. How would this problem be solvable? Thanks for any help in advance.
Oh, and before somebody flags for "asking for a tutorial", I tried hard to stay within the guidelines stackoverflow provides here
I have code that generates SQL scripts that will run nightly. I want to check this into source control each night, so I get a history of changes to tables etc. as well as picking up new tables and when tables are deleted.
I have a team project created in Visual Studio Online.
From looking online it looks like there's no reliable way of automatically picking up changes locally and committing them to VSO. I'd have to create something that compares what I have locally to what is in VSO, which to me seems error-prone.
If I use the command line utility it looks like I have to tell it what is added and deleted (i can't just check everything out, then add/edit/delete my local files, then commit).
I've also looked into the Team Foundation Server class, but that's obsolete.
TL;DR: Is there anything I can to do easily sync local changes (add/edit/delete) to VSO, without having to tell it what's been changed?
Why not just check in the changed from your workspace?
If you have a Local Workspace that includes the folder that you generate the SQL into you can just call tf.exe checkin to get all of the changed into TFS.
+Daniel is right.
I'm trying to setup our CI build environment and having an issue.
First, I'm using VS and TFS 2012 so I can't use the *.12.xaml templates since those are for VS/TFS 2013.
Second, right now I'm configured to use just the defaulttemplate.11.xaml. Originally, I was using WebDeploy for the deployment method and that was working great. Since then, our web/server team has re-configured our test environment to use IIS Shared Configuration as well as DFS Replication to keep everything in sync.
Because of that, I'm no longer able to use WebDeploy (I passed this post over to the TFS admins, but they said no).
Is there a place where I can add some msbuild arguments, or a post-build event where I can send a *.cmd file with some arguments so I can get my code copied/deployed?
I've read Hanselman's (and everyone else that copied him) posts/blogs that say "if you're using xcopy, you're doing it wrong, etc...", but I believe in my case I CAN'T use Web Deploy.
Update:
So I thought I found my answer. Since the web deploy doesn't work for me, I found a workflow activity called CopyDirectory that sounded exactly like what I need.
I went through the process of updating my default template to add this additional step to the build process, which by the way, does NOT work very well. After adding the step, saving, etc, the step doesn't ever show up in my build output. I gave up for awhile to go see if I could do this on our Jenkins build server, got some different errors over there so I came back to TFS to make the changes and commit. Since the CI was still setup in TFS (granted, failing), I noticed that a build got kicked off when I made my commit. I decided to watch for awhile and IT FINISHED SUCCESSFULLY! Woah, all right. So I checked through the build logs, and find out that it threw a WARNING saying "failed to copy. Ensure the source directory exists and that you have the appropriate permissions".
Well, since I just entered this value incorrectly, no big deal, just change to the correct BuildDetail.DropLocation, and we should be golden.
WRONG, after building again with my changes to the source and destination values, I come to find out that since I'm trying to deploy my files to a different domain, it still fails.
Oh, and in addition to that, YOU CAN'T PASS CREDENTIALS TO THE COPYDIRECTORY STEP! REALLY! Phew, I found some documentation though, it says "give the tfs build service/account permissions on the domain that you want to copy to. Well, that would be great, if my server team would allow that, but they don't.
Back to square one...(this is going to turn into a blog about me complaining about TFS...)
I believe you can do it using robocopy. You will want to update your build template to include a new InvokeProcess activity. Set the activity's FileName to "RoboCopy" (include the quotes) and it's Arguments to something like the following:
String.Format(" ""{0}"" ""{1}"" /E /R:10 /W:10 /NFL /NDL ", BinariesDirectory, BuildDetail.DropLocation)
Of course changing the robocopy flags to your specific needs.
I don't think you can pass credentials into robocopy either though, so you might still be SOL there.
One possible alternative though is that because your admins won't give the TFS Build User (i.e. tfsservice) permissions on the destination box, you could change the TFS Builds to run as a different User that does have permissions on that box. To do this I believe you just have to log onto your TFS Build machine, go to the Services, find the Visual Studio Team Foundation Build Service Host 2012 (or something similar), and change the Log On As user from tfsservice to whatever user has permissions on the box that you want to publish to. Of course you will also need to give that user permissions to do everything else that the build system needs to do (download source code, etc.).
I've been searching for a solution for a few days now, I've looked through the MSDN for Interop.Outlook and I think I've found what I need, but can't seem to implement it properly.
Here's the code I've came up with based on something similar I saw in VBA.
class Program
{
Stores allstores = new Stores();
Store store;
static void Main(string[] args)
{
foreach (var store in allstores)
{
MessageBox.Show(store.FilePath);
}
}
}
`
This essentially needs to cycle through a list of computers, and run this code on their outlook(some 2003, some 2007) in order to inventory all connected PST's in each outlook profile. I'm sure there's more code to this, but I can't get this portion to work at all. There seems to be a lack of information on inventorying Outlook data files, most of it is reading e-mails from the mailboxes and not the data file itself.
If someone could shed some light on what I'm overlooking, It'd be greatly appreciated.
EDIT:
I've actually made a working piece of code now, however I have a problem with compatibility. The program works as designed in Office 2010/2007, however it crashes when accessing a 2003 version. I imagine I need to use the Microsoft Office Object 11.0, however I only have Microsoft Office Object 12.0 listed - is there a way to get the 11.0 reference?
This may be of use, pretty thorough object model comparison and development guide.
There is no reason to actually log to any Outlook profiles (which might require an authentication prompt). All the information is already in the profile section in the registry. The exact location is Outlook version specific, and the profile section guids are generated randomly, so the documented profile management API (IProfAdmin etc.) is the way to go, but unfortunately it is Extended MAPI and requires C++ or Delphi.
As of Outlook 2007, Outlook Object Model exposes Namespace.Stores collection and Store.FilePath property, so you can loop through all stores and read the FilePath property for each store (be sure to filter out OST files).
Note that there can be multiple Outlook profiles (as shown in Control Panel | Mail | Show Profiles), but Outlook can only work with one profile at a time, so to use a different profile, you'd need to close Outlook.
If using Redemption is an option (I am its author), it includes ProfMan library (accessible in any language) which will let you extract all PST file locations from all local profiles without actually logging in.:
'Print the path to all the PST files in all profiles
PR_PST_PATH = &H6700001E
set Profiles=CreateObject("ProfMan.Profiles")
for i = 1 to Profiles.Count
set Profile = Profiles.Item(i)
set Services = Profile.Services
Debug.Print "------ Profile: " & Profile.Name & " ------"
for j = 1 to Services.Count
set Service = Services.Item(j)
If (Service.ServiceName = "MSPST MS") or (Service.ServiceName = "MSUPST MS") Then
'there should be only one provider for this service
'but we should really loop through all the providers
Debug.Print Service.Providers.Item(1).ProfSect.Item(PR_PST_PATH)
End If
next
next
I'd like to release some updates for a WinForm program, but to date I have simply released an all-new compile. People have to un-install the old version and install the new version.
EDIT: I'm using an auto-generated InstalWizard. It preserves my file strucutre and places the [PrimaryProgramOutput] in a particular directory. I forget what this is called.
I bet there's a way to get around this, but I don't know what it's called. As you may guess, searches for "updates" "new version" "install" and the other obvious things I've tried have generated an impressive number of irrelevant results. >_<
I suspect this process has a particular name, which should point me in the right direction, but if it doesn't please link to a tutorial or something.
I see from the tags you are using C#. Visual Studio can create Setup projects for these kind of tasks. The setup projects als contain a property RemovePreviousVersion, which will remove a previous version if the versioning of your setup is correct and the GUID of the application stays the same.
See this link for more information:
http://www.simple-talk.com/dotnet/visual-studio/updates-to-setup-projects/
ClickOnce deployment is a great solution most of the time...
You can deploy to the web and when ever your users start the application it will check for updates and automatically update the application if there is a new version available.
It can also be configured not to update automatically but only to notify the user that there is a new version available and allow the user to control the update process.