So I have a .NET app which goes thru and generates a series of files, outputs them to a local directory and then determines if it needs to update an existing file or add a new file into a TFS (Team Foundation Server) project.
I have a single workspace on my local machine and there are 10 different working folders that are other coding projects I have worked on from this particular machine. My problem happens when I go to check if the file already exists in the TFS project and an update is required or if needs to be added to the project as a new file.
snipet:
static string TFSProject = #"$/SQLScripts/";
static WorkspaceInfo wsInfo;
static VersionControlServer versionControl;
static string argPath = "E:\\SQLScripts\\";
wsInfo = Workstation.Current.GetLocalWorkspaceInfo(argPath);
TeamFoundationServer tfs = new TeamFoundationServer(wsInfo.ServerUri.AbsoluteUri);
versionControl = (VersionControlServer)tfs.GetService(typeof(VersionControlServer));
Workspace workspace = versionControl.GetWorkspace(wsInfo);
workspace.GetLocalItemForServerItem(TFSProject);
At this point I check if the file exists and I do one of two things. if the file exists, then I mark the file for EDIT and then I writeout the file to the local directory, otherwise I will script the file first and then ADD the file to the workspace. I dont care if the physical file is identical to the one I am generating as I am doing this as a SAS70 requirement to 'track changes'
If it exists I do:
workspace.PendEdit(filename,RecurisionType.Full);
scriptoutthefile(filename);
or if it doesn't exist
scriptoutthefilename(filename);
workspace.PendAdd(filename,true);
Ok, all of that to get to the problem. When I go to check on pending changes against the PROJECT I get all the pending changes for all of the projects I have on my local machine in the workspace.
// Show our pending changes.
PendingChange[] pendingChanges = workspace.GetPendingChanges();
foreach (PendingChange pendingChange in pendingChanges)
{
dosomething...
}
I thought that by setting the workspace to workspace.GetLocalItemForServerItem(TFSProject) that it would give me ONLY the objects for that particular working folder.
If there any way to force the workspace object to only deal with a particular working folder?
Did that make any sense? Thanks in advance...
using LINQ you can get the pending changes of particular folder
PendingChange[] pendingChanges = workspace
.GetPendingChanges()
.Where(x => x.LocalOrServerFolder.Contains(argPath))
.ToArray();
Here is another way to refresh you workspace cache:
tf workspaces /s:http://SomeTFSServer:8080/
Also, Workspace has a method called UpdateWorkspaceInfoCache(..) that seems to do the same thing. I haven't tested this, though.
Ok, so it turned out that I had a corrupted TFS workspace cache on my machine.
Here was the solution (from a command prompt):
SET AppDataTF=%USERPROFILE%\Local Settings\Application Data\Microsoft\Team Foundation
SET AppDataVS=%APPDATA%\Microsoft\VisualStudio
IF EXIST "%AppDataTF%\1.0\Cache" rd /s /q "%AppDataTF%\1.0\Cache" > NUL
IF EXIST "%AppDataTF%\2.0\Cache" rd /s /q "%AppDataTF%\2.0\Cache" > NUL
IF EXIST "%AppDataVS%\8.0\Team Explorer" rd /s /q "%AppDataVS%\8.0\Team Explorer" > NUL
IF EXIST "%AppDataVS%\9.0\Team Explorer" rd /s /q "%AppDataVS%\9.0\Team Explorer" > NUL
The other part of the issue is that by having the project folder in the same workspace, it WILL pull in all pending changes from other projects. I created a secondary workspace with only the SQLScripts project and local folder and everything works like a charm.
Maybe someone else will find this useful...
Related
Am I right in thinking that if you create a self host nancy console app and want to serve up html,javascript and css files that you have to go thru all these files (could be quite a few) and mark them all for copy to output directory.
public class HomeModule : NancyModule
{
public HomeModule()
{
Get["/"] = v => View["index.html"];
}
}
This will not be found if the index.html file is in the project folder and is not marked copy to output on it's properties.
Edit: I stand corrected, I misunderstood the question.
Yes you need to set all static content to copy, however when I setup my project's (I can't copy paste an example for you at the moment), I just add a Build Event in the project file, or I setup a Build Task for the CI / deployment.
Nope, you don't need to mark every file individually.
https://github.com/NancyFx/Nancy/wiki/Managing-static-content
You can mark an entire directory.
Alternatively, if you're using OWIN, you can use the Static Content middleware.
Something like:
public class Startup
{
public void Configuration(IAppBuilder app)
{
var fileSystem = new FileServerOptions
{
EnableDirectoryBrowsing = false,
FileSystem = new PhysicalFileSystem("....")
};
app.UseFileServer(fileSystem);
app.UseNancy();
}
}
I had the same issue and I found a workaround that others might find useful:
Instead of copying the files to the output directory on each build, I created a directory junction in it, targeting the original static-files directories.
This allows real-time editing of the static content in Visual-Studio (without the need to rebuild in order to copy the edited files to the output directory)
e.g. (Post-build command line):
if not exist "$(TargetDir)Web" md "$(TargetDir)Web"
if not exist "$(TargetDir)Web\Content" mklink /j "$(TargetDir)Web\Content" "$(ProjectDir)Content"
if not exist "$(TargetDir)Web\Scripts" mklink /j "$(TargetDir)Web\Scripts" "$(ProjectDir)Scripts"
if not exist "$(TargetDir)Web\Fonts" mklink /j "$(TargetDir)Web\Fonts" "$(ProjectDir)Fonts"
if not exist "$(TargetDir)Web\Static" mklink /j "$(TargetDir)Web\Static" "$(ProjectDir)Web\Static"
You can use Visual studio build events and add xcopy command like this:
xcopy /E /Y "$(ProjectDir)\Views" "$(ProjectDir)\bin\$(ConfigurationName)\Views\*"
xcopy /E /Y "$(ProjectDir)\Content" "$(ProjectDir)\bin\$(ConfigurationName)\Content\*"
when project is built xcopy gets executed and files are copied in output dir, so your selfhost exe can see that files.
I have a piece of code that queries the history of tfs but on my machine I get an exception
saying:
There is no working folder mapping for C:\SDAM.
However if I run this same piece of code in colleagues machine there is no problem.
I am using:
Microsoft.TeamFoundation.Client; version 10
Microsoft.TeamFoundation.VersionControl.Client; version 10
VS2012 Project Update 4
4.5 Framework
Things I have tried:
Refreshing the cache
Delete the cache.
Checked the working folders and added them using Team Foundation Sidekicks to be sure I have source control folder and local folder mapped.
I have passed in the source control folder path $/SDAM and I get history. I go into to tfs explorer and check that $/SDAM is mapped to C:\SDAM
I am completely baffled and any suggestions would be appreciated. This code is already being used widely by us and I need to reuse it.
The results of
tf workspaces /format:detailed /collection:http://XXXX/tfs/
Workspace : XXXXXXXX
Owner : Domain\zzzzzzz
Computer : XXXXXXXX
Comment :
Collection : cccc\ddddd
Permissions: Private
Location : Server
File Time : Current
Working folders:
$/SDAM: C:\SDAM
IEnumerable tfsHistory;
string SourceControlPath = #"C:\SDAM";
var tfsUri = new Uri(#"http://XXXX/tfs/");
var tfs = TfsTeamProjectCollectionFactory.GetTeamProjectCollection(tfsUri);
var vcs = tfs.GetService<VersionControlServer>();
VersionSpec fromVersion = null, toVersion = null;
fromVersion = new ChangesetVersionSpec(1);
toVersion = new ChangesetVersionSpec(2);
tfsHistory =
vcs.QueryHistory(
SourceControlPath,
LatestVersionSpec.Instance,
0,
RecursionType.Full,
null,
fromVersion,
toVersion,
Int32.MaxValue,
true,
false);
if (tfsHistory != null)
{
//Do something
}
Your tool cannot find a working folder mapping because it is not in the working folder cache for the version of the SDK you're building against. If you do not specify a Team Project Collection and want to get connected to TFS by only a local path, TFS will look in the working folder cache to determine what server and server path correspond to that local path.
If you're building against version 10.0 of the SDK, then it's looking for the working folder cache created by Visual Studio 10.0 (ie, Visual Studio 2010.)
However, if you're running Visual Studio 11.0 and tf 11.0, then it will store the working folder information in the working folder cache for Visual Studio 11.0 (ie, Visual Studio 2012.)
Thus, your tool cannot bootstrap itself with only a working folder mapping. You need to either:
Have your tool connect to the TFS server in question so that it will obtain a fresh copy of the working folder information
Match the version of the SDK you build against to the version of TFS you use with Visual Studio
If you want to dynamically load the newest SDK, you may be able to bind an assembly resolution handler.
On one of my machines, I get a return value of null from any GetLocalWorkspaceInfo call. I have isolated to problem to where it even fails for this simple program:
namespace WorkstationTest
{
using Microsoft.TeamFoundation.VersionControl.Client;
class Program
{
static void Main()
{
string workspaceLocalPath = #"C:\Dev";
var info = Workstation.Current
.GetLocalWorkspaceInfo(workspaceLocalPath);
// info is always null here
}
}
}
What I have already checked:
The exact same code works on my other machine the way it should.
I have verified that I have a workspace at C:\Dev
I have created a new workspace and in a different directory and changed the workspaceLocalPath variable in the code to match.
I have consulted the documentation which states that the return value will be null if the path is not in a workspace. From the above image, the path should be in a workspace.
Yet, everything seems to suggest this should work. Is there anything I could be missing?
After migrating from TFS2013 to TFS2017 in the company I work for I had the same problem with Workstation.Current.GetLocalWorkspaceInfo.
What worked for me is a call to Workstation.EnsureUpdateWorkspaceInfoCache:
TfsTeamProjectCollection tpc = TfsTeamProjectCollectionFactory.GetTeamProjectCollection(new Uri("<your-tfs-uri-here>"));
VersionControlServer tfServer = tpc.GetService<VersionControlServer>();
Workstation.Current.EnsureUpdateWorkspaceInfoCache(tfServer, tfServer.AuthorizedUser);
I added the above code lines to the constructor of my TFS proxy class that uses GetLocalWorkspaceInfo.
When executing tf workspaces (on my computer) in the Visual Studio 2010 command prompt it says No workspace matching * found on this computer, but when executing the same command in Visual Studio 2012 it returns back all my expected workspaces.
The issue can be resolved by doing any of the following:
Reference the version of the Microsoft.TeamFoundation.VersionControl.Client dll that was connected with Visual Studio 2012 instead of the dll connected with Visual Studio 2010.
Open Visual Studio 2010 and connect it to TFS to where it will create the workspaces for Visual Studio 2010
I know this is an old post, but just like to share the workaround that we have, by using VersionControlServer.QueryWorkspaces to query all the workspaces for the user on his/her machine.
private static Workspace FindWorkspaceByPath(TfsTeamProjectCollection tfs, string workspacePath)
{
VersionControlServer versionControl = tfs.GetService<VersionControlServer>();
WorkspaceInfo workspaceInfo = Workstation.Current.GetLocalWorkspaceInfo(workspacePath);
if (workspaceInfo != null)
{
return versionControl.GetWorkspace(workspaceInfo);
}
// No Workspace found using method 1, try to query all workspaces the user has on this machine.
Workspace[] workspaces = versionControl.QueryWorkspaces(null, Environment.UserName, Environment.MachineName);
foreach (Workspace w in workspaces)
{
foreach (WorkingFolder f in w.Folders)
{
if (f.LocalItem.Equals(workspacePath))
{
return w;
}
}
}
throw new Exception(String.Format("TFS Workspace cannot be determined for {0}.", workspacePath));
}
In my case, this issue occurred because of VersionControl.config file put under TFS cache folder (C:\Users\DeepakR\AppData\Local\Microsoft\Team Foundation\5.0\Cache\Volatile\0cb76a25-2556-4bd6-adaa-5e755ac07355_http) goes for a toss i.e. the configured workspace information weren't available as expected.
So, it basically needs a refresh of VersionControl.config file. Auto Refresh happens when Visual Studio gets loaded again i.e. it pulls the configured workspace information from Server and updates the config file or even if we execute tf command utility (tf.exe workspaces /collection:TFSURL)
Microsoft.TeamFoundation.VersionControl.Client's (v12.0.0.0) Workstation class has a function EnsureUpdateWorkspaceInfoCache which will do the same trick
VersionControlServer vcs = (VersionControlServer)tpc.GetService(typeof(VersionControlServer));
Workstation.Current.EnsureUpdateWorkspaceInfoCache(vcs, Environment.UserName);
https://msdn.microsoft.com/en-us/library/microsoft.teamfoundation.versioncontrol.client.workstation.ensureupdateworkspaceinfocache(v=vs.120).aspx
Hope the suggestion helps to resolve the issue.
I had this issue recently (today) using Visual Studio 2017, plus several other versions installed and a number of local workspaces.
I ended up updating the 'Team Foundation Server Client' NuGet package to the latest version (15.x) through the 'Manage NuGet Packages' menu and that fixed it.
I did also remove the existing project references first but that part might depend on what you need.
Simply run with the tricks.
Nothing is going to work properly without a proper DLL reference. The below had fixed the same issue i had for 5 days as it was screwing my time up.
Place the below DLL's in the bin folder of your project and give a reference to the whole solution for all the DLL's. If any error comes up like 'Reference could not be given' ignore it and skip that DLL from giving reference instead just place also the error creating DLL in bin folder which the project will automatically take during build
DLL's:
Microsoft.TeamFoundation.Client.dll
Microsoft.TeamFoundation.Common.dll
Microsoft.TeamFoundation.Core.WebApi.dll
Microsoft.TeamFoundation.TestManagement.Client.dll
Microsoft.TeamFoundation.TestManagement.Common.dll
Microsoft.TeamFoundation.Work.WebApi.dll
Microsoft.TeamFoundation.WorkItemTracking.Client.DataStoreLoader.dll
Microsoft.TeamFoundation.WorkItemTracking.Client.dll
Microsoft.TeamFoundation.WorkItemTracking.Common.dll
Microsoft.TeamFoundation.WorkItemTracking.Controls.dll
Microsoft.TeamFoundation.WorkItemTracking.Proxy.dll
Microsoft.TeamFoundation.WorkItemTracking.WebApi.dll
Microsoft.VisualStudio.Services.Client.Interactive.dll
Microsoft.VisualStudio.Services.Common.dll
Microsoft.VisualStudio.Services.WebApi.dll
Microsoft.WITDataStore32.dll
Microsoft.WITDataStore64.dll
The above dll's can be found in the below path if the System is installed with MTM or TFS
Path: C:\Program Files (x86)\Microsoft Visual Studio\2017\Enterprise\Common7\IDE\CommonExtensions\Microsoft\TeamFoundation\Team Explorer
In my C:\Users\<username>\AppData\Local\Microsoft\Team Foundation folder I had 2 folders:
7.0
8.0
Within the 8.0 folder was the following folder:
\Cache\Volatile\c1dbda02-c575-4dd2-b221-e83f7cb63665_http
But within the 7.0 folder the \Cache\Volatile folder was empty
So all I did was copy across the c1dbda02-c575-4dd2-b221-e83f7cb63665_http folder into 7.0\Cache\Volatile\
After this GetLocalWorkspaceInfo call returned the workspace info successfully
This is how to find workspace when you have server path:
Workspace[] workspaces = _versionControl.QueryWorkspaces(null, Environment.UserName, Environment.MachineName);
return workspaces.FirstOrDefault(w => !string.IsNullOrEmpty(w.TryGetLocalItemForServerItem(ConstDefaultFlowsTfsPath)));
Where ConstDefaultFlowsTfsPath is server path with "$" for instance : "$/MyCompany/Services/DiagnosticsFlows"
You could also replace the last line to:
return workspaces.FirstOrDefault(w => !string.IsNullOrEmpty(w.GetServerItemForLocalItem(myLocalPath)));
and that should work for you too.
I have a console application that is available via nuget or on it's own. It gets installed into the tools directory for the NuGet package. The application requires 3 pieces of 'configuration' information.
a database connection string
a folder path
one more configuration option (string)
Currently, I store these configuration values in a text file right next to the exe in a file called settings.js, serialized as json.
When the application first runs, if the file is not present, it creates one with default values.
I keep the settings.js file in this location so the file will get checked into source control.
My question is about maintaining the settings file across versions.
If you Update-Package via nuget, everything works great, except the new version doesn't have any settings i had configured, because there is a new folder created for the new version.
I have written a powershell script to run in init.ps1 to pull the settings from the last version of the package, and seems to work. However this feels kinda dirty and I am wondering if there is a better way to solve this problem when using nuget to deliver my application.
param($installPath, $toolsPath, $package)
Set-Alias hump (Join-Path $toolsPath hump.exe)
$sorted_list = new-object system.collections.SortedList
$parent_path = Join-Path $installPath ".."
foreach($f in Get-ChildItem $parent_path -Filter Humpback* | Foreach {$_.FullName}){
$sorted_list.Add($f,$f)
}
if($sorted_list.Count -gt 1){
$old_path = $sorted_list.Values[$sorted_list.Count - 2]
$new_path = Join-Path $installPath "tools"
$current_settings = Join-Path $new_path "settings.js"
$has_current_settings = Test-Path $current_settings
if($has_current_settings -eq $false){
$old_settings = Join-Path $old_path "tools\settings.js"
Copy-Item $old_settings $new_path
}
}
Also, init.ps1 doesn't appear to run when installing a package using the command line tool (nuget.exe). Is this expected behavior?
Can you access System.Environment.GetFolderPath? I'd just create a folder under ApplicationData special folder, and store the settings there.
I am creating MSI installer via VS 2008. I try to delete the temp folder at the end of the installation. That temp folder is created by my installer to hold some batch files for database creation. It always show other process is access it and doesn't allow my code to delete it. I have called the Close() of that access process. I have put sleep before the code to delete it. Nothing helpful.
Do you have any idea how I can delete it at the end of the installation?
thanks,
Did you try Filemon to see who is accessing the temp folder when delete is called on the folder?
Its better you use the system temp folder path
System.Environment.GetEnvironmentVariable("TEMP")
you need not worry about cleaning it up.
I can think of creating a batch file in the temp folder to run which runs as the last step. You put a pause in it by using ping (http://www.robvanderwoude.com/wait.php) and then after a few seconds (that installer has exited) delete the folder by using parameter passed:
PING 1.1.1.1 -n 1 -w 60000 >NUL
rd "%1"
This is really a hack. It is better to root out what locks your folder.
I'll address the conceptual problems in your setup first:
First off, as "Vinay B R" said, be sure your "temp" folder is beneath the Windows %TEMP% folder. That way, you can leave the files there if you fail to delete them.
Why exactly do you want to delete the batch files when you're done? There is no expectation that you clean up after yourself inside of the %TEMP% folder.
If you want to ensure the user doesn't run them again, then you could name them with a different file extension (e.g. ".tmp" instead of .bat), execute them using this method described here, then leave them behind:
cmd < "%TEMP%foo.tmp"
If you are trying to delete files because you don't want the user to have access to them, then by deleting them you will only protect yourself against casual users.
If you still want to delete the files, then:
In all likelihood your own process is locking your folder. Using Process Explorer will likely point to msiexec.exe or cmd.exe. No doubt you are able to manually delete the folder once MSI and SQL have exited, right? If so, then your own process isn't terminating right away. Find out why. Is SQL taking longer than you think, perhaps?
As an alternative to Aliostad's method, here is the "other flavor" listed in this article. As he wrote, however, it would be best for you to determine why it's locked.
Process.Start("cmd.exe", "/C choice /C Y /N /D Y /T 3 & Del " +
Application.ExecutablePath);
Application.Exit();
Here is a working sample in C#. If your users will have .NET installed, then you can invoke this as a custom action using the WiX DTF (install WiX, then in Visual Studio select New Project -> Windows Installer XML -> C# Custom Action Project).:
// Note: This can also be used to delete this .exe (i.e.
// System.Windows.Forms.Application.ExecutablePath).
//
public static void AsynchDeleteFolder(string myTempFolderPath)
{
ProcessStartInfo info = new ProcessStartInfo();
// Don't create a visible DOS box.
info.WindowStyle = ProcessWindowStyle.Hidden;
info.CreateNoWindow = true;
// Wait 3 seconds ("/T 3").
info.Arguments = #"/C choice /C Y /N /D Y /T 3 & rmdir /S /Q """ +
myTempFolderPath + #"""";
info.FileName = "cmd.exe";
Process.Start(info);
}
If you would rather execute just the applicable portion as a batch file, then you can avoid the DOS window by following this method.:
' Filename: Run_a_batch_file_with_no_popup_dos_box.vbs
'
' Invoke like this from the command line:
' wscript.exe Run_a_batch_file_with_no_popup_dos_box.vbs "c:\path with spaces to my file name.bat"
'
Set WshShell = CreateObject("WScript.Shell")
WshShell.Run chr(34) & WScript.Arguments.Item(0) & Chr(34), 0
Set WshShell = Nothing