Running .NUnit files programmatically - c#

.nunit files are not running when using NUNIT in C#
I have tried running below code, but the error returned is that the file is not supported. All the examples on the internet use .dll as the path for a testpackage, not .unit.
According to below sites, it is possible to run .unit files:
https://github.com/nunit/docs/wiki/Writing-Engine-Extensions and
https://github.com/nunit/nunit-project-loader/tree/master/src/extension
var path = #"path to nunit file";
var package = new TestPackage(path);
var engine = TestEngineActivator.CreateInstance();
using (var runner = engine.GetRunner(package))
{
var result = runner.Run(this, TestFilter.Empty);
}
}
I have tried the following by referencing nunitprojectloader dll:
var path = #"C:\Users\Administrator\Desktop\nunit-project-loader-master\nunit-project-loader-master\ConsoleApplication1\bin\Debug\DEV.nunit";
NUnitProjectLoader pl = new NUnitProjectLoader();
pl.LoadFrom(path);
var package = pl.GetTestPackage(path);
var engine = TestEngineActivator.CreateInstance();
using (var runner = engine.GetRunner(package))
{
// execute the tests
var result = runner.Run(this, TestFilter.Empty);
}
There is a new error by doing this:
It tries running NBi.NUnit.Runtime.dll which is referenced in the nunit file.
Everythings works fine using the console.

Related

Roslyn / MSBuildWorkspace Does Not Load Documents on Non-Dev Computer

MSBuildWorkspace does now show documents on computers other than the one I'm using to build the application.
I have seen Roslyn workspace for .NET Core's new .csproj format, but in my case, it works fine on the development computer (Documents is populated), but it is empty for the same project on a different computer.
So I am not sure why it would work on my development computer, but not on my other computer...?
This is the code :
public static IReadOnlyList<CodeFile> ReadSolution(string path)
{
List<CodeFile> codes = new List<CodeFile>();
using (MSBuildWorkspace workspace = MSBuildWorkspace.Create())
{
var solution = workspace.OpenSolutionAsync(path).Result;
foreach (var project in solution.Projects)
{
//project.Documents.Count() is 0
foreach (var doc in project.Documents)
{
if (doc.SourceCodeKind == SourceCodeKind.Regular)
{
StringBuilder sb = new StringBuilder();
using (var sw = new StringWriter(sb))
{
var source = doc.GetTextAsync().Result;
source.Write(sw);
sw.WriteLine();
}
codes.Add(new CodeFile(doc.FilePath, sb.ToString()));
}
}
}
}
return codes;
}
It turns out that the newer versions of MSBuildWorkspace no long throw exceptions on failures, instead raising an event, WorkspaceFailure. (https://github.com/dotnet/roslyn/issues/15056)
This indicated that the required build tools (v15 / VS2017) were not available. The new 2017 Build Tools installer is both too complicated for our end users and much slower to install than the 2015 build tools installer was.
Instead, I came up with this less robust method to avoid that dependency:
public static IReadOnlyList<CodeFile> ReadSolution(string path)
{
List<CodeFile> codes = new List<CodeFile>();
var dirName = System.IO.Path.GetDirectoryName(path);
var dir = new DirectoryInfo(dirName);
var csProjFiles = dir.EnumerateFiles("*.csproj", SearchOption.AllDirectories);
foreach(var csProjFile in csProjFiles)
{
var csProjPath = csProjFile.Directory.FullName;
using (var fs = new FileStream(csProjFile.FullName, FileMode.Open, FileAccess.Read))
{
using (var reader = XmlReader.Create(fs))
{
while(reader.Read())
{
if(reader.Name.Equals("Compile", StringComparison.OrdinalIgnoreCase))
{
var fn = reader["Include"];
var filePath = Path.Combine(csProjPath, fn);
var text = File.ReadAllText(filePath);
codes.Add(new CodeFile(fn,text));
}
}
}
}
}
return codes;
}
I had same problem
I was using a .Net Core Console App.
I noticed that Microsoft.CodeAnalysis.Workspaces.MSBuild had the warning:
Package Microsoft.CodeAnalysis.Workspaces.MSBuild 2.10.0 was restored using '.Net Framework,version=v4.6.1' instead of the project target framework '.NetCoreApp,Version=v2.1'. this package may not be fully compatible with your project.
I changed to a .Net Framework Console App and I could access the documents.

Get working directory of TFS XAML Build Agents using Powershell

My goal is to come up with a Powershell script to clean the working directory of XAML build agents.
To get the working directory of build agents I could use below C# code, which works fine.
I would like to implement the same in Powershell.
static void Main(string[] args)
{
var TPC = TfsTeamProjectCollectionFactory.GetTeamProjectCollection(new Uri("CollectionURI"));
IBuildServer buildServer = TPC.GetService<IBuildServer>();
\\ {Microsoft.TeamFoundation.Build.Client.BuildServer}
var buildController = buildServer.GetBuildController("ControllerName");
var buildAgent = buildController.Agents;
var workingFolder = string.Empty;
List<string> list = new List<string>();
foreach (IBuildAgent agent in buildAgent)
{
list.Add(agent.BuildDirectory);
}
}
If it is not possible to find the Powershell equivalent I will have to consume the C# in Powershell through an exe or dll.
It is similar with your C# code:
$url = "http://xxxx:8080/tfs/CollectionName/";
$tfs = [Microsoft.TeamFoundation.Client.TfsTeamProjectCollectionFactory]::GetTeamProjectCollection($url);
$buildservice = $tfs.GetService("Microsoft.TeamFoundation.Build.Client.IBuildServer");
$buildcontroller = $buildservice.GetBuildController("ControllerName");
$buildagents = $buildcontroller.Agents;
foreach($buildagent in $buildagents)
{
Write-Host $buildagent.BuildDirectory;
}

Build cannot be found when publishing results to TFS

I'm attempting to publish test results to Microsoft Test Manager through TFS but I'm getting the following error when attempting to publish results through the TFS API:
Microsoft.TeamFoundation.TestManagement.Client.TestObjectNotFoundException:
Build vstfs:///Build/Build/### cannot be found.
I'm getting my test plan and creating a test run the way Microsoft suggests:
plan = teamProject.TestPlans.Query(
"SELECT * FROM TestPlan WHERE PlanName = '" + planName + "')[0];
ITestRun run = plan.CreateTestRun(true);
run.AddTestPoints(testPoints, null);
run.Save();
So why is it saying the build can't be found?
Builds are periodically cleared out of the system so you may want to be sure you're using the latest build. You can then get the latest builds URI and associate it with your run.
Uri GetLatestBuildURI(string projectName) {
Uri buildUri = null;
IBuildServer buildServer = _tfs.GetService<IBuildServer>();
IBuildDetailSpec detailSpec = buildServer.CreateBuildDetailSpec(projectName);
detailSpec.MaxBuildsPerDefinition = 1;
detailSpec.QueryOrder = BuildQueryOrder.FinishTimeDescending;
IBuildQueryResult results = buildServer.QueryBuilds(detailSpec);
if (results.Builds.Length == 1) {
IBuildDetail detail = results.Builds[0];
buildUri = detail.Uri;
}
return buildUri;
}
...
ITestRun run = plan.CreateTestRun(true);
run.BuildUri = GetLatestBuildURI(projectName);
run.AddTestPoints(testPoints, null);
run.Save();

how to execute multiple ssis packages from c#

I have created 10 different packages and i want to execute them from c# coding. Can some one post some screen shots to achieve this.
I have tried this
Application app = new Application();
TraceService("loading system From File system");
//Create package Container to hold the package.
//And Load the Package Using the Application Object.
Package package = app.LoadPackage(#"C:\User\Kiran\Documents\Visual Studio 2012\Projects\WindowsServiceTest\WindowsServiceTest\Package1.dtsx", null);
TraceService("Execution Started");
DTSExecResult result = package.Execute();
// print the result
TraceService(result.ToString());
TraceService("Execution Completed");
Here i have to get the file name in run time not by hard coding
Following code will execute all packages from given folder.
var pkgLocation = #"C:\User\Kiran\Documents\Visual Studio 2012\Projects\WindowsServiceTest\WindowsServiceTest\";
foreach (var file in Directory.EnumerateFiles(pkgLocation, "*.dtsx"))
using (var pkg = new Application().LoadPackage(file, null))
{
var pkgResults = pkg.Execute();
Console.WriteLine("Package File Name:{0}, Result:{1}",file.ToString(), pkgResults.ToString());
}
The executing SSIS package from C# and VB is well documented in official site. This is my complete code in script task to execute multiple SSIS packages.
string packagesFolder = Dts.Variables["User::packagesFolder"].Value.ToString();
string rootFolder = Dts.Variables["User::rootFolder"].Value.ToString();
Package pkg;
Microsoft.SqlServer.Dts.Runtime.Application app;
DTSExecResult pkgResults;
foreach (var pkgLocation in Directory.EnumerateFiles(packagesFolder+"\\", "ValidateDataMigration-*.dtsx"))
{
try
{
app = new Microsoft.SqlServer.Dts.Runtime.Application();
pkg = app.LoadPackage(pkgLocation, null);
pkgResults = pkg.Execute();
File.AppendAllText(rootFolder + "\\DataValidationProgress.log", pkgLocation.ToString()+"=>"+ pkgResults.ToString()+ Environment.NewLine);
}
catch(Exception e)
{
File.AppendAllLines(rootFolder + "\\DataValidationErrors.log", new string[] { e.Message, e.StackTrace });
}
}

C# TFS API "There is no working folder C:\TFS' in VS2008 installed system

I want to find out the files recently checked in using C# and TFS API from TFS2010. It works fine where MS Visual studio 2010 is installed. this is developed using VS2010, .Net 3.5.
When I use this exe in system having VS2008 installed throws error as "*There is no working folder C:\TFS"*.
This system has only 3.5 Framework.
I copied all the files from C:\Program Files\Microsoft Visual Studio 10.0\Common7\IDE\ReferenceAssemblies\v2.0 along with executable.
C:\TFS is the actual mapping folder. Even tried inner folder as well.
any suggestion please. is there any way to get result from TFS without considering local mappings?
TeamFoundationServer tfsServer = new TeamFoundationServer("http://snchndevtfsapp:8080/tfs/defaultcollection");
WorkItemStore workItemStore = new WorkItemStore(tfsServer);
VersionControlServer vcServer = tfsServer.GetService(typeof(VersionControlServer)) as VersionControlServer;
var usersWorkspaces = vcServer.QueryWorkspaces(null, vcServer.AuthorizedUser, Environment.MachineName).ToList();
List<ChangedTFSItem> foundPastChanges = new System.Collections.Generic.List<ChangedTFSItem>();
var allPastChangesets = vcServer.QueryHistory(#"C:\TFS",
VersionSpec.Latest,
0,
RecursionType.Full,
null,
null,
null,
1000,
true,
false).Cast<Changeset>();
//.Where(x => x.Committer.Contains(Environment.UserName));
List<ChangedTFSItem> _foundPastChanges = allPastChangesets
.SelectMany(x => x.Changes)
.Where(x => x.Item.CheckinDate.Date >= ((DateTime)dateEdit1.EditValue))
//.DistinctBy(x => x.Item.ServerItem, x => x.Item.ServerItem.GetHashCode())
.Select(x => new ChangedTFSItem()
{
FileName = Path.GetFileName(x.Item.ServerItem),
ServerItem = usersWorkspaces[0].GetLocalItemForServerItem(x.Item.ServerItem).Replace(textEdit1.Text, ""),
LocalPath = usersWorkspaces[0].GetLocalItemForServerItem(x.Item.ServerItem),
ChangeTypeName = x.ChangeType.ToString(),
ChangeDate = x.Item.CheckinDate.ToString()
}).ToList();
Instead of placing a physical path as your first argument in Query History #"C:\TFS", try using a source control path. If are interested in all changesets, simply place the root "$/".For the task you are trying to accomplish, you can skip any local workspace connection by doing something like this:
using System;
using System.Linq;
using Microsoft.TeamFoundation.Client;
using Microsoft.TeamFoundation.VersionControl.Client;
namespace GetFilesOfLatestChangesets
{
class Program
{
static void Main()
{
TfsTeamProjectCollection teamProjectCollection = TfsTeamProjectCollectionFactory.GetTeamProjectCollection(new Uri("TFS_URI"));
var vcS = teamProjectCollection.GetService(typeof (VersionControlServer)) as VersionControlServer;
var changesets =
vcS.QueryHistory("$/", VersionSpec.Latest, 0, RecursionType.Full, null, null, null, 10, true, false).
Cast<Changeset>();
foreach (var changeset in changesets)
{
Console.WriteLine("Changes for "+changeset.ChangesetId);
foreach (var change in changeset.Changes)
{
Console.WriteLine(change.Item.ServerItem);
}
}
}
}
}
but then you 'll retrieve the server paths for the changed modules and not where they have been mapped in your workstation.
One final remark: You have to QueryHistory with includeChanges = true, therefore asking for the last 1000 changesets should be rather expensive.

Categories