Roslyn / MSBuildWorkspace Does Not Load Documents on Non-Dev Computer - c#

MSBuildWorkspace does now show documents on computers other than the one I'm using to build the application.
I have seen Roslyn workspace for .NET Core's new .csproj format, but in my case, it works fine on the development computer (Documents is populated), but it is empty for the same project on a different computer.
So I am not sure why it would work on my development computer, but not on my other computer...?
This is the code :
public static IReadOnlyList<CodeFile> ReadSolution(string path)
{
List<CodeFile> codes = new List<CodeFile>();
using (MSBuildWorkspace workspace = MSBuildWorkspace.Create())
{
var solution = workspace.OpenSolutionAsync(path).Result;
foreach (var project in solution.Projects)
{
//project.Documents.Count() is 0
foreach (var doc in project.Documents)
{
if (doc.SourceCodeKind == SourceCodeKind.Regular)
{
StringBuilder sb = new StringBuilder();
using (var sw = new StringWriter(sb))
{
var source = doc.GetTextAsync().Result;
source.Write(sw);
sw.WriteLine();
}
codes.Add(new CodeFile(doc.FilePath, sb.ToString()));
}
}
}
}
return codes;
}

It turns out that the newer versions of MSBuildWorkspace no long throw exceptions on failures, instead raising an event, WorkspaceFailure. (https://github.com/dotnet/roslyn/issues/15056)
This indicated that the required build tools (v15 / VS2017) were not available. The new 2017 Build Tools installer is both too complicated for our end users and much slower to install than the 2015 build tools installer was.
Instead, I came up with this less robust method to avoid that dependency:
public static IReadOnlyList<CodeFile> ReadSolution(string path)
{
List<CodeFile> codes = new List<CodeFile>();
var dirName = System.IO.Path.GetDirectoryName(path);
var dir = new DirectoryInfo(dirName);
var csProjFiles = dir.EnumerateFiles("*.csproj", SearchOption.AllDirectories);
foreach(var csProjFile in csProjFiles)
{
var csProjPath = csProjFile.Directory.FullName;
using (var fs = new FileStream(csProjFile.FullName, FileMode.Open, FileAccess.Read))
{
using (var reader = XmlReader.Create(fs))
{
while(reader.Read())
{
if(reader.Name.Equals("Compile", StringComparison.OrdinalIgnoreCase))
{
var fn = reader["Include"];
var filePath = Path.Combine(csProjPath, fn);
var text = File.ReadAllText(filePath);
codes.Add(new CodeFile(fn,text));
}
}
}
}
}
return codes;
}

I had same problem
I was using a .Net Core Console App.
I noticed that Microsoft.CodeAnalysis.Workspaces.MSBuild had the warning:
Package Microsoft.CodeAnalysis.Workspaces.MSBuild 2.10.0 was restored using '.Net Framework,version=v4.6.1' instead of the project target framework '.NetCoreApp,Version=v2.1'. this package may not be fully compatible with your project.
I changed to a .Net Framework Console App and I could access the documents.

Related

Download only files with specified extensions from Azure Git repository

I need to download files with specified extensions from Azure Git repository programmatically (C#, .NET Framework 4.8). Files are located in different server folders and sometimes in different branches. I was able to achieve this goal with the following code:
var connection = new VssConnection(new Uri(collectionUri), new VssBasicCredential("", personalAccessToken));
using (var ghc = connection.GetClient<GitHttpClient>())
{
string[] extensionsToDownload = { "xml", "xslt" }; // just for example, real cases are not limited to these extensions
string branch = "my-branch-name";
GitVersionDescriptor version = new GitVersionDescriptor { Version = branch };
// Get all items information
GitItemRequestData data = new GitItemRequestData
{
ItemDescriptors = new []
{
new GitItemDescriptor
{
Path = "/Some/Path",
RecursionLevel = VersionControlRecursionType.Full,
VersionType = GitVersionType.Branch,
Version = branch
},
new GitItemDescriptor
{
Path = "/Another/Path",
RecursionLevel = VersionControlRecursionType.Full,
VersionType = GitVersionType.Branch,
Version = branch
}
}
};
var items = ghc.GetItemsBatchAsync(data, project: projectName, repositoryId: repoName).Result;
// filter returned items by extension
List<GitItem> filteredItems = items
.SelectMany(item => item)
.Where(item => item.GitObjectType == GitObjectType.Blob && extensionsToDownload.Contains(item.Path.Split('.').Last())).ToList();
// download zipped items and extract
foreach (var item in filteredItems)
{
using (var stream = ghc.GetItemZipAsync(
project: projectName, repositoryId: repoName, path: item.Path, includeContent: true, versionDescriptor: version).Result)
{
ZipArchive archive = new ZipArchive(stream);
foreach (ZipArchiveEntry entry in archive.Entries)
{
entry.ExtractToFile(Path.Combine(localFolder, entry.FullName.Trim('/')), true);
}
}
}
However, it means that each item requires a separate API call. Definitely not good for performance. I thought "There should be a way to batch download all items at once".
GetBlobsZipAsync method seemed like what I exactly needed. However, attempt to use it had failed miserably. All I got was VssUnauthorizedException: 'VS30063: You are not authorized to access https://dev.azure.com'. Very strange because calling GetBlobZipAsync for each individual item id works perfectly (but it's almost the same as initial solution with the same far-from-ideal performance).
Dictionary<string, string> idToNameMappings = filteredItems.ToDictionary(k => k.ObjectId, v => Path.Combine(localFolder, v.Path.Trim('/')));
foreach (var item in filteredItems)
{
using (var stream = ghc.GetBlobsZipAsync(idToNameMappings.Select(i => i.Key), project: projectName, repositoryId: repoName).Result)
{
ZipArchive archive = new ZipArchive(stream);
foreach (ZipArchiveEntry entry in archive.Entries)
{
entry.ExtractToFile(entry.FullName, idToNameMappings[entry.FullName], true);
}
}
}
Another option is to download all items as zip archive and filter it on the client side:
foreach (var desc in data.ItemDescriptors)
{
using (var stream = ghc.GetItemZipAsync(projectName, repoName, null, desc.Path, desc.RecursionLevel, versionDescriptor:version).Result)
{
ZipArchive archive = new ZipArchive(stream);
foreach (ZipArchiveEntry entry in archive.Entries)
{
if (extensionsToDownload.Contains(entry.FullName.Split('.').Last()))
{
entry.ExtractToFile(Path.Combine(localFolder, entry.FullName.Trim('/')), true);
}
}
}
}
But it's even worse because the repository contains a large amount of data files (including some binary content). Downloading several hundreds MB of data to get less than 10 MB of xml files doesn't seem to be very efficient.
So at the moment I gave up and decided to stick with initial solution. But maybe there's something I overlooked?

Running .NUnit files programmatically

.nunit files are not running when using NUNIT in C#
I have tried running below code, but the error returned is that the file is not supported. All the examples on the internet use .dll as the path for a testpackage, not .unit.
According to below sites, it is possible to run .unit files:
https://github.com/nunit/docs/wiki/Writing-Engine-Extensions and
https://github.com/nunit/nunit-project-loader/tree/master/src/extension
var path = #"path to nunit file";
var package = new TestPackage(path);
var engine = TestEngineActivator.CreateInstance();
using (var runner = engine.GetRunner(package))
{
var result = runner.Run(this, TestFilter.Empty);
}
}
I have tried the following by referencing nunitprojectloader dll:
var path = #"C:\Users\Administrator\Desktop\nunit-project-loader-master\nunit-project-loader-master\ConsoleApplication1\bin\Debug\DEV.nunit";
NUnitProjectLoader pl = new NUnitProjectLoader();
pl.LoadFrom(path);
var package = pl.GetTestPackage(path);
var engine = TestEngineActivator.CreateInstance();
using (var runner = engine.GetRunner(package))
{
// execute the tests
var result = runner.Run(this, TestFilter.Empty);
}
There is a new error by doing this:
It tries running NBi.NUnit.Runtime.dll which is referenced in the nunit file.
Everythings works fine using the console.

MSBuildWorkspace : solution contains no documents and no projects on Visual Studio 2019

I want to create an analyzer using roslyn, but first i need to get all the documents(.cs files) from the target solution.
i used the following code from Josh Varty's tutorial
string solutionPath = #"C:\Users\hamza\Desktop\TestSolution\TestSolution.sln";
var msWorkspace = MSBuildWorkspace.Create();
var solution = msWorkspace.OpenSolutionAsync(solutionPath).Result;
foreach (var project in solution.Projects)
{
Console.WriteLine(project);
foreach (var document in project.Documents)
{
Console.WriteLine(project.Name + "\t\t\t" + document.Name);
}
}
But the result is null, i don't get any documents or projects.
the MSBuildWorkspace version is 3.0.0 i tried also 2.10.0 but the result is the same.
anyone have an idea about this ? or how to fix this ?
After more research I found this helpful issue post on Github:
https://github.com/dotnet/roslyn/issues/24767
this code worked fine
var projectPath = #"C:\Users\hamza\Desktop\TestSolution\TestSolution.sln";
using (var workspace = MSBuildWorkspace.Create())
{
var solution = workspace.OpenSolutionAsync(projectPath).Result;
foreach (var project in solution.Projects)
{
foreach (var document in project.Documents)
{
Console.WriteLine(project.Name + "\t\t\t" + document.Name);
}
}
}
Finally to make things works fine i added this package:
Install-Package Buildalyzer.Workspaces -Version 2.2.0

Build visual studio solution from code

I'm writing a console application to get a solution from a tfs server, build it and publish on iis, but I'm stuck at building...
I found this code, which works like a charm
public static void BuildProject()
{
string solutionPath = Path.Combine(#"C:\MySolution\Common\Common.csproj");
List<ILogger> loggers = new List<ILogger>();
loggers.Add(new ConsoleLogger());
var projectCollection = new ProjectCollection();
projectCollection.RegisterLoggers(loggers);
var project = projectCollection.LoadProject(solutionPath);
try
{
project.Build();
}
finally
{
projectCollection.UnregisterAllLoggers();
}
}
but my solution it's pretty big and contains multiple projects which depends from each other (e.g. project A has a reference to project B)
how to get the correct order to build each project?
is there a way to build the entire solution from the .sln file?
Try using the following code to load a solution and compile it:
string projectFilePath = Path.Combine(#"c:\solutions\App\app.sln");
ProjectCollection pc = new ProjectCollection();
// THERE ARE A LOT OF PROPERTIES HERE, THESE MAP TO THE MSBUILD CLI PROPERTIES
Dictionary<string, string> globalProperty = new Dictionary<string, string>();
globalProperty.Add("OutputPath", #"c:\temp");
BuildParameters bp = new BuildParameters(pc);
BuildRequestData buildRequest = new BuildRequestData(projectFilePath, globalProperty, "4.0", new string[] { "Build" }, null);
// THIS IS WHERE THE MAGIC HAPPENS - IN PROCESS MSBUILD
BuildResult buildResult = BuildManager.DefaultBuildManager.Build(bp, buildRequest);
// A SIMPLE WAY TO CHECK THE RESULT
if (buildResult.OverallResult == BuildResultCode.Success)
{
//...
}

how to execute multiple ssis packages from c#

I have created 10 different packages and i want to execute them from c# coding. Can some one post some screen shots to achieve this.
I have tried this
Application app = new Application();
TraceService("loading system From File system");
//Create package Container to hold the package.
//And Load the Package Using the Application Object.
Package package = app.LoadPackage(#"C:\User\Kiran\Documents\Visual Studio 2012\Projects\WindowsServiceTest\WindowsServiceTest\Package1.dtsx", null);
TraceService("Execution Started");
DTSExecResult result = package.Execute();
// print the result
TraceService(result.ToString());
TraceService("Execution Completed");
Here i have to get the file name in run time not by hard coding
Following code will execute all packages from given folder.
var pkgLocation = #"C:\User\Kiran\Documents\Visual Studio 2012\Projects\WindowsServiceTest\WindowsServiceTest\";
foreach (var file in Directory.EnumerateFiles(pkgLocation, "*.dtsx"))
using (var pkg = new Application().LoadPackage(file, null))
{
var pkgResults = pkg.Execute();
Console.WriteLine("Package File Name:{0}, Result:{1}",file.ToString(), pkgResults.ToString());
}
The executing SSIS package from C# and VB is well documented in official site. This is my complete code in script task to execute multiple SSIS packages.
string packagesFolder = Dts.Variables["User::packagesFolder"].Value.ToString();
string rootFolder = Dts.Variables["User::rootFolder"].Value.ToString();
Package pkg;
Microsoft.SqlServer.Dts.Runtime.Application app;
DTSExecResult pkgResults;
foreach (var pkgLocation in Directory.EnumerateFiles(packagesFolder+"\\", "ValidateDataMigration-*.dtsx"))
{
try
{
app = new Microsoft.SqlServer.Dts.Runtime.Application();
pkg = app.LoadPackage(pkgLocation, null);
pkgResults = pkg.Execute();
File.AppendAllText(rootFolder + "\\DataValidationProgress.log", pkgLocation.ToString()+"=>"+ pkgResults.ToString()+ Environment.NewLine);
}
catch(Exception e)
{
File.AppendAllLines(rootFolder + "\\DataValidationErrors.log", new string[] { e.Message, e.StackTrace });
}
}

Categories