We are using TeamCity for continuous integration, our source control is Git, and we have 1 major repository that contains multiple .sln files (around 10).
All in all, this repository has about ~ 100 - 200 C# projects.
Upon a push to the master repository, TeamCity triggers a build that will compile all projects in the repository.
I'd like to be able to tell which projects were actually affected by a particular commit, and thus publish only those projects' outputs as artifacts of the current build.
For this, i've designed a solution to integrate NDepend into our build process, and generate a diff report between current and latest build outputs.
The outputs that were changed/added will be published as the build outputs.
I have little experience with NDepend; from what i've seen all of its true power comes from the query language that is baked into it.
I am wondering how (if possible) i can achieve the following:
Diff between a folder containing previous build's outputs and current folder of build outputs.
Have NDepend generate a report in a consumable format so i can determine the files that need to be copied.
Is this scenario possible? How easy/hard would that be?
So the simple answer is to do the Reporting Code Diff way as explained in this documentation. The problem with this basic answer is that, it pre-suppose two NDepend projects that always refers to the two same set of assemblies.
Certainly, the number and names of assemblies is varying in your context so we need to build two projects (old/new) on the fly and analyze them through NDepend.API.
Here is the NDepend.API source code for that. For a It-Just-Works experience, in the PowerTools source code (in $NDependInstallDir$\NDepend.PowerTools.SourceCode\NDepend.PowerTools.sln) just call the FoldersDiff.Main(); method after the AssemblyResolve registration call, in Program.cs.
...
AppDomain.CurrentDomain.AssemblyResolve += AssemblyResolverHelper.AssemblyResolveHandler;
FoldersDiff.Main();
...
Here is the the source code that harnesses NDepend.API.
Note that so much more can be done, through the two codeBase objects and the compareContext object. Instead of just showing the 3 lists of assemblies added/removed/codeWasChanges, you could show API breakings changes, new methods and types added, modified classes and methods, code quality regression... For that, just look at default code rules concerning diff, that are based on the same NDepend.CodeModel API.
using System;
using System.Collections.Generic;
using System.Diagnostics;
using System.Linq;
using NDepend;
using NDepend.Analysis;
using NDepend.CodeModel;
using NDepend.Path;
using NDepend.Project;
class FoldersDiff {
private static readonly NDependServicesProvider s_NDependServicesProvider = new NDependServicesProvider();
internal static void Main() {
var dirOld = #"C:\MyProduct\OldAssembliesDir".ToAbsoluteDirectoryPath();
var dirNew = #"C:\MyProduct\NewAssembliesDir".ToAbsoluteDirectoryPath();
Console.WriteLine("Analyzing assemblies in " + dirOld.ToString());
var codeBaseOld = GetCodeBaseFromAsmInDir(dirOld, TemporaryProjectMode.TemporaryOlder);
Console.WriteLine("Analyzing assemblies in " + dirNew.ToString());
var codeBaseNew = GetCodeBaseFromAsmInDir(dirNew, TemporaryProjectMode.TemporaryNewer);
var compareContext = codeBaseNew.CreateCompareContextWithOlder(codeBaseOld);
// So much more can be done by exploring fine-grained diff in codeBases and compareContext
Dump("Added assemblies", codeBaseNew.Assemblies.Where(compareContext.WasAdded));
Dump("Removed assemblies", codeBaseOld.Assemblies.Where(compareContext.WasRemoved));
Dump("Assemblies with modified code", codeBaseNew.Assemblies.Where(compareContext.CodeWasChanged));
Console.Read();
}
internal static ICodeBase GetCodeBaseFromAsmInDir(IAbsoluteDirectoryPath dir, TemporaryProjectMode temporaryProjectMode) {
Debug.Assert(dir.Exists);
var dotNetManager = s_NDependServicesProvider.DotNetManager;
var assembliesPath = dir.ChildrenFilesPath.Where(dotNetManager.IsAssembly).ToArray();
Debug.Assert(assembliesPath.Length > 0); // Make sure we found assemblies
var projectManager = s_NDependServicesProvider.ProjectManager;
IProject project = projectManager.CreateTemporaryProject(assembliesPath, temporaryProjectMode);
// In PowerTool context, better call:
// var analysisResult = ProjectAnalysisUtils.RunAnalysisShowProgressOnConsole(project);
var analysisResult = project.RunAnalysis();
return analysisResult.CodeBase;
}
internal static void Dump(string title, IEnumerable<IAssembly> assemblies) {
Debug.Assert(!string.IsNullOrEmpty(title));
Debug.Assert(assemblies != null);
Console.WriteLine(title);
foreach (var #assembly in assemblies) {
Console.WriteLine(" " + #assembly.Name);
}
}
}
Related
my question does not target a problem. It is more some kind of "Do you know something that...?". All my applications are built and deployed using CI/CD with Azure DevOps. I like to have all build information handy in the create binary and to read them during runtime. Those applications are mainly .NET Core 2 applications written in C#. I am using the default build system MSBuild supplied with the .NET Core SDK. The project should be buildable on Windows AND Linux.
Information I need:
GitCommitHash: string
GitCommitMessage: string
GitBranch: string
CiBuildNumber: string (only when built via CI not locally)
IsCiBuild: bool (Detecting should work by checking for env variables
which are only available in CI builds)
Current approach:
In each project in the solution there is a class BuildConfig à la
public static class BuildConfig
{
public const string BuildNumber = "#{Build.BuildNumber}#"; // Das sind die Namen der Variablen innerhalb der CI
// and the remaining information...
}
Here tokens are used, which get replaced with the corresponding values during the CI build. To achieve this an addon task is used. Sadly this only fills the values for CI builds and not for the local ones. When running locally and requesting the build information it only contains the tokens as they are not replaced during the local build.
It would be cool to either have the BuildConfig.cs generated during the build or have the values of the variables set during the local build (IntelliSense would be very cool and would prevent some "BuildConfig class could not be found" errors). The values could be set by an MSBuild task (?). That would be one (or two) possibilities to solve this. Do you have ideas/experience regarding this? I did not found that much during my internet research. I only stumbled over this question which did not really help me as I have zero experience with MSBuild tasks/customization.
Then I decided to have a look at build systems in general. Namly Fake and Cake. Cake has a Git-Addin, but I did not find anything regarding code generation/manipulation. Do you know some resources on that?
So here's the thing...
Short time ago I had to work with Android apps namly Java and the build system gradle. So I wanted to inject the build information there too during the CI build. After a short time I found a (imo) better and more elegant solution to do this. And this was modifying the build script in the following way (Scripting language used is Groovy which is based on Java):
def getGitHash = { ->
def stdout = new ByteArrayOutputStream()
exec {
commandLine 'git', 'rev-parse', '--short', 'HEAD'
standardOutput = stdout
}
return stdout.toString().trim().replace("\"", "\\\"")
}
def getGitBranch = { ->
def fromEnv = System.getenv("BUILD_SOURCEBRANCH")
if (fromEnv) {
return fromEnv.substring("refs/heads/".length()).replace("\"", "\\\"");
} else {
def stdout = new ByteArrayOutputStream()
exec {
commandLine 'git', 'rev-parse', '--abbrev-ref', 'HEAD'
standardOutput = stdout
}
return stdout.toString().trim().replace("\"", "\\\"")
}
}
def getIsCI = { ->
return System.getenv("BUILD_BUILDNUMBER") != null;
}
# And the other functions working very similar
android {
# ...
buildConfigField "String", "GitHash", "\"${getGitHash()}\""
buildConfigField "String", "GitBranch", "\"${getGitBranch()}\""
buildConfigField "String", "BuildNumber", "\"${getBuildNumber()}\""
buildConfigField "String", "GitMessage", "\"${getGitCommitMessage()}\""
buildConfigField "boolean", "IsCIBuild", "${getIsCI()}"
# ...
}
The result after the first build is the following java code:
public final class BuildConfig {
// Some other fields generated by default
// Fields from default config.
public static final String BuildNumber = "Local Build";
public static final String GitBranch = "develop";
public static final String GitHash = "6c87e82";
public static final String GitMessage = "Merge branch 'hotfix/login-failed' into 'develop'";
public static final boolean IsCIBuild = false;
}
Getting the required information is done by the build script itself without depending on the CI engine to fulfill this task. This class can be used after the first build its generated and stored in a "hidden" directory which is included in code analysis but exluded from your code in the IDE and also not pushed to the Git. But there is IntelliSense support. In C# project this would be the obj/ folder I guess. It is very easy to access the information as they are a constant and static values (so no reflection or similar required).
So here the summarized question: "Do you know something to achieve this behaviour/mechanism in a .NET environment?"
Happy to discuss some ideas/approaches... :)
It becomes much easier if at runtime you are willing to use reflection to read assembly attribute values. For example:
using System.Reflection;
var assembly = Assembly.GetExecutingAssembly();
var descriptionAttribute = (AssemblyDescriptionAttribute)assembly
.GetCustomAttributes(typeof(AssemblyDescriptionAttribute), false).FirstOrDefault();
var description = descriptionAttribute?.Description;
For most purposes the performance impact of this approach can be satisfactorily addressed by caching the values so they only need to read once.
One way to embed the desired values into assembly attributes is to use the MSBuild WriteCodeFragment task to create a class file that sets assembly attributes to the values of project and/or environment variables. You would need to ensure that you do this in a Target that executes before before compilation occurs (e.g. <Target BeforeTargets="CoreCompile" ...). You would also need to set the property <GenerateAssemblyInfo>false</GenerateAssemblyInfo> to avoid conflicting with the functionality referenced in the next option.
Alternatively, you may be able to leverage the plumbing in the dotnet SDK for including metadata in assemblies. It embeds the values of many of the same project variables documented for the NuGet Pack target. As implied above, this would require the GenerateAssemblyInfo property to be set to true.
Finally, consider whether GitVersion would meet your needs.
Good luck!
I need to compile source code of big project dynamically and output type can be Windows Application or Class Library.
Code is nicely executed and its possible to make .dll or .exe files, but problem is that, when I'm trying to make .exe file - it's losing resources like project icon. Result file doesn't include assembly information to.
Any way to solve this? (Expected result should be the same, that manual Build function on project file in Visual Studio 2015).
Thank you!
var workspace = MSBuildWorkspace.Create();
//Locating project file that is WindowsApplication
var project = workspace.OpenProjectAsync(#"C:\RoslynTestProjectExe\RoslynTestProjectExe.csproj").Result;
var metadataReferences = project.MetadataReferences;
// removing all references
foreach (var reference in metadataReferences)
{
project = project.RemoveMetadataReference(reference);
}
//getting new path of dlls location and adding them to project
var param = CreateParamString(); //my own function that returns list of references
foreach (var par in param)
{
project = project.AddMetadataReference(MetadataReference.CreateFromFile(par));
}
//compiling
var projectCompilation = project.GetCompilationAsync().Result;
using (var stream = new MemoryStream())
{
var result = projectCompilation.Emit(stream);
if (result.Success)
{
/// Getting result
//writing exe file
using (var file = File.Create(Path.Combine(_buildPath, fileName)))
{
stream.Seek(0, SeekOrigin.Begin);
stream.CopyTo(file);
}
}
}
We never really designed the workspace API to include all the information you need to emit like this; in particular when you're calling Emit there's an EmitOptions you can pass that includes, amongst other things, resource information. But we don't expose that information since this scenario wasn't hugely considered. We've done some of the work in the past to enable this but ultimately never merged it. You might wish to consider filing a bug so we officially have the request somewhere.
So what can you do? I think there's a few options. You might consider not using Roslyn at all but rather modifying the project file and building that with the MSBuild APIs. Unfortunately I don't know what you're ultimately trying to achieve here (it would help if you mentioned it), but there's a lot more than just the compiler invocation that is involved in building a project. Changing references potentially changes other things too.
It'd also be possible, of course, to update MSBuildWorkspace yourself to pass this through. If you were to modify the Roslyn code, you'll see we implement a series of interfaces named "ICscHostObject#" (where # is a number) and we get passed the information from MSBuild to that. It looks like we already stash that in the command line arguments, so you might be able to pass that to our command line parser and get the data back you need that way.
It appears that I can run all my tests in the solution in one go from the command line using MSTest if I use the /testmetadata flag as described here: http://msdn.microsoft.com/en-us/library/ms182487.aspx
I'm running SQL Server DB Unit tests in Visual Studio 2013, wherein I don't seem to have a vsmdi file at all, and I'm unable to find a way to add one either. I tried creating a testsettings file, but it doesn't discover any tests (shows "No tests to run") when I invoke MSTest.
Is there a way I can have MSTest run all my tests in a VS2013 solution?
I wanted to close this open question. My intention was to run all tests in one go from out Hudson CI server, so I wrote a basic console app to find and invoke MSTest on all DLL files inside the solution folder. This app is executed after the project is built in release mode.
string execId = null;
string className = null;
string testName = null;
string testResult = null;
string resultLine = null;
List<string> results = new List<string>();
XmlDocument resultsDoc = new XmlDocument();
XmlNode executionNode = null;
XmlNode testMethodNode = null;
// Define the test instance settings
Process testInstance = null;
ProcessStartInfo testInfo = new ProcessStartInfo()
{
UseShellExecute = false,
CreateNoWindow = true,
};
// Fetch project list from the disk
List<string> excluded = ConfigurationManager.AppSettings["ExcludedProjects"].Split(',').ToList();
DirectoryInfo assemblyPath = new DirectoryInfo(Assembly.GetExecutingAssembly().Location);
DirectoryInfo[] directories = assemblyPath.Parent.Parent.Parent.Parent.GetDirectories();
// Create a test worklist
List<string> worklist = directories.Where(t => !excluded.Contains(t.Name))
.Select(t => String.Format(ConfigurationManager.AppSettings["MSTestCommand"], t.FullName, t.Name))
.ToList();
// Start test execution
Console.WriteLine("Starting Execution...");
Console.WriteLine();
Console.WriteLine("Results Top Level Tests");
Console.WriteLine("------- ---------------");
// Remove any existing run results
if (File.Exists("UnitTests.trx"))
{
File.Delete("UnitTests.trx");
}
// Run each project in the worklist
foreach (string item in worklist)
{
testInfo.FileName = item;
testInstance = Process.Start(testInfo);
testInstance.WaitForExit();
if (File.Exists("UnitTests.trx"))
{
resultsDoc = new XmlDocument();
resultsDoc.Load("UnitTests.trx");
foreach (XmlNode result in resultsDoc.GetElementsByTagName("UnitTestResult"))
{
// Get the execution ID for the test
execId = result.Attributes["executionId"].Value;
// Find the execution and test method nodes
executionNode = resultsDoc.GetElementsByTagName("Execution")
.OfType<XmlNode>()
.Where(n => n.Attributes["id"] != null && n.Attributes["id"].Value.Equals(execId))
.First();
testMethodNode = executionNode.ParentNode
.ChildNodes
.OfType<XmlNode>()
.Where(n => n.Name.Equals("TestMethod"))
.First();
// Get the class name, test name and result
className = testMethodNode.Attributes["className"].Value.Split(',')[0];
testName = result.Attributes["testName"].Value;
testResult = result.Attributes["outcome"].Value;
resultLine = String.Format("{0} {1}.{2}", testResult, className, testName);
results.Add(resultLine);
Console.WriteLine(resultLine);
}
File.Delete("UnitTests.trx");
}
}
// Calculate passed / failed test case count
int passed = results.Where(r => r.StartsWith("Passed")).Count();
int failed = results.Where(r => r.StartsWith("Failed")).Count();
// Print the summary
Console.WriteLine();
Console.WriteLine("Summary");
Console.WriteLine("-------");
Console.WriteLine("Test Run {0}", failed > 0 ? "Failed." : "Passed.");
Console.WriteLine();
if (passed > 0)
Console.WriteLine("\tPassed {0,7}", passed);
if (failed > 0)
Console.WriteLine("\tFailed {0,7}", failed);
Console.WriteLine("\t--------------");
Console.WriteLine("\tTotal {0,8}", results.Count);
if (failed > 0)
Environment.Exit(-1);
else
Environment.Exit(0);
My App.config file:
<appSettings>
<add key="ExcludedProjects" value="UnitTests.Bootstrap,UnitTests.Utils" />
<add key="MSTestCommand" value=""c:\Program Files (x86)\Microsoft Visual Studio 12.0\Common7\IDE\MSTest.exe" /testcontainer:"{0}\bin\Release\{1}.dll" /nologo /resultsfile:"UnitTests.trx"" />
</appSettings>
MsTest is kind of the "deprecated" test framework of Visual Studio 2013. it's still used for some types of tests, but many of the other tests that can be executed now live in the new Agile Test Runner. Now SQL Database Unit Tests seems to still be in the category of "Some types" and need to be executed through MsTest.exe.
The easiest way is to use the /TestContainer commandline switch and use a naming pattern for your test projects. That way you can quickly grab all the assemblies with a specific naming pattern and then feed them to MsTest. Simple powershell command could be used to grab all the files that adhere to your pattern and then feed them to the commandline.
The vsmdi will still work in Visual Studio 2013, but the editor has been removed from the tool, plus there is no template for it anymore. So it's very hard to use. This is what Microsoft has to say about the VSDMI's:
Caution
Test lists are no longer fully supported in Visual Studio 2012:
You cannot create new test lists.
You cannot run test list tests from within Visual Studio.
If you upgraded from Visual Studio 2010, and had test list in your solution, you can continue to edit it in Visual Studio.
You can continue to run test list using mstest.exe from the command line, as described above.
If you were using a test list in your build definition, you can continue to use it.
Basically they're telling you to stop using this technique and use a combination of TestCategory's to create easy to execute tests groups.
As you can add multiple parameters for test containers, you can all group them into one call:
/testcontainer:[file name] Load a file that contains tests. You can
Specify this option more than once to
load multiple test files.
Examples:
/testcontainer:mytestproject.dll
/testcontainer:loadtest1.loadtest
MsTest /testcontainer:assemblyone.dll /testcontainer:assemblytwo.dll /testcontainer:assembly3.dll
To run MsTest on multiple assemblies at once. And do not (yet) make use of XUnit .NET or NUnit, as these cannot be combined into one report without switching to the new Agile test runner.
I don't know if this will help you or not, but I use Invoke-MsBuild. Its a PowerShell module that should does exactly what you need. I don't know if you were looking for a PowerShell solution or not, but it works great!
It also has a sister script, Invoke-MsTest for running MsTest instead of MsBuild.
Just for completeness, I often want to run tests as console application, simply because I find it much easier to debug this for some reason... Over the years I've created a few small test helpers to help me out; I suppose you can use them with your CI solution quite easily.
I understand this isn't your question entirely; however, since you're looking for a CI solution and mention visual studio, this should solve it quite nicely.
Just to let you know, my little framework is a bit bigger than this, but the things that are missing are quite easy to add. Basically what I left out is everything for logging and the fact that I test different assemblies in different app domains (because of possible DLL conflicts and state). More on that below.
One thing to notice is that I do not catch any exceptions in the process below. My main focus is making it easy to debug your application when troubleshooting. I have a separate (but similar) implementation for CI that basically adds try/catches on the comment points below.
There's only one catch to this method: Visual Studio won't copy all the assemblies you reference; it'll only copy assemblies that you use in the code. A simple workaround for this is to introduce a method (which is never called) which uses one type in the DLL you're testing. That way, your assembly will be copied and everything will work nicely.
Here's the code:
static class TestHelpers
{
public static void TestAll(this object o)
{
foreach (MethodInfo meth in o.GetType().GetMethods().
Where((a) => a.GetCustomAttributes(true).
Any((b) => b.GetType().Name.Contains("TestMethod"))))
{
Console.WriteLine();
Console.WriteLine("--- Testing {0} ---", meth.Name);
Console.WriteLine();
// Add exception handling here for your CI solution.
var del = (Action)meth.CreateDelegate(typeof(Action), o);
del();
// NOTE: Don't use meth.Invoke(o, new object[0]); ! It'll eat your exception!
Console.WriteLine();
}
}
public static void TestAll(this Assembly ass)
{
HashSet<AssemblyName> visited = new HashSet<AssemblyName>();
Stack<Assembly> todo = new Stack<Assembly>();
todo.Push(ass);
HandleStack(visited, todo);
}
private static void HandleStack(HashSet<AssemblyName> visited, Stack<Assembly> todo)
{
while (todo.Count > 0)
{
var assembly = todo.Pop();
// Collect all assemblies that are related
foreach (var refass in assembly.GetReferencedAssemblies())
{
TryAdd(refass, visited, todo);
}
foreach (var type in assembly.GetTypes().
Where((a) => a.GetCustomAttributes(true).
Any((b) => b.GetType().Name.Contains("TestClass"))))
{
// Add exception handling here for your CI solution.
var obj = Activator.CreateInstance(type);
obj.TestAll();
}
}
}
public static void TestAll()
{
HashSet<AssemblyName> visited = new HashSet<AssemblyName>();
Stack<Assembly> todo = new Stack<Assembly>();
foreach (var assembly in AppDomain.CurrentDomain.GetAssemblies())
{
TryAdd(assembly.GetName(), visited, todo);
}
HandleStack(visited, todo);
}
private static void TryAdd(AssemblyName ass, HashSet<AssemblyName> visited, Stack<Assembly> todo)
{
try
{
var reference = Assembly.Load(ass);
if (reference != null &&
!reference.GlobalAssemblyCache && // Ignore GAC
reference.FullName != null &&
!reference.FullName.StartsWith("ms") && // mscorlib and other microsoft stuff
!reference.FullName.StartsWith("vshost") && // visual studio host process
!reference.FullName.StartsWith("System")) // System libraries
{
if (visited.Add(reference.GetName())) // We don't want to test assemblies twice
{
todo.Push(reference); // Queue assembly for processing
}
}
}
catch
{
// Perhaps log something here... I currently don't because I don't care...
}
}
}
How to use this code:
You can simply call TestHelpers.TestAll() to test all assemblies, referenced assemblies, indirectly referenced assemblies, etc. This is probably what you want to do in CI.
You can call TestHelpers.TestAll(assembly) to test a single assembly with all referenced assemblies. This can be useful when you're splitting tests across multiple assemblies and/or when your debugging.
You can call new MyObject().TestAll() to invoke all tests in a single object. This is particularly helpful when debugging.
If you're using appdomains like me, you should make a single appdomain for a DLL you load dynamically from a folder and use TestAll on that. Also, if you use a scratch folder, you might want to empty that between tests. That way, multiple test framework versions and multiple tests won't interact with each other. Particularly if your tests use state (e.g. static variables), this might be a good practice. There are tons of examples for CreateInstanceAndUnwrap online that will help you out with this.
One thing to note is that I use a delegate instead of the method.Invoke. This basically means your exception object won't be eaten by Reflection, which means your debugger won't be broken. Also note that I check attributes by name, which means this will work with different frameworks - as long as the attribute names match.
HTH
Read the Caution part in your own link. It's no longer supported in VST 2012 the way you do it.
Maybe this update version can help:
http://msdn.microsoft.com/en-us/library/ms182490.aspx
I'm looking for a method that let's me validate code and generator code as part of the build process, using Visual Studio 2010 (not express) and MSBuild.
Background Validation:
I'm writing a RESTful web service using the WCF Web Api. Inside the service class that represents the web service I have to define an endpoint, declaring additionally parameters as plain test. When the parameter name inside the endpoint declaration differs from the parameter of the C# method I get a error - unfortunately at run time when accessing the web service, not at compile time. So I thought it would be nice to analyze the web service class as part of the compile step for flaws like this, returning an error when something is not right.
Example:
[WebGet(UriTemplate = "Endpoint/{param1}/{param2}")]
public string MyMethod(string param1, string parameter2) {
// Accessing the web service now will result in an error,
// as there's no fitting method-parameter named "param2".
}
Also I'd like to enforce some naming rules, such as GET-Methods must start with the "Get" word. I believe this will help the service to remain much more maintainable when working with several colleagues.
Background Generation:
I will be using this REST web service in a few other projects, there for I need to write a client to access this service. But I don't want to write a client for each of these, always adjusting whenever the service changes. I'd like the clients to be generated automatically, based upon the web service code files.
Previous approach:
So far I tried to use a T4 template using the DTE interface to parse the code file and validate it, or generate the client. This worked fine in Visual Studio when saving manually, but integrating this in the build process turned out to be not so working well, as the Visual Studio host is not available using MSBuild.
Any suggestion is welcome. :)
Instead of using DTE or some other means to parse the C# code you could use reflection (with Reflection-Only context) to examine the assembly after it's compiled. Using reflection is a more robust solution and probably faster also (especially if you use Mono.Cecil to do the reflecting).
For the MSBuild integration I would recommend writing a custom MSBuild task - it's fairly easy and more robust/elegant than writing a command line utility that's executed by MSBuild.
This may be a long shot but still qualifies as "any suggestion" :)
You could compile the code, then run a post-build command which would be a tool that you'd have to write which uses reflection to compare the parsed UriTemplate text with the method parameter names, catching errors and outputting them in a manner that MSBuild will pickup. Look at This Link for information on how to output so MSBuild will put the errors in the visual studio error list. The post-build tool could then delete the compiled assemblies if errors were found, thus "simulating" a failed build.
Here's the SO Link that lead me to the MSBuild Blog too, just for reference.
HTH
For the enforcement side of things, custom FxCop rules would probably be a very good fit.
For the client code generation, there are quite a few possibilities. If you like the T4 approach, there is probably a way to get it working with MSBuild (but you would definitely need to provide a bit more detail regarding what isn't working now). If you're want an alternative anyway, a reflection-based post-build tool is yet another way to go...
Here is a short, extremely ugly program that you can run over an assembly or group of assemblies (just pass the dlls as arguments) to perform the WebGet UriTemplate check. If you don't pass anything, it runs on itself (and fails, appropriately, as it is its own unit test).
The program will print out to stdout the name of the methods that are missing the parameters and the names of the missing parameters, and if any are found, will return a non-zero return code (standard for a program failing), making it suitable as a post-build event. I am not responsible if your eyes bleed:
using System;
using System.Collections.Concurrent;
using System.Collections.Generic;
using System.Diagnostics;
using System.Linq;
using System.Reflection;
using System.ServiceModel.Web;
namespace ConsoleApplication1
{
class Program
{
static int Main(string[] args)
{
var failList = new ConcurrentDictionary<MethodInfo, ISet<String>>();
var assembliesToRunOn = (args.Length == 0 ? new[] {Assembly.GetExecutingAssembly()} : args.Select(Assembly.LoadFrom)).ToList();
assembliesToRunOn.AsParallel().ForAll(
a => Array.ForEach(a.GetTypes(), t => Array.ForEach(t.GetMethods(BindingFlags.Public | BindingFlags.Instance),
mi =>
{
var miParams = mi.GetParameters();
var attribs = mi.GetCustomAttributes(typeof (WebGetAttribute), true);
if (attribs.Length <= 0) return;
var wga = (WebGetAttribute)attribs[0];
wga.UriTemplate
.Split('/')
.ToList()
.ForEach(tp =>
{
if (tp.StartsWith("{") && tp.EndsWith("}"))
{
var tpName = tp.Substring(1, tp.Length - 2);
if (!miParams.Any(pi => pi.Name == tpName))
{
failList.AddOrUpdate(mi, new HashSet<string> {tpName}, (miv, l) =>
{
l.Add(tpName);
return l;
});
}
}
});
})));
if (failList.Count == 0) return 0;
failList.ToList().ForEach(kvp => Console.Out.WriteLine("Method " + kvp.Key + " in type " + kvp.Key.DeclaringType + " is missing the following expected parameters: " + String.Join(", ", kvp.Value.ToArray())));
return failList.Count;
}
[WebGet(UriTemplate = "Endpoint/{param1}/{param2}")]
public void WillPass(String param1, String param2) { }
[WebGet(UriTemplate = "Endpoint/{param1}/{param2}")]
public void WillFail() { }
[WebGet(UriTemplate = "Endpoint/{param1}/{param2}")]
public void WillFail2(String param1) { }
}
}
I have a requirement to install multiple web setup projects (using VS2005 and ASP.Net/C#) into the same virtual folder. The projects share some assembly references (the file systems are all structured to use the same 'bin' folder), making deployment of changes to those assemblies problematic since the MS installer will only overwrite assemblies if the currently installed version is older than the one in the MSI.
I'm not suggesting that the pessimistic installation scheme is wrong - only that it creates a problem in the environment I've been given to work with. Since there are a sizable number of common assemblies and a significant number of developers who might change a common assembly but forget to update its version number, trying to manage versioning manually will eventually lead to massive confusion at install time.
On the flip side of this issue, it's also important not to spontaneously update version numbers and replace all common assemblies with every install, since that could (temporarily at least) obscure cases where actual changes were made.
That said, what I'm looking for is a means to update assembly version information (preferably using MSBuild) only in cases where the assembly constituents (code modules, resources etc) has/have actually changed.
I've found a few references that are at least partially pertinent here (AssemblyInfo task on MSDN) and here (looks similar to what I need, but more than two years old and without a clear solution).
My team also uses TFS version control, so an automated solution should probably include a means by which the AssebmlyInfo can be checked out/in during the build.
Any help would be much appreciated.
Thanks in advance.
I cannot answer all your questions, as I don't have experience with TFS.
But I can recommend a better approach to use for updating your AssemblyInfo.cs files than using the AssemblyInfo task. That task appears to just recreate a standard AssemblyInfo file from scratch, and loses any custom portions you may have added.
For that reason, I suggest you look into the FileUpdate task, from the MSBuild Community Tasks project. It can look for specific content in a file and replace it, like this:
<FileUpdate
Files="$(WebDir)\Properties\AssemblyInfo.cs"
Regex="(\d+)\.(\d+)\.(\d+)\.(\d+)"
ReplacementText="$(Major).$(ServicePack).$(Build).$(Revision)"
Condition="'$(Configuration)' == 'Release'"
/>
There are several ways you can control the incrementing of the build number. Because I only want the build number to increment if the build is completely successful, I use a 2-step method:
read a number from a text file (the only thing in the file is the number) and add 1 without changing the file;
as a final step in the build process, if everything succeeded, save the incremented number back to the text file.
There are tasks such as ReadLinesFromFile, that can help you with this, but I found it easiest to write a small custom task:
using System;
using System.IO;
using Microsoft.Build.Framework;
using Microsoft.Build.Utilities;
namespace CredibleCustomBuildTasks
{
public class IncrementTask : Task
{
[Required]
public bool SaveChange { get; set; }
[Required]
public string IncrementFileName { get; set; }
[Output]
public int Increment { get; set; }
public override bool Execute()
{
if (File.Exists(IncrementFileName))
{
string lines = File.ReadAllText(IncrementFileName);
int result;
if(Int32.TryParse(lines, out result))
{
Increment = result + 1;
}
else
{
Log.LogError("Unable to parse integer in '{0}' (contents of {1})");
return false;
}
}
else
{
Increment = 1;
}
if (SaveChange)
{
File.Delete(IncrementFileName);
File.WriteAllText(IncrementFileName, Increment.ToString());
}
return true;
}
}
}
I use this before the FileUpdateTask to get the next build number:
<IncrementTask
IncrementFileName="$(BuildNumberFile)"
SaveChange="false">
<Output TaskParameter="Increment" PropertyName="Build" />
</IncrementTask>
and as my final step (before notifying others) in the build:
<IncrementTask
IncrementFileName="$(BuildNumberFile)"
SaveChange="true"
Condition="'$(Configuration)' == 'Release'" />
Your other question of how to update the version number only when source code has changed is highly dependent on your how your build process interacts with your source control. Normally, checking in source file changes should initiate a Continuous Integration build. That is the one to use to update the relevant version number.
I have written one custome task you can refer the code below. It will create an utility to which you can pass assemblyinfo path Major,minor and build number. you can modify it to get revision number. Since in my case this task was done by developer i used to search it and again replace whole string.
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.IO;
using System.Text.RegularExpressions;
namespace UpdateVersion
{
class SetVersion
{
static void Main(string[] args)
{
String FilePath = args[0];
String MajVersion=args[1];
String MinVersion = args[2];
String BuildNumber = args[3];
string RevisionNumber = null;
StreamReader Reader = File.OpenText(FilePath);
string contents = Reader.ReadToEnd();
Reader.Close();
MatchCollection match = Regex.Matches(contents, #"\[assembly: AssemblyVersion\("".*""\)\]", RegexOptions.IgnoreCase);
if (match[0].Value != null)
{
string strRevisionNumber = match[0].Value;
RevisionNumber = strRevisionNumber.Substring(strRevisionNumber.LastIndexOf(".") + 1, (strRevisionNumber.LastIndexOf("\"")-1) - strRevisionNumber.LastIndexOf("."));
String replaceWithText = String.Format("[assembly: AssemblyVersion(\"{0}.{1}.{2}.{3}\")]", MajVersion, MinVersion, BuildNumber, RevisionNumber);
string newText = Regex.Replace(contents, #"\[assembly: AssemblyVersion\("".*""\)\]", replaceWithText);
StreamWriter writer = new StreamWriter(FilePath, false);
writer.Write(newText);
writer.Close();
}
else
{
Console.WriteLine("No matching values found");
}
}
}
}
I hate to say this but it seems that you may be doing it wrongly. Is much easier if you do generate the assembly versions on the fly instead of trying to patch them.
Take a look at https://sbarnea.com/articles/easy-windows-build-versioning/
Why I do think you are doing it wrong?
* A build should not modify the version number
* if you build the same changeset twice you should get the same build numbers
* if you put build number inside what microsoft calls build number (proper naming would be PATCH level) you will eventually reach the 65535 limitation.