I'm looking for a method that let's me validate code and generator code as part of the build process, using Visual Studio 2010 (not express) and MSBuild.
Background Validation:
I'm writing a RESTful web service using the WCF Web Api. Inside the service class that represents the web service I have to define an endpoint, declaring additionally parameters as plain test. When the parameter name inside the endpoint declaration differs from the parameter of the C# method I get a error - unfortunately at run time when accessing the web service, not at compile time. So I thought it would be nice to analyze the web service class as part of the compile step for flaws like this, returning an error when something is not right.
Example:
[WebGet(UriTemplate = "Endpoint/{param1}/{param2}")]
public string MyMethod(string param1, string parameter2) {
// Accessing the web service now will result in an error,
// as there's no fitting method-parameter named "param2".
}
Also I'd like to enforce some naming rules, such as GET-Methods must start with the "Get" word. I believe this will help the service to remain much more maintainable when working with several colleagues.
Background Generation:
I will be using this REST web service in a few other projects, there for I need to write a client to access this service. But I don't want to write a client for each of these, always adjusting whenever the service changes. I'd like the clients to be generated automatically, based upon the web service code files.
Previous approach:
So far I tried to use a T4 template using the DTE interface to parse the code file and validate it, or generate the client. This worked fine in Visual Studio when saving manually, but integrating this in the build process turned out to be not so working well, as the Visual Studio host is not available using MSBuild.
Any suggestion is welcome. :)
Instead of using DTE or some other means to parse the C# code you could use reflection (with Reflection-Only context) to examine the assembly after it's compiled. Using reflection is a more robust solution and probably faster also (especially if you use Mono.Cecil to do the reflecting).
For the MSBuild integration I would recommend writing a custom MSBuild task - it's fairly easy and more robust/elegant than writing a command line utility that's executed by MSBuild.
This may be a long shot but still qualifies as "any suggestion" :)
You could compile the code, then run a post-build command which would be a tool that you'd have to write which uses reflection to compare the parsed UriTemplate text with the method parameter names, catching errors and outputting them in a manner that MSBuild will pickup. Look at This Link for information on how to output so MSBuild will put the errors in the visual studio error list. The post-build tool could then delete the compiled assemblies if errors were found, thus "simulating" a failed build.
Here's the SO Link that lead me to the MSBuild Blog too, just for reference.
HTH
For the enforcement side of things, custom FxCop rules would probably be a very good fit.
For the client code generation, there are quite a few possibilities. If you like the T4 approach, there is probably a way to get it working with MSBuild (but you would definitely need to provide a bit more detail regarding what isn't working now). If you're want an alternative anyway, a reflection-based post-build tool is yet another way to go...
Here is a short, extremely ugly program that you can run over an assembly or group of assemblies (just pass the dlls as arguments) to perform the WebGet UriTemplate check. If you don't pass anything, it runs on itself (and fails, appropriately, as it is its own unit test).
The program will print out to stdout the name of the methods that are missing the parameters and the names of the missing parameters, and if any are found, will return a non-zero return code (standard for a program failing), making it suitable as a post-build event. I am not responsible if your eyes bleed:
using System;
using System.Collections.Concurrent;
using System.Collections.Generic;
using System.Diagnostics;
using System.Linq;
using System.Reflection;
using System.ServiceModel.Web;
namespace ConsoleApplication1
{
class Program
{
static int Main(string[] args)
{
var failList = new ConcurrentDictionary<MethodInfo, ISet<String>>();
var assembliesToRunOn = (args.Length == 0 ? new[] {Assembly.GetExecutingAssembly()} : args.Select(Assembly.LoadFrom)).ToList();
assembliesToRunOn.AsParallel().ForAll(
a => Array.ForEach(a.GetTypes(), t => Array.ForEach(t.GetMethods(BindingFlags.Public | BindingFlags.Instance),
mi =>
{
var miParams = mi.GetParameters();
var attribs = mi.GetCustomAttributes(typeof (WebGetAttribute), true);
if (attribs.Length <= 0) return;
var wga = (WebGetAttribute)attribs[0];
wga.UriTemplate
.Split('/')
.ToList()
.ForEach(tp =>
{
if (tp.StartsWith("{") && tp.EndsWith("}"))
{
var tpName = tp.Substring(1, tp.Length - 2);
if (!miParams.Any(pi => pi.Name == tpName))
{
failList.AddOrUpdate(mi, new HashSet<string> {tpName}, (miv, l) =>
{
l.Add(tpName);
return l;
});
}
}
});
})));
if (failList.Count == 0) return 0;
failList.ToList().ForEach(kvp => Console.Out.WriteLine("Method " + kvp.Key + " in type " + kvp.Key.DeclaringType + " is missing the following expected parameters: " + String.Join(", ", kvp.Value.ToArray())));
return failList.Count;
}
[WebGet(UriTemplate = "Endpoint/{param1}/{param2}")]
public void WillPass(String param1, String param2) { }
[WebGet(UriTemplate = "Endpoint/{param1}/{param2}")]
public void WillFail() { }
[WebGet(UriTemplate = "Endpoint/{param1}/{param2}")]
public void WillFail2(String param1) { }
}
}
Related
Intro
I am looking for more customized solution for translating my app. I will be using Humanizer and Smart.Format after obtaining entries. The problem is to define keys to obtain them in the first place.
Requirements
The requirements are:
Language keys must be defined in-code, preferably near place where they are used
Language keys must contain default-English values
All language keys must be listed (XML, CSV, JSON, anything) after building the app suite
Language entries must be provided from external source (like JSON file), without the need for any kind of recompilation
The app may contain multiple executables, shared libraries, etc. all of them in form of C# apps
Discarded solutions
First, the things I discarded:
Built-in C# Resources.dll; They violate (1) and (4)
External file with keys. Violates (1)
My idea for handling the problem
Now, my idea for the solution looks that way (and is inspired by C++ GetText)
There is a template class which contains keys:
private sealed class Module1Keys : LocalizationKeys<Module1Keys>
{
public static readonly LocalizationKey HelloWorld = DefineKey("/foo", "Hello World!");
public static readonly LocalizationKey HelloWorld2 = DefineKey("/bar", "Hello World2!");
}
And the class LocalizationKeys contains a static method that will actually register keys in simple collection
public abstract class LocalizationKeys<T> where T : LocalizationKeys<T>
{
protected static LocalizationKey DefineKey(string path, string english)
{
var ret = new LocalizationKey(typeof(T), path, english);
// Following registers localization key in runtime:
Localization.Instance.RegisterLocalizableKey(ret);
return ret;
}
}
Problem
The only thing left to handle in this approach is to list localizable keys during build... which is where I had hit the wall. It is very easy to list them during runtime, but I cannot run the code on build time (particularly it may be built as shared library).
Maybe I am overthinking myself and there is better, more clean solution - I don't need to stick with this solution, but Googling around has not yielded anything better...
Nailed it. In GetText times we have to resort to manually parse code.
... but now, with CSharp, we have a Roslyn, with CodeAnalysis API.
Solution
Wire up custom Console build tool that includes Microsoft.CodeAnalysis NuGet and have code like:
var model = compilation.GetSemanticModel(tree);
var methods = root.DescendantNodes().OfType<InvocationExpressionSyntax>();
foreach(var method in methods)
{
if(model.GetSymbolInfo(method).Symbol is IMethodSymbol symbol &&
symbol.ContainingNamespace.Name == "MyProject" &&
symbol.ContainingType.Name == "LocalizationKeys" &&
symbol.Name == "DefineKey")
{
var key = method.ArgumentList.Arguments.FirstOrDefault();
var eng = method.ArgumentList.Arguments.Skip(1).FirstOrDefault();
if(key.Expression is LiteralExpressionSyntax literalKey &&
eng.Expression is LiteralExpressionSyntax literalEng)
{
// "/foo" -> "Hello World!"
// "/bar" -> "Hello World2!"
Console.WriteLine(literalKey + " -> " + literalEng);
}
else
{
// Bonus: detect violation of key definition rule. It is not a literal!
}
}
}
Compile this Console tool as executable and add it as post-build step. Profit.
It appears that I can run all my tests in the solution in one go from the command line using MSTest if I use the /testmetadata flag as described here: http://msdn.microsoft.com/en-us/library/ms182487.aspx
I'm running SQL Server DB Unit tests in Visual Studio 2013, wherein I don't seem to have a vsmdi file at all, and I'm unable to find a way to add one either. I tried creating a testsettings file, but it doesn't discover any tests (shows "No tests to run") when I invoke MSTest.
Is there a way I can have MSTest run all my tests in a VS2013 solution?
I wanted to close this open question. My intention was to run all tests in one go from out Hudson CI server, so I wrote a basic console app to find and invoke MSTest on all DLL files inside the solution folder. This app is executed after the project is built in release mode.
string execId = null;
string className = null;
string testName = null;
string testResult = null;
string resultLine = null;
List<string> results = new List<string>();
XmlDocument resultsDoc = new XmlDocument();
XmlNode executionNode = null;
XmlNode testMethodNode = null;
// Define the test instance settings
Process testInstance = null;
ProcessStartInfo testInfo = new ProcessStartInfo()
{
UseShellExecute = false,
CreateNoWindow = true,
};
// Fetch project list from the disk
List<string> excluded = ConfigurationManager.AppSettings["ExcludedProjects"].Split(',').ToList();
DirectoryInfo assemblyPath = new DirectoryInfo(Assembly.GetExecutingAssembly().Location);
DirectoryInfo[] directories = assemblyPath.Parent.Parent.Parent.Parent.GetDirectories();
// Create a test worklist
List<string> worklist = directories.Where(t => !excluded.Contains(t.Name))
.Select(t => String.Format(ConfigurationManager.AppSettings["MSTestCommand"], t.FullName, t.Name))
.ToList();
// Start test execution
Console.WriteLine("Starting Execution...");
Console.WriteLine();
Console.WriteLine("Results Top Level Tests");
Console.WriteLine("------- ---------------");
// Remove any existing run results
if (File.Exists("UnitTests.trx"))
{
File.Delete("UnitTests.trx");
}
// Run each project in the worklist
foreach (string item in worklist)
{
testInfo.FileName = item;
testInstance = Process.Start(testInfo);
testInstance.WaitForExit();
if (File.Exists("UnitTests.trx"))
{
resultsDoc = new XmlDocument();
resultsDoc.Load("UnitTests.trx");
foreach (XmlNode result in resultsDoc.GetElementsByTagName("UnitTestResult"))
{
// Get the execution ID for the test
execId = result.Attributes["executionId"].Value;
// Find the execution and test method nodes
executionNode = resultsDoc.GetElementsByTagName("Execution")
.OfType<XmlNode>()
.Where(n => n.Attributes["id"] != null && n.Attributes["id"].Value.Equals(execId))
.First();
testMethodNode = executionNode.ParentNode
.ChildNodes
.OfType<XmlNode>()
.Where(n => n.Name.Equals("TestMethod"))
.First();
// Get the class name, test name and result
className = testMethodNode.Attributes["className"].Value.Split(',')[0];
testName = result.Attributes["testName"].Value;
testResult = result.Attributes["outcome"].Value;
resultLine = String.Format("{0} {1}.{2}", testResult, className, testName);
results.Add(resultLine);
Console.WriteLine(resultLine);
}
File.Delete("UnitTests.trx");
}
}
// Calculate passed / failed test case count
int passed = results.Where(r => r.StartsWith("Passed")).Count();
int failed = results.Where(r => r.StartsWith("Failed")).Count();
// Print the summary
Console.WriteLine();
Console.WriteLine("Summary");
Console.WriteLine("-------");
Console.WriteLine("Test Run {0}", failed > 0 ? "Failed." : "Passed.");
Console.WriteLine();
if (passed > 0)
Console.WriteLine("\tPassed {0,7}", passed);
if (failed > 0)
Console.WriteLine("\tFailed {0,7}", failed);
Console.WriteLine("\t--------------");
Console.WriteLine("\tTotal {0,8}", results.Count);
if (failed > 0)
Environment.Exit(-1);
else
Environment.Exit(0);
My App.config file:
<appSettings>
<add key="ExcludedProjects" value="UnitTests.Bootstrap,UnitTests.Utils" />
<add key="MSTestCommand" value=""c:\Program Files (x86)\Microsoft Visual Studio 12.0\Common7\IDE\MSTest.exe" /testcontainer:"{0}\bin\Release\{1}.dll" /nologo /resultsfile:"UnitTests.trx"" />
</appSettings>
MsTest is kind of the "deprecated" test framework of Visual Studio 2013. it's still used for some types of tests, but many of the other tests that can be executed now live in the new Agile Test Runner. Now SQL Database Unit Tests seems to still be in the category of "Some types" and need to be executed through MsTest.exe.
The easiest way is to use the /TestContainer commandline switch and use a naming pattern for your test projects. That way you can quickly grab all the assemblies with a specific naming pattern and then feed them to MsTest. Simple powershell command could be used to grab all the files that adhere to your pattern and then feed them to the commandline.
The vsmdi will still work in Visual Studio 2013, but the editor has been removed from the tool, plus there is no template for it anymore. So it's very hard to use. This is what Microsoft has to say about the VSDMI's:
Caution
Test lists are no longer fully supported in Visual Studio 2012:
You cannot create new test lists.
You cannot run test list tests from within Visual Studio.
If you upgraded from Visual Studio 2010, and had test list in your solution, you can continue to edit it in Visual Studio.
You can continue to run test list using mstest.exe from the command line, as described above.
If you were using a test list in your build definition, you can continue to use it.
Basically they're telling you to stop using this technique and use a combination of TestCategory's to create easy to execute tests groups.
As you can add multiple parameters for test containers, you can all group them into one call:
/testcontainer:[file name] Load a file that contains tests. You can
Specify this option more than once to
load multiple test files.
Examples:
/testcontainer:mytestproject.dll
/testcontainer:loadtest1.loadtest
MsTest /testcontainer:assemblyone.dll /testcontainer:assemblytwo.dll /testcontainer:assembly3.dll
To run MsTest on multiple assemblies at once. And do not (yet) make use of XUnit .NET or NUnit, as these cannot be combined into one report without switching to the new Agile test runner.
I don't know if this will help you or not, but I use Invoke-MsBuild. Its a PowerShell module that should does exactly what you need. I don't know if you were looking for a PowerShell solution or not, but it works great!
It also has a sister script, Invoke-MsTest for running MsTest instead of MsBuild.
Just for completeness, I often want to run tests as console application, simply because I find it much easier to debug this for some reason... Over the years I've created a few small test helpers to help me out; I suppose you can use them with your CI solution quite easily.
I understand this isn't your question entirely; however, since you're looking for a CI solution and mention visual studio, this should solve it quite nicely.
Just to let you know, my little framework is a bit bigger than this, but the things that are missing are quite easy to add. Basically what I left out is everything for logging and the fact that I test different assemblies in different app domains (because of possible DLL conflicts and state). More on that below.
One thing to notice is that I do not catch any exceptions in the process below. My main focus is making it easy to debug your application when troubleshooting. I have a separate (but similar) implementation for CI that basically adds try/catches on the comment points below.
There's only one catch to this method: Visual Studio won't copy all the assemblies you reference; it'll only copy assemblies that you use in the code. A simple workaround for this is to introduce a method (which is never called) which uses one type in the DLL you're testing. That way, your assembly will be copied and everything will work nicely.
Here's the code:
static class TestHelpers
{
public static void TestAll(this object o)
{
foreach (MethodInfo meth in o.GetType().GetMethods().
Where((a) => a.GetCustomAttributes(true).
Any((b) => b.GetType().Name.Contains("TestMethod"))))
{
Console.WriteLine();
Console.WriteLine("--- Testing {0} ---", meth.Name);
Console.WriteLine();
// Add exception handling here for your CI solution.
var del = (Action)meth.CreateDelegate(typeof(Action), o);
del();
// NOTE: Don't use meth.Invoke(o, new object[0]); ! It'll eat your exception!
Console.WriteLine();
}
}
public static void TestAll(this Assembly ass)
{
HashSet<AssemblyName> visited = new HashSet<AssemblyName>();
Stack<Assembly> todo = new Stack<Assembly>();
todo.Push(ass);
HandleStack(visited, todo);
}
private static void HandleStack(HashSet<AssemblyName> visited, Stack<Assembly> todo)
{
while (todo.Count > 0)
{
var assembly = todo.Pop();
// Collect all assemblies that are related
foreach (var refass in assembly.GetReferencedAssemblies())
{
TryAdd(refass, visited, todo);
}
foreach (var type in assembly.GetTypes().
Where((a) => a.GetCustomAttributes(true).
Any((b) => b.GetType().Name.Contains("TestClass"))))
{
// Add exception handling here for your CI solution.
var obj = Activator.CreateInstance(type);
obj.TestAll();
}
}
}
public static void TestAll()
{
HashSet<AssemblyName> visited = new HashSet<AssemblyName>();
Stack<Assembly> todo = new Stack<Assembly>();
foreach (var assembly in AppDomain.CurrentDomain.GetAssemblies())
{
TryAdd(assembly.GetName(), visited, todo);
}
HandleStack(visited, todo);
}
private static void TryAdd(AssemblyName ass, HashSet<AssemblyName> visited, Stack<Assembly> todo)
{
try
{
var reference = Assembly.Load(ass);
if (reference != null &&
!reference.GlobalAssemblyCache && // Ignore GAC
reference.FullName != null &&
!reference.FullName.StartsWith("ms") && // mscorlib and other microsoft stuff
!reference.FullName.StartsWith("vshost") && // visual studio host process
!reference.FullName.StartsWith("System")) // System libraries
{
if (visited.Add(reference.GetName())) // We don't want to test assemblies twice
{
todo.Push(reference); // Queue assembly for processing
}
}
}
catch
{
// Perhaps log something here... I currently don't because I don't care...
}
}
}
How to use this code:
You can simply call TestHelpers.TestAll() to test all assemblies, referenced assemblies, indirectly referenced assemblies, etc. This is probably what you want to do in CI.
You can call TestHelpers.TestAll(assembly) to test a single assembly with all referenced assemblies. This can be useful when you're splitting tests across multiple assemblies and/or when your debugging.
You can call new MyObject().TestAll() to invoke all tests in a single object. This is particularly helpful when debugging.
If you're using appdomains like me, you should make a single appdomain for a DLL you load dynamically from a folder and use TestAll on that. Also, if you use a scratch folder, you might want to empty that between tests. That way, multiple test framework versions and multiple tests won't interact with each other. Particularly if your tests use state (e.g. static variables), this might be a good practice. There are tons of examples for CreateInstanceAndUnwrap online that will help you out with this.
One thing to note is that I use a delegate instead of the method.Invoke. This basically means your exception object won't be eaten by Reflection, which means your debugger won't be broken. Also note that I check attributes by name, which means this will work with different frameworks - as long as the attribute names match.
HTH
Read the Caution part in your own link. It's no longer supported in VST 2012 the way you do it.
Maybe this update version can help:
http://msdn.microsoft.com/en-us/library/ms182490.aspx
We are using TeamCity for continuous integration, our source control is Git, and we have 1 major repository that contains multiple .sln files (around 10).
All in all, this repository has about ~ 100 - 200 C# projects.
Upon a push to the master repository, TeamCity triggers a build that will compile all projects in the repository.
I'd like to be able to tell which projects were actually affected by a particular commit, and thus publish only those projects' outputs as artifacts of the current build.
For this, i've designed a solution to integrate NDepend into our build process, and generate a diff report between current and latest build outputs.
The outputs that were changed/added will be published as the build outputs.
I have little experience with NDepend; from what i've seen all of its true power comes from the query language that is baked into it.
I am wondering how (if possible) i can achieve the following:
Diff between a folder containing previous build's outputs and current folder of build outputs.
Have NDepend generate a report in a consumable format so i can determine the files that need to be copied.
Is this scenario possible? How easy/hard would that be?
So the simple answer is to do the Reporting Code Diff way as explained in this documentation. The problem with this basic answer is that, it pre-suppose two NDepend projects that always refers to the two same set of assemblies.
Certainly, the number and names of assemblies is varying in your context so we need to build two projects (old/new) on the fly and analyze them through NDepend.API.
Here is the NDepend.API source code for that. For a It-Just-Works experience, in the PowerTools source code (in $NDependInstallDir$\NDepend.PowerTools.SourceCode\NDepend.PowerTools.sln) just call the FoldersDiff.Main(); method after the AssemblyResolve registration call, in Program.cs.
...
AppDomain.CurrentDomain.AssemblyResolve += AssemblyResolverHelper.AssemblyResolveHandler;
FoldersDiff.Main();
...
Here is the the source code that harnesses NDepend.API.
Note that so much more can be done, through the two codeBase objects and the compareContext object. Instead of just showing the 3 lists of assemblies added/removed/codeWasChanges, you could show API breakings changes, new methods and types added, modified classes and methods, code quality regression... For that, just look at default code rules concerning diff, that are based on the same NDepend.CodeModel API.
using System;
using System.Collections.Generic;
using System.Diagnostics;
using System.Linq;
using NDepend;
using NDepend.Analysis;
using NDepend.CodeModel;
using NDepend.Path;
using NDepend.Project;
class FoldersDiff {
private static readonly NDependServicesProvider s_NDependServicesProvider = new NDependServicesProvider();
internal static void Main() {
var dirOld = #"C:\MyProduct\OldAssembliesDir".ToAbsoluteDirectoryPath();
var dirNew = #"C:\MyProduct\NewAssembliesDir".ToAbsoluteDirectoryPath();
Console.WriteLine("Analyzing assemblies in " + dirOld.ToString());
var codeBaseOld = GetCodeBaseFromAsmInDir(dirOld, TemporaryProjectMode.TemporaryOlder);
Console.WriteLine("Analyzing assemblies in " + dirNew.ToString());
var codeBaseNew = GetCodeBaseFromAsmInDir(dirNew, TemporaryProjectMode.TemporaryNewer);
var compareContext = codeBaseNew.CreateCompareContextWithOlder(codeBaseOld);
// So much more can be done by exploring fine-grained diff in codeBases and compareContext
Dump("Added assemblies", codeBaseNew.Assemblies.Where(compareContext.WasAdded));
Dump("Removed assemblies", codeBaseOld.Assemblies.Where(compareContext.WasRemoved));
Dump("Assemblies with modified code", codeBaseNew.Assemblies.Where(compareContext.CodeWasChanged));
Console.Read();
}
internal static ICodeBase GetCodeBaseFromAsmInDir(IAbsoluteDirectoryPath dir, TemporaryProjectMode temporaryProjectMode) {
Debug.Assert(dir.Exists);
var dotNetManager = s_NDependServicesProvider.DotNetManager;
var assembliesPath = dir.ChildrenFilesPath.Where(dotNetManager.IsAssembly).ToArray();
Debug.Assert(assembliesPath.Length > 0); // Make sure we found assemblies
var projectManager = s_NDependServicesProvider.ProjectManager;
IProject project = projectManager.CreateTemporaryProject(assembliesPath, temporaryProjectMode);
// In PowerTool context, better call:
// var analysisResult = ProjectAnalysisUtils.RunAnalysisShowProgressOnConsole(project);
var analysisResult = project.RunAnalysis();
return analysisResult.CodeBase;
}
internal static void Dump(string title, IEnumerable<IAssembly> assemblies) {
Debug.Assert(!string.IsNullOrEmpty(title));
Debug.Assert(assemblies != null);
Console.WriteLine(title);
foreach (var #assembly in assemblies) {
Console.WriteLine(" " + #assembly.Name);
}
}
}
IronRuby and VS2010 noob question:
I'm trying to do a spike to test the feasibility of interop between a C# project and an existing RubyGem rather than re-invent that particular wheel in .net. I've downloaded and installed IronRuby and the RubyGems package, as well as the gem I'd ultimately like to use.
Running .rb files or working in the iirb Ruby console is without problems. I can load the both the RubyGems package, and the gem itself and use it, so, at least for that use case, my environment is set up correctly.
However, when I try to do the same sort of thing from within a C# (4.0) console app, it complains about the very first line:
require 'RubyGems'
With the error:
no such file to load -- rubygems
My Console app looks like this:
using System;
using IronRuby;
namespace RubyInteropSpike
{
class Program
{
static void Main(string[] args)
{
var runtime = Ruby.CreateRuntime();
var scope = runtime.ExecuteFile("test.rb");
Console.ReadKey();
}
}
}
Removing the dependencies and just doing some basic self-contained Ruby stuff works fine, but including any kind of 'requires' statement seems to cause it to fail.
I'm hoping that I just need to pass some additional information (paths, etc) to the ruby runtime when I create it, and really hoping that this isn't some kind of limitation, because that would make me sad.
Short answer: Yes, this will work how you want it to.You need to use the engine's SetSearchPaths method to do what you wish.
A more complete example
(Assumes you loaded your IronRuby to C:\IronRubyRC2 as the root install dir)
var engine = IronRuby.Ruby.CreateEngine();
engine.SetSearchPaths(new[] {
#"C:\IronRubyRC2\Lib\ironruby",
#"C:\IronRubyRC2\Lib\ruby\1.8",
#"C:\IronRubyRC2\Lib\ruby\site_ruby\1.8"
});
engine.Execute("require 'rubygems'"); // without SetSearchPaths, you get a LoadError
/*
engine.Execute("require 'restclient'"); // install through igem, then check with igem list
engine.Execute("puts RestClient.get('http://localhost/').body");
*/
Console.ReadKey();
I have a requirement to install multiple web setup projects (using VS2005 and ASP.Net/C#) into the same virtual folder. The projects share some assembly references (the file systems are all structured to use the same 'bin' folder), making deployment of changes to those assemblies problematic since the MS installer will only overwrite assemblies if the currently installed version is older than the one in the MSI.
I'm not suggesting that the pessimistic installation scheme is wrong - only that it creates a problem in the environment I've been given to work with. Since there are a sizable number of common assemblies and a significant number of developers who might change a common assembly but forget to update its version number, trying to manage versioning manually will eventually lead to massive confusion at install time.
On the flip side of this issue, it's also important not to spontaneously update version numbers and replace all common assemblies with every install, since that could (temporarily at least) obscure cases where actual changes were made.
That said, what I'm looking for is a means to update assembly version information (preferably using MSBuild) only in cases where the assembly constituents (code modules, resources etc) has/have actually changed.
I've found a few references that are at least partially pertinent here (AssemblyInfo task on MSDN) and here (looks similar to what I need, but more than two years old and without a clear solution).
My team also uses TFS version control, so an automated solution should probably include a means by which the AssebmlyInfo can be checked out/in during the build.
Any help would be much appreciated.
Thanks in advance.
I cannot answer all your questions, as I don't have experience with TFS.
But I can recommend a better approach to use for updating your AssemblyInfo.cs files than using the AssemblyInfo task. That task appears to just recreate a standard AssemblyInfo file from scratch, and loses any custom portions you may have added.
For that reason, I suggest you look into the FileUpdate task, from the MSBuild Community Tasks project. It can look for specific content in a file and replace it, like this:
<FileUpdate
Files="$(WebDir)\Properties\AssemblyInfo.cs"
Regex="(\d+)\.(\d+)\.(\d+)\.(\d+)"
ReplacementText="$(Major).$(ServicePack).$(Build).$(Revision)"
Condition="'$(Configuration)' == 'Release'"
/>
There are several ways you can control the incrementing of the build number. Because I only want the build number to increment if the build is completely successful, I use a 2-step method:
read a number from a text file (the only thing in the file is the number) and add 1 without changing the file;
as a final step in the build process, if everything succeeded, save the incremented number back to the text file.
There are tasks such as ReadLinesFromFile, that can help you with this, but I found it easiest to write a small custom task:
using System;
using System.IO;
using Microsoft.Build.Framework;
using Microsoft.Build.Utilities;
namespace CredibleCustomBuildTasks
{
public class IncrementTask : Task
{
[Required]
public bool SaveChange { get; set; }
[Required]
public string IncrementFileName { get; set; }
[Output]
public int Increment { get; set; }
public override bool Execute()
{
if (File.Exists(IncrementFileName))
{
string lines = File.ReadAllText(IncrementFileName);
int result;
if(Int32.TryParse(lines, out result))
{
Increment = result + 1;
}
else
{
Log.LogError("Unable to parse integer in '{0}' (contents of {1})");
return false;
}
}
else
{
Increment = 1;
}
if (SaveChange)
{
File.Delete(IncrementFileName);
File.WriteAllText(IncrementFileName, Increment.ToString());
}
return true;
}
}
}
I use this before the FileUpdateTask to get the next build number:
<IncrementTask
IncrementFileName="$(BuildNumberFile)"
SaveChange="false">
<Output TaskParameter="Increment" PropertyName="Build" />
</IncrementTask>
and as my final step (before notifying others) in the build:
<IncrementTask
IncrementFileName="$(BuildNumberFile)"
SaveChange="true"
Condition="'$(Configuration)' == 'Release'" />
Your other question of how to update the version number only when source code has changed is highly dependent on your how your build process interacts with your source control. Normally, checking in source file changes should initiate a Continuous Integration build. That is the one to use to update the relevant version number.
I have written one custome task you can refer the code below. It will create an utility to which you can pass assemblyinfo path Major,minor and build number. you can modify it to get revision number. Since in my case this task was done by developer i used to search it and again replace whole string.
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.IO;
using System.Text.RegularExpressions;
namespace UpdateVersion
{
class SetVersion
{
static void Main(string[] args)
{
String FilePath = args[0];
String MajVersion=args[1];
String MinVersion = args[2];
String BuildNumber = args[3];
string RevisionNumber = null;
StreamReader Reader = File.OpenText(FilePath);
string contents = Reader.ReadToEnd();
Reader.Close();
MatchCollection match = Regex.Matches(contents, #"\[assembly: AssemblyVersion\("".*""\)\]", RegexOptions.IgnoreCase);
if (match[0].Value != null)
{
string strRevisionNumber = match[0].Value;
RevisionNumber = strRevisionNumber.Substring(strRevisionNumber.LastIndexOf(".") + 1, (strRevisionNumber.LastIndexOf("\"")-1) - strRevisionNumber.LastIndexOf("."));
String replaceWithText = String.Format("[assembly: AssemblyVersion(\"{0}.{1}.{2}.{3}\")]", MajVersion, MinVersion, BuildNumber, RevisionNumber);
string newText = Regex.Replace(contents, #"\[assembly: AssemblyVersion\("".*""\)\]", replaceWithText);
StreamWriter writer = new StreamWriter(FilePath, false);
writer.Write(newText);
writer.Close();
}
else
{
Console.WriteLine("No matching values found");
}
}
}
}
I hate to say this but it seems that you may be doing it wrongly. Is much easier if you do generate the assembly versions on the fly instead of trying to patch them.
Take a look at https://sbarnea.com/articles/easy-windows-build-versioning/
Why I do think you are doing it wrong?
* A build should not modify the version number
* if you build the same changeset twice you should get the same build numbers
* if you put build number inside what microsoft calls build number (proper naming would be PATCH level) you will eventually reach the 65535 limitation.