my question does not target a problem. It is more some kind of "Do you know something that...?". All my applications are built and deployed using CI/CD with Azure DevOps. I like to have all build information handy in the create binary and to read them during runtime. Those applications are mainly .NET Core 2 applications written in C#. I am using the default build system MSBuild supplied with the .NET Core SDK. The project should be buildable on Windows AND Linux.
Information I need:
GitCommitHash: string
GitCommitMessage: string
GitBranch: string
CiBuildNumber: string (only when built via CI not locally)
IsCiBuild: bool (Detecting should work by checking for env variables
which are only available in CI builds)
Current approach:
In each project in the solution there is a class BuildConfig à la
public static class BuildConfig
{
public const string BuildNumber = "#{Build.BuildNumber}#"; // Das sind die Namen der Variablen innerhalb der CI
// and the remaining information...
}
Here tokens are used, which get replaced with the corresponding values during the CI build. To achieve this an addon task is used. Sadly this only fills the values for CI builds and not for the local ones. When running locally and requesting the build information it only contains the tokens as they are not replaced during the local build.
It would be cool to either have the BuildConfig.cs generated during the build or have the values of the variables set during the local build (IntelliSense would be very cool and would prevent some "BuildConfig class could not be found" errors). The values could be set by an MSBuild task (?). That would be one (or two) possibilities to solve this. Do you have ideas/experience regarding this? I did not found that much during my internet research. I only stumbled over this question which did not really help me as I have zero experience with MSBuild tasks/customization.
Then I decided to have a look at build systems in general. Namly Fake and Cake. Cake has a Git-Addin, but I did not find anything regarding code generation/manipulation. Do you know some resources on that?
So here's the thing...
Short time ago I had to work with Android apps namly Java and the build system gradle. So I wanted to inject the build information there too during the CI build. After a short time I found a (imo) better and more elegant solution to do this. And this was modifying the build script in the following way (Scripting language used is Groovy which is based on Java):
def getGitHash = { ->
def stdout = new ByteArrayOutputStream()
exec {
commandLine 'git', 'rev-parse', '--short', 'HEAD'
standardOutput = stdout
}
return stdout.toString().trim().replace("\"", "\\\"")
}
def getGitBranch = { ->
def fromEnv = System.getenv("BUILD_SOURCEBRANCH")
if (fromEnv) {
return fromEnv.substring("refs/heads/".length()).replace("\"", "\\\"");
} else {
def stdout = new ByteArrayOutputStream()
exec {
commandLine 'git', 'rev-parse', '--abbrev-ref', 'HEAD'
standardOutput = stdout
}
return stdout.toString().trim().replace("\"", "\\\"")
}
}
def getIsCI = { ->
return System.getenv("BUILD_BUILDNUMBER") != null;
}
# And the other functions working very similar
android {
# ...
buildConfigField "String", "GitHash", "\"${getGitHash()}\""
buildConfigField "String", "GitBranch", "\"${getGitBranch()}\""
buildConfigField "String", "BuildNumber", "\"${getBuildNumber()}\""
buildConfigField "String", "GitMessage", "\"${getGitCommitMessage()}\""
buildConfigField "boolean", "IsCIBuild", "${getIsCI()}"
# ...
}
The result after the first build is the following java code:
public final class BuildConfig {
// Some other fields generated by default
// Fields from default config.
public static final String BuildNumber = "Local Build";
public static final String GitBranch = "develop";
public static final String GitHash = "6c87e82";
public static final String GitMessage = "Merge branch 'hotfix/login-failed' into 'develop'";
public static final boolean IsCIBuild = false;
}
Getting the required information is done by the build script itself without depending on the CI engine to fulfill this task. This class can be used after the first build its generated and stored in a "hidden" directory which is included in code analysis but exluded from your code in the IDE and also not pushed to the Git. But there is IntelliSense support. In C# project this would be the obj/ folder I guess. It is very easy to access the information as they are a constant and static values (so no reflection or similar required).
So here the summarized question: "Do you know something to achieve this behaviour/mechanism in a .NET environment?"
Happy to discuss some ideas/approaches... :)
It becomes much easier if at runtime you are willing to use reflection to read assembly attribute values. For example:
using System.Reflection;
var assembly = Assembly.GetExecutingAssembly();
var descriptionAttribute = (AssemblyDescriptionAttribute)assembly
.GetCustomAttributes(typeof(AssemblyDescriptionAttribute), false).FirstOrDefault();
var description = descriptionAttribute?.Description;
For most purposes the performance impact of this approach can be satisfactorily addressed by caching the values so they only need to read once.
One way to embed the desired values into assembly attributes is to use the MSBuild WriteCodeFragment task to create a class file that sets assembly attributes to the values of project and/or environment variables. You would need to ensure that you do this in a Target that executes before before compilation occurs (e.g. <Target BeforeTargets="CoreCompile" ...). You would also need to set the property <GenerateAssemblyInfo>false</GenerateAssemblyInfo> to avoid conflicting with the functionality referenced in the next option.
Alternatively, you may be able to leverage the plumbing in the dotnet SDK for including metadata in assemblies. It embeds the values of many of the same project variables documented for the NuGet Pack target. As implied above, this would require the GenerateAssemblyInfo property to be set to true.
Finally, consider whether GitVersion would meet your needs.
Good luck!
Is it possible to embed a pre-existing DLL into a compiled C# executable (so that you only have one file to distribute)? If it is possible, how would one go about doing it?
Normally, I'm cool with just leaving the DLLs outside and having the setup program handle everything, but there have been a couple of people at work who have asked me this and I honestly don't know.
I highly recommend to use Costura.Fody - by far the best and easiest way to embed resources in your assembly. It's available as NuGet package.
Install-Package Costura.Fody
After adding it to the project, it will automatically embed all references that are copied to the output directory into your main assembly. You might want to clean the embedded files by adding a target to your project:
Install-CleanReferencesTarget
You'll also be able to specify whether to include the pdb's, exclude certain assemblies, or extracting the assemblies on the fly. As far as I know, also unmanaged assemblies are supported.
Update
Currently, some people are trying to add support for DNX.
Update 2
For the lastest Fody version, you will need to have MSBuild 16 (so Visual Studio 2019). Fody version 4.2.1 will do MSBuild 15. (reference: Fody is only supported on MSBuild 16 and above. Current version: 15)
Just right-click your project in Visual Studio, choose Project Properties -> Resources -> Add Resource -> Add Existing File…
And include the code below to your App.xaml.cs or equivalent.
public App()
{
AppDomain.CurrentDomain.AssemblyResolve +=new ResolveEventHandler(CurrentDomain_AssemblyResolve);
}
System.Reflection.Assembly CurrentDomain_AssemblyResolve(object sender, ResolveEventArgs args)
{
string dllName = args.Name.Contains(',') ? args.Name.Substring(0, args.Name.IndexOf(',')) : args.Name.Replace(".dll","");
dllName = dllName.Replace(".", "_");
if (dllName.EndsWith("_resources")) return null;
System.Resources.ResourceManager rm = new System.Resources.ResourceManager(GetType().Namespace + ".Properties.Resources", System.Reflection.Assembly.GetExecutingAssembly());
byte[] bytes = (byte[])rm.GetObject(dllName);
return System.Reflection.Assembly.Load(bytes);
}
Here's my original blog post:
http://codeblog.larsholm.net/2011/06/embed-dlls-easily-in-a-net-assembly/
If they're actually managed assemblies, you can use ILMerge. For native DLLs, you'll have a bit more work to do.
See also: How can a C++ windows dll be merged into a C# application exe?
Yes, it is possible to merge .NET executables with libraries. There are multiple tools available to get the job done:
ILMerge is a utility that can be used to merge multiple .NET assemblies into a single assembly.
Mono mkbundle, packages an exe and all assemblies with libmono into a single binary package.
IL-Repack is a FLOSS alterantive to ILMerge, with some additional features.
In addition this can be combined with the Mono Linker, which does remove unused code and therefor makes the resulting assembly smaller.
Another possibility is to use .NETZ, which does not only allow compressing of an assembly, but also can pack the dlls straight into the exe. The difference to the above mentioned solutions is that .NETZ does not merge them, they stay separate assemblies but are packed into one package.
.NETZ is a open source tool that compresses and packs the Microsoft .NET Framework executable (EXE, DLL) files in order to make them smaller.
ILMerge can combine assemblies to one single assembly provided the assembly has only managed code. You can use the commandline app, or add reference to the exe and programmatically merge. For a GUI version there is Eazfuscator, and also .Netz both of which are free. Paid apps include BoxedApp and SmartAssembly.
If you have to merge assemblies with unmanaged code, I would suggest SmartAssembly. I never had hiccups with SmartAssembly but with all others. Here, it can embed the required dependencies as resources to your main exe.
You can do all this manually not needing to worry if assembly is managed or in mixed mode by embedding dll to your resources and then relying on AppDomain's Assembly ResolveHandler. This is a one stop solution by adopting the worst case, ie assemblies with unmanaged code.
static void Main()
{
AppDomain.CurrentDomain.AssemblyResolve += (sender, args) =>
{
string assemblyName = new AssemblyName(args.Name).Name;
if (assemblyName.EndsWith(".resources"))
return null;
string dllName = assemblyName + ".dll";
string dllFullPath = Path.Combine(GetMyApplicationSpecificPath(), dllName);
using (Stream s = Assembly.GetEntryAssembly().GetManifestResourceStream(typeof(Program).Namespace + ".Resources." + dllName))
{
byte[] data = new byte[stream.Length];
s.Read(data, 0, data.Length);
//or just byte[] data = new BinaryReader(s).ReadBytes((int)s.Length);
File.WriteAllBytes(dllFullPath, data);
}
return Assembly.LoadFrom(dllFullPath);
};
}
The key here is to write the bytes to a file and load from its location. To avoid chicken and egg problem, you have to ensure you declare the handler before accessing assembly and that you do not access the assembly members (or instantiate anything that has to deal with the assembly) inside the loading (assembly resolving) part. Also take care to ensure GetMyApplicationSpecificPath() is not any temp directory since temp files could be attempted to get erased by other programs or by yourself (not that it will get deleted while your program is accessing the dll, but at least its a nuisance. AppData is good location). Also note that you have to write the bytes each time, you cant load from location just 'cos the dll already resides there.
For managed dlls, you need not write bytes, but directly load from the location of the dll, or just read the bytes and load the assembly from memory. Like this or so:
using (Stream s = Assembly.GetEntryAssembly().GetManifestResourceStream(typeof(Program).Namespace + ".Resources." + dllName))
{
byte[] data = new byte[stream.Length];
s.Read(data, 0, data.Length);
return Assembly.Load(data);
}
//or just
return Assembly.LoadFrom(dllFullPath); //if location is known.
If the assembly is fully unmanaged, you can see this link or this as to how to load such dlls.
.NET Core 3.0 natively supports compiling to a single .exe
The feature is enabled by the usage of the following property in your project file (.csproj):
<PropertyGroup>
<PublishSingleFile>true</PublishSingleFile>
</PropertyGroup>
This is done without any external tool.
See my answer for this question for further details.
The excerpt by Jeffrey Richter is very good. In short, add the libraries as embedded resources and add a callback before anything else. Here is a version of the code (found in the comments of his page) that I put at the start of Main method for a console app (just make sure that any calls that use the libraries are in a different method to Main).
AppDomain.CurrentDomain.AssemblyResolve += (sender, bargs) =>
{
String dllName = new AssemblyName(bargs.Name).Name + ".dll";
var assem = Assembly.GetExecutingAssembly();
String resourceName = assem.GetManifestResourceNames().FirstOrDefault(rn => rn.EndsWith(dllName));
if (resourceName == null) return null; // Not found, maybe another handler will find it
using (var stream = assem.GetManifestResourceStream(resourceName))
{
Byte[] assemblyData = new Byte[stream.Length];
stream.Read(assemblyData, 0, assemblyData.Length);
return Assembly.Load(assemblyData);
}
};
To expand on #Bobby's asnwer above. You can edit your .csproj to use IL-Repack to automatically package all files into a single assembly when you build.
Install the nuget ILRepack.MSBuild.Task package with Install-Package ILRepack.MSBuild.Task
Edit the AfterBuild section of your .csproj
Here is a simple sample that merges ExampleAssemblyToMerge.dll into your project output.
<!-- ILRepack -->
<Target Name="AfterBuild" Condition="'$(Configuration)' == 'Release'">
<ItemGroup>
<InputAssemblies Include="$(OutputPath)\$(AssemblyName).exe" />
<InputAssemblies Include="$(OutputPath)\ExampleAssemblyToMerge.dll" />
</ItemGroup>
<ILRepack
Parallel="true"
Internalize="true"
InputAssemblies="#(InputAssemblies)"
TargetKind="Exe"
OutputFile="$(OutputPath)\$(AssemblyName).exe"
/>
</Target>
The following method DO NOT use external tools and AUTOMATICALLY include all needed DLL (no manual action required, everything done at compilation)
I read a lot of answer here saying to use ILMerge, ILRepack or Jeffrey Ritcher method but none of that worked with WPF applications nor was easy to use.
When you have a lot of DLL it can be hard to manually include the one you need in your exe. The best method i found was explained by Wegged here on StackOverflow
Copy pasted his answer here for clarity (all credit to Wegged)
1) Add this to your .csproj file:
<Target Name="AfterResolveReferences">
<ItemGroup>
<EmbeddedResource Include="#(ReferenceCopyLocalPaths)" Condition="'%(ReferenceCopyLocalPaths.Extension)' == '.dll'">
<LogicalName>%(ReferenceCopyLocalPaths.DestinationSubDirectory)%(ReferenceCopyLocalPaths.Filename)%(ReferenceCopyLocalPaths.Extension)</LogicalName>
</EmbeddedResource>
</ItemGroup>
</Target>
2) Make your Main Program.cs look like this:
[STAThreadAttribute]
public static void Main()
{
AppDomain.CurrentDomain.AssemblyResolve += OnResolveAssembly;
App.Main();
}
3) Add the OnResolveAssembly method:
private static Assembly OnResolveAssembly(object sender, ResolveEventArgs args)
{
Assembly executingAssembly = Assembly.GetExecutingAssembly();
AssemblyName assemblyName = new AssemblyName(args.Name);
var path = assemblyName.Name + ".dll";
if (assemblyName.CultureInfo.Equals(CultureInfo.InvariantCulture) == false) path = String.Format(#"{0}\{1}", assemblyName.CultureInfo, path);
using (Stream stream = executingAssembly.GetManifestResourceStream(path))
{
if (stream == null) return null;
var assemblyRawBytes = new byte[stream.Length];
stream.Read(assemblyRawBytes, 0, assemblyRawBytes.Length);
return Assembly.Load(assemblyRawBytes);
}
}
You could add the DLLs as embedded resources, and then have your program unpack them into the application directory on startup (after checking to see if they're there already).
Setup files are so easy to make, though, that I don't think this would be worth it.
EDIT: This technique would be easy with .NET assemblies. With non-.NET DLLs it would be a lot more work (you'd have to figure out where to unpack the files and register them and so on).
Another product that can handle this elegantly is SmartAssembly, at SmartAssembly.com. This product will, in addition to merging all dependencies into a single DLL, (optionally) obfuscate your code, remove extra meta-data to reduce the resulting file size, and can also actually optimize the IL to increase runtime performance.
There is also some kind of global exception handling/reporting feature it adds to your software (if desired) that could be useful. I believe it also has a command-line API so you can make it part of your build process.
Neither the ILMerge approach nor Lars Holm Jensen's handling the AssemblyResolve event will work for a plugin host. Say executable H loads assembly P dynamically and accesses it via interface IP defined in an separate assembly. To embed IP into H one shall need a little modification to Lars's code:
Dictionary<string, Assembly> loaded = new Dictionary<string,Assembly>();
AppDomain.CurrentDomain.AssemblyResolve += (sender, args) =>
{ Assembly resAssembly;
string dllName = args.Name.Contains(",") ? args.Name.Substring(0, args.Name.IndexOf(',')) : args.Name.Replace(".dll","");
dllName = dllName.Replace(".", "_");
if ( !loaded.ContainsKey( dllName ) )
{ if (dllName.EndsWith("_resources")) return null;
System.Resources.ResourceManager rm = new System.Resources.ResourceManager(GetType().Namespace + ".Properties.Resources", System.Reflection.Assembly.GetExecutingAssembly());
byte[] bytes = (byte[])rm.GetObject(dllName);
resAssembly = System.Reflection.Assembly.Load(bytes);
loaded.Add(dllName, resAssembly);
}
else
{ resAssembly = loaded[dllName]; }
return resAssembly;
};
The trick to handle repeated attempts to resolve the same assembly and return the existing one instead of creating a new instance.
EDIT:
Lest it spoil .NET's serialization, make sure to return null for all assemblies not embedded in yours, thereby defaulting to the standard behaviour. You can get a list of these libraries by:
static HashSet<string> IncludedAssemblies = new HashSet<string>();
string[] resources = System.Reflection.Assembly.GetExecutingAssembly().GetManifestResourceNames();
for(int i = 0; i < resources.Length; i++)
{ IncludedAssemblies.Add(resources[i]); }
and just return null if the passed assembly does not belong to IncludedAssemblies .
It may sound simplistic, but WinRar gives the option to compress a bunch of files to a self-extracting executable.
It has lots of configurable options: final icon, extract files to given path, file to execute after extraction, custom logo/texts for popup shown during extraction, no popup window at all, license agreement text, etc.
May be useful in some cases.
I use the csc.exe compiler called from a .vbs script.
In your xyz.cs script, add the following lines after the directives (my example is for the Renci SSH):
using System;
using Renci;//FOR THE SSH
using System.Net;//FOR THE ADDRESS TRANSLATION
using System.Reflection;//FOR THE Assembly
//+ref>"C:\Program Files (x86)\Microsoft\ILMerge\Renci.SshNet.dll"
//+res>"C:\Program Files (x86)\Microsoft\ILMerge\Renci.SshNet.dll"
//+ico>"C:\Program Files (x86)\Microsoft CAPICOM 2.1.0.2 SDK\Samples\c_sharp\xmldsig\resources\Traffic.ico"
The ref, res and ico tags will be picked up by the .vbs script below to form the csc command.
Then add the assembly resolver caller in the Main:
public static void Main(string[] args)
{
AppDomain.CurrentDomain.AssemblyResolve += new ResolveEventHandler(CurrentDomain_AssemblyResolve);
.
...and add the resolver itself somewhere in the class:
static Assembly CurrentDomain_AssemblyResolve(object sender, ResolveEventArgs args)
{
String resourceName = new AssemblyName(args.Name).Name + ".dll";
using (var stream = Assembly.GetExecutingAssembly().GetManifestResourceStream(resourceName))
{
Byte[] assemblyData = new Byte[stream.Length];
stream.Read(assemblyData, 0, assemblyData.Length);
return Assembly.Load(assemblyData);
}
}
I name the vbs script to match the .cs filename (e.g. ssh.vbs looks for ssh.cs); this makes running the script numerous times a lot easier, but if you aren't an idiot like me then a generic script could pick up the target .cs file from a drag-and-drop:
Dim name_,oShell,fso
Set oShell = CreateObject("Shell.Application")
Set fso = CreateObject("Scripting.fileSystemObject")
'TAKE THE VBS SCRIPT NAME AS THE TARGET FILE NAME
'################################################
name_ = Split(wscript.ScriptName, ".")(0)
'GET THE EXTERNAL DLL's AND ICON NAMES FROM THE .CS FILE
'#######################################################
Const OPEN_FILE_FOR_READING = 1
Set objInputFile = fso.OpenTextFile(name_ & ".cs", 1)
'READ EVERYTHING INTO AN ARRAY
'#############################
inputData = Split(objInputFile.ReadAll, vbNewline)
For each strData In inputData
if left(strData,7)="//+ref>" then
csc_references = csc_references & " /reference:" & trim(replace(strData,"//+ref>","")) & " "
end if
if left(strData,7)="//+res>" then
csc_resources = csc_resources & " /resource:" & trim(replace(strData,"//+res>","")) & " "
end if
if left(strData,7)="//+ico>" then
csc_icon = " /win32icon:" & trim(replace(strData,"//+ico>","")) & " "
end if
Next
objInputFile.Close
'COMPILE THE FILE
'################
oShell.ShellExecute "c:\windows\microsoft.net\framework\v3.5\csc.exe", "/warn:1 /target:exe " & csc_references & csc_resources & csc_icon & " " & name_ & ".cs", "", "runas", 2
WScript.Quit(0)
If you are using .NET Core 3.0
You can do this with the dotnet publish command with PublishSingleFile property:
dotnet publish -r win-x64 -c Release /p:PublishSingleFile=true
The only downside is you end up with a single EXE file with a huge size.
It's possible but not all that easy, to create a hybrid native/managed assembly in C#. Were you using C++ instead it'd be a lot easier, as the Visual C++ compiler can create hybrid assemblies as easily as anything else.
Unless you have a strict requirement to produce a hybrid assembly, I'd agree with MusiGenesis that this isn't really worth the trouble to do with C#. If you need to do it, perhaps look at moving to C++/CLI instead.
Generally you would need some form of post build tool to perform an assembly merge like you are describing. There is a free tool called Eazfuscator (eazfuscator.blogspot.com/) which is designed for bytecode mangling that also handles assembly merging. You can add this into a post build command line with Visual Studio to merge your assemblies, but your mileage will vary due to issues that will arise in any non trival assembly merging scenarios.
You could also check to see if the build make untility NANT has the ability to merge assemblies after building, but I am not familiar enough with NANT myself to say whether the functionality is built in or not.
There are also many many Visual Studio plugins that will perform assembly merging as part of building the application.
Alternatively if you don't need this to be done automatically, there are a number of tools like ILMerge that will merge .net assemblies into a single file.
The biggest issue I've had with merging assemblies is if they use any similar namespaces. Or worse, reference different versions of the same dll (my problems were generally with the NUnit dll files).
Try this:
https://github.com/ytk2128/dll-merger
here you can merge all 32 bit dlls/exe - even its not ".net" dlls - so for me better then ilmerge for example ...
I want to include the current time and date in a .net application so I can include it in the start up log to show the user what version they have. Is it possible to retrieve the current time during compilation, or would I have to get the creation/modification time of the executable?
E.g.
Welcome to ApplicationX. This was built day-month-year at time.
If you're using reflection for your build number you can use that to figure out when a build was compiled.
Version information for an assembly consists of the following four values:
Major Version
Minor Version
Build Number
Revision
You can specify all the values or you can accept the default build number, revision number, or both by using an asterisk (*). Build number and revision are based off Jan 1, 2000 by default.
The following attribute will set Major and minor, but then increment build number and revision.
[assembly: AssemblyVersion("5.129.*")]
Then you can use something like this:
public static DateTime CompileTime
{
get
{
System.Version MyVersion = System.Reflection.Assembly.GetExecutingAssembly().GetName().Version;
// MyVersion.Build = days after 2000-01-01
// MyVersion.Revision*2 = seconds after 0-hour (NEVER daylight saving time)
DateTime compileTime = new DateTime(2000, 1, 1).AddDays(MyVersion.Build).AddSeconds(MyVersion.Revision * 2);
return compileTime;
}
}
The only way I know of doing this is somewhat convoluted -
You can have a pre-build event that runs a small application which generates the source code on the fly. An easy way to do this is to just overwrite a very small file that includes a class (or partial class) with the day/month/year hardcoded as a string constant.
If you set this to run as a pre-build event, it will rewrite that file before every build.
You could use PostSharp to weave in the date immediately post-build. PostSharp comes with a lightweight aspect-oriented programming library, but it can be extended to weave in anything you need in a wide variety of ways. It works at the IL level, but the API abstracts you a bit from that.
http://www.postsharp.org/
There's nothing built into the language to do this.
You could write a pre-build step to write out the current date and time to a source file though (in a string literal, for example, or as source code to generate a DateTime), and then compile that as part of your build.
I would suggest you make this source file as simple as possible, containing nothing but this information. Alternatively it could edit an existing file.
For an example of this, see the build file for MiscUtil which embeds the current SVN revision into the AssemblyFileVersion attribute. Some assorted bits of the build file:
<!-- See http://msbuildtasks.tigris.org -->
<Import
Project="$(MSBuildExtensionsPath)\MSBuildCommunityTasks\MSBuild.Community.Tasks.Targets"/>
<!-- Find out what the latest version is -->
<SvnInfo RepositoryPath="$(SvnUrl)">
<Output TaskParameter="LastChangedRevision" PropertyName="Revision" />
</SvnInfo>
<!-- Update the AssemblyInfo with the revision number -->
<FileUpdate Files="$(OutputDirectory)\MiscUtil\MiscUtil\Properties\AssemblyInfo.cs"
Regex='(\[\s*assembly:\s*AssemblyFileVersion\(\s*"[^\.]+\.[^\.]+)\.([^\.]+)(\.)([^\.]+)("\)\s*\])'
ReplacementText='$1.$2.$(Revision)$5' />
In makefiles for C programs, it is common to see something like this:
echo char * gBuildSig ="%DATE% %TIME%"; > BuildTimestamp.c
And then the resulting C source file is compiled into the image. The above works on Windows because the %date% and %time% variables are known in cmd.exe, but a similar thing would work on Unix using cat.
You can do the same thing using C#. Once again, this is how it would look if you are using a makefile. You need a class, and a public static property.
BuildTimestamp.cs:
echo public static class Build { public static string Timestamp = "%DATE% %TIME%";} > BuildTimestamp.cs
And then for the thing you are building, a dependency and a delete:
MyApp.exe: BuildTimestamp.cs MyApp.cs
$(_CSC) /target:exe /debug+ /optimize- /r:System.dll /out:MyApp.exe MyApp.cs BuildTimestamp.cs
-del BuildTimestamp.cs
Be sure to delete the BuildTimestamp.cs file after you compile it; you don't want to re-use it. Then, in your app, just reference Build.Timestamp.
Using MSBuild or Visual Studio, it is more complicated. I couldn't get %date% or %time% to resolve. Those things are pseudo environment variables, I guess that is why. So I resorted to an indirect way to get a timestamp, using the Touch task with AlwaysCreate = true. That creates an empty file. The next step writes source code into the same file, referencing the timestamp of the file. One twist - I had to escape the semicolon.
Your pre-build step should build the target "BuildTimestamp". And be sure to include that file into the compile. And delete it afterwards, in the post-build step.
<ItemGroup>
<StampFile Include="BuildTimestamp.cs"/>
</ItemGroup>
<Target Name="BuildTimestamp"
Outputs="#(StampFile)">
<Message Text="Building timestamp..." />
<Touch
AlwaysCreate="true"
Files="#(StampFile)" />
<WriteLinesToFile
File="#(StampFile)"
Lines='public static class Build { public static string Timestamp = "%(StampFile.CreatedTime)" %3B }'
Overwrite="true" />
</Target>
You could update the Assembly version in AssemblyInfo.cs as part of your build. Then you could do something like this
FileVersionInfo lvar = FileVersionInfo.GetVersionInfo(FileName);
FileVersionInfo has the information (build/version,etc) that you looking for. See if this works out for you.
Hi I used following method for the same...
private DateTime ExecutableInfo()
{
System.IO.FileInfo fi = new System.IO.FileInfo(Application.ExecutablePath.Trim());
try
{
return fi.CreationTime;
}
catch (Exception ex)
{
throw ex;
}
finally
{
fi = null;
}
}
I have a requirement to install multiple web setup projects (using VS2005 and ASP.Net/C#) into the same virtual folder. The projects share some assembly references (the file systems are all structured to use the same 'bin' folder), making deployment of changes to those assemblies problematic since the MS installer will only overwrite assemblies if the currently installed version is older than the one in the MSI.
I'm not suggesting that the pessimistic installation scheme is wrong - only that it creates a problem in the environment I've been given to work with. Since there are a sizable number of common assemblies and a significant number of developers who might change a common assembly but forget to update its version number, trying to manage versioning manually will eventually lead to massive confusion at install time.
On the flip side of this issue, it's also important not to spontaneously update version numbers and replace all common assemblies with every install, since that could (temporarily at least) obscure cases where actual changes were made.
That said, what I'm looking for is a means to update assembly version information (preferably using MSBuild) only in cases where the assembly constituents (code modules, resources etc) has/have actually changed.
I've found a few references that are at least partially pertinent here (AssemblyInfo task on MSDN) and here (looks similar to what I need, but more than two years old and without a clear solution).
My team also uses TFS version control, so an automated solution should probably include a means by which the AssebmlyInfo can be checked out/in during the build.
Any help would be much appreciated.
Thanks in advance.
I cannot answer all your questions, as I don't have experience with TFS.
But I can recommend a better approach to use for updating your AssemblyInfo.cs files than using the AssemblyInfo task. That task appears to just recreate a standard AssemblyInfo file from scratch, and loses any custom portions you may have added.
For that reason, I suggest you look into the FileUpdate task, from the MSBuild Community Tasks project. It can look for specific content in a file and replace it, like this:
<FileUpdate
Files="$(WebDir)\Properties\AssemblyInfo.cs"
Regex="(\d+)\.(\d+)\.(\d+)\.(\d+)"
ReplacementText="$(Major).$(ServicePack).$(Build).$(Revision)"
Condition="'$(Configuration)' == 'Release'"
/>
There are several ways you can control the incrementing of the build number. Because I only want the build number to increment if the build is completely successful, I use a 2-step method:
read a number from a text file (the only thing in the file is the number) and add 1 without changing the file;
as a final step in the build process, if everything succeeded, save the incremented number back to the text file.
There are tasks such as ReadLinesFromFile, that can help you with this, but I found it easiest to write a small custom task:
using System;
using System.IO;
using Microsoft.Build.Framework;
using Microsoft.Build.Utilities;
namespace CredibleCustomBuildTasks
{
public class IncrementTask : Task
{
[Required]
public bool SaveChange { get; set; }
[Required]
public string IncrementFileName { get; set; }
[Output]
public int Increment { get; set; }
public override bool Execute()
{
if (File.Exists(IncrementFileName))
{
string lines = File.ReadAllText(IncrementFileName);
int result;
if(Int32.TryParse(lines, out result))
{
Increment = result + 1;
}
else
{
Log.LogError("Unable to parse integer in '{0}' (contents of {1})");
return false;
}
}
else
{
Increment = 1;
}
if (SaveChange)
{
File.Delete(IncrementFileName);
File.WriteAllText(IncrementFileName, Increment.ToString());
}
return true;
}
}
}
I use this before the FileUpdateTask to get the next build number:
<IncrementTask
IncrementFileName="$(BuildNumberFile)"
SaveChange="false">
<Output TaskParameter="Increment" PropertyName="Build" />
</IncrementTask>
and as my final step (before notifying others) in the build:
<IncrementTask
IncrementFileName="$(BuildNumberFile)"
SaveChange="true"
Condition="'$(Configuration)' == 'Release'" />
Your other question of how to update the version number only when source code has changed is highly dependent on your how your build process interacts with your source control. Normally, checking in source file changes should initiate a Continuous Integration build. That is the one to use to update the relevant version number.
I have written one custome task you can refer the code below. It will create an utility to which you can pass assemblyinfo path Major,minor and build number. you can modify it to get revision number. Since in my case this task was done by developer i used to search it and again replace whole string.
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.IO;
using System.Text.RegularExpressions;
namespace UpdateVersion
{
class SetVersion
{
static void Main(string[] args)
{
String FilePath = args[0];
String MajVersion=args[1];
String MinVersion = args[2];
String BuildNumber = args[3];
string RevisionNumber = null;
StreamReader Reader = File.OpenText(FilePath);
string contents = Reader.ReadToEnd();
Reader.Close();
MatchCollection match = Regex.Matches(contents, #"\[assembly: AssemblyVersion\("".*""\)\]", RegexOptions.IgnoreCase);
if (match[0].Value != null)
{
string strRevisionNumber = match[0].Value;
RevisionNumber = strRevisionNumber.Substring(strRevisionNumber.LastIndexOf(".") + 1, (strRevisionNumber.LastIndexOf("\"")-1) - strRevisionNumber.LastIndexOf("."));
String replaceWithText = String.Format("[assembly: AssemblyVersion(\"{0}.{1}.{2}.{3}\")]", MajVersion, MinVersion, BuildNumber, RevisionNumber);
string newText = Regex.Replace(contents, #"\[assembly: AssemblyVersion\("".*""\)\]", replaceWithText);
StreamWriter writer = new StreamWriter(FilePath, false);
writer.Write(newText);
writer.Close();
}
else
{
Console.WriteLine("No matching values found");
}
}
}
}
I hate to say this but it seems that you may be doing it wrongly. Is much easier if you do generate the assembly versions on the fly instead of trying to patch them.
Take a look at https://sbarnea.com/articles/easy-windows-build-versioning/
Why I do think you are doing it wrong?
* A build should not modify the version number
* if you build the same changeset twice you should get the same build numbers
* if you put build number inside what microsoft calls build number (proper naming would be PATCH level) you will eventually reach the 65535 limitation.