C# Code Generation - c#

I am looking at creating a small class generator for a project. I have been reading about CodeDOM so it the semantics of creating the classes does not appear to be an issue, but am unsure oh how to best integrate the generation into the development and deployment process.
How should I trigger the creation of the classes? I have read it should be part of the build process, how should I do this?
Where should the classes be created? I read that the files should not be edited by hand, and never checked into source control. Should I even worry about this and just generate the classes into the same directory as the generator engine?

Take a look at T4 templates (it's built in to VS2008). It allows you to create "template" classes that generate code for you. Oleg Sych is an invaluable resource for this.
Link for Oleg's tutorial on code generation.

The answers to your question depend partly on the purpose of your generated classes.
If the classes are generated as a part of the development, they should be generated as text files and checked into your SCM like any other class.
If your classes are generated dynamically at runtime as a part of the operation of your system, I wouldn't use the CodeDOM at all. I'd use Reflection.

I know of the presence of T4 templates (and know many people use them), but I have not used them myself. Aside from those, you have two main options:
Use a SingleFileGenerator to transform the source right inside the project. Whenever you save the document that you edit, it will automatically regenerate the code file. If you use source control, the generated file will be checked in as part of the project. There are a few limitations with this:
You can only generate one output for each input.
Since you can't control the order in which files are generated, and the files are not generated at build time, your output can only effectively be derived from a single input file.
The single file generator must be installed on the developer's machine if they plan to edit the input file. Since the generated code is in source control, if they don't edit the input then they won't need to regenerate the output.
Since the output is generated only when the input is saved, the output shouldn't depend on any state other than the exact contents of the input file (even the system clock).
Generate code as part of the build. For this, you write an MSBuild targets file. For this, you have full control of input(s) and output(s) so dependencies can be handled. System state can be treated as an input dependency when necessary, but be remember that every build that requires code generation takes longer than a build which uses a previouly generated result. The results (generated source files) are generally placed in the obj directory and added to the list of inputs going to csc (the C# compiler). Limitations of this method:
It's more difficult to write a targets file than a SingleFileGenerator.
The build depends on generating the output, regardless of whether the user will be editing the input.
Since the generated code is not part of the project, it's a little more difficult to view the generated code for things like setting breakpoints.

Related

Is there a way to modify a Settings.Designer.cs file?

So the thing is at work I was assigned an old project which basically reads data from a table in a database and that data that reads from the first table makes an update to a second table in another database and I was looking a file called Settings.Designer.cs which has auto generated code.
I was looking at the file and has the values which reads the data from (from the first table) and I saw the strings are "hardcoded"
is there a way to make those values dynamics values? Because this file is a internal sealed partial class and I have understood this types of classes can't be accessed
Well, the settings as a general rule are generated from the project settings page:
so, in VS go project and the last option will be "my project name here" properties.
And then you see this:
So things like company name, connection strings, constant values and what not can be created, edited, and maintained from above.
It not clear if you attempted to see/use/look/change settings using above, or there are settings that exist in the desinger file that don't exist in above?
There "might" be a good case - but we assume you at least checked the above first???
I mean, if you add a connection string, or even just some particular constant, or even say some web site company name for the whole site?
When you build the project, those settings get shoved automatic and transferred into web.config for you.
So, at least give the settings page in your project a good look over before you do this. As a general rule, I would not modify that designer file until such time you are BEYOND 100% sure that is your only option. Remember, when you re-build the project, the designer file will get overwritten and re-created. You as a result will "lose" changes and settings as a result.
use the "source means" to work on that project, be it a simple webform, or what not, you want to REALLY try to use the VS built in design surfaces that in turn spit out, crank out those files, and to be honest - not looked at such files for 10+ years, and I never care about these types of generated bits and parts - since I as a developer don't have to create such files. They could be instructions for how to peel a banana for all I care. They are not my concern.
They should quite much be considered temporary files that are transient in nature - and changes to them in general will be lost on next project re-build.

Output a Roslyn MSBuildWorkspace to different folder

When executing
mSBuildWorkspace.TryApplyChanges(solution);
Visual Studio changes the solution in place. This means that if I want to output to a different location, I need to first copy the whole solution to the requested target and only then work on it. This is error prone as the solution might have relative path links to dependencies, which can break when moving the solution.
So is there a way to tell MSBuildWorkspace to output the changes to a different folder than the source?
There's no built-in support for this.
Option #1: Instead of instead of calling TryApplyChanges you could call Solution.GetChanges to figure out what changed compared to what was originally loaded, and then call the various methods to get the changed documents and apply the edit yourself. This means you're on the hook to actually apply the edits -- source file edits are easy (just write the updated text) but if you care about more complicated things like project changes (adding/removing references) you don't really have a way to leverage MSBuildWorkspace's support for those sorts of things.
Option #2: Roslyn's open source, so you'd have to modify MSBuildWorkspace yourself to allow such a redirection, which would let you potentially try to reuse some of the more complicated logic around project manipulation. Or you can just copy/paste the implementation of the applying, and then use Solution.GetChanges and the reused code.

Roslyn: Access XAML of a partial class from code analyzer

The context: we are currently using a solution where all localizable strings are in XAML files which are translated. For translating strings in code, we use a function that will search from associated resource dictionary:
MessageBox.Show(this.i18n("my message"));
I would like to implement a code analyzer that will check if the "my message" string is actually declared in associated XAML file. The problem is that I can't find anything in compilation context that would lead me to the correct XAML file.
If the resource management is outside of the scope for Roslyn I could use DTE Interface for my task but I would like to know if there are better solutions for it.
Roslyn exposes an AdditionalFiles mechanism where you can specify some additional files to be passed into your analyzer which you need the content of. XAML files for what you're doing would be a perfect example. We have one Roslyn analyzer that we run on Roslyn itself that verifies that the types we have in our API match an additional file (called PublicAPI.Shipped.txt). If you look at this as a sample it'll show you how to read in extra files.
This doesn't give you any help at interpreting the files (you'll need to parse them yourself), but this at least gives you the mechanism to get the contents of them. We'll take care of all the mucking around reading the file from disk and everything for you.
You still have to specify that you actually want the files to be included in the AdditionalFiles list in the first place. If you look here you can see that you can specify an MSBuild item group name that will get passed through everything.

Create Intellisense from stored dynamic objects

Im about to create some settings for MVC projects and sites, based on dynamic variables etc.
These settings will be stored in xml for easy read and write.
My question now, after reading about extending the intellisense in this question:
Is it possible to provide intellisense for dynamic objects in visual studio?
Is if its possible to read my saved settings (which are stored at runtime) and then for the next run build a intellisense from that?
I.E. for each of these site.setings.layout.width a list of the "older" saved xml-defined defined dynamics will be able to show up?
If all you need is xml "intellisense" then consider designing xml schemas and dump them in Visual Studio installation Folder\xml\Schemas or include them in your solution and VS will do the rest if the namespaces match appropriately.
Edit:
Coming back to this after a while. No other answer appears to have been given so I'll try to be more creative.
Visual Studio has an option to generate an xsd from an xml file. Note that the schema will be mostly an approximation but it will match the file and will be a good description of structure. If you could find a way to call that from a command line (or possibly find a similar tool for the step) you can then chain that with xsd.exe and generate C# classes from it at build time (prebuild step)
If point one is too cumbersome you could try to write a T4 template that reads a previous configuration file and generates your custom code based on that. Generating a POCO property structure based on some xml should be fairly simple with T4. The template should be run as a precompiled step.
Note that both suggestions involve static code generation. A full dynamic solution could be done with F# type providers but that is not available for C#.

How to always produce byte-for-byte identical .exe on rebuild of C# application?

I'll give you a little bit of background first as to why I'm asking this question:
I am currently working in a stricly-regulated industry and as such our code is quite carefully looked-over by official test houses. These test houses expect to be able to build the code and generate an .exe or .dll which is EXACTLY the same each and every time (without changing any code obviously!). They check the MD5 and the SHA1 of the executables that they create to ensure this.
Up until this point I have predominantly been coding in C++, where (after a few project setting tweaks) I managed to get the projects to rebuild consistantly to the same MD5/SHA1. I am now using C# in a project and am having great difficulty getting the MD5's to match after a rebuild. I am aware that there are "Time-Stamps" in the PE header of the file, and they have been cleared to 0. I am also aware that there is a GUID for the .exe, which again has been cleared to 00 00 00... etc. However the files still don't match.
I'm using CFF Explorer to view and edit the PE Header to remove the time and date stamps. After using a binary comparison tool there are only 2 blocks of bytes in the .exe's that are different (both very small).
One of the inconsistant blocks appears just before some binary code, which in ASCII details the path of the *Project*\obj\Release\xxx.pdb file.
EDIT: This is now known to be the GUID of the *.pdb file, however I still don't know if I can modify it without causing any errors!?
The other block appears in the middle of what looks to be function names, ie. (a typical section) AssemblyName.GetName.Version.get_Version.System.IO.Ports.SerialPort.Parity.Byte.<PrivateImplementationDetails>{
then the different code block:
4A134ACE-D6A0-461B-A47C-3A4232D90816
followed by:
"}.ValueType.__StaticArrayInitTypeSize=7.$$method0x60000ab-1.RuntimeFieldHandle.InitializeArray`... etc..
Any ideas or suggestions would be most welcome!
Update: Roslyn seems to have a /feature:deterministic compiler flag for reproducible builds, although it's not 100% working yet.
You should be able to get rid of the debug GUID by disabling PDB generation. If not, setting the GUID to zeroes is fine - only debuggers look at that section (you won't be able to debug the assembly anymore, but it should still run fine).
The PrivateImplementationDetails are a bit more difficult - these are internal helper classes generated by the compiler for certain language constructs (array initializers, switch statements using strings, etc.). Because they are only used internally, the class name doesn't really matter, so you could just assign a running number to them.
I would do this by going through the #Strings metadata stream and replacing all strings of the form "<PrivateImplementationDetails>{GUID}" with "<PrivateImplementationDetails>{running number, padded to same length as a GUID}".
The #Strings metadata stream is simply the list of strings used by the metadata, encoded in UTF-8 and separated by \0; so finding and replacing the names should be easy once you know where the #Strings stream is inside the executable file.
Unfortunately the "metadata stream headers" containing this information are quite buried inside the file format. You'll have to start at the NT Optional Header, find the pointer to the CLI Runtime Header, resolve it to a file position using the PE section table (it's an RVA, but you need a position inside the file), then go to the metadata root and read the stream headers.
I'm not sure about this, but just a thought: are you using any anonymous types for which the compiler might generate names behind the scenes, which might be different each time the compiler runs? Just a possibility which occurred to me. Probably one for Jon Skeet ;-)
Update: You could perhaps also use Reflector addins for comparison and disassembly.
Regarding the PDB GUID problem, if you specify that a PDB shouldn't be generated at compilation for Release builds, does the binary still contain the PDB's file system GUID?
To disable PDB generation:
Right-click your project in Solution Explorer and select Properties.
From the menu along the left, select Build.
Ensure that the Configuration selection is Release (you'll still want a PDB for debugging).
Click the Advanced button in the bottom right.
Under Output / Debug Info, select None.
If you're building from the console, use /debug- to get the same result.
Take a look at the answers from this question. Especially on the external link provided in the 3rd one.
EDIT:
I actually wantetd to link to this article.
You said that after a few project tweaks you were able to get C++ apps to compile repeatably to the same SHA1/MD5 values. I'm in the same boat as you in being in an industry with a third party test lab that needs to rebuild exactly the same executables repeatably.
In researching how to make this happen in VS2005, I came across your post here. Could you share the project tweaks you did to make the C++ apps build to the same SHA1/MD5 values consistently? It would be of great help to myself and perhaps any others that share this requirement.
Use ildasm.exe to fully disassemble both programs and compare the IL. Then you can "clean" the code using text-based methods and (predictably) recompile it again.

Categories