Resource file (.resx) vs reflection to access an embedded resource - c#

I may be missing something very simple here, but what's the benefit of using reflection to retrieve an embedded resource from the same assembly that contains the resource as opposed to simply retrieving it via an .resx file? I see this a lot but don't get it - is there a reason to use Assembly.GetExecutingAssembly().GetManifestResourceStream(resource) compared to resx file Resources.resource? Even Microsoft does it: How to embed and access resources.
What I mean exactly: suppose I have an assembly MyAssembly that contains an embedded resource Config.xml. The assembly has MyClass that implements a method that returns said resource as a string:
public string GetConfigXML() // returns the content of Config.xml as a string
Often, I see this implemented like this, using reflection to retrieve the resource:
public string GetConfigXML()
{
Stream xmlStream = Assembly.GetExecutingAssembly().GetManifestResourceStream("MyAssembly.Config.xml");
string xml = GetStringFromStream(xmlStream);
return xml;
}
Why use GetManifestResourceStream() when you can:
add a resource file (Resource.resx) to the MyAssembly project in Visual Studio;
add Config.xml to the resource's 'Files';
get the content of Config.xml in a much simpler way: string xml = Resource.Config;
I don't know how Visual Studio handles .resx files internally, but I doubt it simply copies the resource into the .resx file (in which case you'd end up with duplicated resources). I assume it doesn't use reflection internally either, so why not simply use .resx files in situations like this, which seems much more performance-friendly to me?

but what's the benefit of using reflection to retrieve an embedded resource
The common benefit that's behind any reason to convert data from one format to another. Speed, speed, speed and convenience.
XML is a pretty decent format to keep your resources stored in. You'll have a very good guarantee that you can still retrieve the original resource 10 years from now when the original got lost in the fog of time and a couple of machine changes without good backups. But it is quite a sucky format to have to read from, XML is very verbose and locating a fragment requires reading from the start of the file.
Problems that disappear when Resgen.exe compiles the .xml file into a .resource file. A binary format that's fit to be linked into your assembly metadata and contains the original bytes in the resource. And is directly mapped into memory when your assembly is loaded, no need to find another file and open it, read it and convert the data. Big difference.
Do use the Resource Designer to avoid having to use GetManifestResourceStream() directly. Yet more convenience.

Related

How can I get the FileVersion information of an executable stored in memory as an array of bytes?

I need to read the FileVersion of an executable. The problem is that I don't have an actual file on disk, only an array of bytes. The FileVersion API only has a GetVersionInfo(string fileName) method, it doesn't have any method to grab the version from the file.
I tried looking into the source with a decompiler, but it looks more complicated than a simple copy/paste can do.
Is there any way to read the FileVersion of a file, given that I have the bytes of file contents, without writing the file to disk?
Unfortunately, after lots of digging, and looking into the .NET Core source, available Javascript libraries, etc. I decided that this is not easy (or at all) possible and not worth the pain to figure out.
Instead, I went with the simpler approach of:
Write the file to disk in some sort of temporary location
Read the version from the file on disk
Delete the file
The code is simple and straight-forward .NET code, no fancy tricks or complicated JS libraries required.
This works for me. It might be worth noting that my byteArray comes from a .NET assembly that is an embedded resource.
Dim assembly = System.Reflection.Assembly.Load(byteArray)
Dim currentVersion = assembly.GetName.Version

How to use strongly-typed satellite assembly?

In my particular scenario, different users have different requirements of what the text of messages, labels, etc are, even though there's no language change. ie: the language always remains en-US.
Currently, I have all my string resources in an internal resources file and I use strongly-typed in my code.
To move the string resources to a satellite assembly I'm following the following MSDN article. So far I've managed to create a .resources file and the corresponding satellite assembly. In this article, the example to get a string resources uses GetString() instead of strongly-typed.
So how do I tell the app do use a different satellite assembly without losing the ability to use strongly-typed access?
It may not be the ideal solution, but our approach to this problem was simply to have the resource keys as public static strings, and this list is the canonical source for key value names.
So while they are just string values, it's easy to verify in code review that names aren't being simply free-formed in, and an acceptance test can verify that all of the keys which should be present in the source file do exist.

Roslyn: Access XAML of a partial class from code analyzer

The context: we are currently using a solution where all localizable strings are in XAML files which are translated. For translating strings in code, we use a function that will search from associated resource dictionary:
MessageBox.Show(this.i18n("my message"));
I would like to implement a code analyzer that will check if the "my message" string is actually declared in associated XAML file. The problem is that I can't find anything in compilation context that would lead me to the correct XAML file.
If the resource management is outside of the scope for Roslyn I could use DTE Interface for my task but I would like to know if there are better solutions for it.
Roslyn exposes an AdditionalFiles mechanism where you can specify some additional files to be passed into your analyzer which you need the content of. XAML files for what you're doing would be a perfect example. We have one Roslyn analyzer that we run on Roslyn itself that verifies that the types we have in our API match an additional file (called PublicAPI.Shipped.txt). If you look at this as a sample it'll show you how to read in extra files.
This doesn't give you any help at interpreting the files (you'll need to parse them yourself), but this at least gives you the mechanism to get the contents of them. We'll take care of all the mucking around reading the file from disk and everything for you.
You still have to specify that you actually want the files to be included in the AdditionalFiles list in the first place. If you look here you can see that you can specify an MSBuild item group name that will get passed through everything.

C# localization [duplicate]

This question already has answers here:
Closed 11 years ago.
Possible Duplicate:
C# localization , really confusing me
Could someone please share their localization steps for huge C# applications?
I'm pretty sure that the basic resource-based strategy might work when talking about small to medium projects.
However, if we speak about large products, this approach should be paired up with custom build steps and some 3rd party applications used specifically by linguists.
So, could you please advise / share some global localization strategy that is used in your applications (big enough, obviously :)
Thank you.
Basic resource-based strategy works even in large enterprise applications. This is built-in and easily understandable solution, therefore each and every programmer could use it without a problem.
The only problem is, you need to somehow transform your resource files into Translation Memory files (i.e. tmx) and back - so that translators could use their standard tools.
So what you need is actually a Localization process. Is it different for large applications? Well, if you set-up correct process it would scale. Now onto process. From my point of view it should look like this:
Copy resource files into appropriate folder structure (Localization Engineers should not work directly with application code base). The appropriate folder structure should be somehow similar to:
[Project Name]
.
.
[neutral] [German] [Japanese] [French]
.
.
(each folder contains translatable resources in given language, neutral is usually English)
Of course you would need to transform your code base into folder structure somehow, but this could be automated.
Process your translatable resources and create transkits - zip archives containing files that need to be translated (in this case it seems like all of them). The files should be probably transformed, so you won't end-up sending out resx files. The transformation application should read contents of resx files and put translatable strings into some file of format agreed with translators (it could be simply Excel but I won't recommend this solution). Now, I can't give you the names of such tools, although I know that some commercial applications exist, for I have only worked with custom ones.
Send transkits to the translators (most likely translation vendors).
Upon receiving translated files (transkit) back, you need to verify it (this step is crucial). You need to ensure that transkit is complete (i.e. no translatable strings are missing) and technically correct (i.e. file encoding is correct, usually UTF-8 or UTF-16). Also it is at least good to take a glance at the file to see if there are no strange characters like 1/2, 3/4 or something - this usually mean broken encoding.
Import your transkit. This is the reverse step of 2 - you need to put translated strings back to appropriate files.
Copy translated files back to the original code base and run "Localization" build.
Test your application for Localization problems (i.e. overlapping controls, clipping strings, incorrect encoding, etc. - this usually mean that i18n is not done right).
Fix Localization/Internationalization (Localizability) defects.
Proceed to 1 until UI/String freeze period. This assumes that translators would use Translation Memory of some kind and won't charge (or charge less) you for re-translating previously translated strings.
Automate all possible steps and your done.
Apart from that you might won't to establish your common glossary of terms and do linguistic review on translated content.
I think you can rely heavily on the resource framework provided by .NET with a few modifications to make it more appropriate for large projects, namely to build and maintain resources independently of the application and to eliminate the generated properties that refer to each resource by name. If there are other goals appropriate for large project localization that aren't addressed below, please describe them so I can consider them too.
Create a stand-alone project to represent your resources that can be loaded as a separate DLL.
Add a "Resources" file to your project by selecting the link on the Resources tab of the project properties: "This project does not contain a default resources file. Click here to create one."
Add another resource with the same root name to represent another language, for example "Resource.de.resx" for German. (Visual Studio apparently uses the filename to determine the language that the resource file represents). Move it to the same directory/folder as the default Resources file. (Repeat for every language.)
In the properties of the Resources.resx file, delete "ResXFileCodeGenerator" from the Custom Tool property to prevent default code generation in the Application's "Properties" namespace. (Repeat for every language.)
Explicitly/manually declare your own resource manager that loads the newly created resources with a line like:
static System.Resources.ResourceManager resourceMan =
new System.Resources.ResourceManager(
"LocalizeDemo.Properties.Resources", typeof(Resources).Assembly);
Implement a file that can be generated that contains a list of all the resources you can refer to (see figure 1)
Implement a function to retrieve and format strings (see figure 2).
Now you have enough that you can refer to translated strings from any number of applications (see figure 3).
Use System.Resources.ResXResourceWriter (from System.Windows.Forms.dll) or System.Resources.ResourceWriter (System.dll) to generate the resources instead of having the Resx files be your primary source. In our project, we have an SQL database that defines all of our strings in each language and part of our build process generates all the Resx files before building the resources project.
Now that you can generate your Resx files from any format, you can use any format you want (in our case an SQL database, which we export to and import from Excel spreadsheets) to provide files to send out to translators.
Also notice that the translated resources are building as satellite DLLs. You could conceivably build each language independently with the right command line tools. If that is part of your question (how to do that) let me know. But for the moment, I'll assume you know about that since you already mentioned custom build steps.
Figure 1 - enum identifying all available resources:
namespace MyResources
{
public enum StrId
{
Street
....
}
}
Figure 2 - Code to load and return formatted resource strings:
namespace MyResources
{
public class Resources
{
static System.Resources.ResourceManager resourceMan =
new System.Resources.ResourceManager("MyResources.Properties.Resources",
typeof(Resources).Assembly);
public static string GetString(StrId name,
System.Globalization.CultureInfo culture = null, params string[] substitutions)
{
if (culture == null) culture = System.Threading.Thread.CurrentThread.CurrentUICulture;
string format = resourceMan.GetString(name.ToString(), culture);
if (format != null)
{
return string.Format(format, substitutions);
}
return name.ToString();
}
}
}
Figure 3 - accessing resources:
using MyResources;
namespace LocalizationDemo
{
class Program
{
static void Main(string[] args)
{
System.Threading.Thread.CurrentThread.CurrentUICulture =
new System.Globalization.CultureInfo("de-DE");
Console.WriteLine(Resources.GetString(StrId.Street));
}
}
}

Is there a XML to LINQ Generator?

I have an XML file that I want to base some unit tests off of. Currently I load the XML file from disk in the class initialize method. I would rather have this XML generated in the test instead of reading the file from disk. Are there any utilities that will automatically generate the LINQ to XML code to generate a given XML file?
Or are there better ways to do this? Is loading from disk OK for unit tests?
I would embed the XML file directly into the assembly - no need for a string resource or anything like that, just include it as an embedded resource (create a file, go to the properties in Visual Studio, and select "Embedded Resource").
Then you can read it using Assembly.GetManifestResourceStream, load the XML from that as you would any other stream, and you're away.
I've used this technique several times - it makes it a lot easier to see the data you're interested in.
Probably it's better to use some resource file, for example, a .resx file where you put the XML as a string resource. That's fast enough for a unit test and you don't have to do any magic. Reading from disk is not OK for various reasons (speed, need for configuration, etc.)

Categories