C# generics code-bloat - should I be worried? - c#

I'd like to ask something about generics.
I am trying to keep the code simple, and thus I will be making a single class to handle load/save for a game's savegame files. As each portion of the game has different requirements I'd like to keep this as easily accessible as possible:
public void Load<T>(string path, out T obj)
{
BinaryFormatter bf = new BinaryFormatter();
using (FileStream file = File.Open(Application.persistentDataPath + path, FileMode.Open))
{
obj = (T)bf.Deserialize(file);
}
}
Now I can call this with a simple
TurnData x; s.Load("test.txt", out x);
The alternative would be to make the Load function return the object and then convert it to a TurnData type.
TurnData x = (TurnData)s.Load("test.txt");
I do not know much about C#. I assume that the code inside using(...) { ... } does not get executed if there is an error opening the file for example? If someone can confirm this that would be nice. The example code I have seen did not have any error handling, which seemed weird to me, so I added using?
So in this secondary version where the function returns the object instead of using an out parameter would need more complicated code for error checking and possible return null? It doesn't seem great.
So the real question is ... can I use the next version I have here or are there concerns that I should have due to the use of generics?

There is no generic code bloat for reference types - code is reused. With value types, though, CLR will generate a separate method for each type. See
.NET Generics and Code Bloat.

The using statement has nothing to do with error handling. Using File.Open method you can expect to get the exceptions you will find here. You could avoid the abruptly stop of your program from any such an exception by wrapping your using statement in a try/cath construct like below:
public T Load<T>(string path)
{
T obj = default(T);
var bf = new BinaryFormatter();
try
{
using (var file = File.Open(Application.persistentDataPath + path, FileMode.Open))
{
obj = (T)bf.Deserialize(file);
}
}
catch(Exception exception)
{
// Log the exception
}
return obj;
}
Essentially you attempt to Open the file specified in the path. If that fails you just log the failure and your return null from the function.
Regarding the using statement, it provides
a convenient syntax that ensures the correct use of IDisposable
objects.
as you can read more thoroughly here
As a side note regarding the signature of your method I would make a few comments. Consider the following method body and spot the differences with that we have above.
public T Load<T>(string path, IFormatter formatter)
{
if(path ==null) throw new ArgumentNullException(nameof(path));
if(formatter == null) throw new ArgumentNullException(nameof(formatter));
T obj = default(T);
try
{
using (var file = File.Open(path, FileMode.Open))
{
obj = (T)formatter.Deserialize(file);
}
}
catch(Exception exception)
{
// Log the exception
}
return obj;
}
and
var path = Path.Combine(Application.persistentDataPath, "test.txt");
var binaryFormatter = new BinaryFormatter();
var x = s.Load(path, binaryFormatter);
Making the above changes you make your method more easily to be tested through a unit test and more reliable since you make some precondition checking before the meat and potatoes of your method. What would had happened if you had passed a null path? What would had happened if you had passed a null formatter ? etc...

Related

Having some trouble to delete a file using FileStreams in C#

I'm writing a program that uses text files in C#.
I use a parser class as an interface between the file structure and the program.
This class contains a StreamReader, a StreamWriter and a FileStream. I use the FileStream as a common stream for the reader and the writer, else these two will conflict when both of them have the file open.
The parser class has a class variable called m_path, this is the path to the file. I've checked it extensively, and the path is correct. OpenStreams() and and ResetStreams() work perfectly, however after calling CloseStreams() in the delete() function, the program goes to the catch clause, so File.Delete(m_path) won't get executed. In other situations the CloseStreams() function works perfectly. It goes wrong when I'm trying to close the StreamReader (m_writer), but it does give an exception (File is Already Closed).
/**
* Function to close the streams.
*/
private void closeStreams() {
if (m_streamOpen) {
m_fs.Close();
m_reader.Close();
m_writer.Close(); // Goes wrong
m_streamOpen = false;
}
}
/**
* Deletes the file.
*/
public int delete() {
try {
closeStreams(); // Catch after this
File.Delete(m_path);
return 0;
}
catch { return -1; }
}
I call the function like this:
parser.delete();
Could anybody give me some tips?
Your File.Delete(m_path); will never be called, because you get an exception here:
private void closeStreams() {
if (m_streamOpen) {
m_fs.Close();
m_reader.Close();
m_writer.Close(); // throws an exception here
m_streamOpen = false;
}
}
The exception is "Cannot access a closed file"
The cause is explained in the documentation of Close() in StreamReader:
Closes the System.IO.StreamReader object and the underlying stream, and releases any system resources associated with the reader.
There are also some articles about this behaviour:
Does disposing streamreader close the stream?
Is there any way to close a StreamWriter without closing its BaseStream?
Can you keep a StreamReader from disposing the underlying stream?
Avoiding dispose of underlying stream
You should consider re-writing your code and use using() statements.
However, I experimented a bit with your code, and it worked with calling Close() in other order:
m_writer.Close();
m_reader.Close();
m_fs.Close();
However, I assume that this works only by coincidence (I used .NET 4.0 and probably this will not work in another .NET version). I would strongly advice to not do it in this way.
I tested this:
using (FileStream fs = new FileStream(m_path, FileMode.OpenOrCreate, FileAccess.ReadWrite, FileShare.None))
using (StreamReader reader = new StreamReader(fs))
using (StreamWriter writer = new StreamWriter(fs))
{
// so some work here
}
File.Delete(m_path);
But, I know that this may not be for you, since you may want the read and write streams available as fields in your class.
At least, you have some samples to start with ...
File.Delete should work, either you didn't call your delete method, or m_path is an invalid path

c# binary serialization "unexpected binary element: 225"

I am trying to save my custom class using binary serialization in c#. I have the serialization working fine (as far as I can tell, I get no errors) but when I try to deserialize it I get the following error:
SerializationException: Unexpected binary element: 255
Does anybody got ideas as to what could be causing this? I'm using C# with the Unity game engine.
EDIT:
Here's the code, it's inside the class for my editor window.
public static LevelMetaData LoadAllLevels()
{
string filepath = Path.Combine(Application.dataPath, "Levels/meta.dat");
BinaryFormatter serializer = new BinaryFormatter();
if (File.Exists(filepath))
{
using (StreamReader sr = new StreamReader(filepath))
{
return (LevelMetaData)serializer.Deserialize(sr.BaseStream);
}
}
else
{
LevelMetaData temp = new LevelMetaData();
using (StreamWriter sw = new StreamWriter(filepath))
{
serializer.Serialize(sw.BaseStream, temp);
}
return temp;
}
}
EDIT 2:
Here's the LevelMetaData class:
[Serializable]
public class LevelMetaData
{
public LevelMetaData()
{
keys = new List<string>();
data = new List<LevelController>();
}
public LevelController this[string key]
{
get
{
if (keys.Contains(key))
return data[keys.IndexOf(key)];
else
throw new KeyNotFoundException("Level with key \"" + key + "\"not found.");
}
}
public void Add(string key, LevelController level)
{
if (!keys.Contains(key))
{
keys.Add(key);
data.Add(level);
}
}
public bool Contains(string key)
{
return keys.Contains(key);
}
public List<string> keys;
public List<LevelController> data;
}
After some sleep and some more googling I found that I should be using the FileStream class instead of StreamReader and StreamWriter. Changing this makes it work.
I got this same error but it was due to a corrupt file created by the binary formatter serialization method. In my case, I observed that I had a JIT compilation error when serializing. This was somehow creating a file but with no correct data in it so whenever I tried to deserialized I got thie "unexpected binary element : 255" message.
Please, read this if you also find this behaviour on your serialization method.
It all reduces to use this line in the monobehavior using the formatter:
// Forces a different code path in the BinaryFormatter that doesn't rely on run-time code generation (which would break on iOS).
enter code here`Environment.SetEnvironmentVariable("MONO_REFLECTION_SERIALIZER", "yes");
For a further explanation of why are you needing this please visit the referenced link.
If you try adding the code solution and think it didn't help, you MUST uninstall the app from your device.
That's because of the data corruption on the file you attempt to create in the first place.
I had the issue, and that solve the problem.

Serialize and Deserialize Oracle.DataAccess.OracleException in C#

OracleException has no public constructors nor any way to get a new instance. I tried my XmlSerializerHelper class, but it requires a public parameterless constructor.
I used BinaryFormatter to serialize the OracleException and wrote it to a file.
How can I serialize OracleException in a file, and deserialize too using XmlSerializer -for testing reasons-?.
Reference:
http://geekswithblogs.net/WillSmith/archive/2008/07/25/testing-oracleexception.aspx
PD: Is better SoapFormatter or BinaryFormatter ?
Code
SerializationHelper.Serialize(#"C:\Temp\ExcepcionOracle.bin", ex);
var exOra = SerializationHelper.Deserialize(#"C:\Temp\ExcepcionOracle.bin");
public static void Serialize(string fileName, Object obj)
{
var binaryFormatter = new BinaryFormatter();
var fileStream = new FileStream(fileName, FileMode.Create);
try
{
binaryFormatter.Serialize(fileStream, obj);
}
catch (SerializationException ex)
{
throw new ApplicationException("The object graph could not be serialized", ex);
}
finally
{
fileStream.Close();
}
}
public static object Deserialize(string fileName)
{
var binaryFormatter = new BinaryFormatter();
var fileStream = new FileStream(fileName, FileMode.Open);
try
{
fileStream.Seek(0, SeekOrigin.Begin);
return binaryFormatter.Deserialize(fileStream);
}
catch (SerializationException ex)
{
throw new ApplicationException("Serialization Exception: " + ex.Message);
}
finally
{
fileStream.Close();
}
return null;
}
things like Exception simply aren't very suitable for xml serializers (and XmlSerializer in particular). In addition to the constructor issues (which some serializers can work around, and some can't), you are also likely to get issues with unexpected subclasses and arbitrary data in the collection.
If you are serializing as xml, you should probably just capture the key information you need - maybe the .Message and a few other things. Note also that in a client/server application the client doesn't really need to know much of the particulars of the failure - that should remain at the server. Either it is an exected error (invalid parameters, login issues, quota restrictions, etc), or it is an unexpected error. In the latter case: just say an unexpected error happened. The details would only be useful to a developer, and a developer should already have access to the server's error log.

How to save a human readable file

Currently i have an application that reads and writes several properties from one or two basic classes to a .txt file using the Binary Serializer.
I've opened up the .txt file in NotePad and as it's formatted for the application it's not very readable to the human eye, not for me anyway =D
I've heard of using XML but pretty much most of my searches seem to overcomplicate things.
The kind of data im trying to save is simply a collection of "Person.cs" classes,nothing more than a name and address, all private strings but with properties and marked as Serializable.
What would be the best way to actually save my data in a way that can be easily read by a person? It would also make it easier to make small changes to the application's data directly in the file instead of having to load it, change it and save it.
Edit:
I have added the current way i am saving and loading my data, my _userCollection is as it suggests and the nUser/nMember are an integer.
#region I/O Operations
public bool SaveData()
{
try
{
//Open the stream using the Data.txt file
using (Stream stream = File.Open("Data.txt", FileMode.Create))
{
//Create a new formatter
BinaryFormatter bin = new BinaryFormatter();
//Copy data in collection to the file specified earlier
bin.Serialize(stream, _userCollection);
bin.Serialize(stream, nMember);
bin.Serialize(stream, nUser);
//Close stream to release any resources used
stream.Close();
}
return true;
}
catch (IOException ex)
{
throw new ArgumentException(ex.ToString());
}
}
public bool LoadData()
{
//Check if file exsists, otherwise skip
if (File.Exists("Data.txt"))
{
try
{
using (Stream stream = File.Open("Data.txt", FileMode.Open))
{
BinaryFormatter bin = new BinaryFormatter();
//Copy data back into collection fields
_userCollection = (List<User>)bin.Deserialize(stream);
nMember = (int)bin.Deserialize(stream);
nUser = (int)bin.Deserialize(stream);
stream.Close();
//Sort data to ensure it is ordered correctly after being loaded
_userCollection.Sort();
return true;
}
}
catch (IOException ex)
{
throw new ArgumentException(ex.ToString());
}
}
else
{
//Console.WriteLine present for testing purposes
Console.WriteLine("\nLoad failed, Data.txt not found");
return false;
}
}
Replace your BinaryFormatter with XMLSerializer and run the same exact code.
The only change you need to make is the BinaryFormatter takes an empty constructor, while for the XMLSerializer you need to declare the type in the constructor:
XmlSerializer serializer = new XmlSerializer(typeof(Person));
Using XmlSerializer is not really complicated. Have a look at this MSDN page for an example: http://msdn.microsoft.com/en-us/library/system.xml.serialization.xmlserializer.aspx
You could implement your own PersonsWriter, that takes a StreamWriter as constructor argument and has a Write method that takes an IList<Person> as input to parse out a nice text representation.
For example:
public class PersonsWriter : IDisposable
{
private StreamWriter _wr;
public PersonsWriter(IList<Person> persons, StreamWriter writer)
{
this._wr = writer;
}
public void Write(IList<Persons> people) {
foreach(Person dude in people)
{
_wr.Write(#"{0} {1}\n{2}\n{3} {4}\n\n",
dude.FirstName,
dude.LastName,
dude.StreetAddress,
dude.ZipCode,
dude.City);
}
}
public void Dispose()
{
_wr.Flush();
_wr.Dispose();
}
}
YAML is another option for human readable markup that is also easy to parse. there are libraries available for c# as well as almost all other popular languages. Here's a sample of what yaml looks like:
invoice: 34843
date : 2001-01-23
bill-to: &id001
given : Chris
family : Dumars
address:
lines: |
458 Walkman Dr.
Suite #292
city : Royal Oak
state : MI
postal : 48046
Frankly, as a human, I don't find XML to be all that readable. In fact, it's not really designed to be read by humans.
If you want a human readable format, then you have to build it.
Say, you have a Person class that has a First Name, a last Name and a SSN as properties. Create your file, and have it write out 3 lines, with a description of the field in the first fifty (random number from my head) and then with character 51 have the value start being written.
This will produce a file that looks like:
First Name-------Stephen
Last Name -------Wrighton
SSN -------------XXX-XX-XXXX
Then, reading it back in, your program would know where the data begins on each line, and what each line is for (the program would know that Line 3 is the SSN value).
But remember, to truly gain human readability, you sacrifice data portability.
Try the DataContractSerializer
It serializes objects to XML and is very easy to use
Write a CSV reader writer if you want a good compromise between human and machine readable in a Windows environment
Loads into Excel too.
There's a discussion about it here:
http://knab.ws/blog/index.php?/archives/3-CSV-file-parser-and-writer-in-C-Part-1.html
EDIT
That is a C# article... it just confusingly has "C" in the URL.
I really think you should go with XML (look into DataContractSerializer). Its not that complicated. You could probably even just replace BinarySerializer with XMLSerializer and go.
If you still don't want to do that, though, you can write a delimited text file. Then you'll have to write your own reader method (although, it could almost just use the split method).
//Inside the Person class:
public override string ToString()
{
List<String> propValues = new List<String>();
// Get the type.
Type t = this.GetType();
// Cycle through the properties.
foreach (PropertyInfo p in t.GetProperties())
{
propValues.add("{0}:={1}", p.Name, p.GetValue(o, null));
}
return String.Join(",". propValues.ToArray())
}
using (System.IO.TextWriter tw = new System.IO.StreamWriter("output.txt"))
{
tw.WriteLine(person.ToString());
}

will a mocked interface still serialize to a file?

I'm trying to unit test saving of a file. I have an interface that defines a document, and I pass a concrete object that implements that interface to the Save method, and it works in practice, but I'm trying to unit test it to ensure that it will always work (and I'm desperately trying to catch up on the unit tests after a period of 'crunch time').
My save method is pretty simple, it works like so:
public Boolean SaveDocument(IDocument document)
{
BinaryFormatter bFormatter = new BinaryFormatter();
FileStream fs = null;
try
{
if (!Directory.Exists(folderName))
Directory.CreateDirectory(folderName);
String path = Path.Combine(Path.Combine(folderName, document.FileName), document.Extension);
using (fs = new FileStream(path, FileMode.OpenOrCreate))
{
bFormatter.Serialize(fs, document);
}
}
catch (IOException ioex)
{
LOG.Write(ioex.Message);
return false;
}
return true;
}
and my test is:
[Test]
public void can_save_a_document()
{
String testDirectory = "C:\\Test\\";
m_DocumentHandler = new DocumentHandler(testDirectory);
DynamicMock mock = new DynamicMock(typeof(IDocument));
mock.ExpectAndReturn("get_FileName", "Test_File");
mock.ExpectAndReturn("get_Extension", ".TST");
m_DocumentHandler.SaveDocument(mock.MockInstance as IDocument);
try
{
Assert.IsTrue(Directory.Exists(testDirectory), "Directory was not created");
String[] filesInTestDir = Directory.GetFiles(testDirectory);
Assert.AreEqual(1, filesInTestDir.Length, "there is " + filesInTestDir.Length.ToString() + " files in the folder, instead of 1");
Assert.AreEqual(Path.GetFileName(filesInTestDir[0]), "Test_File.TST");
}
finally
{
Directory.Delete(testDirectory);
Assert.IsFalse(Directory.Exists(testDirectory), "folder was not cleaned up");
}
}
I'm aware that serializing an interface preserves the concrete data, but will the mocked interface serialize?
The BinaryFormatter uses the actual type of the object passed in - not the interface - when it serializes the data. So internally it will write something like Type:MyLib.Objects.MyObj,MyLib when you pass a real object and Type:Moq.ConcreteProxy,Moq etc. when you pass a mock object.
Using the BinaryFormatter for persistence is going to get you in trouble either way as you have to deal with versioning and memory layout differences between releases. You would be much better off establishing a well defined format for your document and writing the objects and fields manually.
If you need to test how serialization works with a class, then create some real classes for the test and use those. Mocking is useful for testing interactions between collaborating objects.

Categories