Lazy, stream driven object serialization with protobuf-net - c#

We are developing a WCF service for streaming a large amount of data, therefore we have chosen to use WCF Streaming functionality combined with a protobuf-net serialization.
Context:
Generally an idea is to serialize objects in the service, write them into a stream and send.
On the other end the caller will receive a Stream object and it can read all data.
So currently the service method code looks somewhat like this:
public Result TestMethod(Parameter parameter)
{
// Create response
var responseObject = new BusinessResponse { Value = "some very large data"};
// The resposne have to be serialized in advance to intermediate MemoryStream
var stream = new MemoryStream();
serializer.Serialize(stream, responseObject);
stream.Position = 0;
// ResultBody is a stream, Result is a MessageContract
return new Result {ResultBody = stream};
}
The BusinessResponse object is serialized to a MemoryStream and that is returned from a method.
On the client side the calling code looks like that:
var parameter = new Parameter();
// Call the service method
var methodResult = channel.TestMethod(parameter);
// protobuf-net deserializer reads from a stream received from a service.
// while reading is performed by protobuf-net,
// on the service side WCF is actually reading from a
// memory stream where serialized message is stored
var result = serializer.Deserialize<BusinessResponse>(methodResult.ResultBody);
return result;
So when serializer.Deserialize() is called it reads from a stream methodResult.ResultBody, on the same time on the service side WCF is reading a MemoryStream, that has been returned from a TestMethod.
Problem:
What we would like to achieve is to get rid of a MemoryStream and initial serialization of the whole object on the service side at once.
Since we use streaming we would like to avoid keeping a serialized object in memory before sending.
Idea:
The perfect solution would be to return an empty, custom-made Stream object (from TestMethod()) with a reference to an object that is to be serialized ('BusinessResponse' object in my example).
So when WCF calls a Read() method of my stream, I internally serialize a piece of an object using protobuf-net and return it to the caller without storing it in the memory.
And now there is a problem, because what we actually need is a possibility to serialize an object piece by piece in the moment when stream is read.
I understand that this is totally different way of serialization - instead of pushing an object to a serializer, I'd like to request a serialized content piece by piece.
Is that kind of serialization is somehow possible using protobuf-net?

I cooked up some code that is probably along the lines of the gate idea of Marc.
public class PullStream : Stream
{
private byte[] internalBuffer;
private bool ended;
private static ManualResetEvent dataAvailable = new ManualResetEvent(false);
private static ManualResetEvent dataEmpty = new ManualResetEvent(true);
public override bool CanRead
{
get { return true; }
}
public override bool CanSeek
{
get { return false; }
}
public override bool CanWrite
{
get { return true; }
}
public override void Flush()
{
throw new NotImplementedException();
}
public override long Length
{
get { throw new NotImplementedException(); }
}
public override long Position
{
get
{
throw new NotImplementedException();
}
set
{
throw new NotImplementedException();
}
}
public override int Read(byte[] buffer, int offset, int count)
{
dataAvailable.WaitOne();
if ( count >= internalBuffer.Length)
{
var retVal = internalBuffer.Length;
Array.Copy(internalBuffer, buffer, retVal);
internalBuffer = null;
dataAvailable.Reset();
dataEmpty.Set();
return retVal;
}
else
{
Array.Copy(internalBuffer, buffer, count);
internalBuffer = internalBuffer.Skip(count).ToArray(); // i know
return count;
}
}
public override long Seek(long offset, SeekOrigin origin)
{
throw new NotImplementedException();
}
public override void SetLength(long value)
{
throw new NotImplementedException();
}
public override void Write(byte[] buffer, int offset, int count)
{
dataEmpty.WaitOne();
dataEmpty.Reset();
internalBuffer = new byte[count];
Array.Copy(buffer, internalBuffer, count);
Debug.WriteLine("Writing some data");
dataAvailable.Set();
}
public void End()
{
dataEmpty.WaitOne();
dataEmpty.Reset();
internalBuffer = new byte[0];
Debug.WriteLine("Ending writes");
dataAvailable.Set();
}
}
This is a simple stream descendant class only implementing Read and Write (and End). The Read blocks while no data is available and the Write blocks while data is available. This way there is only one byte buffer involved. The linq copying of the rest is open for optimization ;-) The End method is added so no blocking occurs where Read is performed when no data is available and no data will be written any more.
You have to write to this stream from a separate thread. I show this below:
// create a large object
var obj = new List<ToSerialize>();
for(int i = 0; i <= 1000; i ++)
obj.Add(new ToSerialize { Test = "This is my very loooong message" });
// create my special stream to read from
var ms = new PullStream();
new Thread(x =>
{
ProtoBuf.Serializer.Serialize(ms, obj);
ms.End();
}).Start();
var buffer = new byte[100];
// stream to write back to (just to show deserialization is working too)
var ws = new MemoryStream();
int read;
while ((read = ms.Read(buffer, 0, 100)) != 0)
{
ws.Write(buffer, 0, read);
Debug.WriteLine("read some data");
}
ws.Position = 0;
var back = ProtoBuf.Serializer.Deserialize<List<ToSerialize>>(ws);
I hope this solves your problem :-) It was fun to code this anyway.
Regards, Jacco

Related

Hash a file as its being recived

End goal:
Users are uploading a large number of files in different sizes to my web site. And i dont want duplicate files on the disk.
The solution i have been using is a simple SH1 hash of the file when it is uploaded. With code like this:
public static string HashFile(string FileName)
{
using (FileStream stream = File.OpenRead(FileName))
{
SHA1Managed sha = new SHA1Managed();
byte[] checksum = sha.ComputeHash(stream);
string sendCheckSum = BitConverter.ToString(checksum).Replace("-",string.Empty);
return sendCheckSum;
}
}
This "works" fine for smaller files, but its a big pain when the file is 30gb. So i would like to hash the file as im reciving it from the client. I get the file from the client in "chunks" and size of the chunk is not always static.
Code that recives the file.
int chunk = context.Request["chunk"] != null ? int.Parse(context.Request["chunk"]) : 0;
int chunks = context.Request["chunks"] != null ? int.Parse(context.Request["chunks"]) : 0;
string fileName = context.Request["name"] != null ? context.Request["name"] : string.Empty;
HttpPostedFile fileUpload = context.Request.Files[0];
string fullFilePath = Path.Combine(SiteSettings.UploadTempFolder, fileName);
using (var fs = new FileStream(fullFilePath, chunk == 0 ? FileMode.Create : FileMode.Append))
{
var buffer = new byte[fileUpload.InputStream.Length];
fileUpload.InputStream.Read(buffer, 0, buffer.Length);
fs.Write(buffer, 0, buffer.Length);
**// Here i want the hash, when i have the file data in memory.**
}
You can always create your own stream :)
public class ActionStream : Stream
{
private readonly Stream _innerStream;
private readonly Action<byte[], int, int> _readAction;
public ActionStream(Stream innerStream, Action<byte[], int, int> readAction)
{
_innerStream = innerStream;
_readAction = readAction;
}
public override bool CanRead => true;
public override bool CanSeek => false;
public override bool CanWrite => false;
public override long Length => _innerStream.Length;
public override long Position
{
get { return _innerStream.Position; }
set { throw new NotSupportedException(); }
}
public override void Flush() { }
public override int Read(byte[] buffer, int offset, int count)
{
var bytesRead = _innerStream.Read(buffer, offset, count);
_readAction(buffer, offset, bytesRead);
return bytesRead;
}
public override long Seek(long offset, SeekOrigin origin)
{
throw new NotSupportedException();
}
protected override void Dispose(bool disposing)
{
if (disposing)
{
_innerStream.Dispose();
}
base.Dispose(disposing);
}
public override void SetLength(long value) { throw new NotSupportedException(); }
public override void Write(byte[] buffer, int offset, int count)
{
throw new NotSupportedException();
}
}
This allows you to bind together the two stream operations you're doing:
using (var fs = new FileStream(path, chunk == 0 ? FileMode.Create : FileMode.Append))
{
var as = new ActionStream(fileUpload.InputStream,
(buffer, offset, bytesRead) =>
{
fs.Write(buffer, offset, bytesRead);
});
var sha = new SHA1Managed();
var checksum = sha.ComputeHash(as);
}
This assumes that SHA1Manager reads through every single byte of the input stream in order - you should check that. I'm pretty sure that is how it works, though :)
This is a cut and paste from:
Compute a hash from a stream of unknown length in C#
MD5, like other hash functions, does not require two passes.
To start:
HashAlgorithm hasher = ..;
hasher.Initialize();
As each block of data arrives:
byte[] buffer = ..;
int bytesReceived = ..;
hasher.TransformBlock(buffer, 0, bytesReceived, null, 0);
To finish and retrieve the hash:
hasher.TransformFinalBlock(new byte[0], 0, 0);
byte[] hash = hasher.Hash;
This pattern works for any type derived from HashAlgorithm, including MD5CryptoServiceProvider and SHA1Managed.
HashAlgorithm also defines a method ComputeHash which takes a Stream object; however, this method will block the thread until the stream is consumed. Using the TransformBlock approach allows an "asynchronous hash" that is computed as data arrives without using up a thread.

Redirect console output to readable memory not to file?

I was using this code to redirect my console output to file and then read and display it. I want to go away from using files because I'm polluting my folders with those console files. How can I do this in memory ? I don't want any files to pollute the system. Maybe I'm trying something weird here. I just want 1 thread to read the console output of the very same application:
1 application
multiple threads write to console
1 thread reads from console
My working file code:
private StreamWriter currentOut = null;
private void RedirectConsole()
{
currentOut = new StreamWriter(new FileStream(filename,
FileMode.Create, FileAccess.Write, FileShare.Read));
currentOut.AutoFlush = true;
Console.SetOut(currentOut);
ThreadPool.QueueUserWorkItem(o => { Listen(); });
}
private void Listen()
{
StreamReader fileIn = new StreamReader(new FileStream(filename,
FileMode.Open, FileAccess.Read, FileShare.ReadWrite));
while (true)
{
try
{
if (!fileIn.EndOfStream)
{
string a = fileIn.ReadLine();
MessageBox.Show(a);
}
Thread.Sleep(25);
}
catch { }
}
}
This seems to be what I want. But I'm unable to implement that (help?). File is like a buffer. You write to it from one end and read from another. I need the same in memory.
Try:
private StreamWriter currentOut = null;
private MemoryStream ms = new MemoryStream();
private void RedirectConsole()
{
currentOut = new StreamWriter(ms);
currentOut.AutoFlush = true;
Console.SetOut(currentOut);
ThreadPool.QueueUserWorkItem(o => { Listen(); });
}
private void Listen()
{
StreamReader fileIn = new StreamReader(ms);
// ...
}
The problem with using MemoryStream is that the read position advances with the write position. Pipes (System.IO.Pipes namespace) are a better choice for use as temporary buffers where the read position needs to advance independent of the write position. Admittedly, this more or less does exactly what your working solution does, though it removes the need to implement the buffer yourself.
class ConsoleRedirector : IDisposable
{
private TextWriter originalOut = Console.Out;
private AnonymousPipeServerStream consoleOutServerPipe;
private StreamWriter currentOut;
public ConsoleRedirector()
{
this.consoleOutServerPipe = new AnonymousPipeServerStream(PipeDirection.Out);
this.currentOut = new StreamWriter(this.consoleOutServerPipe);
this.currentOut.AutoFlush = true;
Console.SetOut(this.currentOut);
ThreadPool.QueueUserWorkItem(o => { this.Listen(); });
}
private void Listen()
{
AnonymousPipeClientStream consoleOutClientPipe = new AnonymousPipeClientStream(PipeDirection.In, this.consoleOutServerPipe.ClientSafePipeHandle);
using (StreamReader fileIn = new StreamReader(consoleOutClientPipe))
{
// ...
}
}
public void Dispose()
{
this.currentOut.Dispose();
Console.SetOut(this.originalOut);
}
}
I ended up writing a derived stream class and replaced the FileStream with my own stream. I probably should have avoided that. But since I couldn't find a working solution, it was also a good practice. Something like this:
public class MyStream: Stream
{
private byte[] internalBuffer = new byte[4096];
// ...
public override int Read(byte[] buffer, int offset, int count)
{
// used by StreamReader
}
public override void Write(byte[] buffer, int offset, int count)
{
// used by StreamWriter
}
}
override all the other stuff, handle multi-threading while enlarging internalBuffer and disposing passed data.

Serial Ports - There is an error in XML document (5, 3870)

Here I have a class I've created for the purpose of sending text-files and images over a serial port:
public class SendItem
{
private byte[] bytes;
private string _fileName;
private string _extension;
private int bytesSin;
public string Extension
{
get { return _extension; }
set { _extension = value; }
}
public string FileName
{
get { return _fileName; }
set { _fileName = value; }
}
public byte[] Bytes
{
get { return bytes; }
set { bytes = value; }
}
public SendItem()
{
}
public void SendFile(SerialPort serialPort1)
{
if (serialPort1.IsOpen)
{
OpenFileDialog OFDialog = new OpenFileDialog();
OFDialog.Title = "Open File";
OFDialog.Filter = "Text Files (*.txt)" + "|*.txt|All files (*.*)|*.*";
OFDialog.InitialDirectory = #"C:\";
bool? userClickedOK = OFDialog.ShowDialog();
if (userClickedOK == true)
{
serialPort1.DiscardInBuffer();
string chosenFile = OFDialog.FileName;
string ext = Path.GetExtension(OFDialog.FileName);
_extension = ext;
byte[] data = File.ReadAllBytes(chosenFile);
SendItem newSendItem = new SendItem();
newSendItem._extension = ext;
newSendItem._fileName = System.IO.Path.GetFileNameWithoutExtension(chosenFile);
newSendItem.bytes = data;
byte[] view = newSendItem.Serialize();
string test = Convert.ToBase64String(data);
serialPort1.Write(newSendItem.Serialize(), 0, newSendItem.Serialize().Length);
//serialPort1.Write(data, 0, data.Length);
}
}
}
public override string ToString()
{
return string.Format("File: {0}{1}",_fileName, _extension);
}
}
There is quite a few redundant things in the SendFile method, but I ask that it would be ignored in favour of the following issue I keep on having.
Whenever I send a text file or just plain chat text from a textbox, the following block of code is executed without triggering the catch:
void _sPort_DataReceived(object sender, SerialDataReceivedEventArgs e)
{
SerialPort sp = (SerialPort)sender;
byte[] data = new byte[sp.BytesToRead];
string message = string.Empty;
sp.Read(data, 0, data.Length);
try
{
SendItem receivedObject = data.Deserialize<SendItem>();
File.WriteAllBytes(#"d:\" + receivedObject.FileName + receivedObject.Extension, receivedObject.Bytes);
message = "File has been recieved.";
sp.Write("File sent.");
}
catch (Exception exp)
{
errors = exp.Message;
message = Encoding.UTF8.GetString(data);
}
App.Current.Dispatcher.Invoke(new Action(() => _response.Add("Friend: " + message)));
}
The problem comes in when I try to send an image... It trigger the catch and gives the exception There is an error in XML document (5, 3870).
My serializer and deserializer I've written is as follow:
public static byte[] Serialize<T>(this T source)
{
XmlSerializer serializer = new XmlSerializer(typeof(T));
MemoryStream stream = new MemoryStream();
serializer.Serialize(stream, source);
byte[] buffer = stream.GetBuffer();
stream.Close();
return buffer;
}
public static T Deserialize<T>(this byte[] source)
{
XmlSerializer serializer = new XmlSerializer(typeof(T));
MemoryStream stream = new MemoryStream(source);
T result = (T)serializer.Deserialize(stream);
stream.Close();
return result;
}
Can anyone point out where I'm making my mistake? I've been debugging it for ages and I can't wrap my head around it.
**EDIT:
I am also including the serialised data for an image I tried to send, after looking at it, it appears my Byte might be too big for the serial port - is there any way to adjust this or allow the serial port to take all of the data sent over?
/9j/4AAQSkZJRgABAgAAAQABAAD//gAEKgD/4gIcSUNDX1BST0ZJTEUAAQEAAAIMbGNtcwIQAABtbnRyUkdCIFhZWiAH3AABABkAAwApADlhY3NwQVBQTAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA9tYAAQAAAADTLWxjbXMAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAApkZXNjAAAA/AAAAF5jcHJ0AAABXAAAAAt3dHB0AAABaAAAABRia3B0AAABfAAAABRyWFlaAAABkAAAABRnWFlaAAABpAAAABRiWFlaAAABuAAAABRyVFJDAAABzAAAAEBnVFJDAAABzAAAAEBiVFJDAAABzAAAAEBkZXNjAAAAAAAAAANjMgAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAB0ZXh0AAAAAEZCAABYWVogAAAAAAAA9tYAAQAAAADTLVhZWiAAAAAAAAADFgAAAzMAAAKkWFlaIAAAAAAAAG+iAAA49QAAA5BYWVogAAAAAAAAYpkAALeFAAAY2lhZWiAAAAAAAAAkoAAAD4QAALbPY3VydgAAAAAAAAAaAAAAywHJA2MFkghrC/YQPxVRGzQh8SmQMhg7kkYFUXdd7WtwegWJsZp8rGm/fdPD6TD////bAEMACAYGBwYFCAcHBwkJCAoMFA0MCwsMGRITDxQdGh8eHRocHCAkLicgIiwjHBwoNyksMDE0NDQfJzk9ODI8LjM0Mv/bAEMBCQkJDAsMGA0NGDIhHCEyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMv/CABEIAjoCOgMAIgABEQECEQH/xAAbAAEBAAMBAQEAAAAAAAAAAAAAAQIDBAUGB//EABgBAQEBAQEAAAAAAAAAAAAAAAABAgME/8QAGAEBAQEBAQAAAAAAAAAAAAAAAAECAwT/2gAMAwAAARECEQAAAfvRySZY1mKSgAKAWUBICgAAAiiUAAAAAAECgAAAAQAUECUKCCUECXIKxz17KAACgFgWLCgAIAAllCKqUCAAABCygAKCAAJZSBQQFACAoIggBMsayY5AAUCBKSGTFGTC1kkM5BUFuNoBYKgqCoKlsSoAItqVCFqVAgCBRUgUAICgiCBRKrDPDMCgSyiFJjkXUzwwWFqSXOa4u5oq7bokdE0Q6Zoh0zRa3TSl3NNN2WjbrO1L05kKAsqRQEABREAWWKAUkACwZWLQGNCillQCBQsYZzN1S4Ztwwwm9pJ0qCpQQqCsBmxpbjYy3aN2ue2x15gALKhC1KgkUCURQlECggALBkstICZYmQoEsAUgEsl16tmnFnP0cs6dTGulQVAjEZTXbsQVjYyy15Js26N9xuHXiAAsACyoEAAAAQKKRYgLBlYVQSWGQsFIBQQEsl06durnqc3TzN9AvVZCwFirp2a1zSqsJbjYz6ObfeXQs6cQAAAFgqVAiUAIoixVhAUCCLAoJKFxypYASpSAYZa83Xrywxpy9PK30ovWooFVgmNZNAALCZbtG2Y68sMtecNgAAAFgqVAgABKIogKDESiRbFATLDOgAQFCMdWerLDHLHOpwd3mt+hTXUFBBqXbJVAAFkcnX5aaPqPzb6xx+hHXmAAAAAsqBBFtSoEAQFQSVKAAAsFAFASXCMdWeGbiM3T5/f5zp6iXXQhalGGYwti0AJRDxvZ8evj/rfmvvLy9Uu+MAAAAAsFSpFAQABFEBIS0AACWGQAoIxwzwjXjnM3Wzi8vn+j5s36qXXaBQARKUCxZAR5Hr+Rb8z97+dfaOX0Vl6cIFAAAAFJYKgqVAgACKMFSkoAAlFY5AAgxsjGZM3GWry8ffxze+VrtFEUQAAKsJUsPI9fyq+M+n+X9jXL9CuOV4wUAAAAsAAoEgAAAGFiWygAACZQVKiCyWZRZLFGnk7edrC+X8w9H3j852an6C+bxxr6afN7D6B4/bHW59yZTDmOp4/CfTeT43RqeN6HD6u8febIvAKAAAAAFJZUCAAACDES2WFSgBKTLGiwCRFkosQpjMh8l4H0fDn0/Nehwe50xz6PT0Z6eZr9nk1Nfs+NlNfT7vM7OTHwXL1mjX27948/0MfRjwvrPkfv5n3BriVZAopAAALKkpAAAAEpbgmWUAsoShKSwUCURUSgBKHF5nv+Hy6/F+3w+trpweZ9pyx4XF9bq1n5+/RbJt2bts5/B8f0fJvp8/2eznceL7k7sX5L9J+C+4Z7S9eIJBVEQWikUkpAAABKSyhLWBM2gqQqwFIUAFIUlKlIgp819L5kvz/bo18fV9Ew3Xj5+v15XH0Z60zz15acpv5a8nm+hmnmb+njy8P6bxPct9Kx6PPRAAEVQQIVKCCygEUAEpgJZYKAAlAGWNRZQQotIAGrdjHy/nbuPn6Po+3wfWznuapvKcvTm7ctDU16tnPjXo3h3pPJ6PFuur2/l/oNZ+hS9OQiZQKgqCoKgWBYFgBbAsEEXEQAAqFAAsFCCkFoAhRJ8x4/1PhcvRwe38t6ur9BePt5TL436PwOl6NXlYbdnofP91n2nPz3heLyenye09z6j5j62Y9Ea5CqS2BKFBAAUAAEBQAGsQBUoBUFAlJUW2xIKQKFAeV819P8xz6+P0XRrfqavKwj3/P5s16NGVrg6ZLnq2ePnG7Vs3Wer9V8P9VMe+S81lWwAFgsCoAKgAVCoFlGNGCBUFCoKBZUJVlxyqyyS2AAS2zHwo2eBhw57dnldXntze26x6no+dv5dO3R5c1OzkVfJ5O+dOfN382s+gy+U9LOf0vo/PPeZ+lcnXcyiRYqwVAoERUFgWBYtEsEprS0lhklJQURcaoIsoBUhlNfkL7Pk/O+a37Pl+RorbpNT03H6OOurZr0ntbfIyzeycuxNuGnVb06efOzPy/S8vWIjWM+vip7/p/H7M39E9n8v2n6i+H+nmfQVElARKhYogLIZSoJSVKxhVgVKLBbCAUEqkmOlc/H5vkl9jg8vVddPNi1KhAq9vBsl9HCY56btGrXZumqp046Kt36tKXQXIWAluI2Z6Kd3b42Uv6F9L+Rfex78yxzEWICZYiglgsQyRFxyhjY1FlIFoSgASwyjUuHF1eFm/OeRnp60hFkMoVUAhlt57L0YMJWWpZta8TdqgCxYMmNBEqFyuNTPs4afrXd+cfpGWMymEBLAIVKEsCFxzxIl1CVQRYKsKgWBo26Jdfyn0/wU35WNx65rEqwlSgUlglRACAFRVQUIAQWC241Llhkuz9M/MPpJP0WZY4QkkosWAFxsKiLAljUmWNWpUJRZQlBDHRt0y8H579j8RrWvG46lsFbequF62B5j07J5c9XBfOnpLPMdkOSdkl5Has4naOK9mRwvTHmPR1HE3ajELYiZZYZLl08mcfs2Xk+rmJZmLiWy4lISkACmA1FgtgJS3FbbBYkmrTt5835D5j0/L6akssAZY09PzeziKQuWHZW3i2c6d/B18q4ki9nF6Fmvk2aoZY1ero4MU9rW85NunVLUsUItgyywyPvvrPhfusyQzIUhFWUkIygEyNdjapZFgoJlFLLCIujl6vOy/POXPX1ogsKybE6eTfsrlwysYbcZXZwdNTPi261xlRM8Ok6fP6uUgq3HI9LzvT8xJjYSEtsFSlyxp9T+hfm36RmSS5kuIyiKsCWCxFiBLsssgAluSCkkyxsOfwvc+czfhpZ1qCiGzZpqdDRsrPbpyM7q3
It may or may not be the whole problem, but this is definitely a problem:
byte[] buffer = stream.GetBuffer();
That's quite possibly returning more data than you want - because it's not limiting itself to the length of the stream. MemoryStream has the handy MemoryStream.ToArray method to simplify this:
public static byte[] Serialize<T>(this T source)
{
XmlSerializer serializer = new XmlSerializer(typeof(T));
MemoryStream stream = new MemoryStream();
serializer.Serialize(stream, source);
return stream.ToArray();
}
(There's no benefit from closing a MemoryStream that you've created like this - and if you did feel you needed to, you should use a using statement.)
Likewise your Deserialize method can be simplified to:
public static T Deserialize<T>(this byte[] source)
{
XmlSerializer serializer = new XmlSerializer(typeof(T));
return (T) serializer.Deserialize(new MemoryStream(source));
}
Given that you were effectively trying to deserialize "the XML document - and then probably a load of "blank" data, that may well be the issue.
EDIT: Now there's a second problem:
byte[] data = new byte[sp.BytesToRead];
string message = string.Empty;
sp.Read(data, 0, data.Length);
You're assuming that all of the data is ready to read in a single go. That won't be true if there's more data than the buffer size of the serial port. Instead, if you're trying to put multiple "messages" onto what is effectively a stream, you'll need to either indicate the end of a message (and keep reading until you find that) or write the length of the message before the message itself - then when reading, read that length and then make sure you read that many bytes.
This is a standard issue when you try to use a stream-oriented protocol as a message-oriented protocol.
You should use ToByteArray() instead of GetBuffer().

Load File from SDCard

I integrated (this)EPUB Reader reader to my project. It is working fine. & I want to load the file from SDCard instead of Isolated storage of device
To open file from Isolated storage we have IsolatedStorageFileStream like this
IsolatedStorageFileStream isfs;
using (IsolatedStorageFile isf = IsolatedStorageFile.GetUserStoreForApplication())
{
try
{
isfs = isf.OpenFile([Path to file], FileMode.Open);
}
catch
{
return;
}
}
ePubView.Source = isfs;
For file in SDcard I tried like this
ExternalStorageDevice sdCard = (await ExternalStorage.GetExternalStorageDevicesAsync()).FirstOrDefault();
// If the SD card is present, get the route from the SD card.
if (sdCard != null)
{
ExternalStorageFile file = await sdCard.GetFileAsync(_sdFilePath);
// _sdFilePath is string that having file path of file in SDCard
// Create a stream for the route.
Stream file = await file.OpenForReadAsync();
// Read the route data.
ePubView.Source = file;
}
Here I am getting exception System.IO.EndOfStreamException
If You want try.. Here is my project sample link
Question : How can I give my file as source to epubView control
Is this is proper way, please give a suggestion regarding this..
Thanks
Although I've not tried your approach, and I cannot say exactly where is an error (maybe file from SD is read async and thus you get EndOfStream, and please keep in mind that as it is said at EPUB Reader Site - it's under heavy developement). Check if after copying the file to ISolatedStorage, you will be able to use it. I would try in this case first copying from SD to Memory stream like this:
ExternalStorageDevice sdCard = (await ExternalStorage.GetExternalStorageDevicesAsync()).FirstOrDefault();
if (sdCard != null)
{
MemoryStream newStream = new MemoryStream();
using (ExternalStorageFile file = await sdCard.GetFileAsync(_sdFilePath))
using (Stream SDfile = await file.OpenForReadAsync())
newStream = await ReadToMemory(SDfile);
ePubView.Source = newStream;
}
And ReadToMemory:
private async Task<MemoryStream> ReadToMemory(Stream streamToRead)
{
MemoryStream targetStream = new MemoryStream();
const int BUFFER_SIZE = 1024;
byte[] buf = new byte[BUFFER_SIZE];
int bytesread = 0;
while ((bytesread = await streamToRead.ReadAsync(buf, 0, BUFFER_SIZE)) > 0)
{
targetStream.Write(buf, 0, bytesread);
}
return targetStream;
}
Maybe it will help.
There's a bug with the stream returned from ExternalStorageFile. There's two options to get around it...
If the file is small then you can simply copy the stream to a MemoryStream:
Stream s = await file.OpenForReadAsync();
MemoryStream ms = new MemoryStream();
s.CopyTo(ms);
However, if the file is too large you'll run in to memory issues so the following stream wrapper class can be used to correct Microsoft's bug (though in future versions of Windows Phone you'll need to disable this fix once the bug has been fixed):
using System;
using System.IO;
namespace WindowsPhoneBugFix
{
/// <summary>
/// Stream wrapper to circumnavigate buggy Stream reading of stream returned by ExternalStorageFile.OpenForReadAsync()
/// </summary>
public sealed class ExternalStorageFileWrapper : Stream
{
private Stream _stream; // Underlying stream
public ExternalStorageFileWrapper(Stream stream)
{
if (stream == null)
throw new ArgumentNullException("stream");
_stream = stream;
}
// Workaround described here - http://stackoverflow.com/a/21538189/250254
public override long Seek(long offset, SeekOrigin origin)
{
ulong uoffset = (ulong)offset;
ulong fix = ((uoffset & 0xffffffffL) << 32) | ((uoffset & 0xffffffff00000000L) >> 32);
return _stream.Seek((long)fix, origin);
}
public override bool CanRead
{
get { return _stream.CanRead; }
}
public override bool CanSeek
{
get { return _stream.CanSeek; }
}
public override bool CanWrite
{
get { return _stream.CanWrite; }
}
public override void Flush()
{
_stream.Flush();
}
public override long Length
{
get { return _stream.Length; }
}
public override long Position
{
get
{
return _stream.Position;
}
set
{
_stream.Position = value;
}
}
public override int Read(byte[] buffer, int offset, int count)
{
return _stream.Read(buffer, offset, count);
}
public override void SetLength(long value)
{
_stream.SetLength(value);
}
public override void Write(byte[] buffer, int offset, int count)
{
_stream.Write(buffer, offset, count);
}
}
}
Code is available here to drop in to your project:
https://github.com/gavinharriss/ExternalStorageFileWrapper-wp8
Example of use:
ExternalStorageFile file = await device.GetFileAsync(filename); // device is an instance of ExternalStorageDevice
Stream streamOriginal = await file.OpenForReadAsync();
ExternalStorageFileWrapper streamToUse = new ExternalStorageFileWrapper(streamOriginal);

C# POCO to a Stream/Byte[] and process while reading with a Silverlight 4 client over WCF

For me, one of the annoying parts of Silverlight is that it doesn't allow you to return a Stream directly, since Silverlight will turn it into a Byte[], even when using TransferMode.StreamedResponse. Using that technique it is possible to process something like a FileStream, in a streaming fashion, from a Silverlight client (see code below).
However, I would like to do something similar with really large C# POCOs. If I could receive a Stream I could use something like XML serialization on both ends and process the records as they were read. However, since the return value has to be a Byte[] I have to read to the end of the array before I can process it. This method does at least allow me to provide some feedback to the user.
I came up with two possibilities on possible solutions. One would be to write my own serializer, or at least deserializer, where I process data as it is being read (I am thinking delimited text). The other thought would be to do some type of variable bit Stream class, which I believe would require a steep learning curve on my part. I am looking for any ideas or links on possibilities on how to process values in a Byte[] as they are read.
Service Contract:
public static class Action {
public static string val = "http://tempuri.org/stream";
}
// Share contract between client and server
[ServiceContract]
public interface IFileService {
#if SILVERLIGHT
[OperationContract(AsyncPattern = true, Action = Action.val, ReplyAction = Action.val)]
IAsyncResult BeginReadFile(Message request, AsyncCallback callback, object AsyncState);
system.ServiceMode.Channels.Message EndReadFile(IAsyncResult result);
#else
[OperationContract(AsyncPattern = true, Action = Action.val, ReplyAction = Action.val)]
IAsyncResult BeginReadFile(string filePath, AsyncCallback callback, object AsyncState);
system.IO.Stream EndReadFile(IAsyncResult result);
#endif
}
[DataContract(Name = "ReadFile", Namespace = "http://tempuri.org/")]
public class StreamRequest
{
[DataMember(Order = 1)]
public string FileName;
}
Service Implementation:
public IAsyncResult BeginReadFile(string filePath, AsyncCallback callback, object AsyncState) {
return new CompletedAsyncResult<Stream>(File.OpenRead(filePath));
}
public Stream EndReadFile(IAsyncResult result) {
return ((CompletedAsyncResult<Stream>)result).Data;
}
Client:
public void ReadFile( string filePath ) {
BackToUIThread backToUIThread = byteCount => {
this.OnBytesReceived(this, new AsyncEventArgs<object>(byteCount));
};
StreamRequest request = new StreamRequest() { FileName = "C:\\test.txt" };
Message mIn = Message.CreateMessage(
_myFactory.Endpoint.Binding.MessageVersion, Action.val, request);
_myService.BeginReadFile(mIn,
(asyncResult) =>
{
Message mOut = _myService.EndReadFile(asyncResult);
System.Xml.XmlDictionaryReader r = mOut.GetReaderAtBodyContents();
r.ReadToDescendant("ReadFileResult"); // move to stream
r.Read(); // move to content
int bytesRead = 1;
long totalBytesRead = 0;
byte[] buffer = new byte[100];
do {
bytesRead = r.ReadContentAsBase64(buffer, 0, buffer.Length);
totalBytesRead += bytesRead;
if (bytesRead > 0) {
System.Windows.Deployment.Current.Dispatcher.BeginInvoke(
backToUIThread, totalBytesRead);
}
} while (bytesRead > 0);
r.Close();
mOut.Close();
}, null);
}
You could have your Silverlight app pass in the offset and the number of bytes your want read to your service.
You could have something like
int offset = 0;
int length = 1024;
List<byte> bytes = new List<bytes>();
_myService.ReadFileComplete += (s, e)=>
{
bytes.Add(e.Result.Bytes);
if(asyncResult.Result.HasMore)
{
offset += length;
_myService.ReadFileAsync(fileName, offset, length);
}
else
{
DoSomethingWithBytes(bytes);
}
}
_myService.ReadFileAsync(fileName, offset, length);
Then on your WCF Service, just use the Stream.Read method with the given offset and length.
You could improve this by have the service return the next offset to be read.

Categories