Hashing Serialized in Silverlight - c#

I have a Silverlight app. This app calls back to a web service (that I've written) to get a list of customers. I'm trying to generate a hash code that represents the list of customers. In an attempt to do this, I have the following code written in my Silverlight app, and in my server-side code:
public static string GetHashCode<T>(T data)
{
SHA256Managed crypto = new SHA256Managed();
try
{
// Serialize the object
byte[] serializedBytes;
using (var memoryStream = new MemoryStream())
{
var serializer = new DataContractSerializer(typeof(T));
serializer.WriteObject(memoryStream, data);
memoryStream.Seek(0, SeekOrigin.Begin);
serializedBytes = memoryStream.ToArray();
}
byte[] hashed = crypto.ComputeHash(serializedBytes);
StringBuilder sb = new StringBuilder();
for (int i = 0; i < hashed.Length; i++)
sb.Append(hashed[i].ToString("X2"));
return sb.ToString();
}
catch (Exception)
{
return string.Empty;
}
}
My method on the server-side looks like this:
[OperationContract]
public List<Customer> GetCustomers()
{
List<Customer> customers = Customer.GetAll();
string hash = GetHashCode<List<Customer>>(customers);
return customers;
}
On the client-side, in my Silverlight app, I have the following in my GetCustomersCompleted event handler:
string hash = GetHashCode<List<Customer>>(e.Result.ToList());
Please notice that I am converting the event handler Result to a list first. The reason why is because the Result is actually a ObservableCollection. I bring this up because I believe this is where my problem lies.
My problem is, even though the data is the same, my hash values are different on the server and client-side. This is based on what I'm seeing in the watch window. I have a feeling it has something to do with the serialization that occurs when the results get passed from my server to my app. However, I'm not sure how to get around this. I need a hash of the data, but I can't figure out how to get them to line up. Thank you for any help you can provide.

Related

File API seems to always write corrupt files when used in a loop, except for the last file

I know the title is long, but it describes the problem exactly. I didn't know how else to explain it because this is totally out there.
I have a utility written in C# targeting .NET Core 2.1 that downloads and decrypts (AES encryption) files originally uploaded by our clients from our encrypted store, so they can be reprocessed through some of our services in the case that they fail. This utility is run via CLI using database IDs for the files as arguments, for example download.bat 101 102 103 would download 3 files with the corresponding IDs. I'm receiving byte data through a message queue (really not much more than a TCP socket) which describes a .TIF image.
I have a good reason to believe that the byte data is not ever corrupted on the server. That reason is when I run the utility with only one ID parameter, such as download.bat 101, then it works just fine. Furthermore, when I run it with multiple IDs, the last file that is downloaded by the utility is always intact, but the rest are always corrupted.
This odd behavior has persisted across two different implementations for writing the byte data to a file. Those implementations are below.
File.ReadAllBytes implementation:
private static void WriteMessageContents(FileServiceResponseEnvelope envelope, string destination, byte[] encryptionKey, byte[] macInitialVector)
{
using (var inputStream = new MemoryStream(envelope.Payload))
using (var outputStream = new MemoryStream(envelope.Payload.Length))
{
var sha512 = YellowAesEncryptor.DecryptStream(inputStream, outputStream, encryptionKey, macInitialVector, 0);
File.WriteAllBytes(destination, outputStream.ToArray());
_logger.LogStatement($"Finished writing [{envelope.Payload.Length} bytes] to [{destination}].", LogLevel.Debug);
}
}
FileStream implementation:
private static void WriteMessageContents(FileServiceResponseEnvelope envelope, string destination, byte[] encryptionKey, byte[] macInitialVector)
{
using (var inputStream = new MemoryStream(envelope.Payload))
using (var outputStream = new MemoryStream(envelope.Payload.Length))
{
var sha512 = YellowAesEncryptor.DecryptStream(inputStream, outputStream, encryptionKey, macInitialVector, 0);
using (FileStream fs = new FileStream(destination, FileMode.Create))
{
var bytes = outputStream.ToArray();
fs.Write(bytes, 0, envelope.Payload.Length);
_logger.LogStatement($"File byte content: [{string.Join(", ", bytes.Take(16))}]", LogLevel.Trace);
fs.Flush();
}
_logger.LogStatement($"Finished writing [{envelope.Payload.Length} bytes] to [{destination}].", LogLevel.Debug);
}
}
This method is called from a for loop which first receives the messages I described earlier and then feeds their payloads to the above method:
using (var requestSocket = new RequestSocket(fileServiceEndpoint))
{
// Envelopes is constructed beforehand
foreach (var envelope in envelopes)
{
var timer = Stopwatch.StartNew();
requestSocket.SendMoreFrame(messageTypeBytes);
requestSocket.SendMoreFrame(SerializationHelper.SerializeObjectToBuffer(envelope));
if (!requestSocket.TrySendFrame(_timeout, signedPayloadBytes, signedPayloadBytes.Length))
{
var message = $"Timeout exceeded while processing [{envelope.ActionType}] request.";
_logger.LogStatement(message, LogLevel.Error);
throw new Exception(message);
}
var responseReceived = requestSocket.TryReceiveFrameBytes(_timeout, out byte[] responseBytes);
...
var responseEnvelope = SerializationHelper.DeserializeObject<FileServiceResponseEnvelope>(responseBytes);
...
_logger.LogStatement($"Received response with payload of [{responseEnvelope.Payload.Length} bytes].", LogLevel.Info);
var destDir = downloadDetails.GetDestinationPath(responseEnvelope.FileId);
if (!Directory.Exists(destDir))
Directory.CreateDirectory(destDir);
var dest = Path.Combine(destDir, idsToFileNames[responseEnvelope.FileId]);
WriteMessageContents(responseEnvelope, dest, encryptionKey, macInitialVector);
}
}
I also know that TIFs have a very specific header, which looks something like this in raw bytes:
[73, 73, 42, 0, 8, 0, 0, 0, 20, 0...
It always begins with "II" (73, 73) or "MM" (77, 77) followed by 42 (probably a Hitchhiker's reference). I analyzed the bytes written by the utility. The last file always has a header that resembles this one. The rest are always random bytes; seemingly jumbled or mis-ordered image binary data. Any insight on this would be greatly appreciated because I can't wrap my mind around what I would even need to do to diagnose this.
UPDATE
I was able to figure out this problem with the help of elgonzo in the comments. Sometimes it isn't a direct answer that helps, but someone picking your brain until you look in the right place.
All right, as I suspected this was a dumb mistake (I had severe doubts that the File API was simply this flawed for so long). I just needed help thinking through it. There was an additional bit of code which I didn't post that was biting me, when I was retrieving the metadata for the file so that I could then request the file from our storage box.
byte[] encryptionKey = null;
byte[] macInitialVector = null;
...
using (var conn = new SqlConnection(ConnectionString))
using (var cmd = new SqlCommand(uploadedFileQuery, conn))
{
conn.Open();
var reader = cmd.ExecuteReader();
while (reader.Read())
{
FileServiceMessageEnvelope readAllEnvelope = null;
var originalFileName = reader["UploadedFileClientName"].ToString();
var fileId = Convert.ToInt64(reader["UploadedFileId"].ToString());
//var originalFileExtension = originalFileName.Substring(originalFileName.IndexOf('.'));
//_logger.LogStatement($"Scooped extension: {originalFileExtension}", LogLevel.Trace);
envelopes.Add(readAllEnvelope = new FileServiceMessageEnvelope
{
ActionType = FileServiceActionTypeEnum.ReadAll,
FileType = FileTypeEnum.UploadedFile,
FileName = reader["UploadedFileServerName"].ToString(),
FileId = fileId,
WorkerAuthorization = null,
BinaryTimestamp = DateTime.Now.ToBinary(),
Position = 0,
Count = Convert.ToInt32(reader["UploadedFileSize"]),
SignerFqdn = _messengerConfig.FullyQualifiedDomainName
});
readAllEnvelope.SignMessage(_messengerConfig.PrivateKeyBytes, _messengerConfig.PrivateKeyPassword);
signedPayload = new SecureMessage { Payload = new byte[0] };
signedPayload.SignMessage(_messengerConfig.PrivateKeyBytes, _messengerConfig.PrivateKeyPassword);
signedPayloadBytes = SerializationHelper.SerializeObjectToBuffer(signedPayload);
encryptionKey = (byte[])reader["UploadedFileEncryptionKey"];
macInitialVector = (byte[])reader["UploadedFileEncryptionMacInitialVector"];
}
conn.Close();
}
Eagle-eyed observers might realize that I have not properly coupled the encryptionKey and macInitialVector to the correct record, since each file has a unique key and vector. This means I was using the key for one of the files to decrypt all of them which is why they were all corrupt except for one file -- they were not properly decrypted. I solved this issue by coupling them together with the ID in a simple POCO and retrieving the appropriate key and vector for each file upon decryption.

Deserializing a BinaryFormatter result into another type

We had recently rewritten our system's architecture and ran into a pretty big issue the day we went live.
We save a user's information as they are filling out their forms by serializing the form's model using BinaryFormatter and storing the result in the database:
private byte[] Serialize<T>(T model)
{
try
{
var formatter = new BinaryFormatter();
using (var stream = new MemoryStream())
{
formatter.Serialize(stream, model);
return stream.ToArray();
}
}
catch (System.Exception ex)
{
_log.LogException(ex);
}
return null;
}
When the user needed the information again, we would simply deserialize back into the same type and that worked great:
public T Deserialize<T>(byte[] serializedData)
{
try
{
var formatter = new BinaryFormatter();
using (var stream = new MemoryStream(serializedData))
{
var result = (T)formatter.Deserialize(stream);
return result;
}
}
catch (System.Exception ex)
{
_log.LogException(ex);
return default(T);
}
}
The problem comes in after the architecture rewrite and we had 3 clients that still had legacy form data saved. The updated architecture's form models has the same property names but obviously new namespaces in a new assembly.
As a result the formatter.Deserialize is failing because it's trying to deserialize into an object that is in an assembly that is not in the new architecture.
Is there any possible way I can somehow deserialize into a generic object and then do a deep copy into the new model (or use automapper or something like that)? Or is the BinaryFormatter tightly tied to object and I would need the dll that contained the type definition that was used to do the serialization in order to do the deserialization?

Send object over a TCP connection

I've started windows mobile programming today and I have successfully connected to my server.
The application I am making on Visual Studio is not a universal application, but a Windows Mobile Application.
The API DataWriter is used to write data to an output stream, in the applications scenario the output stream is the socket. I.E:
DataWriter dw = new DataWriter(clientSocket.OutputStream);
One of the methods I have been looking at is WriteBytes and WriteBuffer
(Documentation can be found her for API documentation for DataWriter
.
Which method do I use, and why?
How can I convert this class and sent it to my server using the methods mentioned above.
public class Message
{
public string pas { get; internal set; }
public int type { get; internal set; }
public string us { get; internal set; }#
}
//the below code goes in a seperate function
DataWriter dw = new DataWriter(clientSocket.OutputStream);
Message ms = new Message();
ms.type = 1;
ms.us = usernameTextBox.Text;
ms.pas = usernameTextBox.Text;
//TODO: send ms to the server
Between the two methods, WriteBytes() seems like the simpler approach. WriteBuffer() offers you more control over the output buffer/stream, which you can certainly use if and when you need to. But, for all intents and purposes, if you just need to simply open a connection and send it a byte stream, WriteBytes() does the job.
How can I convert this class and sent it to my server
That's entirely up to you, really. What you have to do is define how you're going to "serialize" your class to transmit over the connection (and thereby have to "deserialize" it when the other code receives the data).
There are a few ways to do that, among many others. A straightforward approach (taken from the top answer on that linked question), would be to use the BinaryFormatter class. Something like this:
var ms = new Message();
ms.type = 1;
ms.us = usernameTextBox.Text;
ms.pas = usernameTextBox.Text;
byte[] serializedMessage;
var formatter = new BinaryFormatter();
using (var stream = new MemoryStream())
{
formatter.Serialize(stream, ms);
serializedMessage = ms.ToArray();
}
// now serializedMessage is a byte array to be sent
Then on the other end you'd need to deserialize it back to an object instance. Which might look something like this:
// assuming you have a variable called serializedMessage as the byte array received
Message ms;
using (var stream = new MemoryStream())
{
var formatter = new BinaryFormatter();
stream.Write(serializedMessage, 0, serializedMessage.Length);
stream.Seek(0, SeekOrigin.Begin);
ms = (Message)formatter.Deserialize(stream);
}
You can of course abstract these behind a simpler interface. Or if you're looking for any kind of human readability in the serialization you might try something like a JSON serializer and directly convert the string to a byte array, etc.
Edit: Note that this is really just an example of one of many ways to "serialize" an object. And, as pointed out by a user in comments below, there could be drawbacks to using this binary serializer.
You can use any serializer, really. You can even make your own. Technically overriding .ToString() to print all the properties and then calling that is a form of serialization. The concept is always the same... Convert the in-memory object to a transmittable piece of data, from which an identical in-memory object can later be built. (Technically, saving to a database is a form of serialization.)

Serve a complex object with an image using REST

Noob WCF/REST question here!
I need to build a simple web service (one method for now), my preference would be for a REST/JSON type of architecture but I'm not sure this can be achieved in my case.
I know that it's easy enough to serialize/deserialize complex objects into/from JSON while using a REST-based service. Even though I haven't tested it yet, it also looks easy enough for a REST-based service to return an image.
However, my scenario may need to serve a mix of both. Below is an example of an object definition that can be returned:
class Response
{
string myTitle;
string myDate;
Object myImage;
}
I realize I could store the physical image wherever accessible and then simply return the URL as a string but I would like to avoid the overhead as much as possible.
Is it even possible?
Also note that I'm not committed to REST or JSON in any way, it's simply that all the cool kids are using it so...
If you want the image as part of the JSON object, convert it into a serializable type. Easiest way is just to use its byte representation.
System.Drawing.Image img = System.Drawing.Image.FromFile("filename");
byte[] imgContent;
using (System.IO.MemoryStream m = new System.IO.MemoryStream())
{
img.Save(m, img.RawFormat);
imgContent = new byte[m.Length];
const int count = 4096;
byte[] buffer = new byte[4096];
for (int i = 0; i < m.Length; i += count)
{
m.Read(buffer, i, (m.Length - i < count ? (int)(m.Length - i) : count));
buffer.CopyTo(imgContent, i);
}
}
myResponse.myImage = imgContent;
EDIT: As the OP found, there is a much simpler/quicker way to write:
System.Drawing.Image img = System.Drawing.Image.FromFile("filename");
using (System.IO.MemoryStream m = new System.IO.MemoryStream())
{
img.Save(m, img.RawFormat);
myResponse.myImage = m.ToArray();
}

How can I access an existing byte[] in C#?

I have a small issue accessing a byte[]:
I have a binary object (byte[] saved to mssql db) which I get from the db and I want to read it. Whenever I access it, for its length or for its Read() method, I get a Cannot access a closed Stream exception.
What's the best way to treat binaries if they have to be updated in the code and than saved again to the db?
Thanks.
Edit - code
In this application we convert a test object to a generic data object we've created to simplify, so this is the data object:
public class DataObject
{
public Stream Content { get; set; }
public Descriptor Descriptor { get; set; }
}
The descriptor contains metadata only (currently only name and description strings) and is not relevant, I think.
The test is more complicated, I'll start by adding the mapping into data object. The serializer mentioned is NetDataContractSerializer.
public DataObject Map(Test test)
{
using(var stream = new MemoryStream())
{
Serialize(test, stream);
return new DataObject { Content = stream, Descriptor = test.Descriptor };
}
}
private void Serialize(Test test, MemoryStream stream)
{
serializer.WriteObject(stream, test);
stream.Flush();
stream.Position = 0;
}
and vice versa:
public Test Build(DataObject data)
{
using (var stream = data.Content)
{
var test = Deserialize(stream);
test.Descriptor = data.Descriptor;
return test ;
}
}
private Test Deserialize(Stream stream)
{
return serializer.ReadObject(stream) as IPythonTest;
}
Edit II - trying to change the test's content:
This is my first attempt handling streams, I'm not sure I'm doing it right, so I'll explain first what I want to do: The information in data field should be saved into the test's data object.
private static void UpdateTestObject(DataObject data, Test test)
{
var testData = new byte[data.Content.Length];
data.Content.Read(testData, 0, (int) data.Content.Length);
test.TestObject = testData;
}
The exception is thrown in UpdateTestObject when accessing data.Content. I get it after creating some test, mapping it, and trying to save it.
data.Content.Read(testData, 0, (int) data.Content.Length);
Here we go. The data object has a Content stream that it closed.
Result: Error.
Reasno? TOTALLY (!) unrelated to your question. Basically find out why / what is the problem there in your data handling.
Could be a design fubar in which the stream is not available after a certain point and youru sage of the object is past this point.
So the problem is caused by the Map() method - as far as I could understand, since it used:
using (var stream = new MemoryStream())
{ ... }
The stream was disposed of at the end of the block. Changing it to declaring a MemoryStream and then using it afterwards worked.
Thanks to everyone who gave it a thought (not mentioning reading all this code)! :)

Categories