trying to post a byte array or memory stream with RestSharp.
I have tried the following
request.AddFile("stream", x => new MemoryStream(blocks.First().Value), "stream", "application/binary");
And
request.AddFile("stream", blocks.First().Value, "stream", "application/binary");
Where blocks.First().Value is a Byte Array
On the server end I am expecting a form with stream parameter in it that I can extract the bytes from.
Additional information:
Adding null or string.Empty to AddFile sends the byte array
request.AddFile("stream", blocks.First().Value, string.Empty);
The problem is that it adds 2 bytes to each byte array sent (1 for carriage return and one for new line). And I cannot remove them on every post on server side since other clients do not behave this way.
Thank you for any input on this!
We fixed it with a pull request in Nancy 0.11 so this is no longer an issue.
Related
I have a bitmap, I converted it to bytes using BinaryFormatter and MemoryStream then sent the bytes to the TCP server, when I try to convert it back to BitMapI get this error System.Runtime.Serialization.SerializationException: End of Stream encountered before parsing was completed. I tried converting the bitmap to bytes then converting the bytes to bitmap on the client side just to check if the error is due to the conversion but everything worked just fine. So I think the problem is that the server is receiving the byte of array in chunks not in 1 big array, so my question is how check if byte array is fully sent?
As you say the data can absolutely be sent in multiple chunks and you need to have a way of knowing when all the data is received. For HTTP you use Content-Length in the headers to let the client know when all data is received. Since you control both sides you can transform your image into a byte array, check the size and lets say its 5000 bytes.
Then you create an int (or long if necessary, not in this case probably) and set it to 5000 and send that first (as bytes) and then the rest of the data. You will then have created your own header. On the other side you start by reading the exact amount of bytes for an int (or other if you chose long etc). Then transform the bytes into an int and you know it will now come 5000 bytes. Then start reading until you have 5000 bytes. This can always be optimized but this is a simple way you can do it.
I am making a audio chat program, So I tried to sending Audio bytes via web-socket
first I did, I get audio bytes and send it on But it was failed(maybe there can't pass completely)
second I tried is convert bytes to string with use BitConverter and convert again to byte array with Encoding.UTF8.GetBytes method
this is my code
var pcmAudio = stream.ToByteArray();
var audio = Encoding.UTF8.GetBytes(BitConverter.ToString(pcmAudio));
if I send that 'audio' it works. I can convert to byte array and I can play audio.
but, if I send pcmAudio there are an error
Stream ms = new MemoryStream(Encoding.UTF8.GetBytes(data));
above is my receiving audio code. data is string. there's no way to receive with Byte type.
So i had to convert data to bytes.
Unfortunately it doesn't work.
the error message is 'Wave header is corrupted'
i want to send byte array compeletly.
ur question 1. Why do you want to send bytes ? You know the way to send audio with bitconverter
my answer 1. Byte length would be larger than not converted
thank you
I am trying to send batch of files along with various other form elements over asynchronous XMLHTTPRequest. Each batch can contain file size up to 15 MB. There can be multiple batches.
I am reading the files in a javascript and converting them into base64 string and then trying to receive the files in the controller (using for loop) and converting them into byte array.
string fileValue1 = form.GetValues("fileName1");
string fileValue2 = form.GetValues("fileName2");
The file value is of the format as below, depending upon the type of the attachment:
"data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAMgAAABkC...
Using the below substring to remove the content before ',' and doing some processing before converting it into byte array.
fileValue1 = fileValue1 .Substring(fileValue1 .IndexOf(',') + 1);
fileValue1 = fileValue1 .Trim().Replace(" ", "+");
fileValue1 = Convert.FromBase64String(fileValue1)
Since I am sending files through multiple batches (say if there are 10 batches), there will be total of around 150 to 200 MB of files trying to reach the controller through asynchronous AJAX calls.
While sending, I am getting the below error message:
System.OutOfMemoryException was thrown at converting base64 string to byte array - occuring at replace statement.
I have followed various posts for workaround, but nothing seems to be working for me. I tried to increase "httpRuntime maxRequestLength" and " maxAllowedContentLength" in web.config to 4 GB to allow huge size, but nothing seems to be working.
If I remove the line
fileValue1 = fileValue1 .Substring(fileValue1 .IndexOf(',') + 1);
I am getting below error:
The input is not a valid Base-64 string as it contains a non-base 64 character, more than two padding characters, or an illegal character among the padding characters.
Maximum size of file can reach upto 500 MB when sent through multiple batches.
I am not sure, how to read this string file by block to make sure memory is available before converting to byte array. Any help would be greatly appreciated.
probably you need change the configuration of your server to accept request bigger or change the time before expire the request because when you have several request the server want attend all and the request and the process is slow then the request expire and you don't receive all data
this link can be helpful or this other
I'm writing a c# client who need to communicate with a c++ server. I'm trying to find a way to send a string to the server, and i'm stuck, since a char in c# is 2 bytes and in c++ it's 1.
How can I convert my strings to send them as a readable char array for the server?
Thanks a lot!
Ps : I will have the same problem with other types like int and stuff I think.
In C++ you can use std::wstring which is wide character and of two bytes.
You can quite easy convert your string to an ascii byte array before sending it to the server:
string message = ...
byte [] data = Encoding.ASCII.GetBytes(message);
server.Send(data);
Be sure however that you send messsages that consist of characters covered in the ascii-table. Characters outside that table can bring some surprises when converted to ascii.
Converting the received answer from the server back into a string
byte [] received = ...
string response = Encoding.ASCII.GetString(received);
Generally speaking, sending data from client to server and back through some kind of connection is not the easiest thing to do.
I can share my experience, in which I need to serialize properties of serializable classes into stream of bytes to be sent across a generic connection.
Using the System.BitConverter you can get the representation of basic data types (bool, char, double, float, ...) into arrays of bytes:
byte[] f1 = BitConverter.GetBytes(MsgID); // MsgID is a ulong
for string objects you could use the UTF8 encoding:
// == Payload is a C# string ==
// calculates how many bytes we need to stream the Payload
payloadBufferSize = Encoding.UTF8.GetByteCount(Payload);
// create a suitable buffer
payloadBuffer = new byte[payloadBufferSize];
// Encode the Payload in the buffer
Encoding.UTF8.GetBytes(Payload, 0, Payload.Length, payloadBuffer, 0);
doing so you have an array of bytes you can send through your connection, given that on the other side you have some kind of object that is able to decode an UTF8 stream of bytes.
If you just want to get a plain ASCII stream, you could use the Encoding.ASCII encoder instead of the Encoding.UTF8 in the sample above, but if you have unicode character you'll get a '?' as a resulting char.
I am working on C#, trying below code
byte[] buffer = new byte[str.Length];
buffer = Encoding.UTF8.GetBytes(str);
In str I've got lengthy data but I've got problem in getting complete encoded bytes.
Please tell me what's going wrong and how can I overcome this problem?
Why are you creating a new byte array and then ignoring it? The value of buffer before the call to GetBytes is being replaced with a reference to a new byte array returned by GetBytes.
However, you shouldn't expect the UTF-8 encoded version of a string to be the same length in bytes as the original string's length in characters, unless it's all ASCII. Any character over U+007F takes up at least 2 bytes.
What's the bigger picture here? What are you trying to achieve, and why does the length of the byte array matter to you?
The proper use is:
byte[] buffer = Encoding.UTF8.GetBytes(str);
In general, you should not make any assumptions about length/size/count when working with encoding, bytes and chars/strings. Let the Encoding objects do their work and then query the resulting objects for that info.
Having said that, I don't believe there is an inherent length restriction for the encoding classes. I have several production apps doing the same work in the opposite direction (bytes encoded to chars) which are processing byte arrays in the 10s of megabytes.