Conversion of memory stream to bitmapdata - c#

I make a webrequest to receive a large jpeg as a byte array. This in turn can be converted to a memory stream. I need to get this data into a bitmapdata so that I can marshall copy it to a byte array again. Am i right in assuming that a byte array returned from a memory stream is not the same as a byte array returned from a marshall copy of bitmapdata to a byte array?
I do not want to write the memory stream out to an image as it will return a out of memory error due to its size AND the fact I am using compact cf C# 2.
this is my call to the server..
HttpWebRequest _request = (HttpWebRequest)WebRequest.Create("A url/00249.jpg");
_request.Method = "GET";
_request.Timeout = 5000;
_request.ReadWriteTimeout = 20000;
byte[] _buffer;
int _blockLength = 1024;
int _bytesRead = 0;
MemoryStream _ms = new MemoryStream();
using (Stream _response = ((HttpWebResponse)_request.GetResponse()).GetResponseStream())
{
do
{
_buffer = new byte[_blockLength];
_bytesRead = _response.Read(_buffer, 0, _blockLength);
_ms.Write(_buffer, 0, _bytesRead);
} while (_bytesRead > 0);
}
This is my code to read a byte array from a bitmapdata.
Bitmap Sprite = new Bitmap(_file);
Bitmapdata RawOriginal = Sprite.LockBits(new Rectangle(0, 0, Sprite.Width, Sprite.Height), ImageLockMode.ReadOnly, PixelFormat.Format32bppRgb);
int origByteCount = RawOriginal.Stride * RawOriginal.Height;
SpriteBytes = new Byte[origByteCount];
System.Runtime.InteropServices.Marshal.Copy(RawOriginal.Scan0, SpriteBytes, 0, origByteCount);
Sprite.UnlockBits(RawOriginal);
Note:
I do not want to use this:
Bitmap Sprite = new Bitmap(_file);
I want to go from:
MemoryStream _ms = new MemoryStream();
to
System.Runtime.InteropServices.Marshal.Copy(RawOriginal.Scan0, SpriteBytes, 0, origByteCount);
using what ever conversions are required without writing to a bitmap.

What you're asking is going to be difficult. The data you're receiving from the response object is a full jpeg image, which has a header and then a bunch of compressed data bytes. The byte array addressed by Scan0 is uncompressed and quite possibly includes some padding bytes at the end of each scan line.
Most importantly, you definitely cannot use Marshal.Copy to copy the received bytes to Scan0.
To do what you're asking will require that you parse the header of the jpeg that you receive and uncompress the image bits directly to Scan0, padding each scan line as appropriate. There is nothing in the .NET Framework that will do that for you.
The accepted answer to this question has a link to a library that might help you out.
Even if that works, I'm not certain it will help you out. If calling the BitMap constructor to create the image causes you to run out of memory, it's almost certain that this roundabout method will, as well.
Is the problem that you have so many sprites that you can't keep them all in memory, uncompressed? If so, you'll probably have to find some other way to solve your problem.
By the way, you can save yourself a lot of trouble by changing your code that reads the image to:
MemoryStream _ms = new MemoryStream();
using (Stream _response = ((HttpWebResponse)_request.GetResponse()).GetResponseStream())
{
_response.CopyTo(_ms);
}

Related

File.ReadAllBytes doesn't read the PNG image pixels properly

I am trying to read the bytes of a .png image using File.ReadAllBytes(string) method without success.
My images are of size 2464x2056x3 (15.197.952 bytes), but this method returns an array of about 12.000.000 bytes.
I tried with a white image of the same size, and I am getting a byte array of 25.549, and checking the byte array I can see all kind of values, that obviously is not correct because is a white image.
The code I am using is:
var frame = File.ReadAllBytes("C:\\workspace\\white.png");
I've also tried to open the image first as an Image object then get the byte array with the following:
using (var ms = new MemoryStream())
{
var imageIn = Image.FromFile("C:\\workspace\\white.png");
imageIn.Save(ms, imageIn.RawFormat);
var array = ms.ToArray();
}
But the result is the same as before...
Any idea of what's happening?
How can I read the byte array?
PNG is a compressed format.
See some info about it: Portable Network Graphics - Wikipedia.
This means that the binary representation is not the actual pixel values that you expect.
You need some kind of a PNG decoder to get the pixel values from the compressed data.
This post might steer you in the right direction: Reading a PNG image file in .Net 2.0. Note that it's quite old, maybe there are newer methods for doing it.
On a side note: even a non compressed format like BMP has a header, so you can't simply read the binary file and get the pixel values in a trivial way.
Update:
One way to get the pixel values from a PNG file is demonstrated below:
using System.Drawing.Imaging;
byte[] GetPngPixels(string filename)
{
byte[] rgbValues = null;
// Load the png and convert to Bitmap. This will use a .NET PNG decoder:
using (var imageIn = Image.FromFile(filename))
using (var bmp = new Bitmap(imageIn))
{
// Lock the pixel data to gain low level access:
BitmapData bmpData = bmp.LockBits(new Rectangle(0, 0, bmp.Width, bmp.Height), ImageLockMode.ReadWrite, bmp.PixelFormat);
// Get the address of the first line.
IntPtr ptr = bmpData.Scan0;
// Declare an array to hold the bytes of the bitmap.
int bytes = Math.Abs(bmpData.Stride) * bmp.Height;
rgbValues = new byte[bytes];
// Copy the RGB values into the array.
System.Runtime.InteropServices.Marshal.Copy(ptr, rgbValues, 0, bytes);
// Unlock the pixel data:
bmp.UnlockBits(bmpData);
}
// Here rgbValues is an array of the pixel values.
return rgbValues;
}
This method will return a byte array with the size that you expect.
In order to use the data with opencv (or any similar usage), I advise you to enhance my code example and return also the image metadata (width, height, stride, pixel-format). You will need this metadata to construct a cv::Mat.

Convert streamed PNG data to image: data Streamed via TCP using UnrealEngine 4.10

I have created a small application using UnrealEngine 4.10 (UE4). Within that application, I am grabbing the colorBuffer via ReadPixels. I am then compressing the colorBuffer to PNG. Finally, the compressed colorBuffer is sent via TCP to another application. The UE4 application is written in c++, not that that should matter. The compressed colorBuffer is being sent every "tick" of the application - essentially 30 times a second (thereabouts).
My client, the one receiving the streamed PNG is written in c# and does the following:
Connect to server If connected
get the stream (memory stream)
read the memory stream into a byte array
convert the byte array to an image
Client implementation:
private void Timer_Tick(object sender, EventArgs e)
{
var connected = tcp.IsConnected();
if (connected)
{
var stream = tcp.GetStream(); //simply returns client.GetStream();
int BYTES_TO_READ = 16;
var buffer = new byte[BYTES_TO_READ];
var totalBytesRead = 0;
var bytesRead;
do {
// You have to do this in a loop because there's no
// guarantee that all the bytes you need will be ready when
// you call.
bytesRead = stream.Read(buffer, totalBytesRead,
BYTES_TO_READ - totalBytesRead);
totalBytesRead += bytesRead;
} while (totalBytesRead < BYTES_TO_READ);
Image x = byteArrayToImage(buffer);
}
}
public Image byteArrayToImage(byte[] byteArrayIn)
{
var converter = new ImageConverter();
Image img = (Image)converter.ConvertFrom(byteArrayIn);
return img;
}
The problem is that Image img = (Image)converter.ConvertFrom(byteArrayIn);
Throws an argument exception, telling me "Parmeter is not valid".
The data being sent looks like this:
My byteArrayInand buffer look like this:
I have also tried both:
Image.FromStream(stream); and Image.FromStream(new MemoryStream(bytes));
Image.FromStream(stream); causes it to read forever... and Image.FromStream(new MemoryStream(bytes)); results in the same exception as mentioned above.
Some questions:
What size shall I set BYTES_TO_READ to be? I set as 16 because when I check the size of the byte array being sent in the UE4 application (dataSize in the first image), it says the length is 16... Not too sure about what to set this as.
Is the process that I have followed correct?
What am I doing wrong?
UPDATE
#RonBeyer asked if I could verify that the data sent from the server matches that which is received. I have tried to do that and here is what I can say:
The data sent, as far as I can tell looks like this (sorry for formatting):
The data being received, looks like this:
var stream = tcp.GetStream();
int BYTES_TO_READ = 512;
var buffer = new byte[BYTES_TO_READ];
Int32 bytes = stream.Read(buffer, 0, buffer.Length);
var responseData = System.Text.Encoding.ASCII.GetString(buffer, 0,
bytes);
//responseData looks like this (has been formatted for easier reading)
//?PNG\r\n\u001a\n\0\0\0\rIHDR
//?PNG\r\n\u001a\n\0\0\0\rIHDR
//?PNG\r\n\u001a\n\0\0\0\rIHDR
//?PNG\r\n\u001a\n\0\0\0\rIHDR
//?PNG\r\n\u001a\n\0\0\0\rIHDR
//?PNG\r\n\u001a\n\0\0\0\rIHDR
//?PNG\r\n\u001a\n\0\0\0\rIHDR
//?PNG\r\n\u001a\n\0\0\0\rIHDR
//?PNG\r\n\u001a\n\0\0\0\rIHDR
//?PNG\r\n\u001a\n\0\0\0\rIHDR
If I try take a single line from the responseData and put that into an image:
var stringdata = "?PNG\r\n\u001a\n\0\0\0\rIHDR";
var data = System.Text.Encoding.ASCII.GetBytes(stringdata);
var ms = new MemoryStream(data);
Image img = Image.FromStream(ms);
data has a length of 16... the same length as the dataSize variable on the server. However, I again get the execption "Parameter is not valid".
UPDATE 2
#Darcara has helped by suggesting that what I was actually receiving was the header of the PNG file and that I needed to first send the size of the image. I have now done that and made progress:
for (TArray<class FSocket*>::TIterator ClientIt(Clients); ClientIt;
++ClientIt)
{
FSocket *Client = *ClientIt;
int32 SendSize = 2 * x * y;
Client->SetNonBlocking(true);
Client->SetSendBufferSize(SendSize, SendSize);
Client->Send(data, SendSize, bytesSent);
}
With this, I am now getting the image on the first go, however, subsequent attempts fail with the same "Parameter is not valid". Upon inspection, I have noticed that the stream now appears to be missing the header... "?PNG\r\n\u001a\n\0\0\0\rIHDR". I came to this conclusion when I converted the buffer to a string using Encoding.ASCII.GetString(buffer, 0, bytes);
Any idea why the header is now only being sent to first time and never again? What can I do to fix it?
First of all, thank you to #Dacara and #RonBeyer for your help.
I now have a solution:
Server:
for (TArray<class FSocket*>::TIterator ClientIt(Clients); ClientIt;
++ClientIt)
{
FSocket *Client = *ClientIt;
int32 SendSize = x * y; // Image is 512 x 512
Client->SetSendBufferSize(SendSize, SendSize);
Client->Send(data, SendSize, bytesSent);
}
The first issue was that the size of the image needed to be correct:
int32 SendSize = 2 * x * y;
The line above is wrong. The image is 512 by 512 and so SendSize should be x * y where x & y are both 512.
The other issue was how I was handling the stream client side.
Client:
var connected = tcp.IsConnected();
if (connected)
{
var stream = tcp.GetStream();
var BYTES_TO_READ = (512 * 512)^2;
var buffer = new byte[BYTES_TO_READ];
var bytes = stream.Read(buffer, 0, BYTES_TO_READ);
Image returnImage = Image.FromStream(new MemoryStream(buffer));
//Apply the image to a picture box. Note, this is running on a separate
//thread.
UpdateImageViewerBackgroundWorker.ReportProgress(0, returnImage);
}
The var BYTES_TO_READ = (512 * 512)^2; is now the correct size.
I now have Unreal Engine 4 streaming its frames.
You are only reading the first 16 bytes of the stream. I'm guessing that is not intentional.
If the stream ends/connection closes after the image is transferred, use stream.CopyTo to copy it into a MemoryStream. Image.FromStream(stream) might also work
If the stream does not end, you need to know the size of the transferred object beforehand, so you can copy it read-by-read into another array / memory stream or directly to disk. In that case a much higher read buffer should be used (default is 8192 I think). This is a lot more complicated though.
To manually read from the stream, you need to prepend you data with the size. A simple Int32 should suffice. Your client code might look something like this:
var stream = tcp.GetStream();
//this is our temporary read buffer
int BYTES_TO_READ = 8196;
var buffer = new byte[BYTES_TO_READ];
var bytesRead;
//read size of data object
stream.Read(buffer, 0, 4); //read 4 bytes into the beginning of the empty buffer
//TODO: check that we actually received 4 bytes.
var totalBytesExpected = BitConverter.ToInt32(buffer, 0)
//this will be the stream we will save our received bytes to
//could also be a file stream
var imageStream = new MemoryStream(totalBytesExpected);
var totalBytesRead = 0;
do {
//read as much as the buffer can hold or the remaining bytes
bytesRead = stream.Read(buffer, 0, Math.Min(BYTES_TO_READ, totalBytesExpected - totalBytesRead));
totalBytesRead += bytesRead;
//write bytes to image stream
imageStream.Write(buffer, 0, bytesRead);
} while (totalBytesRead < totalBytesExpected );
I glossed over a lot of error handling here, but that should give you the general idea.
If you want to transfer more complex objects look into proper protocols like Google Protocol Buffers or MessagePack

How to pass unsafe pointer to FileStream in C#

I am trying to write images acquired from a webcam to a file via FileStream in C#. They are 16-bit monochrome so I cannot just write out the Bitmap object. I am using Marshal.Copy() in order to work around this as follows:
unsafe private void RecordingFrame()
{
Bitmap bm16;
BitmapData bmd;
Emgu.CV.Image<Gray, UInt16> currentFrame;
const int ORIGIN_X = 0;
const int ORIGIN_Y = 0;
// get image here and put it in bm16...
bmd = bm16.LockBits(new Rectangle(ORIGIN_X, ORIGIN_Y, bm16.Width, bm16.Height),
ImageLockMode.ReadOnly, bm16.PixelFormat);
var length = bmd.Stride * bmd.Height;
byte[] bytes = new byte[length];
Marshal.Copy(bmd.Scan0, bytes, 0, length);
fsVideoWriter.Write(bytes, 0, length);
bm16.UnlockBits(bmd);
}
Is this the best way to accomplish this? I wanted to simply pass the BitmapData's Scan0 member as a pointer to FileStream but I couldn't figure out how to do this so I copied the data into a byte buffer. This reduces performance slightly so if I can improve it to achieve a higher frame rate I'd like to do so.
You could create an UnmanagedMemoryStream from bmd.Scan0 and then call CopyTo(fsVideoWriter). But I'm not sure if this would be any faster than what you have now.

bytes to image converting error

am crteating a client server application
and the client will ask the server for a certain image, and the server will send it to the client
when the client receive it , it will show it in a picturebox
so this is my code
string line = null;
line = textBox3.Text;
socket.Send(Encoding.ASCII.GetBytes(line));
data = new byte[1024];
dataSize = socket.Receive(data);
//string s = Encoding.ASCII.GetString(data, 0, dataSize);
// textBox4.Text = s;
Image newImage;
using (MemoryStream ms = new MemoryStream(data,0,dataSize))
{
ms.Write(data,0,dataSize);
newImage = Image.FromStream(ms,true); //HERE I GOT THE PROBLEM
}
pictureBox1.Image = newImage;
}
then it returns an error called, Parameter is not valid, so i dont know what wrong in here?
Hard to believe the image is less than 1KB in size. Have bigger buffer:
data = new byte[1024 * 500]; //limit to 500KB
Having buffer smaller than the actual size of the image probably results in an incomplete data which is indeed invalid stream for the image.
You need to reset the memory stream's position back to the start after writing to it:
...
ms.Write(data,0,dataSize);
ms.Position = 0;
newImage = Image.FromStream(ms,true); //HERE I GOT THE PROBLEM
...
Your network code is buggy in two ways:
1) If the data is >1024 bytes it won't work at all
2) If the incoming data gets fragmented it's break (One Send call does NOT map to one Receive call). TCP is a stream not packet based protocol.
To fix it first write the bytesize of the image, and when reading read until you have enough bytes and only then construct the image from the bytes.

Cannot render image to HttpContext.Response.OutputStream

Basically I am trying to render a simple image in an ASP.NET handler:
public void ProcessRequest (HttpContext context)
{
Bitmap image = new Bitmap(16, 16);
Graphics graph = Graphics.FromImage(image);
graph.FillEllipse(Brushes.Green, 0, 0, 16, 16);
context.Response.ContentType = "image/png";
image.Save(context.Response.OutputStream, ImageFormat.Png);
}
But I get the following exception:
System.Runtime.InteropServices.ExternalException: A generic error
occurred in GDI+.
at System.Drawing.Image.Save(Stream stream, ImageCodecInfo encoder,
EncoderParameters encoderParams)
The solution is to use this instead of having image write to OutputStream:
MemoryStream temp = new MemoryStream();
image.Save(temp, ImageFormat.Png);
byte[] buffer = temp.GetBuffer();
context.Response.OutputStream.Write(buffer, 0, buffer.Length);
So I'm just curious as to why the first variant is problematic?
Edit: The HRESULT is 80004005 which is just "generic".
The writer indeed needs to seek to write in the stream properly.
But in your last source code, make sure that you do use either MemoryStream.ToArray() to get the proper data or, if you do not want to copy the data, use MemoryStream.GetBuffer() with MemoryStream.Length and not the length of the returned array.
GetBuffer will return the internal buffer used by the MemoryStream, and its length generally greater than the length of the data that has been written to the stream.
This will avoid you to send garbage at the end of the stream, and not mess up some strict image decoder that would not tolerate trailing garbage. (And transfer less data...)
Image.Save(MemoryStream stream) does require a MemoryStream object that can be seeked upon. The context.Response.OutputStream is forward-only and doesn't support seeking, so you need an intermediate stream. However, you don't need the byte array buffer. You can write directly from the temporary memory stream into the context.Response.OutputStream:
/// <summary>
/// Sends a given image to the client browser as a PNG encoded image.
/// </summary>
/// <param name="image">The image object to send.</param>
private void SendImage(Image image)
{
// Get the PNG image codec
ImageCodecInfo codec = GetCodec("image/png");
// Configure to encode at high quality
using (EncoderParameters ep = new EncoderParameters())
{
ep.Param[0] = new EncoderParameter(Encoder.Quality, 100L);
// Encode the image
using (MemoryStream ms = new MemoryStream())
{
image.Save(ms, codec, ep);
// Send the encoded image to the browser
HttpContext.Current.Response.Clear();
HttpContext.Current.Response.ContentType = "image/png";
ms.WriteTo(HttpContext.Current.Response.OutputStream);
}
}
}
A fully functional code sample is available here:
Auto-Generate Anti-Aliased Text Images with ASP.NET
I believe the problem is that the Response.OutputStream does not support seeking. In order to save a PNG (or JPEG), the image object needs to be able to write the output non-sequentially. If I remember correctly, it would have worked if you saved the image as a BMP since that image format can be written without seeking the stream.
Ok I used a wrapper for Stream (implements Stream and passes calls to an underlying stream) to determine that Image.Save() calls Position and Length properties without checking CanSeek which returns false. It also tries to set Position to 0.
So it seems an intermediate buffer is required.

Categories