I want to convert byte array to bitmap. I get this byte array from capture card. Rgb24 data include for this array. When i convert this array to bitmap object i got the "Parameter is not valid" error.
This is my byte array
myByteArray{byte[921600]}
MemoryStream mStream = new MemoryStream(myByteArray);
Bitmap bi = new Bitmap(mStream );
and
using (MemoryStream mStream = new MemoryStream(myByteArray))
{
Bitmap bi = (Bitmap)System.Drawing.Image.FromStream(mStream );
}
and
using (MemoryStream mStream = new MemoryStream())
{
mStream.Write(myByteArray, 0, myByteArray.Length);
mStream.Seek(0, SeekOrigin.Begin);
Bitmap bm = new Bitmap(mStream);
return bm;
}
Is this happen because of the size of the array?
Can any one give a method to do this task?
It will be greatly appreciated.
thank you
If your myByteArray is raw image data, this should work:
Bitmap bmp = null;
unsafe
{
fixed (byte* p = myByteArray)
{
IntPtr unmanagedPointer = (IntPtr)p;
// Deduced from your buffer size
int width = 640;
int height = 480;
bmp = new Bitmap(width, height, width * 3, System.Drawing.Imaging.PixelFormat.Format24bppRgb, unmanagedPointer);
}
}
Related
Currently I'm making a function which can take a base64 image and crop it to the desired rectangle (X, Y, Width, Height). However, the below code doesn't seem to do the trick and I don't know why. It returns the image unchanged and uncropped.
Can anyone see the issue? :)
public static string CropImage(string base64, int x, int y, int width, int height)
{
byte[] bytes = Convert.FromBase64String(base64);
using (var ms = new MemoryStream(bytes))
{
Bitmap bmp = new Bitmap(ms);
Rectangle rect = new Rectangle(x, y, width, height);
Bitmap croppedBitmap = new Bitmap(rect.Width, rect.Height, bmp.PixelFormat);
using (Graphics gfx = Graphics.FromImage(croppedBitmap))
{
gfx.DrawImage(bmp, 0, 0, rect, GraphicsUnit.Pixel);
}
using (MemoryStream ms2 = new MemoryStream())
{
bmp.Save(ms2, ImageFormat.Jpeg);
byte[] byteImage = ms2.ToArray();
var croppedBase64 = Convert.ToBase64String(byteImage);
return croppedBase64;
}
}
}
The cropped image is in croppedBitmap, bmp is the original image. I think you want to use croppedBitmap in the second memory stream:
using (MemoryStream ms2 = new MemoryStream())
{
croppedBitmap.Save(ms2, ImageFormat.Jpeg);
byte[] byteImage = ms2.ToArray();
var croppedBase64 = Convert.ToBase64String(byteImage);
return croppedBase64;
}
Image image = new Image ();
Bitmap bitmap = Bitmap.CreateBitmap (200, 100, Bitmap.Config.Argb8888);
Canvas canvas = new Canvas(bitmap);
var paint = new Paint();
paint.Color = Android.Graphics.Color.Red;
paint.SetStyle(Paint.Style.Fill);
Rect rect = new Rect(0, 0, 200, 100);
canvas.DrawRect(rect, paint);
Android.Widget.ImageView contains method SetImageBitmap.
What the best way to set Xamarin.Forms.Image from my bitmap?
Convert the Bitmap to byte[] via http://forums.xamarin.com/discussion/5950/how-to-convert-from-bitmap-to-byte-without-bitmap-compress
There are two solutions mentioned.
var byteArray = ByteBuffer.Allocate(bitmap.ByteCount);
bitmap.CopyPixelsToBuffer(byteArray);
byte[] bytes = byteArray.ToArray<byte>();
return bytes;
(in case first solution is still broken)
ByteBuffer buffer = ByteBuffer.Allocate(bitmap.ByteCount);
bitmap.CopyPixelsToBuffer(buffer);
buffer.Rewind();
IntPtr classHandle = JNIEnv.FindClass("java/nio/ByteBuffer");
IntPtr methodId = JNIEnv.GetMethodID(classHandle, "array", "()[B");
IntPtr resultHandle = JNIEnv.CallObjectMethod(buffer.Handle, methodId);
byte[] byteArray = JNIEnv.GetArray<byte>(resultHandle);
JNIEnv.DeleteLocalRef(resultHandle);
And then use
var image = new Image();
image.Source = ImageSource.FromStream(() => new MemoryStream(byteArray));
to create an Image.
I tried #Wosi's answer, but for some reason the rendering of the image after that didn't work and the code is specific to Android. I needed to work from a byte array to a bitmap and then back again. This is what I did:
Code for turning a bitmap into a byte array:
byte[] bitmapData;
using (var stream = new MemoryStream())
{
tempBitmap.Compress(Android.Graphics.Bitmap.CompressFormat.Png, 0, stream);
bitmapData = stream.ToArray();
}
And the code for turning a byte array into a bitmap:
Android.Graphics.Bitmap tempBitmap = Android.Graphics.BitmapFactory.DecodeByteArray(imageByteArray, 0, imageByteArray.Length, options);
Where "options" is defined as follows:
Android.Graphics.BitmapFactory.Options options = new Android.Graphics.BitmapFactory.Options
{
InJustDecodeBounds = true
};
Android.Graphics.Bitmap result = Android.Graphics.BitmapFactory.DecodeByteArray(bitmapArray, 0, byteArrayLength, options);
//int imageHeight = options.OutHeight;
//int imageWidth = options.OutWidth;
In this part the Bitmap gets decoded. This is done to get the image height and width properties. For my case I required this information to encode it as a byte array again.
There with this it is possible to encode a byte array to a string and then back again.
Setting an image source from a byte array is done as follows:
var imageSource = ImageSource.FromStream(() => new MemoryStream(ImageByteArray, 0, ImageByteArray.Length));
I'm having some trouble converting an image to a video using the SharpAVI.dll.
I have managed to produce a video file using a randomly generated byte array by using the documentation on SharpAVI's website:
Getting Started with SharpAVI
So the next step I thought I would take was to take an Image, create a Bitmap image, convert the bitmap to a byte array and then simply save the byte array to each frame of the video file. When I run the program, I get no errors or anything and a video file of an appropriate file size is produced however the video file is unreadable and will not open. I'm really struggling to see why this won't work. Any help would be greatly appreciated!
My Code:
private void GenerateSingleImageVideo()
{
string imagePath = textBoxImagePath.Text;
Bitmap thisBitmap;
//generate bitmap from image file
using (Stream BitmapStream = System.IO.File.Open(imagePath, FileMode.Open))
{
Image img = Image.FromStream(BitmapStream);
thisBitmap = new Bitmap(img);
}
//convert the bitmap to a byte array
byte[] byteArray = BitmapToByteArray(thisBitmap);
//creates the writer of the file (to save the video)
var writer = new AviWriter(textBoxFileName.Text + ".avi")
{
FramesPerSecond = int.Parse(textBoxFrameRate.Text),
EmitIndex1 = true
};
var stream = writer.AddVideoStream();
stream.Width = thisBitmap.Width;
stream.Height = thisBitmap.Height;
stream.Codec = KnownFourCCs.Codecs.Uncompressed;
stream.BitsPerPixel = BitsPerPixel.Bpp32;
int numberOfFrames = ((int.Parse(textBoxFrameRate.Text)) * (int.Parse(textBoxVideoLength.Text)));
int count = 0;
while (count <= numberOfFrames)
{
stream.WriteFrame(true, byteArray, 0, byteArray.Length);
count++;
}
writer.Close();
MessageBox.Show("Done");
}
private byte[] BitmapToByteArray(Bitmap img)
{
ImageConverter converter = new ImageConverter();
return (byte[])converter.ConvertTo(img, typeof(byte[]));
}
You're wrong in assuming that you should pass a Bitmap object to WriteFrame method. It expects pixel data in bottom to top 32bpp format. See example in
// Buffer for pixel data
var buffer = new byte[width * height * 4];
...
// Copy pixels from Bitmap assuming it has expected 32bpp pixel format
var bits = bitmap.LockBits(new Rectangle(0, 0, width, height), ImageLockMode.ReadOnly, PixelFormat.Format32bppRgb);
Marshal.Copy(bits.Scan0, buffer, 0, buffer.Length);
bitmap.UnlockBits(bits);
You can see code of a sample app as a reference
https://github.com/baSSiLL/SharpAvi/blob/master/Sample/Recorder.cs
i have a filestream which im trying to convert into a image. i have the following code
FileStream imageFile = image["file"] as FileStream;
image is a map holding information on the image. please can some one direct me on what I should do next.
Image.FromStream will take a stream and return an image. Incidentally, if you use Bitmap.FromStream or Image.FromStream, or indeed any of these methods, they all return a Bitmap, so they can all be upcast to Bitmap from Image if you want them to be.
you can do the following:
int width = 128;
int height = width;
int stride = width / 8;
byte[] pixels = new byte[height * stride];
// Define the image palette
BitmapPalette myPalette = BitmapPalettes.Halftone256;
// Creates a new empty image with the pre-defined palette
BitmapSource image = BitmapSource.Create(
width,
height,
96,
96,
PixelFormats.Indexed1,
myPalette,
pixels,
stride);
FileStream stream = new FileStream("new.jpg", FileMode.Create);
JpegBitmapEncoder encoder = new JpegBitmapEncoder();
encoder.FlipHorizontal = true;
encoder.FlipVertical = false;
encoder.QualityLevel = 30;
encoder.Rotation = Rotation.Rotate90;
encoder.Frames.Add(BitmapFrame.Create(image));
encoder.Save(stream);
Size image placed in byte[] array (don't know the type of image).
I have to produce another byte [] array, which size should be up to 50kB.
How can I do some kind of scaling?
Unless you want to get into some serious math, you need to load your byte array into a memory stream, load an image from that memory stream, and use the built-in GDI functions in the System.Drawing namespace.
Doing a 25%, or 50% scale is easy. Beyond that, you need to start doing interpolation and differencing to make anything look halfway decent in binary data manipulation. You'll be several days into it before you can match what's already available in GDI.
System.IO.MemoryStream myMemStream = new System.IO.MemoryStream(myBytes);
System.Drawing.Image fullsizeImage = System.Drawing.Image.FromStream(myMemStream);
System.Drawing.Image newImage = fullsizeImage .GetThumbnailImage(newWidth, newHeight, null, IntPtr.Zero);
System.IO.MemoryStream myResult = new System.IO.MemoryStream();
newImage.Save(myResult ,System.Drawing.Imaging.ImageFormat.Gif); //Or whatever format you want.
return myResult.ToArray(); //Returns a new byte array.
BTW - if you really need to figure out your source image type, see: How to check if a byte array is a valid image
Ok, so after some experiments, I have something like that:
public static byte[] Resize2Max50Kbytes(byte[] byteImageIn)
{
byte[] currentByteImageArray = byteImageIn;
double scale = 1f;
if (!IsValidImage(byteImageIn))
{
return null;
}
MemoryStream inputMemoryStream = new MemoryStream(byteImageIn);
Image fullsizeImage = Image.FromStream(inputMemoryStream);
while (currentByteImageArray.Length > 50000)
{
Bitmap fullSizeBitmap = new Bitmap(fullsizeImage, new Size((int)(fullsizeImage.Width * scale), (int)(fullsizeImage.Height * scale)));
MemoryStream resultStream = new MemoryStream();
fullSizeBitmap.Save(resultStream, fullsizeImage.RawFormat);
currentByteImageArray = resultStream.ToArray();
resultStream.Dispose();
resultStream.Close();
scale -= 0.05f;
}
return currentByteImageArray;
}
Has someone another idea? Unfortunatelly Image.GetThumbnailImage() was causing very dirty images.
The updated answer below works with Docker SixLabors.ImageSharp.
The solution for .net core 3.1 and greater:
Install System.Drawing.Common nuget:
Install-Package System.Drawing.Common
The code to change the size of the image from a byte array:
byte[] ReduceImage(byte[] bytes)
{
using var memoryStream = new MemoryStream(bytes);
using var originalImage = new Bitmap(memoryStream);
var resized = new Bitmap(newWidth, newHeight);
using var graphics = Graphics.FromImage(resized);
graphics.CompositingQuality = CompositingQuality.HighSpeed;
graphics.InterpolationMode = InterpolationMode.HighQualityBicubic;
graphics.CompositingMode = CompositingMode.SourceCopy;
graphics.DrawImage(originalImage, 0, 0, reducedWidth, reducedHeight);
using var stream = new MemoryStream();
resized.Save(stream, ImageFormat.Png);
return stream.ToArray();
}
Update:
The approach above can don't work for Linux, so the universal solution is:
Install SixLabors.ImageSharp nuget:
Install-Package SixLabors.ImageSharp
Write the following code:
private static byte[] ReduceImage(byte[] bytes)
{
using var memoryStream = new MemoryStream(bytes);
using var image = Image.Load(memoryStream);
image.Mutate(x => x.Resize(ReducedWidth, ReducedHeight));
using var outputStream = new MemoryStream();
image.Save(outputStream, new PngEncoder() /*or another encoder*/);
return outputStream.ToArray();
}
Suppose you read a file from Drive
FileStream streamObj = System.IO.File.OpenRead(#"C:\Files\Photo.jpg");
Byte[] newImage=UploadFoto(streamObj);
public static Byte[] UploadFoto(FileStream fileUpload)
{
Byte[] imgByte = null;
imgByte = lnkUpload(fileUpload);
return imgByte;
}
private static Byte[] lnkUpload(FileStream img)
{
byte[] resizedImage;
using (Image orginalImage = Image.FromStream(img))
{
ImageFormat orginalImageFormat = orginalImage.RawFormat;
int orginalImageWidth = orginalImage.Width;
int orginalImageHeight = orginalImage.Height;
int resizedImageWidth = 60; // Type here the width you want
int resizedImageHeight = Convert.ToInt32(resizedImageWidth * orginalImageHeight / orginalImageWidth);
using (Bitmap bitmapResized = new Bitmap(orginalImage, resizedImageWidth, resizedImageHeight))
{
using (MemoryStream streamResized = new MemoryStream())
{
bitmapResized.Save(streamResized, orginalImageFormat);
resizedImage = streamResized.ToArray();
}
}
}
return resizedImage;
}
I have no definite implementation for you, but I would approach it that way:
You can store 51200 values (uncompressed). And you know the ratio from the original:
Calculate the dimensions with the ratio and the size of the new image:
x = y / ratio
size(51200) = x * y
y = size / x
x = (size / x) / ratio;
y = x * ratio
for the resampling of the values I would go for using a filter kernel:
http://en.wikipedia.org/wiki/Lanczos_resampling
Haven't used it yet, but sounds promising.
I am used this....
public static byte[] ImagenToByteArray(System.Drawing.Image imageIn)
{
MemoryStream ms = new MemoryStream();
imageIn.Save(ms, System.Drawing.Imaging.ImageFormat.Gif);
return ms.ToArray();
}