Size image placed in byte[] array (don't know the type of image).
I have to produce another byte [] array, which size should be up to 50kB.
How can I do some kind of scaling?
Unless you want to get into some serious math, you need to load your byte array into a memory stream, load an image from that memory stream, and use the built-in GDI functions in the System.Drawing namespace.
Doing a 25%, or 50% scale is easy. Beyond that, you need to start doing interpolation and differencing to make anything look halfway decent in binary data manipulation. You'll be several days into it before you can match what's already available in GDI.
System.IO.MemoryStream myMemStream = new System.IO.MemoryStream(myBytes);
System.Drawing.Image fullsizeImage = System.Drawing.Image.FromStream(myMemStream);
System.Drawing.Image newImage = fullsizeImage .GetThumbnailImage(newWidth, newHeight, null, IntPtr.Zero);
System.IO.MemoryStream myResult = new System.IO.MemoryStream();
newImage.Save(myResult ,System.Drawing.Imaging.ImageFormat.Gif); //Or whatever format you want.
return myResult.ToArray(); //Returns a new byte array.
BTW - if you really need to figure out your source image type, see: How to check if a byte array is a valid image
Ok, so after some experiments, I have something like that:
public static byte[] Resize2Max50Kbytes(byte[] byteImageIn)
{
byte[] currentByteImageArray = byteImageIn;
double scale = 1f;
if (!IsValidImage(byteImageIn))
{
return null;
}
MemoryStream inputMemoryStream = new MemoryStream(byteImageIn);
Image fullsizeImage = Image.FromStream(inputMemoryStream);
while (currentByteImageArray.Length > 50000)
{
Bitmap fullSizeBitmap = new Bitmap(fullsizeImage, new Size((int)(fullsizeImage.Width * scale), (int)(fullsizeImage.Height * scale)));
MemoryStream resultStream = new MemoryStream();
fullSizeBitmap.Save(resultStream, fullsizeImage.RawFormat);
currentByteImageArray = resultStream.ToArray();
resultStream.Dispose();
resultStream.Close();
scale -= 0.05f;
}
return currentByteImageArray;
}
Has someone another idea? Unfortunatelly Image.GetThumbnailImage() was causing very dirty images.
The updated answer below works with Docker SixLabors.ImageSharp.
The solution for .net core 3.1 and greater:
Install System.Drawing.Common nuget:
Install-Package System.Drawing.Common
The code to change the size of the image from a byte array:
byte[] ReduceImage(byte[] bytes)
{
using var memoryStream = new MemoryStream(bytes);
using var originalImage = new Bitmap(memoryStream);
var resized = new Bitmap(newWidth, newHeight);
using var graphics = Graphics.FromImage(resized);
graphics.CompositingQuality = CompositingQuality.HighSpeed;
graphics.InterpolationMode = InterpolationMode.HighQualityBicubic;
graphics.CompositingMode = CompositingMode.SourceCopy;
graphics.DrawImage(originalImage, 0, 0, reducedWidth, reducedHeight);
using var stream = new MemoryStream();
resized.Save(stream, ImageFormat.Png);
return stream.ToArray();
}
Update:
The approach above can don't work for Linux, so the universal solution is:
Install SixLabors.ImageSharp nuget:
Install-Package SixLabors.ImageSharp
Write the following code:
private static byte[] ReduceImage(byte[] bytes)
{
using var memoryStream = new MemoryStream(bytes);
using var image = Image.Load(memoryStream);
image.Mutate(x => x.Resize(ReducedWidth, ReducedHeight));
using var outputStream = new MemoryStream();
image.Save(outputStream, new PngEncoder() /*or another encoder*/);
return outputStream.ToArray();
}
Suppose you read a file from Drive
FileStream streamObj = System.IO.File.OpenRead(#"C:\Files\Photo.jpg");
Byte[] newImage=UploadFoto(streamObj);
public static Byte[] UploadFoto(FileStream fileUpload)
{
Byte[] imgByte = null;
imgByte = lnkUpload(fileUpload);
return imgByte;
}
private static Byte[] lnkUpload(FileStream img)
{
byte[] resizedImage;
using (Image orginalImage = Image.FromStream(img))
{
ImageFormat orginalImageFormat = orginalImage.RawFormat;
int orginalImageWidth = orginalImage.Width;
int orginalImageHeight = orginalImage.Height;
int resizedImageWidth = 60; // Type here the width you want
int resizedImageHeight = Convert.ToInt32(resizedImageWidth * orginalImageHeight / orginalImageWidth);
using (Bitmap bitmapResized = new Bitmap(orginalImage, resizedImageWidth, resizedImageHeight))
{
using (MemoryStream streamResized = new MemoryStream())
{
bitmapResized.Save(streamResized, orginalImageFormat);
resizedImage = streamResized.ToArray();
}
}
}
return resizedImage;
}
I have no definite implementation for you, but I would approach it that way:
You can store 51200 values (uncompressed). And you know the ratio from the original:
Calculate the dimensions with the ratio and the size of the new image:
x = y / ratio
size(51200) = x * y
y = size / x
x = (size / x) / ratio;
y = x * ratio
for the resampling of the values I would go for using a filter kernel:
http://en.wikipedia.org/wiki/Lanczos_resampling
Haven't used it yet, but sounds promising.
I am used this....
public static byte[] ImagenToByteArray(System.Drawing.Image imageIn)
{
MemoryStream ms = new MemoryStream();
imageIn.Save(ms, System.Drawing.Imaging.ImageFormat.Gif);
return ms.ToArray();
}
Related
This question already has answers here:
A generic error occurred in GDI+, JPEG Image to MemoryStream
(36 answers)
Closed 3 years ago.
I'm resizing an image, what could be wrong with my code?
var newSize = ResizeImageFile(ConvertToBytes(myFile), 2048);
using (MemoryStream ms = new MemoryStream(newSize, 0, newSize.Length))
{
ms.Write(newSize, 0, newSize.Length);
using (Image image = Image.FromStream(ms, true))
{
image.Save(targetLocation, ImageFormat.Jpeg);
}
}
I have used this function to resize my image
public static byte[] ResizeImageFile(byte[] imageFile, int targetSize) // Set targetSize to 1024
{
using (Image oldImage = Image.FromStream(new MemoryStream(imageFile)))
{
Size newSize = CalculateDimensions(oldImage.Size, targetSize);
using (Bitmap newImage = new Bitmap(newSize.Width, newSize.Height, PixelFormat.Format24bppRgb))
{
using (Graphics canvas = Graphics.FromImage(newImage))
{
canvas.SmoothingMode = SmoothingMode.AntiAlias;
canvas.InterpolationMode = InterpolationMode.HighQualityBicubic;
canvas.PixelOffsetMode = PixelOffsetMode.HighQuality;
canvas.DrawImage(oldImage, new Rectangle(new Point(0, 0), newSize));
MemoryStream m = new MemoryStream();
newImage.Save(m, ImageFormat.Jpeg);
return m.GetBuffer();
}
}
}
}
I have been looking for answers for a long time, please help me.
Thank you
Since the both CalculateDimensions and ConvertToBytes methods are not shown, I tried to assume that the above methods are something like as follows:
// Calculate the Size at which the image width and height is lower than the specified value
// (Keep the aspect ratio)
private static Size CalculateDimensions(Size size, int targetSize)
{
double rate = Math.Max(size.Width * 1.0 / targetSize, size.Height * 1.0 / targetSize);
int w = (int)Math.Floor(size.Width / rate);
int h = (int)Math.Floor(size.Height / rate);
return new Size(w, h);
}
//Convert image file to byte array
private static byte[] ConvertToBytes(string fileName)
{
var result = File.ReadAllBytes(fileName);
return result;
}
If your code does not work well, then some problems must be in the above methods.
Image image = new Image ();
Bitmap bitmap = Bitmap.CreateBitmap (200, 100, Bitmap.Config.Argb8888);
Canvas canvas = new Canvas(bitmap);
var paint = new Paint();
paint.Color = Android.Graphics.Color.Red;
paint.SetStyle(Paint.Style.Fill);
Rect rect = new Rect(0, 0, 200, 100);
canvas.DrawRect(rect, paint);
Android.Widget.ImageView contains method SetImageBitmap.
What the best way to set Xamarin.Forms.Image from my bitmap?
Convert the Bitmap to byte[] via http://forums.xamarin.com/discussion/5950/how-to-convert-from-bitmap-to-byte-without-bitmap-compress
There are two solutions mentioned.
var byteArray = ByteBuffer.Allocate(bitmap.ByteCount);
bitmap.CopyPixelsToBuffer(byteArray);
byte[] bytes = byteArray.ToArray<byte>();
return bytes;
(in case first solution is still broken)
ByteBuffer buffer = ByteBuffer.Allocate(bitmap.ByteCount);
bitmap.CopyPixelsToBuffer(buffer);
buffer.Rewind();
IntPtr classHandle = JNIEnv.FindClass("java/nio/ByteBuffer");
IntPtr methodId = JNIEnv.GetMethodID(classHandle, "array", "()[B");
IntPtr resultHandle = JNIEnv.CallObjectMethod(buffer.Handle, methodId);
byte[] byteArray = JNIEnv.GetArray<byte>(resultHandle);
JNIEnv.DeleteLocalRef(resultHandle);
And then use
var image = new Image();
image.Source = ImageSource.FromStream(() => new MemoryStream(byteArray));
to create an Image.
I tried #Wosi's answer, but for some reason the rendering of the image after that didn't work and the code is specific to Android. I needed to work from a byte array to a bitmap and then back again. This is what I did:
Code for turning a bitmap into a byte array:
byte[] bitmapData;
using (var stream = new MemoryStream())
{
tempBitmap.Compress(Android.Graphics.Bitmap.CompressFormat.Png, 0, stream);
bitmapData = stream.ToArray();
}
And the code for turning a byte array into a bitmap:
Android.Graphics.Bitmap tempBitmap = Android.Graphics.BitmapFactory.DecodeByteArray(imageByteArray, 0, imageByteArray.Length, options);
Where "options" is defined as follows:
Android.Graphics.BitmapFactory.Options options = new Android.Graphics.BitmapFactory.Options
{
InJustDecodeBounds = true
};
Android.Graphics.Bitmap result = Android.Graphics.BitmapFactory.DecodeByteArray(bitmapArray, 0, byteArrayLength, options);
//int imageHeight = options.OutHeight;
//int imageWidth = options.OutWidth;
In this part the Bitmap gets decoded. This is done to get the image height and width properties. For my case I required this information to encode it as a byte array again.
There with this it is possible to encode a byte array to a string and then back again.
Setting an image source from a byte array is done as follows:
var imageSource = ImageSource.FromStream(() => new MemoryStream(ImageByteArray, 0, ImageByteArray.Length));
I'm having some trouble converting an image to a video using the SharpAVI.dll.
I have managed to produce a video file using a randomly generated byte array by using the documentation on SharpAVI's website:
Getting Started with SharpAVI
So the next step I thought I would take was to take an Image, create a Bitmap image, convert the bitmap to a byte array and then simply save the byte array to each frame of the video file. When I run the program, I get no errors or anything and a video file of an appropriate file size is produced however the video file is unreadable and will not open. I'm really struggling to see why this won't work. Any help would be greatly appreciated!
My Code:
private void GenerateSingleImageVideo()
{
string imagePath = textBoxImagePath.Text;
Bitmap thisBitmap;
//generate bitmap from image file
using (Stream BitmapStream = System.IO.File.Open(imagePath, FileMode.Open))
{
Image img = Image.FromStream(BitmapStream);
thisBitmap = new Bitmap(img);
}
//convert the bitmap to a byte array
byte[] byteArray = BitmapToByteArray(thisBitmap);
//creates the writer of the file (to save the video)
var writer = new AviWriter(textBoxFileName.Text + ".avi")
{
FramesPerSecond = int.Parse(textBoxFrameRate.Text),
EmitIndex1 = true
};
var stream = writer.AddVideoStream();
stream.Width = thisBitmap.Width;
stream.Height = thisBitmap.Height;
stream.Codec = KnownFourCCs.Codecs.Uncompressed;
stream.BitsPerPixel = BitsPerPixel.Bpp32;
int numberOfFrames = ((int.Parse(textBoxFrameRate.Text)) * (int.Parse(textBoxVideoLength.Text)));
int count = 0;
while (count <= numberOfFrames)
{
stream.WriteFrame(true, byteArray, 0, byteArray.Length);
count++;
}
writer.Close();
MessageBox.Show("Done");
}
private byte[] BitmapToByteArray(Bitmap img)
{
ImageConverter converter = new ImageConverter();
return (byte[])converter.ConvertTo(img, typeof(byte[]));
}
You're wrong in assuming that you should pass a Bitmap object to WriteFrame method. It expects pixel data in bottom to top 32bpp format. See example in
// Buffer for pixel data
var buffer = new byte[width * height * 4];
...
// Copy pixels from Bitmap assuming it has expected 32bpp pixel format
var bits = bitmap.LockBits(new Rectangle(0, 0, width, height), ImageLockMode.ReadOnly, PixelFormat.Format32bppRgb);
Marshal.Copy(bits.Scan0, buffer, 0, buffer.Length);
bitmap.UnlockBits(bits);
You can see code of a sample app as a reference
https://github.com/baSSiLL/SharpAvi/blob/master/Sample/Recorder.cs
I want to convert byte array to bitmap. I get this byte array from capture card. Rgb24 data include for this array. When i convert this array to bitmap object i got the "Parameter is not valid" error.
This is my byte array
myByteArray{byte[921600]}
MemoryStream mStream = new MemoryStream(myByteArray);
Bitmap bi = new Bitmap(mStream );
and
using (MemoryStream mStream = new MemoryStream(myByteArray))
{
Bitmap bi = (Bitmap)System.Drawing.Image.FromStream(mStream );
}
and
using (MemoryStream mStream = new MemoryStream())
{
mStream.Write(myByteArray, 0, myByteArray.Length);
mStream.Seek(0, SeekOrigin.Begin);
Bitmap bm = new Bitmap(mStream);
return bm;
}
Is this happen because of the size of the array?
Can any one give a method to do this task?
It will be greatly appreciated.
thank you
If your myByteArray is raw image data, this should work:
Bitmap bmp = null;
unsafe
{
fixed (byte* p = myByteArray)
{
IntPtr unmanagedPointer = (IntPtr)p;
// Deduced from your buffer size
int width = 640;
int height = 480;
bmp = new Bitmap(width, height, width * 3, System.Drawing.Imaging.PixelFormat.Format24bppRgb, unmanagedPointer);
}
}
I am currently trying to resize an image to a thumbnail, to show as a preview when its done uploading. I am using fineuploader plugin for the uploading part of the image. I consistently keep getting a "parameter is not valid". I've seen many posts related to this, and tried most of the solution, but have no success. Here is the snippet of the code:
public static byte[] CreateThumbnail(byte[] PassedImage, int LargestSide)
{
byte[] ReturnedThumbnail = null;
using (MemoryStream StartMemoryStream = new MemoryStream(),
NewMemoryStream = new MemoryStream())
{
StartMemoryStream.Write(PassedImage, 0, PassedImage.Length); //error being fire in this line
System.Drawing.Bitmap startBitmap = new Bitmap(StartMemoryStream);
int newHeight;
int newWidth;
double HW_ratio;
if (startBitmap.Height > startBitmap.Width)
{
newHeight = LargestSide;
HW_ratio = (double)((double)LargestSide / (double)startBitmap.Height);
newWidth = (int)(HW_ratio * (double)startBitmap.Width);
}
else
{
newWidth = LargestSide;
HW_ratio = (double)((double)LargestSide / (double)startBitmap.Width);
newHeight = (int)(HW_ratio * (double)startBitmap.Height);
}
System.Drawing.Bitmap newBitmap = new Bitmap(newWidth, newHeight);
newBitmap = ResizeImage(startBitmap, newWidth, newHeight);
newBitmap.Save(NewMemoryStream, System.Drawing.Imaging.ImageFormat.Jpeg);
ReturnedThumbnail = NewMemoryStream.ToArray();
}
return ReturnedThumbnail;
}
I'm out of ideas, any help is appreciated.
Your error is in the new Bitmap(startMemoryStream) line, not the line above.
The documentation states that this exception can occur when:
stream does not contain image data or is null.
-or-
stream contains a PNG image file with a single dimension greater than 65,535 pixels.
You should check that you have a valid PNG file in there. For example, write it to a file and try opening it in an image viewer.
That code is dangerous - every instance of a System.Drawing class must be placed in a using(){} clause.
Here's an alternate solution that uses the ImageResizer NuGet package and resizes the image safely.
var ms = new MemoryStream();
ImageResizer.Current.Build(PassedImage, ms, new ResizeSettings(){MaxWidth=LargestSide, MaxHeight=LargestSide});
return ImageResizer.ExtensionMethods.StreamExtensions.CopyToBytes(ms);