I'm trying to output an image to the output stream through an HttpHandler but it keeps throwing an ArgumentException. I've googled the issue and tried so many things but I still couldn't fix the problem. Anyway, here's the code:
public void ProcessRequest(HttpContext context)
{
Int32 imageId = context.Request.QueryString["id"] != null ?
Convert.ToInt32(context.Request.QueryString["id"]) : default(Int32);
if (imageId != 0)
{
//context.Response.ContentType = "image/jpeg";
Byte[] imageData = this._imageInfoManager.GetTradeMarkImage(imageId);
using (MemoryStream ms = new MemoryStream(imageData, 0, imageData.Length))
{
using (Image image = Image.FromStream(ms, true, true)) //this line throws
{
image.Save(context.Response.OutputStream, ImageFormat.Jpeg);
}
}
}
throw new ArgumentException("Image could not be found.");
}
Note that the imageData byte array is not empty and the memory stream is being filled up correctly. Any thoughts?
UPDATE:
Here's the code for GetTradeMarkImage... Note that the images are stored in the an SQL Server database in the image format.
public Byte[] GetTradeMarkImage(Int32 id)
{
object result = DB.ExecuteScalar(SqlConstants.SqlProcedures.Request_GetImageById, id);
if (result != null)
{
using (MemoryStream ms = new MemoryStream())
{
BinaryFormatter bf = new BinaryFormatter();
bf.Serialize(ms, result);
return ms.ToArray();
}
}
return null;
}
Okay, now you've posted the GetTradeMarkImage code, that's almost certainly the problem:
using (MemoryStream ms = new MemoryStream())
{
BinaryFormatter bf = new BinaryFormatter();
bf.Serialize(ms, result);
return ms.ToArray();
}
Why would you expect the value of BinaryFormatter to be a valid image stream? It's not clear what's in your database (a BLOB?) nor what the execution-time type of result is here (which you should find out by debugging) but you shouldn't be using BinaryFormatter here. I suspect you just want to get the raw data out of the database, and put that into the byte array.
If you're lucky, you may just be able to cast result to byte[] to start with. (I don't know what ExecuteScalar does with blobs, and this obviously isn't the "normal" ExecuteScalar method anyway). Otherwise, you may need to use an alternative approach, opening a DataReader and getting the value that way.
Related
let's assume I let the user choose an image from the computer. I load the file to a picture box. here is the conversion method:
public static Image LoadImageFromFile(string fileName)
{
Image result = null;
if (!File.Exists(fileName))
return result;
FileStream fs = new FileStream(fileName, FileMode.Open, FileAccess.Read);
try
{
using (Image image = Image.FromStream(fs))
{
ImageManipulation.RotateImageByExifOrientationData(image);
result =(Image) image.Clone();
image.Dispose();
}
}
finally
{
fs.Close();
}
return result;
}
then, when the user clicks on the Save button, I convert the image into a byte array and save it into the database. here is the conversion code:
public static byte[] ImageToByteArray(Image image)
{
if (image == null)
return null;
using (var ms = new MemoryStream())
{
ImageFormat imageFormat = image.RawFormat;
ImageCodecInfo codec = ImageCodecInfo.GetImageDecoders().FirstOrDefault(c => c.FormatID == imageFormat.Guid);
var encoderParameters = new EncoderParameters(1);
encoderParameters.Param[0] = new EncoderParameter(Encoder.Quality, 100L);
if (codec != null)
image.Save(ms, codec, encoderParameters);
else
image.Save(ms, ImageFormat.Jpeg);
return ms.ToArray();
}
}
but the problem:
I have a jpg file on the disk. I can load it into my picture box without any problem. the picture is perfectly visible in it. but when I save it, the code gives me "A generic error occurred in GDI+." Error at 'image.Save(ms, codec,encoderParameters)' line.
more odd incident: I don't get this error all the time. it is happening with specific images. for example, I downloaded an image from the internet and crop it in "Paint" and saved it as jpg. error happened. open it in Paint again and save it as png. no error!!!! that is why I am really confused . and Yes I already have tried to don't dispose the image. not helping
I know it might be a stupid question but I am desperately stuck here :)
Thank you in Advanced
I didn't find out why my code is not working but I just substitute my loading method with the code below. it is releasing the file and at the same time works as it should.
public static Image LoadImageFromFile(string fileName)
{
using (Bitmap bmb = new Bitmap(fileName))
{
MemoryStream m = new MemoryStream();
bmb.Save(m, ImageFormat.Bmp);
return Image.FromStream(m);
}
}
Since two days I try to manage my images in a WPF application but I have errors and errors errors, ...
The image is show in a System.Windows.Control.Image.
For this reason I try to work with variable BitMapImage.
Now I have the error : "Impossible to access close stream" and I can not find a solution.
I have created two function for convert :
public static BitmapImage ConvertToBitMapImage(byte[] bytes)
{
if (bytes == null || bytes.Length == 0) return null;
var image = new BitmapImage();
using (var mem = new MemoryStream(bytes))
{
mem.Position = 0;
image.BeginInit();
image.CreateOptions = BitmapCreateOptions.PreservePixelFormat;
image.CacheOption = BitmapCacheOption.OnLoad;
image.UriSource = null;
image.StreamSource = mem;
image.EndInit();
}
//image.Freeze();
return image;
}
public static byte[] ImageToByte(BitmapImage imageSource)
{
Stream stream = imageSource.StreamSource;
Byte[] buffer = null;
if (stream != null && stream.Length > 0)
{
using (BinaryReader br = new BinaryReader(stream))
{
buffer = br.ReadBytes((Int32)stream.Length);
}
}
return buffer;
}
In my object I have a property :
public BitmapImage MiniatureApp
{
get
{
if (IMAGES != null)
_MiniatureApp = Tools.BinaryImageConverter.ConvertToBitMapImage(IMAGES.IMAGE);
return _MiniatureApp;
}
set
{
if (this.IMAGES != null)
this.IMAGES.IMAGE = Tools.BinaryImageConverter.ImageToByte((BitmapImage)value);
else
{
IMAGES img = new IMAGES();
img.NOM = "";
img.IMAGE = Tools.BinaryImageConverter.ImageToByte((BitmapImage)value);
this.IMAGES = img;
}
NotifyPropertyChanged();
}
}
And in my main I do this :
FileStream fs = new FileStream(pathImage, FileMode.Open, FileAccess.Read);
byte[] data = new byte[fs.Length];
fs.Read(data, 0, System.Convert.ToInt32(fs.Length));
fs.Close();
VMAppareil.VMApp.CurrentAppareil.MiniatureApp = Tools.BinaryImageConverter.ConvertToBitMapImage(data);
Exist a solution for my problem or exist a best way to do this ?
Gobelet.
You can fix your solution replacing the ImageToByte method with the following (borrowed from https://stackoverflow.com/a/6597746/5574010)
public static byte[] ImageToByte(BitmapImage imageSource)
{
var encoder = new JpegBitmapEncoder();
encoder.Frames.Add(BitmapFrame.Create(imageSource));
using (var ms = new MemoryStream())
{
encoder.Save(ms);
return ms.ToArray();
}
}
Your previous solution did not work because the ConvertToBitMapImage method close the stream that you assign to image.StreamSource as soon as your code exit the unit statement. When you call the ImageToByte method as part of the MiniatureApp setter the StreamSource will be close and you get an error.
I'm not really sure what you're doing with the getter-setter.
For getting a byte array from a file, I would have this:
public static byte[] FileToByteArray(string fileName)
{
byte[] fileData = null;
using (FileStream fs = new File.OpenRead(fileName))
{
var binaryReader = new BinaryReader(fs);
fileData = binaryReader.ReadBytes((int)fs.Length);
}
return fileData;
}
And once you have the bytes from that function, you can use ConvertToBitMapImage() and assign that to the Image.Source, like this:
byte[] bytes = FileToByteArray(fileName);
YourImage.Source = ConvertToBitMapImage(bytes);
I'm not sure why you would need to convert a BitmapImage into bytes, but you should be able to call your function directly for that:
BitmapImage bImg = new BitmapImage();
bImg.Source = ConvertToBitMapImage(bytes);
byte[] bytes = ImageToByte(bImg); // these should be the same bytes as went in
Set it up like this and step through the code to see where it snags. It's possible the bytes might need encoding when you grab it in ImageToByte().
But also, you don't just then set the byte[] directly to the IMAGES or this.IMAGES.IMAGE, because that isn't an Image object, it's a byte[]. Without knowing what your IMAGES model construct looks like with its property types, it's a little vague to know what is being set to what and why, but this seems like a bad idea to over-complicate what you should just do with function calls as it is needed, without a getter-setter and an IMAGES model in the way.
My end goal is to use protobuf-net and GZipStream in an attempt to compress a List<MyCustomType> object to store in a varbinary(max) field in SQL Server. I'm working on unit tests to understand how everything works and fits together.
Target .NET framework is 3.5.
My current process is:
Serialize the data with protobuf-net (good).
Compress the serialized data from #1 with GZipStream (good).
Convert the compressed data to a base64 string (good).
At this point, the value from step #3 will be stored in a varbinary(max) field. I have no control over this. The steps resume with needing to take a base64 string and deserialize it to a concrete type.
Convert a base 64 string to a byte[] (good).
Decompress the data with GZipStream (good).
Deserialize the data with protobuf-net (bad).
Can someone assist with why the call to Serializer.Deserialize<string> returns null? I'm stuck on this one and hopefully a fresh set of eyes will help.
FWIW, I tried another version of this using List<T> where T is a custom class I created and I Deserialize<> still returns null.
FWIW 2, data.txt is a 4MB plaintext file residing on my C:.
[Test]
public void ForStackOverflow()
{
string data = "hi, my name is...";
//string data = File.ReadAllText(#"C:\Temp\data.txt");
string serializedBase64;
using (MemoryStream protobuf = new MemoryStream())
{
Serializer.Serialize(protobuf, data);
using (MemoryStream compressed = new MemoryStream())
{
using (GZipStream gzip = new GZipStream(compressed, CompressionMode.Compress))
{
byte[] s = protobuf.ToArray();
gzip.Write(s, 0, s.Length);
gzip.Close();
}
serializedBase64 = Convert.ToBase64String(compressed.ToArray());
}
}
byte[] base64byteArray = Convert.FromBase64String(serializedBase64);
using (MemoryStream base64Stream = new MemoryStream(base64byteArray))
{
using (GZipStream gzip = new GZipStream(base64Stream, CompressionMode.Decompress))
{
using (MemoryStream plainText = new MemoryStream())
{
byte[] buffer = new byte[4096];
int read;
while ((read = gzip.Read(buffer, 0, buffer.Length)) > 0)
{
plainText.Write(buffer, 0, read);
}
// why does this call to Deserialize return null?
string deserialized = Serializer.Deserialize<string>(plainText);
Assert.IsNotNull(deserialized);
Assert.AreEqual(data, deserialized);
}
}
}
}
Because you didn't rewind plainText after writing to it. Actually, that entire Stream is unnecessary - this works:
using (MemoryStream base64Stream = new MemoryStream(base64byteArray))
{
using (GZipStream gzip = new GZipStream(
base64Stream, CompressionMode.Decompress))
{
string deserialized = Serializer.Deserialize<string>(gzip);
Assert.IsNotNull(deserialized);
Assert.AreEqual(data, deserialized);
}
}
Likewise, this should work for the serialize:
using (MemoryStream compressed = new MemoryStream())
{
using (GZipStream gzip = new GZipStream(
compressed, CompressionMode.Compress, true))
{
Serializer.Serialize(gzip, data);
}
serializedBase64 = Convert.ToBase64String(
compressed.GetBuffer(), 0, (int)compressed.Length);
}
After serializing a object in C# I upload it to my database, each of the hex strings look like this:
0x0001000000FFFFFFFF01000000000000000C0200000046496E76656E746F72794F626A6563742C2056657273696F6E3D312E302E302E302C2043756C747572653D6E65757472616C2C205075626C69634B6579546F6B656E3D6E756C6C050100000019496E76656E746F72792E496E76656E746F72794F626A65637404000000
I was expecting each of the objects to be the same (meaning a error in my code) when I deserialized .But they are not, instead they represent the correct unique objects.
Is there some kind of pointer association happening in sql server?
Serialize method:
public byte[] serialize()
{
BinaryFormatter bf = new BinaryFormatter();
MemoryStream ms = new MemoryStream();
bf.Serialize(ms, this); //NOTE: this refers to a InventoryObject
byte[] returnVal = ms.ToArray();
ms.Close();
return returnVal;
}
Deserialize method:
public static InventoryObject deserialize(byte[] arrBytes)
{
MemoryStream memStream = new MemoryStream();
BinaryFormatter binForm = new BinaryFormatter();
memStream.Write(arrBytes, 0, arrBytes.Length);
memStream.Seek(0, SeekOrigin.Begin);
InventoryObject obj = (Object)binForm.Deserialize(memStream) as InventoryObject;
return obj ;
}
I figured it out. SQL server 2000 does not show the entire hex string to you in the query analyzer if it is greater than a certain length. Thus my hex strings are actually different.
I have met an issue regarding the casting type from HttpInputStream to FileStream.
How I did ?
I have a HttpPostedFileBase object and I want to have FileStream.
I wrote:
public void Test(HttpPostedFileBase postedFile) {
FileStream fileStream = (FileStream)(postedFile.InputStream); // throw exception
FileStream anotherFileStream = postedFile.InputStream as FileStream; // null
}
I tried also
public void Test(HttpPostedFileBase postedFile) {
Stream stream = postedFile.InputStream as Stream;
FileStream myFile = (FileStream)stream;
}
But no success.
Why at postedFile.InputStream comes HttpInputStream type ?
And how could I solve this issue ?
Thanks
public byte[] LoadUploadedFile(HttpPostedFileBase uploadedFile)
{
var buf = new byte[uploadedFile.InputStream.Length];
uploadedFile.InputStream.Read(buf, 0, (int)uploadedFile.InputStream.Length);
return buf;
}
The stream that you get from your HTTP call is read-only sequential (non-seekable) and the FileStream is read/write seekable. You will need first to read the entire stream from the HTTP call into a byte array, then create the FileStream from that array.
I used the following and it worked just fine for the same situation
MemoryStream streamIWant = new MemoryStream();
using (Stream mystream = (Stream)AmazonS3Service.GetObjectStream(AWSAlbumBucketName, ObjectId))
{
mystream.CopyTo(streamIWant);
}
return streamIWant;
The GetObjectStream returns the same type of string mentioned in the question.
You can use the .SaveAs method to save the file content. HttpInputSteam probably because it's uploaded through http [browser]
postedFile.SaveAs("Full Path to file name");
You can also use CopyTo
FileStream f = new FileStream(fullPath, FileMode.CreateNew);
postedFile.InputStream.CopyTo(f);
f.Close();
Below code worked for me..
Use a BinaryReader object to return a byte array from the stream like:
byte[] fileData = null;
using (var binaryReader = new BinaryReader(Request.Files[0].InputStream))
{
fileData = binaryReader.ReadBytes(Request.Files[0].ContentLength);
}
How to create byte array from HttpPostedFile
It will work for you.
IFormFile file;
if (file != null)
{
byte[]? image = Array.Empty<byte>();
await using var memoryStream = new MemoryStream();
await file!.CopyToAsync(memoryStream);
image = memoryStream.ToArray();
}
First convert to byte array then convert to Stream.
[HttpPost]
[Route("api/TestReader/UploadTestReaderStudentsGrade")]
public IHttpActionResult UploadTestReaderStudentsGrade()
{
HttpPostedFile httpPostedFile = HttpContext.Current.Request.Files[0];
var fileBytes = new byte[httpPostedFile.InputStream.Length];
httpPostedFile.InputStream.Read(fileBytes, 0, (int)httpPostedFile.InputStream.Length); // to byte array
Stream stream = new MemoryStream(fileBytes); // to Stream
Workbook workbook = new Workbook(stream);
return Ok();
}