Bitmap.Save() method exception - c#

I am trying to write a windows application to convert .avi videos to bitmap frames. I am able to get the bitmaps but I am having problems saving them.
Frames are saved perfectly up to 1649th frame. After that I get this exception:
Attempted to read or write protected memory. This is often an indication that other memory is corrupt
I ran the code several times, the code always throw above exception when processing 1649th frame. The output folder is empty at the beginning and its size is 389 MBs when the program stops.
I am guessing that windows do not allow a program to write this amount of data in a short interval but I am not sure and I don't know how to fix it. Can anyone help?
for(counter = reader.Start; counter<(reader.Start + reader.Length); counter++)
{
DummyBitmap = reader.GetNextFrame();
DummyBitmap.Save(folderBrowserDialog2.SelectedPath + "\\" + counter.ToString() + ".bmp");
reader.Position++;
}

Bitmap class implements IDisposable interface so it would be wise to use it as:
using (var b = new Bitmap(...))
{
}
Also, maybe this post can give you some answers: Bitmap memory leak.

Related

C# System.OutOfMemoryException when Creating New Array

I am reading files into an array; here is the relevant code; a new DiskReader is created for each file and path is determined using OpenFileDialog.
class DiskReader{
// from variables section:
long MAX_STREAM_SIZE = 300 * 1024 * 1024; //300 MB
FileStream fs;
public Byte[] fileData;
...
// Get file size, check it is within allowed size (MAX)STREAM_SIZE), start process including progress bar.
using (fs = File.OpenRead(path))
{
if (fs.Length < MAX_STREAM_SIZE)
{
long NumBytes = (fs.Length < MAX_STREAM_SIZE ? fs.Length : MAX_STREAM_SIZE);
updateValues[0] = (NumBytes / 1024 / 1024).ToString("#,###.0");
result = LoadData(NumBytes);
}
else
{
// Need for something to handle big files
}
if (result)
{
mainForm.ShowProgress(true);
bw.RunWorkerAsync();
}
}
...
bool LoadData(long NumBytes)
{
try
{
fileData = new Byte[NumBytes];
fs.Read(fileData, 0, fileData.Length);
return true;
}
catch (Exception e)
{
return false;
}
}
The first time I run this, it works fine. The second time I run it, sometimes it works fine, most times it throws an System.OutOfMemoryException at
[Edit:
"first time I run this" was a bad choice of words, I meant when I start the programme and open a file is fine, I get the problem when I try to open a different file without exiting the programme. When I open the second file, I am setting the DiskReader to a new instance which means the fileData array is also a new instance. I hope that makes it clearer.]
fileData = new Byte[NumBytes];
There is no obvious pattern to it running and throwing an exception.
I don't think it's relevant, but although the maximum file size is set to 300 MB, the files I am using to test this are between 49 and 64 MB.
Any suggestions on what is going wrong here and how I can correct it?
If the exception is being thrown at that line only, then my guess is that you've got a problem somewhere else in your code, as the comments suggest. Reading the documentation of that exception here, I'd bet you call this function one too many times somewhere and simply go over the limit on object length in memory, since there don't seem to be any problem spots in the code that you posted.
The fs.Length property requires the whole stream to be evaluated, hence to read the file anyway. Try doing something like
byte[] result;
if (new FileInfo(path).Length < MAX_STREAM_SIZE)
{
result = File.ReadAllBytes(path);
}
Also depending on your needs, you might avoid using byte array and read the data directly from the file stream. This should have much lower memory footprint
If I understand well what you want to do, I have this proposal: The best option is to allocate one static array of defined MAX size at the beginning. And then keep that array, only fill it with a new data from another file. This way your memory should be absolutely fine. You just need to store file size in a separate variable, because the array will have always the same MAX size.
This is a common approach in systems with automatic memory management - it makes the program faster when you allocate a constant size of memory at the start and then never allocate anything during the computation, because garbage collector is not run many times.

Writing to file, memory used steadily increasing

I have an application where I need to write binary to a file constantly. The bits of data are small, about 1K each. The computers this is running on aren't great and are running XP. I've run into the problem that when I turn on the logging the computers just get totally hosed and I watch the Task Manager and just see the memory usage going up and up until it crashes.
A coworker suggested that I just keep the packets in memory until a certain amount of time has passed and then write it all at once instead of writing each one separately - tried that, same issue.
This is the code (loggingBuffer is the List<byte[]> I'm storing the packets in while the interval passes):
if ((DateTime.Now - lastStoreTime).TotalSeconds > 10)
{
string fileName = #"C:\Storage\file";
FileMode fm = File.Exists(fileName) ? FileMode.Append : FileMode.Create;
using (BinaryWriter w = new BinaryWriter(File.Open(fileName, fm), Encoding.ASCII))
{
foreach (byte[] packetData in loggingBuffer)
{
w.Write(packetData);
}
}
loggingBuffer.Clear();
lastStoreTime= DateTime.Now;
}
Is there anything different I should be doing to accomplish this?
Seems to me that, while you're writing each 10 seconds, you could close the file in between. And cleanup all related file-writing things. Perhaps that would solved your problem.
Secondly, I'd suggest creating the BinaryWriter outside the function where you actually write the data. It'll keep things clearer. In your current code you're checking each time wether to append data or to create a new file and the write to it. If you'll do this outside the function and call it just once perhaps this will save memory too. All untested by me, that is :)

GhostScript Rasterizer Out of Memory Exception

I am working with a 32-bit Console Application that operates as a background processor. The part I am working on uses GhostScript to Perform OCR on PDFs. Each page of a PDF is rendered to a PNG image in a temp folder which the OCR Reader then reads. The OCR text is saved to a database and the files in the temp folder are then deleted.
The problem is with the GhostScriptRasterizer object eating all of the memory the processor has available. When I call the GhostScriptRasterizer.GetPage(dpi, dpi, pageNumber) method I get either get an OutOfMemory Exception or a System.ArgumentException with Message "Parameter is not valid". My research on the second exception tells me it is really a symptom of the first. The method call eats all of the avialable memory.
The GetPage method is creating a System.Drawing.Bitmap image which requires contiguous unfragmented memory. The problem code begins here.
try
{
img = rasterizer.GetPage(dpi, dpi, pageNumber);
}
catch (OutOfMemoryException ex)
{
img = GetImage(rasterizer, dpi, pageNumber, ms);
}
catch (System.ArgumentException ex)
{
img = GetImage(rasterizer, dpi, pageNumber, ms);
}
The GetImage method I wrote looks like this.
public Image GetImage(GhostscriptRasterizer rasterizer, int dpi, int pageNumber, MemoryStream ms)
{
rasterizer.Close();
rasterizer.Dispose();
rasterizer = new GhostscriptRasterizer();
rasterizer.Open(ms);
dpi = dpi - 50;
Image image = null;
if (dpi > 0)
{
try
{
image = rasterizer.GetPage(dpi, dpi, pageNumber);
}
catch (OutOfMemoryException ex)
{
image = GetImage(rasterizer, dpi, pageNumber, ms);
}
catch (System.ArgumentException ex)
{
image = GetImage(rasterizer, dpi, pageNumber, ms);
}
}
return image;
}
The dpi I start with is 300 and it has worked for 95% of the documents we have run through our first test of this system. However for certain pages 300 dpi is clearly too high as I get the Outofmemory exception. It looks like some of the pages are about 35 X 59 inches. I have no control over this. The solution for me is to keep trying at a lower and lower dpi until I have something that doesn't eat all of the memory. However, all of that memory remains in the rasterizer object so I need to dispose of it somehow. Calling rasterizer.Close() gives me the following error.
Managed Debugging Assistant 'FatalExecutionEngineError' has detected a problem in 'F:\Development\bin\Debug\Processor.Run.vshost.exe'.
Additional information: The runtime has encountered a fatal error. The address of the error was at 0x7331e8c6, on thread 0x3e90. The error code is 0xc0000005. This error may be a bug in the CLR or in the unsafe or non-verifiable portions of user code. Common sources of this bug include user marshaling errors for COM-interop or PInvoke, which may corrupt the stack.
Removing the Close() call and calling rasterizer.Dispose() gives me:
An unhandled exception of type 'System.AccessViolationException' occurred in Ghostscript.NET.dll
Additional information: Attempted to read or write protected memory. This is often an indication that other memory is corrupt.
I even just tried to just break if I hit an exception and return the file list and this still required me to not use a using declaration for the rasterizer because I got the same exception at the end of the using because of course it is trying to Dispose of the object. It appears the garbage collector picked up that memory later down the line but that does not in any way solve my problem. I still have no way of rasterizing the page within the same job.
The only solution I can think of is somehow resizing the pdf ahead of time but I'm hoping someone knows a way of disposing that memory and re-rasterizing at a new lower dpi.
You can write PostScript that alters the media size when a PDF requests a large media. But that would require some PostScript programming knowledge.
I believe the actual problem is not Ghostscript, however, because when exceeding memory limits Ghostscript will switch to a display list model where it outputs the page to disk in bands (running the display list as many times as there are bands to output). Provided you actually have a disk, which you clearly do, and there's enough memory for one raster line, then it will (eventually in the case of one band per line) output the whole thing.
Which suggests to me the actual problem is with the C++ or C# wrapper you are using, not Ghostscript tself.
I suspect that your wrapper is trying to create a huge bitmap in memory to hold the rendered output before writing it to disk. That isn't required.
Try running Ghostscript directly from the command line with one of your failing files, if that works then you can simply use Ghostscript, its perfectly capable of producing a PNG file as output. For what its worth I have used Ghostscript to output media of that size, and larger, at 600 dpi.
I have a similar issue, I get the "Attempted to read or write protected memory" when disposing memory after an exception occurs. This happens when I am trying to convert a password-protected PDF - even after catching the exception, the above access violation occurs and crashes the program.
The solution I used:
I am also using iTextSharp in my program. So I wrote a method using iTextSharp to check if the PDF file is password protected first, using help from this thread: https://stackoverflow.com/questions/11298651/checking-if-pdf-is-password-protected-using-itextsharp#=
So now I am checking for the problem before I run into it. It's the only way I've found around this problem - I don't think the Ghostscript.NET wrapper is being updated or maintained any more.
I used HandleProcessCorruptedStateExceptionsAttribute and SecurityCritical attributes on top of my method which is calling GhostScript method.
This got issue fixed for me. I no more get this exception.

"The process cannot access the file because it is being used by another process" error [duplicate]

This question already has answers here:
IOException: The process cannot access the file 'file path' because it is being used by another process
(12 answers)
Closed 2 years ago.
EDIT: I finally found the error. It is totally irrelevant with the bitmaps or streams or static. It appears that one of my colleagues has forgotten to remove email attachment after sending the mail, and the mail attachment service keeps open. I used a using statement for whole mail sending process, and it is solved. Thanks everyone.
I know you might say that there are billions of threads with the same title and this is a duplicate, but believe me it is not. I have been searching for the solution like 7 hours, but nothing helped so far.
The problem is the following: This is a photo capture application which uses WebcamSource as the webcam. The application runs well when first photo is taken and emailed to user. However, when user returns to the process all over again (where it started before the first run), application gives such a error. The erroneous code is below.
public static void SaveImageCapture(BitmapSource bitmap)
{
JpegBitmapEncoder encoder = new JpegBitmapEncoder();
// bitmap = BitmapFrame.Create(BitmapCreateOptions.None, BitmapCacheOption.OnLoad);
encoder.Frames.Add(BitmapFrame.Create(bitmap));
encoder.QualityLevel = 100;
encoder.Rotation = Rotation.Rotate270;
try
{
using (FileStream fstream = new FileStream("Photos\\" + fileName + ".jpg", FileMode.Create))
{
encoder.Save(fstream);
fstream.Close();
}
}
catch (Exception e) {
System.Windows.Forms.MessageBox.Show(e.ToString());
}
}
Code crashes at FileStream fstream = new FileStream("Photos\\" + fileName + ".jpg", FileMode.Create) and it gives the error
The process cannot access the fileĀ "C:\Users[username]\Dropbox[projectname][projectname]\bin\Debug\Photos" because it is being used by another process.
I tried closing webcam stream, surrounding the code with try/catch, putting it into using statement, adding FileAccess and FileShare fields, trying to add BitmapCreateOptions.None and BitmapCacheOption.OnLoad(it did not allow me), creating new images with different names rather than overwriting the same image, deleting image after sending email(it gave me the same error), and some small arrangements that may cause file access problems.
I would suggest making the filename something more generic so appending a timestamp or something, but additionally, in the using call fstream.Flush() before fstream.Close()
Do you reference that file location (Mainly the Photos folder) anywhere else in your code before it reaches this point?
It seems like you have accessed this elsewhere in your code and the connection to it has not been closed off - it's handy to use the Using block whenever calling IO methods.
Although it's not ideal, try to call GC.Collect() after encoder.Rotation = Rotation.Rotate270;

OutOfMemory Exception when loading lots of Images from Isolated storage

EDIT: I keep getting OutOfMemoryException was unhandled,
I think it's how I am saving the image to isolated storage ,I think this is where I can solve my problem how do I reduce the size of the image before I save it? (added code where I save Image)
I am opening images from Isolated storage sometimes over 100 images and I want to loop over them images but I get a OutOfMemory Exception when there is around 100 to 150 images loaded in to a storyboard. How can I handle this exception, I have already brought down the resolution of the images. How can I handle this exception and stop my app from crashing?
I get the exception at this line here
image.SetSource(isStoreTwo.OpenFile(projectFolder + "\\MyImage" + i + ".jpg", FileMode.Open, FileAccess.Read));//images from isolated storage
here's my code
private void OnLoaded(object sender, RoutedEventArgs e)
{
IsolatedStorageFile isStoreTwo = IsolatedStorageFile.GetUserStoreForApplication();
try
{
storyboard = new Storyboard
{
//RepeatBehavior = RepeatBehavior.Forever
};
var animation = new ObjectAnimationUsingKeyFrames();
Storyboard.SetTarget(animation, projectImage);
Storyboard.SetTargetProperty(animation, new PropertyPath("Source"));
storyboard.Children.Add(animation);
for (int i = 1; i <= savedCounter; i++)
{
BitmapImage image = new BitmapImage();
image.SetSource(isStoreTwo.OpenFile(projectFolder + "\\MyImage" + i + ".jpg", FileMode.Open, FileAccess.Read));//images from isolated storage
var keyframe = new DiscreteObjectKeyFrame
{
KeyTime = KeyTime.FromTimeSpan(TimeSpan.FromMilliseconds(100 * i)),
Value = image
};
animation.KeyFrames.Add(keyframe);
}
}
catch (OutOfMemoryException exc)
{
//throw;
}
Resources.Add("ProjectStoryBoard", storyboard);
storyboard.Begin();
}
EDIT This is how I am saving the image to Isolated storage, I think this is where I can solve my problem, How do I reduce the size of the image when saving it to isolated storage?
void cam_CaptureImageAvailable(object sender, Microsoft.Devices.ContentReadyEventArgs e)
{
string fileName = folderName+"\\MyImage" + savedCounter + ".jpg";
try
{
// Save picture to the library camera roll.
//library.SavePictureToCameraRoll(fileName, e.ImageStream);
// Set the position of the stream back to start
e.ImageStream.Seek(0, SeekOrigin.Begin);
// Save picture as JPEG to isolated storage.
using (IsolatedStorageFile isStore = IsolatedStorageFile.GetUserStoreForApplication())
{
using (IsolatedStorageFileStream targetStream = isStore.OpenFile(fileName, FileMode.Create, FileAccess.Write))
{
// Initialize the buffer for 4KB disk pages.
byte[] readBuffer = new byte[4096];
int bytesRead = -1;
// Copy the image to isolated storage.
while ((bytesRead = e.ImageStream.Read(readBuffer, 0, readBuffer.Length)) > 0)
{
targetStream.Write(readBuffer, 0, bytesRead);
}
}
}
}
finally
{
// Close image stream
e.ImageStream.Close();
}
}
I would appreciate if you could help me thanks.
It doesn't matter how large your images are on disk because when you load them into memory they're going to be uncompressed. The memory required for the image will be approximately (stride * height). stride is width * bitsPerPixel)/8, and then rounded up to the next multiple of 4 bytes. So an image that's 1024x768 and 24 bits per pixel will take up about 2.25 MB.
You should figure out how large your images are, uncompressed, and use that number to determine the memory requirements.
You are getting the OutOfMemory Exception because you are storing all the images in memory at the same time in order to create your StoryBoard. I don't think you will be able to overcome the uncompressed bitmap size that the images require to be displayed on screen.
So to get past this we must think about your goal rather than trying to fix the error. If your goal is to show a new image in sequence every X ms then you have a few options.
Keep using StoryBoards but chain them using the OnCompleted event. This way you don't have to create them all at once but can just generate the next few. It might not be fast enough though if you're changing images every 100ms.
Use CompositionTarget.Rendering as mentioned in my answer here. This would probably take the least amount of memory if you just preload the next one (as opposed to having them all preloaded as your current solution does). You'd need to manually check the elapsed time though.
Rethink what you're doing. If you state what you are going after people might have more alternatives.
To answer the edit at the top of your post, try ImageResizer. There's a NuGet package, and a HanselBlog episode on it. Obviously , this is Asp.Net based, but I'm sure you could butcher it to work in your scenario.
Tackling these kind of problems at design layer usually works better.
Making application smart about the running environment via some configurations makes your application more robust. For example you can define some variables like image size, image count, image quality... based on available memory and set these variables at run-time in your App. So your application always works; fast on high memory machines and slow on low memory ones; but never crash. (Don't believe working in managed environment means no worry about the environment... Design always matters)
Also there are some known design patterns like Lazy Loading you can benefit from.
I don't know about windows phone in particular, but in .net winforms, you need to use a separate thread when doing a long-running task. Are you using a BackgroundWorker or equivalent? The finalizer thread can become blocked, which will prevent the resources for the images from being disposed. Using a separate thread from the UI thread will allow will allow the Dispose method to be run automatically.
Ok, an image (1024x768) has at least a memsize of 3 mb (argb)
Don't know how ObjectAnimationUsingKeyFrames works internal. Maybe you can force the gc by destroying the instances of BitmapImage (and KeyFrames) without loss of its data in the animation.
(not possible, see comments!)
Based on one of your comments, you are building a Time Lapse app. Commercial time-lapse apps for WP7 compress the images to video, not stills. e.g. Time Lapse Pro
The whole point of video playback is to reduce similar, or time-related, images to highly compressed stream that do not require massive amounts of memory to play back.
If you can add the ability to encode to video, in your app, you will avoid the problem of trying to emulate a video player (using 100s of single full-resolution frames as a flick-book).
Processing the images into video server-side may be another option (but not as friendly as in-camera).

Categories