Why C# OutOfMemory exception in System.Drawing.dll? - c#

Let's simplify the model.
class Container
{
//other members
public byte[] PNG;
}
class Producer
{
public byte[] Produce(byte[] ImageOutside)
{
using (MemoryStream bmpStream = new MemoryStream(ImageOutside),
pngStream = new MemoryStream())
{
System.Drawing.Bitmap bitmap = new System.Drawing.Bitmap(bmpStream);
bitmap.Save(pngStream, System.Drawing.Imaging.ImageFormat.Png);
pngStream.Seek(0, System.IO.SeekOrigin.Begin);
byte[] PNG = new byte[pngStream.Length];
pngStream.Read(PNG, 0, (int)pngStream.Length);
bitmap.Dispose();
GC.Collect();
GC.WaitForPendingFinalizers();
return PNG;
}
}
}
The main function keep making Container container = new Container(); produce PNG for container.PNG, and Queue.Enqueue(container)
use using() clause doesn't work at all.
While this repeat for about 40+ times(it varies), it throws an exception. Sometime it's OutOfMemoryException and sometime It's something like "GDI+ normal error"(I am not sure how it is exactly in English, I just translated it).
But If I try and catch the exception and simply ignore it, it can still continue producing more but not unlimitedly though, just more forward.
The occupied memory shown in task manager is only about 600 - 700 MB when the first exception is thrown and it finally stops at about 1.2GB. I have tried this:
while (true)
{
Bitmap b = new Bitmap(4500, 5000);
list.Add(b);
Invoke((MethodInvoker)delegate { textBox1.Text = list.Count.ToString(); });
}
It never throws any exception though 99% memory(about 11GB) has been allocated for the program, and all happen is the number in textBox1 no longer raise.
The way to avoid this may be not to produce so many things, but I still want to know the internal principle and reason and thank you for your help.

with byte[] PNG = new byte[pngStream.Length]; you allocate a large portion of memory to store the image.
The follow call it's useless, you have already disposed the stream.
GC.Collect();
GC.WaitForPendingFinalizers();
The memory used by PNG array cannot be released, because there is active reference in function return.
I suggest to return a stream instead of an array of bytes.
Otherwise after you call the method Produce remember to remove the reference to PNG before call again.
sample:
while (true)
{
Byte[] b = new Byte[1000];
b = this.Produce(b);
//Use your array as you need, but you can't assign it to external property, otherwise memory cannot be released
b = null; //remove the reference, (in reality, in this example assign null is not necessary, because b will be overwritten at next loop.
GC.Collect(); //Force garbage collector, probably not necessarry, but can be useful
GC.WaitForPendingFinalizers();
}
The platform compilation can affect the maximum available memory:
In a 32 bit application, you have a maximum of 2 GiB of available memory
In a 64 bit application you have 2 Tib of available memory, but
single object (class) cannot exceed 2 Gib.
In a UWP application there are other limitation in dependence of the
device
Any CPU is complied just in time, when you launch the application,
and can be run both 32-bit and 64, it depends from machine architecture
and system configuration.

Related

Converting byte array to memory stream and bitmap causing high memory usage

I have a byte array list. And, I am using it to generate bitmap images via memory stream.
While saving images, memory usage goes very high. And at some point, it causes out of memory exception.
I tried to comment out saving files to see if that causing this problem. Or, called GC manually. Nothing changed, still using high memory. My latest code is like this:
List<byte[]> byteArrayList = helper.GetArrayList(); // Gets approximately 10k items.
for (int i = 0; i < byteArrayList.Count; i++)
{
using (MemoryStream ms = new MemoryStream(byteArrayList[i]))
{
using (Bitmap bm = new Bitmap(ms))
{
bm.Save(fileLocation);
bm.Dispose();
}
ms.Dispose();
}
byteArrayList[i] = null;
byteArrayList.Remove(byteArrayList[i]);
}
byteArrayList.Dispose();
How can i solve this issue?
I have tested your code and saw that the system cannot collect your garbage in a LOOP. so if you create so many bitmaps in a loop, the memory increases to the peak levels (such 2-3-4 gbs) until garbage collector runs. But when loop ends, the memory level decreases to the normal which is too late. So When I test your code in a BACKGROUNDWORKER instead of main thread, GC doesnt stuck to the loop and runs as it is supposed to and it converts the byte arrays to the bitmaps and save them without any extreme memory consumption.
If you change the helper method to return a Queue<T> instead...
Queue<byte[]> byteArrayQueue = helper.GetQueue(); // Gets approximately 10k items.
while (byteArrayQueue.Any())
{
using (var ms = new MemoryStream(byteArrayQueue.Dequeue()))
{
using (var bm = new Bitmap(ms))
{
bm.Save(fileLocation);
}
}
}

Memory Mapped File gets deleted from memory

For some reason, when i read from a memory mapped file a couple of times it just gets randomly deleted from memory, i don't know what's going on. Is the kernel or GC deleting it from memory? If they are, how do i prevent them from doing so?
I am serializing an object to Json and writing it to memory.
I get an exception when trying to read again after a couple of times, i get FileNotFoundException: Unable to find the specified file.
private const String Protocol = #"Global\";
Code to write to memory mapped file:
public static Boolean WriteToMemoryFile<T>(List<T> data)
{
try
{
if (data == null)
{
throw new ArgumentNullException("Data cannot be null", "data");
}
var mapName = typeof(T).FullName.ToLower();
var mutexName = Protocol + typeof(T).FullName.ToLower();
var serializedData = JsonConvert.SerializeObject(data);
var capacity = serializedData.Length + 1;
var mmf = MemoryMappedFile.CreateOrOpen(mapName, capacity);
var isMutexCreated = false;
var mutex = new Mutex(true, mutexName, out isMutexCreated);
if (!isMutexCreated)
{
var isMutexOpen = false;
do
{
isMutexOpen = mutex.WaitOne();
}
while (!isMutexOpen);
var streamWriter = new StreamWriter(mmf.CreateViewStream());
streamWriter.WriteLine(serializedData);
streamWriter.Close();
mutex.ReleaseMutex();
}
else
{
var streamWriter = new StreamWriter(mmf.CreateViewStream());
streamWriter.WriteLine(serializedData);
streamWriter.Close();
mutex.ReleaseMutex();
}
return true;
}
catch (Exception ex)
{
return false;
}
}
Code to read from memory mapped file:
public static List<T> ReadFromMemoryFile<T>()
{
try
{
var mapName = typeof(T).FullName.ToLower();
var mutexName = Protocol + typeof(T).FullName.ToLower();
var mmf = MemoryMappedFile.OpenExisting(mapName);
var mutex = Mutex.OpenExisting(mutexName);
var isMutexOpen = false;
do
{
isMutexOpen = mutex.WaitOne();
}
while (!isMutexOpen);
var streamReader = new StreamReader(mmf.CreateViewStream());
var serializedData = streamReader.ReadLine();
streamReader.Close();
mutex.ReleaseMutex();
var data = JsonConvert.DeserializeObject<List<T>>(serializedData);
mmf.Dispose();
return data;
}
catch (Exception ex)
{
return default(List<T>);
}
}
The process that created the memory mapped file must keep a reference to it for as long as you want it to live. Using CreateOrOpen is a bit tricky for exactly this reason - you don't know whether disposing the memory mapped file is going to destroy it or not.
You can easily see this at work by adding an explicit mmf.Dispose() to your WriteToMemoryFile method - it will close the file completely. The Dispose method is called from the finalizer of the mmf instance some time after all the references to it drop out of scope.
Or, to make it even more obvious that GC is the culprit, you can try invoking GC explicitly:
WriteToMemoryFile("Hi");
GC.Collect();
GC.WaitForPendingFinalizers();
GC.Collect();
ReadFromMemoryFile().Dump(); // Nope, the value is lost now
Note that I changed your methods slightly to work with simple strings; you really want to produce the simplest possible code that reproduces the behaviour you observe. Even just having to get JsonConverter is an unnecessary complication, and might cause people to not even try running your code :)
And as a side note, you want to check for AbandonedMutexException when you're doing Mutex.WaitOne - it's not a failure, it means you took over the mutex. Most applications handle this wrong, leading to issues with deadlocks as well as mutex ownership and lifetime :) In other words, treat AbandonedMutexException as success. Oh, and it's good idea to put stuff like Mutex.ReleaseMutex in a finally clause, to make sure it actually happens, even if you get an exception. Thread or process dead doesn't matter (that will just cause one of the other contendants to get AbandonedMutexException), but if you just get an exception that you "handle" with your return false;, the mutex will not be released until you close all your applications and start again fresh :)
Clearly, the problem is that the MMF loose its context as explained by Luaan. But still nobody explains how to perform it:
The code 'Write to MMF file' must run on a separate async thread.
The code 'Read from MMF' will notify once read completed that the MMF had been read. The notification can be a flag in a file for example.
Therefore the async thread running the 'Write to MMF file' will run as long as the MMF file is read from the second part. We have therefore created the context within which the memory mapped file is valid.

Which Queue keeps a copy of the Object instead of just Reference?

I am currently using System.Collections.Concurrent.BlockingCollection, and it´s very good with what it´s doing.
However it seems that it only keeps the reference of an object.
So if i have one byte[] object, which is written to and added to the Queue 100 times.
And after it reaches 100, i want to read all those, i will only get 100 copies of the current data "byte[]" holds.
Hope that explains it, at least it seems it´s doing this from my tests.
So if it´s doing this, is there another one that can keep copies of the data and just add it and add it till i read it?
Like for example, i would have 100 byte[] files, write it to a MemoryStream in the correct order, then i can read them in that order.
Though a Memory Stream isn´t what i would prefer to use, but works as an example.
Here is my code:
try
{
Thread.Sleep(100);
for (int i = Queue.Count; i <= Queue.Count; i++)
if (Queue.TryTake(out AudioData, 300))
{
if (Record)
waveWriter.Write(AudioData, 0, AudioData.Length);
}
}
catch (Exception e)
{
if (e is ArgumentNullException)
return;
}
Here is the part which receives the data
using (ms = new MemoryStream(TcpSize))
using (var tt1 = tcplisten.AcceptTcpClient())
{
ReceiveData = new byte[TcpSize];
tt1.NoDelay = true;
using (var tcpstream = tt1.GetStream())
while (connect)
{
if (Record)
Queue.Add(ReceiveData);
tcpstream.Read(ReceiveData, 0, TcpSize);
waveProvider.AddSamples(ReceiveData, 0, TcpSize);
}
}
You may wonder why i use a for loop and all that for writing, but it´s just there for debug purposes. I wanted to test if the objects in the Queue was copies, cause if so, it shouldn´t matter when i write it, but it does which means it must be reference.
Thanks
If you want to queue copies of the data, just make a copy and then queue the copy.
Queue.Add((byte[])ReceiveData.Clone());
But I think you also need to sort out the fact that you're writing the data to the queue before filling the buffer...
Alternatively, create a new buffer on each iteration and queue that instead:
while (connect)
{
ReceiveData = new byte[TcpSize];
tcpstream.Read(ReceiveData, 0, TcpSize);
waveProvider.AddSamples(ReceiveData, 0, TcpSize);
if (Record)
Queue.Add(ReceiveData);
}

Write file need to optimised for heavy traffic part 4

this is a continuation of part 3
Write file need to optimised for heavy traffic part 3
as my code changed somewhat i think it is better to open a new thread.
public class memoryStreamClass
{
static MemoryStream ms1 = new MemoryStream();
static MemoryStream ms2 = new MemoryStream();
static int c = 1;
public void fillBuffer(string outputString)
{
byte[] outputByte = Encoding.ASCII.GetBytes(outputString);
if (c == 1)
{
ms1.Write(outputByte, 0, outputByte.Length);
if (ms1.Length > 8100)
{
c = 2;
Thread thread1 = new Thread(() => emptyBuffer(ref ms1));
thread1.Start();
}
}
else
{
ms2.Write(outputByte, 0, outputByte.Length);
if (ms2.Length > 8100)
{
c = 1;
Thread thread2 = new Thread(() => emptyBuffer(ref ms2));
thread2.Start();
}
}
}
void emptyBuffer(ref MemoryStream ms)
{
FileStream outStream = new FileStream(string.Format("c:\\output.txt", FileMode.Append);
ms.WriteTo(outStream);
outStream.Flush();
outStream.Close();
ms.SetLength(0);
ms.Position = 0;
Console.WriteLine(ms.Position);
}
there are 2 things i have changed changed from the code in part 3.
the class and method is changed to non-static, the variables are still static tho.
i have move the memorystream reset length into the emptyBuffer method, and i use a ref parameter to pass the reference instead of a copy to the method.
this code compiled fine and runs ok. However, i run it side by side with my single thread program, using 2 computers, one computer runs the single thread, and one computer runs the multithread version, on the same network. i run it for around 5 mins. and the single threaded version collects 8333KB of data while the multithread version collects only 8222KB of data. (98.6% of the single thread version)
its first time i have do any performance comparison between the 2 version. Maybe a should run more test to confirm it. but base on looking the code, any masters out there will point out any problem?
i haven't putting any code on lock or threadpooling at the moment, maybe i should, but if the code runs fine, i dont want to change it and break it. the only thing i will change is the buffer size, so i will eliminate any chance of the buffer fill up before the other is emptied.
any comments on my code?
The problem is still static state. You're clearing buffers that could have data that wasn't written to disk.
I imagine this scenario is happening 1.4% of the time.
ms1 fills up, empty buffer1 thread started, switch to ms2
empty buffer1 is writing to disk
ms2 fills up, empty buffer2 thread started, switch to ms1
empty buffer1 to disk finishes
ms1 is cleared while it is the active stream
When doing multi-threaded programming, static classes are fine but static state is not. Ideally you have no shared memory between threads and your code is entirely dependent on it.
Think of it this way -- if you're expecting a value to consistently change, it's not exactly static is it?

I get System.OutOfMemoryException. There is a way to make my code lighter?

I want to write a routine that receives some jpeg frames from a server (Remote desktop - like), converts them to bitmap images and then displays them on a windows form. I am trying to make the routine as lighter as possible but perhaps I am doing it wrong as I receive always a System.OutOfMemoryException. My code follows:
EDIT: added a part that is related to this exception
private void WatcherRoutine()
{
Boolean lLoopEnd = false;
Bitmap lCurrent = null;
//Graphics lGraphics = null;
Image lImg = null;
BinaryReader lBRVideo = new BinaryReader(this._state.Video.GetStream());
while (lLoopEnd == false)
{
try
{
// Reads frame type
switch (lBRVideo.ReadByte())
{
// Frame received is a big frame (ie a desktop screenshot)
case Constants.BIGFRAME:
{
// Reads frame size in bytes
Int32 lVideoLength = lBRVideo.ReadInt32();
if (lVideoLength > 0)
{
// Stores frame in a stream
MemoryStream ms = new MemoryStream(lBRVideo.ReadBytes(lVideoLength));
// Creates image from stream
lImg = Image.FromStream(ms);
ms.Dispose();
// Creates bitmap from image
lCurrent = new Bitmap(lImg);
lImg.Dispose();
// Passes image to windows form to display it
this.Invoke(this._state.dUpdateVideo, lCurrent);
////lGraphics = Graphics.FromImage(lImg);
//lGraphics.Dispose();
}
}
break;
// Diff frame (ie a small part of desktop that has changed)
// Commenting this part makes the exception disappear :|
case Constants.DIFFFRAME:
{
Int16 lX = lBRVideo.ReadInt16(),
lY = lBRVideo.ReadInt16();
Int32 lVideoLength = lBRVideo.ReadInt32();
if (lVideoLength > 0)
{
//Byte[] lVideoImg = lBRVideo.ReadBytes(lVideoLength);
//Image lImgDiff = Image.FromStream(new MemoryStream(lVideoImg));
////if(lGraphics != null)
//{
// lGraphics.DrawImage(lImgDiff, lX, lY);
// this.Invoke(this._state.dUpdateVideo, new Bitmap(lImg));
//}
}
}
break;
case Constants.CURSOR:
{
Int16 lX = lBRVideo.ReadInt16(),
lY = lBRVideo.ReadInt16();
// TODO
}
break;
default:
break;
}
}
catch (Exception e)
{
if (this._state.WorkEnd == false)
{
this._state.WorkEnd = true;
this.BeginInvoke(this._state.dDisconnect);
}
lLoopEnd = true;
SmartDebug.DWL(e.Message);
}
}
}
dUpdateVideo is a delegate that contains this small routine.. perhaps have I to free pBmp?
private void UpdateVideo(Bitmap pBmp)
{
this.VideoPictureBox.Image = pBmp;
}
When you're using GDI+ based APIs (System.Drawing), an OutOfMemory exception doesn't necessarily mean that you're out of memory. It can also mean that parameters passed to GDI+ are invalid, or some other cause. GDI+ is pretty OutOfMemory happy.
You should also reuse your memory stream, if possible. That reduces GC pressure a lot. You're allocating many large objects, and the GC is pretty bad in that scenario.
Also I think you're never disposing lCurrent.
Then you violate the contract of Image.FromStream:
You must keep the stream open for the lifetime of the Image:
lImg = Image.FromStream(ms);
ms.Dispose();
lCurrent = new Bitmap(lImg);// `lImage` is used here, but `ms` is already disposed
lImg.Dispose();
The documentation for Image.FromStream states:
You must keep the stream open for the lifetime of the Image.
Move ms.Dispose() behind lImg.Dispose()
Once I wrote a program which processed lots of images loaded from files. I Disposed everything I could, as soon as possible, and let the rest to the GC. This wasn't enough, memory usage profiling showed clearly that GC was too slow relative to the image loading speed of my program. The solution was to call GC.Collect() manually each time when I finished processing a given number of images. Note that this is not a good practice, but sometimes helps. At least it worth trying.
The problem may be related to the binary protocol error (video length is somehow messed up, see lBRVideo.ReadInt16 and ReadInt32 calls you comment out)

Categories