Magick.NET C# - huge memory usage - c#

I'm struggling with Magick.NET library, as when converting image files to pdf's my memory usage is over 4GB and the CPU usage is 100%. When the conversion is done it all backs to normal. but as I'm using this particular third party in many instances of one application it causes huge memory loss.
Problem exists on line images.Write(newPdfPath);
using (MagickImageCollection images = new MagickImageCollection())
{
images.Read(orginalImage);
images.Write(newPdfPath);
}
The images are different sizes, and it really doesn't matter how big, as when converting jpg of size 7 KB the issue also exists.
Please help!!!!

the cpu and memory allocations are not related to the size of the image on your hard drive. It completly related to the number of pixels you have. if you have a completely white image of 20,000 pixel by 20,000 pixel, the size of this file on your hard drive can be 6 MG but when you load it into memory with Magick.net it will be gigabytes. So first you have to see what size(in pixels) are the images and then we can judge about the performance.
Then you can use these approaches to improve the performance:
once you load the image into memory you can write it on HDD with .mpc format and then you can load it into memory very fast. (if you need to load couple of times your images)
Use Magikc.net q8 instead of q16
if you can run the command on a parallel loop then Magick.net version 7 can run almost 4 times faster.
and as the other answer is saying you have to dispose your image when it has been done.

If you convert images in a loop then disposing each image after conversion is done it might help. Use
Image.Dispose();
method in order to free unmanaged memory resources used by images.

Related

Decrease Memory Usage Loading Bitmap (Win2D)

I'm writing a program that uses Win2D to load an image file and display it on the screen. However, the image file itself is about 5 Mb in size, but when I load it using CanvasBitmap.LoadAsync, the process memory jumps up to over 600MB in memory and then settles down to around 300MB. Is there any way to reduce the process memory without having to resize the image manually in an image editor? I've seen code for resizing other types of bitmaps and I was wondering if that's also possible in Win2D.
Regards,
Alex
Update (1/27/2020)
Realized that bitmaps are uncompressed image files, so the only avalible options are either to reduce the image size somehow, or use a different file format. Decided to use the later because I'm working with PDF files. They can be converted into SVG files using Inkscape. Also, SVG files conveniently happen to be supported by Win2D.

How to store image in memory more compact than usual Bitmap instance and access it fast enough?

I have images like 5000*4000 pixels that I render every frame on form.
Such image loaded like Bitmap bmp = Image.FromFile("test.png"); uses around 100 mb RAM, while file size is 40 - 50 mb! That way my prog already uses ~500 bm RAM.
How to store that images more RAM efficient, but be able to cut parts from it fast enough? I have one huge image and cut parts from it to render according to view position. I call render method at 50 FPS and can't afford loading images from file every frame, since it too slow.
Is there any other way to store image in memory?
You are stuck with having to have the uncompressed image loaded into memory if you wish to display it and manipulate it. There is no way round this.
The only sure way to reduce this is to reduce the resolution of the image being displayed. You could store the compressed image in memory and uncompress it from memory rather than from disk each time you need to change it.
Having said that 500MB of RAM is not that much these days where computers have several GB of RAM available.

Opening Large PNG in .NET Targeting AnyCPU Throws Out of Memory Exception

I am attempting to open a PNG image in code that is 8,733 x 12,945 pixels. By my calculations, this should require 431.247 MiB (8733 * 12945 * 4) of memory to store the pixels. However, when my build platform is AnyCPU, any attempt to open the file from C# (using new Bitmap(string filename)) results in an OutOfMemoryException. When I switch the platform to x64, the image is opened without issue.
Does anyone know why I would be getting such an exception trying to open an image that should require significantly less memory than the 2 GiB (2^31) threshhold?
-- BEGIN EDIT --
Yes, I am aware of the concept of memory fragmentation. I have a single WebForm, and the ONLY action that it takes is to call new Bitmap once the client selects a file from a file picker dialog. My expectation is that the app does the following:
1) Allocates enough memory for a Windows Form (give it 20kB to be safe)
2) The user selects a file, so the app opens a stream (probably 8kB buffer) and reads the header information (first couple hundred bytes).
3) The header indicates the image size and pixel format, so the constructor then allocates a buffer large enough to fit the pixels (450 MB).
I understand that my memory is either very fragmented, OR GDI+ is trying to allocate the space in a strange way. What I don't get is why this is happening? There is nothing resource-intensive in my application, so the memory should not be fragmented enough to prevent an allocation of about 20% of the addressable space. What could be going on to prevent such an allocation? Does the bitmap constructor require more memory than I calculated?
You should use streams to read big files. This exception is not related to amount of available physical memory. If you need more information you should read articles like this one:
http://blogs.msdn.com/b/ericlippert/archive/2009/06/08/out-of-memory-does-not-refer-to-physical-memory.aspx

Resizing large images with ImageResizer and out of memory exceptions

I'm encountering out of memory exceptions when i resize images 9000x9000 square using ImageResizer.Net (on a 32-bit system):
ImageBuilder.Current.Build(imageFileName, outputFileName, settings, true);
I am able to successfully resize the large images using a stream though:
using (var stream = new FileStream(imageFileName, FileMode.Open, FileAccess.Read))
using (var img = Image.FromStream(stream, true, false))
{
ImageBuilder.Current.Build(img, outputFileName, settings);
stream.Close();
}
but, this last method still hits an out of memory exception after x loops. Is there a huge memory leak in ImageResizer, or is there an error in my code?
Either way, is there a workaround?
You need a 64-bit system if you're going to be processing 81 megapixel images. Just decompressing the image will require between 350 and 800 contiguous megabytes of RAM.
On a 32-bit system (even with 16GB of ram installed), only 1200MB or so are initially available to any .NET process. Due to fragmentation (not memory leaks!) that 1200MB will be split up into small 50-100MB chunks by any activity. Since you need your memory in 800MB blocks (since you're processing massive images), that stops working quickly.
To allow .NET to combat memory fragmentation, you need to give it (a) time and (b) plenty of extra space.
On a 64-bit system, the process should be able to access enough RAM for the .NET runtime to not starve under these workloads.
I would think you're maxing out your RAM with some of the large image resizing (not sure though).
Try to use the DiskCache plugin:
http://imageresizing.net/plugins/diskcache
http://imageresizing.net/download
It will write the resized images directly to disk bypassing any RAM issues. Plus it's really fast too.
Im sorry to answer with a "try this" comment, as answer, but I'm unable to write comments just yet due to insufficient SO reputation.
As mentioned by Computer Linguist in a comment on their own answer libvips is capable of resizing large images while using (much) less memory. [I actually ended-up using the command line version as I didn't come across any .NET wrappers for the library itself.]
In my case, the image I was trying to resize was 21,920 × 14,610 pixels. ImageResizer successfully resized a PNG version (15+ MB) of the image but threw an OutOfMemoryException for a JPEG version (22+ MB).
I'm running Windows 7 64-bit on a computer with 8 GB of RAM.

Image.FromFile is very SLOW. Any alternatives, optimizations?

I have a winforms image list which contains say like 200 images 256x256.
I use the method Images.FromFile to load the images and then add them to the image list.
According to ANTS .NET profiler, half of the program's time is spent in Images.FromFile. Is there a better way to load an image to add to an image list?
Another thing that might be optimized is, the images that are loaded are larger than 256x256. So is there a way to load them by resizing them first or something? I just want to uniform scale them if they their height is larger than 256 pixels.
Any idea to optimize this?
EDIT: They are JPEGs.
You don't say how much bigger than 256x256 the images actually are - modern digital cameras images are much bigger than this.
Disk I/O can be very slow, and I would suggest you first get a rough idea how many megabytes of data you're actually reading.
Then you can decide if there's a subtle 'Image.FromFile' problem or a simple 'this is how slow my computer/drives/anti-virus scanner/network actually is' problem.
A simple test of the basic file I/O performance would be do to File.ReadAllBytes() for each image instead of Image.FromFile() - that will tell you what proportion of the time was spent with the disk and what with the image handling - I suspect you'll find it's largely disk, at which point your only chance to speed it up might be one of the techniques for getting JFIF thumbnails out of files. Or perhaps one can imagine clever stuff with partial reads of progressive JPEGs, though I don't know if anyone does that, nor if your files are progressive (they're probably not).
I don't really know how fast you need these to load, but if the problem is that an interactive application is hanging while you load the files, then think of ways to make that better for the user - perhaps use a BackgroundWorker to load them asynchronously, perhaps sort the images by ascending file-size and load the small ones first for a better subjective performance.
If you are trying to make thumbnails then try this code here it will let you extract thumbnails without completely loading the image.
You can use FreeImage.NET which is great for loading images on background.
The Image.FromFile contains hidden Mutex which locks your app if you are try to load large images even on background thread.
As for the JPEGs, the FreeImage library uses OpenJPEG library which can load JPEG images in smaller scale more quickly. It can also utilize embedded thumbnails.
The WPF classes also allow loading images in smaller resolution, but this cannot be used if you are restricted to WinForms.
If you want speed then prescale your images, don't do it in runtime.
You didn't mention want type of images you loading (jpeg, png, gif, bmp) of course that bmp is the fastest one since it has not (or almost no) compression.
Are your images 256 colors (8 bit w/ palette), bmp, gif and png support that format that can load pretty fast.
It sounds like you need some sort of image thumbnails? Don't forget that jpeg images already contains thumbnails inside, so you can extract only this small image and you do not need to scale. Such images however smaller than 256x256.
Another option is to move loading logic into separate thread, it will not be faster, but from users perspective it can look as significant speedup.
I have a winforms image list
I would avoid using ImageList in this scenario. Internally, it is an old Windows common control intended for working with GUI icons, usually 32x32 pixels or less.
Another thing that might be optimized is, the images that are loaded are larger than 256x256.
Be aware that GDI+ is not optimized for working with very large bitmaps of the kind that you would normally edit using photo software. Photo editing software packages generally have sophisticated algorithms that divide the image into smaller parts, swap parts to and from disk as needed to efficiently utilize memory, and so on.
Resizing an image is CPU intensive, especially when using a good quality interpolation algorithm. Take a cue from how Windows Explorer does this -- it saves the thumbnails to a file on disk for future access, and it does the processing in the background to avoid monopolizing the system.
This might be too late but using ImageLocation property will do two things
speed up image loading
work around the bug (Image file is locked when you set the PictureBox Image property to a file)
pictureBox1.ImageLocation = "image.jpg";
pictureBox1 .SizeMode = PictureBoxSizeMode.StretchImage;

Categories