I am working on a C# WPF application which uses pixel data from many images to process one image.
It stores every image as System.Drawing.Bitmap and are locked into memory.
The user is able to open any number of images.
The question is that what should normally happen when the user opens so many images, that the memory will be full during processing?
On my Windows 8.1 computer, when this happens, I see in task manager that the memory usage is getting higher, it slows down, and freezes for a minute, then the application exits.
However, on my Windows 8.1 (non-RT) tablet, when this happens, I see in task manager that the memory usage is getting higher then suddenly gets low and then getting higher again and so on for 2-3 times... (this is very strange for me because I think all images should be kept in memory and only released from memory when no longer needed), the speed is normal, no freeze, and AccessViolationException occurs.
So I would like to know if these behaviors are normal or not, and if not what is the normal behavior and why is it not normal for me?
C# is not a good language for memory hungry applications. So like a suggession I would say:
Validate do you really need all images in memory contemporary to process or you need only 2, or some part of them in memory, contemporary.
if the answer is yes, you may look for MemoryMapped files.
if the answer is no, rearchitect your code.
To answer your question: no it's not a normal behaviour and the only right way to deal with memory consumption that leads to some application undefined behavior, is to fix the architecture of application.
Related
As far as I know single .net application can allocate a lot from available memory. It will be released by GC at some point. I never have to care much about details. It just works.
But what is going to happen if native application is started while most/all memory is used by .net application? Will GC respects this and free memory before? Or will Windows "take care" and will swap memory of .net appplication into swap-file?
I have a single PC with slow HDD, where my WPF application (MVVM, bitmaps, database, pretty memory-intensive) occupy 200-2000 Mb (up to 80%) of RAM and I get reports what PC become slow when running Office, antivirus, etc.
In e.g. Photoshop there is a setting to limit amount of used RAM. Now I am thinking whenever such thing make sense in my WPF application.
Is uncontrollable GC memory allocation a problem or not? Should I limit amount of used by my application memory?
The problem described does not look related to .NET memory management/GC, assuming that application just holds the operation data (in use) in the memory.
.NET app is not different from any other user app so the OS will treat them in the same way: move infrequently used memory blocks to the swap file.
If the application occupies 80% of RAM and works with memory intensively, competing with other applications, the whole process will generate a lot of page faults causing large traffic between swap file and memory. This leads to severe performance degradation, especially in case of slow HDD.
The .NET part in this game is just to clean the application memory from time to time from the data no longer in use. If there are just big amount of data required for the app to run and just adding more RAM is not an option, then application redesign (which limits the amount of loaded data somehow) is a considerable approach.
I am working to diagnose a series of OutOfMemoryException problems within an application of ours. This is an internal 32-bit (x86) OWIN-hosted WebAPI that runs within a console application and talks to a series of hardware components in parallel. For that reason it's creating around 20 instances of a library, and the sharp increase in "virtual size" memory matches when those instances are created.
From the output of Process Explorer, and dotMemory, it does not appear that we're allocating that much actual memory within this application:
From reading many, many SO answers I think I understand that our problem is either from fragmentation within the G0, G1, G2 & LOH heaps, or we're possibly bumping into the 2GB addressable memory limit for a 32-bit process running on Windows 7. This application works in batches where it collects a bunch of data from hardware devices, creates collections in memory to aggregate that data into a single object, and then saves it to be retrieved by a client app. This activity is the cause of the spikes in the dotMemory visual, but these data structures are not enormous, which I think the dotMemory chart shows.
Looking at the heaps has shown they rarely grow beyond 10-15MB in size, and I don't see much evidence that the LOH is growing too large or being severely fragmented. I'm really struggling with how to proceed to better understand what's happening here.
So my question is two-fold:
Is it conceivable that we could be hitting that 2GB limit for virtual memory, and that's a cause for these memory exceptions?
If that is a possible cause then am I right in thinking a 64-bit build would get around that?
We are exploring moving to a 64-bit build, but that would require updating some low-level libraries we use to also be 64-bit. It's certainly an option we will explore eventually (if not sooner), but we're trying to understand this situation better before investing the time required.
Update after setting the LARGEADDRESSFLAG
Based a recommendation I set that flag on the binary and interestingly saw the virtual size jump immediately to nearly 3GB. I don't know if I should be alarmed by that?!
I will monitor the application with this configuration for the next several hours.
In my case the advice provided by #ThomasWeller was indeed correct and enabling the "large address aware" flag has allowed this application to run for several days without throwing memory exceptions.
I've designed a logging service for multiple applications in C#.
Since the thinking of performance saving, all logs should be store in buffer first, and then write to log file when buffer is full.
However, there are some of extension cards (PCI / PCI-e) causing BSoD, which are not in my control. The logs in the buffer will lose when the BSoD occurs, but I want to find a way to keep them.
I've found some articles are discussing about how to dumping data when software crashed. However, the minidump one needs to dump everything by myself, and I think it will cause some performance issues; the other articles (A)(B) are only suitable in single application crash.
Do anyone have any suggestion to save my logs even if BSoD occurs?
EDIT: if there are any suggestion to reduce the loss of data to minimize is also welcome.
Since the buffer you have in your C# application is not written to disk for performance reasons, the only other place left for it is memory (RAM). Since you don't know how Windows manages your memory at the moment of the crash, we have to consider two cases: a) the log is really in RAM and b) that RAM has been swapped to disk (page file). To get access to all RAM of a BSoD, you have to configure Windows to create a full memory dump instead of a kernel minidump.
At the time of a blue screen, the operating system stops relying on almost anything, even most kernel drivers. The only attempt it makes is to write the contents of the physical RAM onto disk. Furthermore, since it cannot even rely on valid NTFS data structures, it writes to the only place of contiguous disk space it knows: the page file. That's also the reason why your page file needs to be at least as large as physical RAM plus some metadata, otherwise it won't be able to hold the information.
At this point we can already give an answer to case b): if your log was actually swapped to the page file, it will likely be overwritten by the dump.
If the buffer was really part of the working set (RAM), that part will be included in the kernel dump. Debugging .NET applications from a kernel dump is almost impossible, because the SOS commands to analyze the .NET heaps only work for a user mode full memory dump. If you can identify your log entries by some other means (e.g. a certain substring), you may of course do a simple string search on the kernel dump.
All in all, what you're trying to achieve sounds like a XY-problem. If you want to test your service, why don't you remove or replace the unrelated problematic PCI cards or test on a different PC?
If blue screen logging is an explicit feature of your logging service, you should have considered this as a risk and evaluated before writing the service. That's a project management issue and off-topic for StackOverflow.
Unfortunately I have to confirm what #MobyDisk said: it's (almost) impossible and at least unreliable.
Im trying to run simultaneously hundreds of instances of the same app(using C#), and after about 200 instances the GUI starts to slow down dramatically until the point that the load time of the next instance is climbing up to 20s (from 1s).
The test maching is :
xeon 5520
12gb ram
windows 2008 web 64 bit
at max load (200 instances) the cpu is at about 20% and ram 45%, so im sure its not a hardware issue.
I already tried configuring Session size and SharedSection in the registry of the windows but it doesnt seem to help.
I also tried to running the app in the background and also on multiple sessions (different sessions) and still the same (i though maybe it a limitation per session).
When the slowdown occures for example on one session i can login to another session and the desktops works without a problem (the first dekstop becomse unusable.)
My question is - is there a way to strip the gdi objects or maybe eliminate the use of the GUI? or is it a windows limitation?
p.s - I cant change the app since its a third pary.
Thanks in advance.
With 200 instances running, the constant context switching is probably hurting performance. Context switching isn't counted in CPU load.
Edit: whoops, wrong link.
Try monitoring context switching on your system
http://technet.microsoft.com/en-us/library/cc938606.aspx
I doubt it's GDI - if you run out of GDI handles/resources you'll notice vast chunks of your windows failing to redraw, rather than everythign slowing down.
The most likely reason for a sudden drop in performance is that you are maxing out your RAM and thrashing your Virtual Memory as all your processes fight for CPU time. Check memory usage, and if it's high, see if you can reduce the footprint of your application. Or apply a "hardware fix" by installing more RAM. Or add Sleeps into your Apps where possible so that they aren't demanding constant timeslices from your CPU (and thus needing to be constantly paged in from VM).
I wrote a hello world program (windows application without any UI) in C#. The release-build excutable doesn't do anything but to Thread.Sleep(50000) //50 seconds.
I opened sysinternals (a profiler like task manager). This excutable ate 7MB memory (private bytes)!!
Can anybody explain what is happening and how to make the memory usage smaller.
P.S. I also tried to use NGEN to pre-compile the .exe but still got the same memory usage.
Thanks a lot
C# (and other JIT compiled/interpreted languages) usually end up doing a lot of things for you automatically in the background. While what you've written is simple, there's a lot of stuff going on in the background to support the JIT nature of the application.
That 7MB of memory is relatively small given 2GB of RAM is fairly commonplace these days. And it probably won't go up more unless you do something unusual or allocate lots of arrays and data structures.
If it's a Hello World based on the C# WindowsApplication project type, and there's an int main in Program.cs doing an application.run on a Windows Form, then not only is there a lot of JIT overhead, but there's a lot of Windows Forms overhead too.
End of the day, I'm sure everything is dandy.
.Net apps have a lot of basic overhead, but your memory increase should be relatively small after that point.
Example. My one app consumes 10MB of memory on load, but it only consumes 40MB of memory after loading 60k rows from a Database, each row containing multiple strings and many values.
A small fixed upfront value isn't that bad, on modern computers. Scalability is the primary issue now-a-days. .Net scales fairly well for being managed.