Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question appears to be off-topic because it lacks sufficient information to diagnose the problem. Describe your problem in more detail or include a minimal example in the question itself.
Closed 8 years ago.
Improve this question
I'm using Visual Studio 2008 to work on a Winform / WPF project.
It uses multiple projects and classes to build it into a working product.
My problem is, we have noticed that there is a 4-8k per second leak in the memory usage. granted it is a small leak, but it is non-stop continuous 4-8k. Our application runs over night and even for a few days at time. When those few days comes alone, this thing has eaten up more memory than the computer can handle (usually 2-3 gigs) and a force restart on the pc is the only solution. This leak occurs even while nothing is happening except network communications with our host.
After further analysis on the project through ANTS Memory Profiler, we have discovered that the Private bytes data is continuously growing. Is there any way to tell where this private data is being created from? I haven't had much luck tracking this down with ANTS. Steps would help greatly!
Image of the private bytes increasing (~45 minutes):
Image of the Time line growth (~45 minutes):
Thanks in advance!
If the private bytes keep increasing, it means you have a memory leak. Try DebugDiag, it is from MS and free, also a very good tool to tracking memory leak on Windows.
Use this tool is simple, first you create a rule to monitor your process with DebugDiag collection, it will create memory dump according to your rule, you can create the memory dump manually. Then you can use DebugDiag Analysis to analysis the dump, please set the right symbol path before analysis.
This MSDN article Identify And Prevent Memory Leaks In Managed Code might help too. This article points our how to find out if the memory leak is a native one or managed one. If it is a purely .NET manage leak, you can also use CLR profiler to debug the problem.
Related
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Closed 6 years ago.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Improve this question
2 days back we released new build of an existing asp.net 2.0 application, which got converted to 4.5 now. now we are seeing suddenly app pool for this asp.net app is consuming heavy memory on production server, more than 30 gb i guess. Code wise there is nothing which this app was doing earlier or have got added now with this new release which shall consume anything heavy, also doesn't have any file upload/downloads, heavy caching nothing.
As i am not able to find anything in code which can cause this so i am in need to some profiler which can tell few details on server.
So is there any free or open source tool which can help us to find out details like all session data with size that is stored on server for this app pool, and any other details which can help to know why this specific app pool is taking this much memory.
Apart from tool if any other directions/suggestions, that would be helpful as well. thanks...
Those kind of issues often can be traced via WinDbg debugger tool from Microsoft which is free.
First of all, you should create dump file of your current w3wp process.
https://msdn.microsoft.com/en-us/library/d5zhxt22.aspx
After that, you'll be able to load current state of that process into WinDbg.
https://developer.microsoft.com/en-us/windows/hardware/windows-driver-kit
https://blogs.msdn.microsoft.com/jankrivanek/2012/11/15/setting-up-managed-code-debugging-with-sos-and-sosex/
https://theartofdev.com/windbg-cheat-sheet/
http://windbg.info/
Keep in mind that WinDbg is low-level tool, so you need to spend some time to learn and get used to it.
Example of usage:
Create process dump (*.DMP) via Task Manager. Located in C:\Users\{username}\AppData\Local\Temp folder.
Open WinDbg (x64) -> Open Crash Dump -> Select created *.DMP
After that you need to setup symbols:
.symfix
.reload
Next, you should load .net runtime:
.loadby sos clr
You could get exception if server and your machine doesn't have same version of .net clr (What to do with "The version of SOS does not match the version of CLR you are debugging" in WinDbg?). Test it with !clrstack command.
If you want to load additional module with extended commands (http://www.stevestechspot.com/)
.load PathToFile\sosex.dll
!sosex.help
Now you have everything in place and you can start to analyze memory heap, threads, locks etc....
You could also find extremely helpful information, tips and tricks on a blog by Tess Ferrandez https://blogs.msdn.microsoft.com/tess/tag/debugging/
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 6 years ago.
Improve this question
I am using Asp.net with C# to re-ranking a collection image based on their content. But, since I run it, I obtained the following error. While my laptop has 4 GB RAM an 320 GB Hard Disk.
exception of type 'system.outofmemoryexception' was thrown
How can increase RAM for running my program?
It is nearly impossible to give you a good answer without seeing some code, but odds are that you are not actually running out of memory.
GDI will throw an OutOfMemoryException for many problems that are not related to memory at all. It can happen when you try to process a file that isn't actually an image, when the file is corrupt, or when it is an image format that GDI doesn't support.
First, check to make sure that every file or data stream you are processing is actually a real image file. If you are absolutely sure that the files are valid, and the format is supported by GDI, only then would I start looking at actual memory problems.
Two options -
1) there's a bug in your code and it isn't freeing things up.
2) 4GB of RAM isn't a lot.
Visual Studio will use as much memory as the laptop has. But you can "extend" it by enabling Virtual Memory - which I suspect is disabled on your computer. Virtual Memory (aka paging file) allows the Operating System to use diskspace.
However, it will be slow because RAM is written/read to disk as needed. You laptop is probably already slow enough.
Your best bet is to buy more RAM for your laptop. 8GB would be good enough (it is what I have on my "play around" laptop) --- 16GB is even better!
To enable virtual memory on Windows 7 - Open the System Properties (search for it or press the and Pause button). Select tab "Advanced" and open Performance Settings. Next select "Advanced" tab and press "Change..." Automatically Manage Paging and/or "System managed size" ("No paging file" disables Virtual Memory).
It is disabled by default because of the performance impact. Your PC will be slower because it reads/writes to disk...which is many times slower than Memory (RAM). But it works.
If you can - Buy more memory. You'll be happier.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 7 years ago.
Improve this question
Over the past couple months, I have been working on an application which converts PS files to PNG using Ghostscript.net, and then merges these into another PDF document.
There are 100+ PS files which get converted to PNG. This is done with Ghostscript.net and is multithreaded. I have not posted any code, because I don't believe my issue is necessarily code related. I have tested this program hundreds of times over the past couple months without issue. Yesterday I went to test it again, and the multithreaded portion was going incredibly slow. The CPU usage hovers around 100% during this portion, but the output is taking exponentially longer to produce.
I attempted to run an older, more stable, version of this application, and have the same issue. I also reduced the number of files it is running to just 1 PS file, and it still runs incredibly slow. Anything over 4 PS files ends up "timing out" or throwing an exception after attempting to run for about 5 minutes.
I ran my application on other computers, with less memory and processing power, and the other computers run it without issue.
My question is basically, what could have happened with my machine to cause these issues? Where is a good place to start investigating. I am using a work computer, so my level of access is limited. When I ran the program with only 4 PS files, the conversion process took a very long time, but the code that is executed after seems to have no performance issues. I am using a Windows 7 machine with an Intel core i7 and 16GB of RAM.
EDIT: So I have identified the issue (and think it makes this question more relevant to this forum). Normally, when using the Ghostscript application to convert .ps to .pdf, a temporary file is created for each output file. Once the application is terminated, the temp files created in the session are deleted.
I went digging into my temp files in the AppData\Temp\ (after a little more research, I now know that Ghostscript will use the TEMP environment variable to determine where these files will be stored) and noticed that there were a bunch of _teXXXX.tmp files in there (a little over 65,000). I also noticed that these _teXXXX.tmp files were all in groups of a few hundred with the same creation date/time.
Apparently, when using the Ghostscript.net dll, these files do not get disposed of. For a while, you would not see a slowdown, because when the dll would request a temp file name, it would just continue to count up and return a block of file names. After months of testing, it must have run out of available names in the naming convention it was using, causing it to slow down at first (searching through the whole range to find an unused name), and then finally cause it to stop running all together.
After I deleted all of these temp files, my application ran normally.
The Ghostscript.NET Processor object has a "Dispose()" method, but this did not seem to solve the issue. As I'm doing these conversions in a multithreaded environment, I am going to try to run them in a single threaded manner and see if the issue still occurs. I am submitting this as a bug, as I can't seem to find a solution for this at the moment. For now, I will just implement some code and delete these files myself after the run is complete.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 8 years ago.
Improve this question
I am currenly analyzing my code and application for having resource leakage. How do I monitor a C# process for currently running threads?
There's already a tool present, Parallel Stacks. To open it, Click on
Debug -> Windows -> Parallel Stacks
You can have a quick view on how to work with Parallel Stacks at MSDN.
This is the approach I use when I need to locate a leak:
Open a memory profiler.
I use perfmon.
This article has some material about setting perfmon and #fmunkert also explains it rather well.
Locate an area in the code that you suspect that it is likely that the leak is in that area. This part is mostly depending on you having good guesses about the part of the code that is responsible for the issue.
Push the Leak to the extreme: Use labels and "goto" for isolating an area / function and repeat the suspicious code many times (a loop will work to. I find goto more convenient for this matter).
In the loop I have used a breakpoint that halted every 50 hits for examining the delta in the memory usage. Of course you can change the value to feet a noticeable leak change in your application.
If you have located the area that causes the leak, the memory usage should rapidly spike.
If the Memory usage does not spike, repeat stages 1-4 with another area of code that you suspect being the root cause. If it does, continue to 6.
In the area you have found to be the cause, use same technique (goto + labels) to zoom in and isolate smaller parts of the area until you find the source of the leak (Please don't downvote me for the recursive step... :0) ).
Please note that the down sides of this method are:
If you are allocating an object in the loop, it's disposal should be also contained in the loop.
If you have more than one source of leak, It makes it harder to spot (yet still possible)
Good luck...
If you have visual studio 2013 You can download the Microsoft Concurrency Visualizer for Visual Studio:
https://msdn.microsoft.com/en-us/library/dd537632.aspx
It gives great insight in the application and threads currently running.
Synchronization, Sleeping, Blocking etc etc.
Next to that you can also download the extension (found on the same page)
In my opinion a great tool (and best of all free)
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a tool, library or favorite off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 9 years ago.
Improve this question
This is a really general learning-based question, not a technical problem.
I'm making a game in Unity. The game involves a lot of pretty complex, high-object-count processes, and as such, I generate a lot of garbage. As everyone who makes games that use managed code knows, garbage collection spikes are a HUGE buzzkill. I am also trying to target mobile devices, so this pain is amplified.
My goal is simple: track down the procedures that generate the most garbage and pool and reuse the objects involved in order to reduce the load on the garbage collector. I have tackled all of the obvious classes, but my GC issues persist. If anyone can offer some greater wisdom here, please bring it on.
What I have been unable to track down in my research is a good way to measure GC load and track down problem spots in the code itself. I can get metrics from Unity and from diagnostics classes about the size of the memory pool (though I'm no expert on what any of those numbers really mean) and I can display the total number of GC collections that have happened since the start, but my knowledge of debugging this issue kind of ends there.
I know there MUST be better ways to solve this problem. Can anyone point me at the right tools or framework classes (or libraries?) that can help me out here? I've seen some mention of a debugger program (sgen?) that's part of Mono, but I can't find a download or let alone a guide for how to hook it up to a Unity game.
Any help would be greatly appreciated. Thanks.
For C# and Unity3D specifically, the profiler built into Unity3D is the best tool you'll have available to you. You should be able to get very far with the Unity Profiler. Specifically the CPU Profiler:
I have highlighted the most important column - GC Alloc - Which, on a frame-by-frame basis, shows memory that has been allocated for garbage collection. The aim of the game here is to make that column as consistently close to zero as possible.
I'd suggest spending a few hours in there, playing your game, pausing it and digging into areas of your code where GC Alloc is showing numbers.
Finally, I strongly suggest you check out this video tutorial by one of the developers of Unity on memory debugging and using the Unity Editor profiler properly.
Caveat: it's all down to details: general principles are nice but you're right - real data is the right thing. Unfortunately it's hard to get for Unity: I've seen reports that DotTrace works for some aspects of Unity games but that most of Unity looks like a black box to it.
A useful rule of thumb is to look for new statements. It's a rule of thumb, not a 100% scientific principle (structs are created with new but they go on the stack... mostly...) but it's a good leading indicator of possible causes of fragmentation. If you "new" something up and let if just disappear ( go out of scope, be dereferenced, etc) it will end up in the garbage collector. Moreover if you new up a batch of items and dereference some of them you'll also fragment the heap which makes the collection spikes worse as the collector tries to defragment the memory.
This is why most recommendations in this space will be to use pools, as you're already doing.
Here are some general purpose links that might be useful in tackling memory management in Unity:
http://www.gamasutra.com/blogs/WendelinReich/20131109/203841/C_Memory_Management_for_Unity_Developers_part_1_of_3.php
http://andrewfray.wordpress.com/2013/02/04/reducing-memory-usage-in-unity-c-and-netmono/