I have written a C# windows service that collects syslog and inserts them to a SQL server database.
Average insert range is 5-10 inserts Per Second and each time I have a read operation and an insert operation.
I should note that I use the "Using" command for managing SQL server connections.
The only problem I face is that memory usage of windows increases steadily and even reaches to 6GB!
Is this a normal case?
and if it is what's the solution to decrease memory usage?
The only problem I face is that memory usage of windows increases steadily and even reaches to 6GB! Is this a normal case?
This is normal ,until you face issues with memory pressure..
How to identify and troubleshoot, if you have memory pressure..?
How SQLserver manages memory is a complex topic,but there are many links in SO which will help you
and if it is what's the solution to decrease memory usage?
Its Good to follow best practices and leave 10% to OS and leave the rest to SQLServer by setting max memory limit
Related
As far as I know single .net application can allocate a lot from available memory. It will be released by GC at some point. I never have to care much about details. It just works.
But what is going to happen if native application is started while most/all memory is used by .net application? Will GC respects this and free memory before? Or will Windows "take care" and will swap memory of .net appplication into swap-file?
I have a single PC with slow HDD, where my WPF application (MVVM, bitmaps, database, pretty memory-intensive) occupy 200-2000 Mb (up to 80%) of RAM and I get reports what PC become slow when running Office, antivirus, etc.
In e.g. Photoshop there is a setting to limit amount of used RAM. Now I am thinking whenever such thing make sense in my WPF application.
Is uncontrollable GC memory allocation a problem or not? Should I limit amount of used by my application memory?
The problem described does not look related to .NET memory management/GC, assuming that application just holds the operation data (in use) in the memory.
.NET app is not different from any other user app so the OS will treat them in the same way: move infrequently used memory blocks to the swap file.
If the application occupies 80% of RAM and works with memory intensively, competing with other applications, the whole process will generate a lot of page faults causing large traffic between swap file and memory. This leads to severe performance degradation, especially in case of slow HDD.
The .NET part in this game is just to clean the application memory from time to time from the data no longer in use. If there are just big amount of data required for the app to run and just adding more RAM is not an option, then application redesign (which limits the amount of loaded data somehow) is a considerable approach.
We are designing an enterprise application which caches a lot of data from back end. The users are allowed to open arbitrary number of app windows, and each loads its own data and caches it. To somehow manage memory consumption and prevent overall OS performance decrease, we decided to write a cache manager that will automatically monitor app's memory footprint and remove data from cache when needed.
So the problem is we have difficulties identifying whether it is time to free up memory. Currently we use a very simple approach - we just start throwing away stuff from cache when app's memory usage exceeds 80% of physical memory.
Are there any (alternative?) established practices for dealing with such kind of problem?
This is basically OK. There is no really good strategy. If there are multiple competing applications this can lead to cache competitions and false evictions.
If you pick the threshold too low you waste cache space. If it's too high nothing else might fit into memory including the file cache, DLLs, ...
What do you mean by "available physical memory"? Do you mean installed memory or memory that's free? How can an app use 80% of free memory? I'm unclear on the definition that you are using.
SQL Server uses memory until the OS signals that it's low on memory (I believe that happens when 95% of "something" is being used).
You certainly do not want to use the GC to free memory. It will routinely kill your entire cache.
Maybe you can move the cache contents to disk entirely? Or, you could share the cache between .NET processes by having a hidden cache server process that can be queries by app processes.
I want to stress that if your app consumes 99% of installed RAM (as an example) performance will be very bad because the file cache is almost empty. This means that even DLLs and .NET NGEN'ed code will be paged out and in frequently.
Maybe a better strategy is to assume, that 1GB will be needed to appropriately cache the OS and app files. So you can consume memory until there are only 10% free of the installed RAM minus 1 GB.
I have an asp.net/c# 4.0 website on a shared server.
The os Windows Server 2012 is running IIS 7, has 32GB of ram and 3.29GHz of processor.
The server is running into difficulty now and again, such as problems RDP'ing and other PHP websites running slow.
Our sys admin has suggested my website as being a possible memory hog and cause of these issues.
At any given time the sites "Memory (private working set)" is 2GB and can peak as high as 15GB.
I ran a trial version of JetBrains dotMemory on the server, attached to the website's w3wp.exe process. My first time using this program, I am a complete novice here.
I took two memory snapshots using dotMemory.
And the basic snapshot comparison can be seen here: http://i.imgur.com/0Jk8yYE.jpg
From the above comparison we can see that System.String and BusinessObjects.Item are the two items with the most survived bytes.
Drilling down on system.string I could see that the main dominating survived object was System.Web.Caching.CacheEntry with 135MB. See the screengrab: http://i.imgur.com/t1hs8nd.jpg
Which leads me to suspect maybe I cache too much?
I cache all data that comes out of my database: HTML Page Content, Nav-Menu Items, Relationships between pages & children, articles etc. Using HttpContext.Current.Cache.Insert.
With the cache timeout set to 10080 minutes.
My Questions:
1 - is 2GB Memory (private working set) and a peak as high as 15GB to much for a server with 32 GB Ram?
2 - Have I used dotMemory correctly to identify an issue?
3 - Is my caching an issue?
4 - Possible other causes?
5 - Possible solutions?
Thanks
Usage of a large amount of memory itself can not slow down a program. It can be slowed down
if windows actively using swap file. Ask an admin to check this case
if .NET program causes garbage collecting too much (produces too much memory traffic)
You can see a memory traffic information in dotMemory. Unfortunately it can't gather such information if it attaches to the program, the only way to collect object creation stack traces and memory traffic information is to launch your program under dotMemory profiler, enabling corresponding settings.
BUT!
The problem is you can't evaluate is memory traffic amount "N" high or no, the only way to find a root of a performance problem is using performance profiler.
For example JetBrains dotTrace can show you, how much time you program spends in garbage collecting, and only in case this is a bottle neck, you should use memory profiler to find a root of traffic.
Conclusion: try to find a bottle neck using performance profiler first.
Then, if you still have a questions about dotMemory, ask me, I'll try to help you :)
we have a 2008 SQL Server machine, for some reason which we still havent figured out, every two weeks this server stops responding around the same time and around the same day, it's either sunday or saturday, we have checked the logs and the only message we have found is this
A significant part of sql server process memory has been paged out.
On the operating system log we algo found a message
Application popup: Windows - Virtual Memory Minimum Too Low : Your system is low on virtual memory. Windows is increasing the size of your virtual memory paging file. During this process, memory requests for some applications may be denied. For more information, see Help.
so it looks like the operating system is out of physical memory, we do not undestand why this happens every two weeks, it seems as if memory never gets freed and two weeks is the period it takes to get full. Is there a way we could better diagnose this? we are also wondering if it is related to how we are using NHibernate? or is there any other cause?
SQL Server consumes more and more memory over time, this is normal. We faced this issue after the server has been up for a couple of MONTHS. SQL's memory consumption went up to several GB and Windows has eventually cut it down...
Setting up a "max server memory" for your SQL Server should help. On our 8 GB server we've set it to 5.5 GB.
PS. Setting up a "low memory" email alert is a good practice. It will let know just before the things are about to mess up. This blog post explains how you do this.
1) identify the process that consumes the memory. Use the Process object and identify the process that consumes memory (large Private Bytes and Virtual Bytes)
2) If the process turns out to be SQL Server, follow SQL Server memory diagnostics steps. See Monitoring Memory Usage and Using DBCC MEMORYSTATUS to Monitor SQL Server Memory Usage
Depending on the identified memory consumer process and posibly on the identified SQL Server memory clerk, appropiate actions and remedies can be recommended, but untill you do the due diligence and find the cause no advice can possibly be given.
I was hoping someone could explain why my application when loaded uses varying amounts of RAM. I'm speaking about a compiled version that uses the exe directly. It's a pretty basic applications and there are no conditional branches in the startup of the application. Yet every time I start it up the RAM amount varies from 6MB-16MB.
I know it's on the small end of usage anyways but I'm curious of why this happens.
Edit: to give a bit more clarification on what the app actually does.
It is a WinForm project.
It connects to a database using sqlclient to retrieve a list of servers.
Based on that list a series of buttons are created to start and stop a service on those servers.
Using the System.Timers class to audit the status of the services on those servers every 20 seconds.
The applications at this point sits there and waits for user input via one of the button clicks to start/stop the service.
The trick here is that the amount of RAM reported by the task schedule is not the amount of RAM used by your application. Rather, it is the amount of RAM reserved for use by your application.
Remember that with managed frameworks like .Net, you don't request or release memory directly. Rather, a garbage collector manages the memory for you. The amount of memory reserved for your application at a given time can vary and depends on a lot of different factors, including memory pressure created at the time by other programs.
Think of it this way: if you need 10 MBs of RAM for your app, is it faster to request and return it to the operating system 1 MB at a time over 10 requests/releases or reserve the block at once with one request/release? Now extend that to a scenario where you don't know exactly how much RAM you'll need, only that it's somewhere in the neighborhood of 10 MB. Additionally, your computer has 1 GB sitting there unused. Of course the best thing to do is take a good-sized chunk of that available RAM. Even 20 or 30 MB wouldn't be unreasonable relative to the ram that's sitting there unused, because unused RAM is wasted performance.
If your system later starts to feel some memory pressure then .Net can easily return some RAM to the system. This is one of the ways managed languages can sometimes give better performance than languages like C++ with traditional memory management: a garbage collector that can more easily take the entire system health into account when allocating memory.
What are you using to determine how much memory is being "used". Even with regular applications Windows will aggressively allocate unused memory in advance, with .NET applications it's even more complicated as to how much memory is actually being used, and how much Windows is just tacking on so that it will be available instantly when needed. If another application actually asks for memory this reserved memory will be repurposed.
One way to check is to minimize the application (at least on XP). If you are looking at the memory use in something like task manager you'll notice it drops off right away, eliminating the seemly "random" amount allocated.
It may be related to the jitter, after the first load the jitter already created a compiled version and it doesn't need to run. Other than that you would have to give us some more details about the app and which kind of memory you are referring to.