I get that: The process is terminated with all the records in database inserted, then when I see the task manager in Windows the sqlserver.exe process still have a 3.663.263mb occupying my in memory and is not released.......
You can control the server memory configuration and set MAX SERVER MEMORY. here is a good overview document https://msdn.microsoft.com/en-us/library/ms178067.aspx
Minimum Maximum Memory Values are : 32bit - 64MB 64bit - 128mb
Set min server memory and max server memory to span a range of memory
values. This method is useful for system or database administrators to
configure an instance of SQL Server in conjunction with the memory
requirements of other applications that run on the same computer.
Use min server memory to guarantee a minimum amount of memory
available to the SQL Server Memory Manager for an instance of SQL
Server. SQL Server will not immediately allocate the amount of memory
specified in min server memory on startup. However, after memory usage
has reached this value due to client load, SQL Server cannot free
memory unless the value of min server memory is reduced.
Related
Using .NET running in an Azure cloud service, with a process with many SQL Server connection strings (SQL authentication, not Windows), what is the maximum number of connections that can be made to different databases?
I am putting together a system that needs to read/write to many MSSQL instances on different hosts at the same time and am looking to gather information/documentation on limits. This is not the same as multiple connections going to the same database, this is for example 40 strings (therefore 40 connection pools) to 40 different databases under different security contexts.
Thanks.
Maximum Capacity Specifications for SQL Server
User connections: 32767
You see also
Maximum number of concurrent users connected to SQL Server 2008
For you question (num max connections pools), I view the code of ADO.NET and I found that the pools are stored ( _poolCollection) in a ConcurrentDictionary then
the number max of pool is the the number max of entry in the dictionary. In the documentation says
For very large ConcurrentDictionary objects, you can increase the maximum array size to 2 gigabytes (GB) on a 64-bit system by setting the configuration element to true in the run-time environment.
I think that there aren't real limits, depends on the machine
We're using an out-of-proc session state service/ASP.Net Session state. We know were having problems with this as it's been abused in the past, too much stored in session state too often, so were in the process of moving onto a more scalable system.
In the meantime, though, we're trying to get our heads around how a session state service manages it's memory and what limits do we have. But none of the Microsoft docs seem to go into any details.
Specifically I want to know:
What are the limits to how much "the standard" out of proc session state service (installed along with IIS in the windows management console) can store?
(x64)
Is there a per user limit?
by standard service, I mean this one:
There is no limit beyond that of the machine hosting the service. If it has 16 gigs of RAM, assuming a few gigs are used for other processes / the OS / etc., there would be something like 13 GB of memory available for the session data. The data is not persisted to disk so the data only ever exists in RAM / memory; this is why when you restart the service all sessions are gone. The memory is volatile and works like a RAM disk.
If you're reaching the memory limits of the machine hosting your session state service, you are either storing too much data per user, or have too many users storing a little data. You're already on the right track as the next step is moving to a distributed session state provider to scale correctly. This is often achieved via a distributed caching system which comes with a session state provider, or by writing your own provider against said system.
There is no per-user limit on data, but note that out of process communication always happens via serialization. Therefore, there is a practical limit as serializing/deserializing a gig of user data per request is going to be very slow no matter how you approach it.
I have a memory-intensive C# app that resides on the same server as SQL Server.
I have tweaks in my application to limit in-memory caching and I am aware of how to set (and have set) maximum memory limits on my SQL Server.
PROBLEM: When the C# app wants to use more memory than is available due to SQL Server's caching, my app slows greatly, presumably to venturing into Virtual Memory space. I can prevent this most of the time by eyeballing how much RAM is available and setting the appropriate values in my custom C# code (whether to cache to RAM and/or to SQL Server) and in SQL Server's settings.
HOWEVER: Sometimes the machine's memory usage goes beyond my eyeballed boundaries due to other processes on the machine, typical OS needs fluctuating, etc.
I have noticed that SQL Server will often yield RAM to other processes, such as Chrome, MS Word... it doesn't seem to do so for my process. I have a gut feeling that my C# app isn't actively using all of the cached data in SQL Server...
So, how do I detect when SQL Server won't yield the RAM to my application and/or how do I detect when my application cannot allocate additional bytes of physical RAM?
We have WPF client that communicate with the server by web-service.
In the thick installation both client + server + sqlserver are installed on the same machine. The machine has 500M RAM.
I will be great full for tips about:
its very difficult to find the bottleneck , because you can use in profiler on such machine without heavy impact on the result. the profiler can cause to page faults due to missing memory and the diagnostic will show irrelevant results.
In the task manger i see that the asp process consume 135M this is too much. I tried to understand why it consume so much and i saw area - undefined. is it the asp process itself? does asp process has a big memory overhead? in this architecture (thick) i don't have multiple clients against the server . do you have any memory advice for it?
the benchmark results has big variety . i think it because page fault. do you have any advice?
Thank you very much. i am from java world , so i am new to it.
Download the process explorer and see whats needs the memory. Also check with starts with your server using the autoruns
Now 500M is little memory for a server.
From my experiences the SQL server needs a memory, iis and asp.net needs also memory, but the SQL need more, especial for cache and be fast, and make the indexing and all that. The asp.net is running with out needs of too much memory, but all depends from how you have setup your system, how many pools you have give to asp.net (I believe only one with 500M), how man sites, how many memory you get with your sites and so on.
More about
If its possible, move the SQL to a better server, and all the rest computers connect to this one. The SQL server needs memory.
Second possible, move the sap.net session from memory to sql server.
Limiting the memory use of iis can be done by:
Go to the IIS MMC
-> click Application Pools
-> Right-click the pool
-> select Advanced Settings
-> go to the Recycling section, you will see there are two settings:
Private Memory Limit (KB) Virtual
Memory Limit (KB)
When any of these 2 settings are set, if the worker process exceeds the
private or virtual memory quota, IIS will recycle that pool which limits
the memory usage.
You can limit the SQL server memory use in the SQL MMC, go to the database servers properties
Overall I would recommend to split the database and the iis server. 0.5gb is not a lot of memory for a server.
So I am working in an embedded environment and there is a need to transfer all of the data from a SQL Server Compact Edition database (.sdf file) to a SQLite database. So this would be a one time operation and my concern is that if I connect to the CE database to retrieve the data, will this cause additional memory and CPU resources to be used and stick around after I am done reading the data? My hope is that all the resources used by SQL Server CE would be garbage collected and freed up after I am done reading from the CE database or else I may be forced to write my own .sdf parser/reader.
Does anyone know ow the resources for SQL Server CE will be handled after I am done connecting to the database and if I can safely assume that they will be freed up after I read the data?
I believe you would have no concerns here. The ce and express versions of sql server have imposed utilization limits. I gb mem and 1 physical cpu and if you you are doing a one time batch process the mem will be GC after the process is stopped.