Detecting C# and SQL Server memory conflicts - c#

I have a memory-intensive C# app that resides on the same server as SQL Server.
I have tweaks in my application to limit in-memory caching and I am aware of how to set (and have set) maximum memory limits on my SQL Server.
PROBLEM: When the C# app wants to use more memory than is available due to SQL Server's caching, my app slows greatly, presumably to venturing into Virtual Memory space. I can prevent this most of the time by eyeballing how much RAM is available and setting the appropriate values in my custom C# code (whether to cache to RAM and/or to SQL Server) and in SQL Server's settings.
HOWEVER: Sometimes the machine's memory usage goes beyond my eyeballed boundaries due to other processes on the machine, typical OS needs fluctuating, etc.
I have noticed that SQL Server will often yield RAM to other processes, such as Chrome, MS Word... it doesn't seem to do so for my process. I have a gut feeling that my C# app isn't actively using all of the cached data in SQL Server...
So, how do I detect when SQL Server won't yield the RAM to my application and/or how do I detect when my application cannot allocate additional bytes of physical RAM?

Related

Memory is not released after a executing queries with SqlBulkCopy

I get that: The process is terminated with all the records in database inserted, then when I see the task manager in Windows the sqlserver.exe process still have a 3.663.263mb occupying my in memory and is not released.......
You can control the server memory configuration and set MAX SERVER MEMORY. here is a good overview document https://msdn.microsoft.com/en-us/library/ms178067.aspx
Minimum Maximum Memory Values are : 32bit - 64MB 64bit - 128mb
Set min server memory and max server memory to span a range of memory
values. This method is useful for system or database administrators to
configure an instance of SQL Server in conjunction with the memory
requirements of other applications that run on the same computer.
Use min server memory to guarantee a minimum amount of memory
available to the SQL Server Memory Manager for an instance of SQL
Server. SQL Server will not immediately allocate the amount of memory
specified in min server memory on startup. However, after memory usage
has reached this value due to client load, SQL Server cannot free
memory unless the value of min server memory is reduced.

Out of proc SessionState memory management

We're using an out-of-proc session state service/ASP.Net Session state. We know were having problems with this as it's been abused in the past, too much stored in session state too often, so were in the process of moving onto a more scalable system.
In the meantime, though, we're trying to get our heads around how a session state service manages it's memory and what limits do we have. But none of the Microsoft docs seem to go into any details.
Specifically I want to know:
What are the limits to how much "the standard" out of proc session state service (installed along with IIS in the windows management console) can store?
(x64)
Is there a per user limit?
by standard service, I mean this one:
There is no limit beyond that of the machine hosting the service. If it has 16 gigs of RAM, assuming a few gigs are used for other processes / the OS / etc., there would be something like 13 GB of memory available for the session data. The data is not persisted to disk so the data only ever exists in RAM / memory; this is why when you restart the service all sessions are gone. The memory is volatile and works like a RAM disk.
If you're reaching the memory limits of the machine hosting your session state service, you are either storing too much data per user, or have too many users storing a little data. You're already on the right track as the next step is moving to a distributed session state provider to scale correctly. This is often achieved via a distributed caching system which comes with a session state provider, or by writing your own provider against said system.
There is no per-user limit on data, but note that out of process communication always happens via serialization. Therefore, there is a practical limit as serializing/deserializing a gig of user data per request is going to be very slow no matter how you approach it.

Tips for limit memory machine performance? IIS application

We have WPF client that communicate with the server by web-service.
In the thick installation both client + server + sqlserver are installed on the same machine. The machine has 500M RAM.
I will be great full for tips about:
its very difficult to find the bottleneck , because you can use in profiler on such machine without heavy impact on the result. the profiler can cause to page faults due to missing memory and the diagnostic will show irrelevant results.
In the task manger i see that the asp process consume 135M this is too much. I tried to understand why it consume so much and i saw area - undefined. is it the asp process itself? does asp process has a big memory overhead? in this architecture (thick) i don't have multiple clients against the server . do you have any memory advice for it?
the benchmark results has big variety . i think it because page fault. do you have any advice?
Thank you very much. i am from java world , so i am new to it.
Download the process explorer and see whats needs the memory. Also check with starts with your server using the autoruns
Now 500M is little memory for a server.
From my experiences the SQL server needs a memory, iis and asp.net needs also memory, but the SQL need more, especial for cache and be fast, and make the indexing and all that. The asp.net is running with out needs of too much memory, but all depends from how you have setup your system, how many pools you have give to asp.net (I believe only one with 500M), how man sites, how many memory you get with your sites and so on.
More about
If its possible, move the SQL to a better server, and all the rest computers connect to this one. The SQL server needs memory.
Second possible, move the sap.net session from memory to sql server.
Limiting the memory use of iis can be done by:
Go to the IIS MMC
-> click Application Pools
-> Right-click the pool
-> select Advanced Settings
-> go to the Recycling section, you will see there are two settings:
Private Memory Limit (KB) Virtual
Memory Limit (KB)
When any of these 2 settings are set, if the worker process exceeds the
private or virtual memory quota, IIS will recycle that pool which limits
the memory usage.
You can limit the SQL server memory use in the SQL MMC, go to the database servers properties
Overall I would recommend to split the database and the iis server. 0.5gb is not a lot of memory for a server.

Memory/Performance of Using SQL Server CE

So I am working in an embedded environment and there is a need to transfer all of the data from a SQL Server Compact Edition database (.sdf file) to a SQLite database. So this would be a one time operation and my concern is that if I connect to the CE database to retrieve the data, will this cause additional memory and CPU resources to be used and stick around after I am done reading the data? My hope is that all the resources used by SQL Server CE would be garbage collected and freed up after I am done reading from the CE database or else I may be forced to write my own .sdf parser/reader.
Does anyone know ow the resources for SQL Server CE will be handled after I am done connecting to the database and if I can safely assume that they will be freed up after I read the data?
I believe you would have no concerns here. The ce and express versions of sql server have imposed utilization limits. I gb mem and 1 physical cpu and if you you are doing a one time batch process the mem will be GC after the process is stopped.

C#, Sql Server 2008: Stream large result set to end user only works on some databases

I have a long running query that returns a large data set. This query is called from a web service and the results are converted to a CSV file for the end user. Previous versions would take 10+ minutes to run and would only return results to the end user once the query completes.
I rewrote the query to where it runs in a minute or so in most cases, and rewrote the way it is accessed so the results would be streamed to the client as they came into the asp.net web service from the database server. I tested this using a local instance of SQL Server as well as a remote instance without issue.
Now, on the cusp of production deployment it seems our production SQL server machine does not send any results back to the web service until the query has completed execution. Additionally, I found another machine, that is identical to the remote server that works (clones), is also not streaming results.
The version of SQL Server 2008 is identical on all machines. The production machine has a slightly different version of Windows Server installed (6.0 vs 6.1). The production server has 4 cores and several times the RAM as the other servers. The other servers are single core with 1GB ram.
Is there any setting that would be causing this? Or is there any setting I can set that will prevent SQL Server from buffering the results?
Although I know this won't really affect the overall runtime at all, it will change the end-user perception greatly.
tl;dr;
I need the results of a a query to stream to the end user as the query runs. It works with some database machines, but not on others. All machines are running the same version of SQL Server.
The gist of what I am doing in C#:
var reader = cmd.ExecuteReader();
Response.Write(getHeader());
while(reader.Read())
{
Response.Write(getCSVForRow(reader));
if(shouldFlush()) Response.Flush()
}
Clarification based on response below
There are 4 database servers, Local, Prod, QA1, QA2. They are all running SQL Server 2008. They all have identical databases loaded on them (more or less, 1 day lag on non-prod).
The web service is hosted on my machine (though I have tested remotely hosted as well).
The only change between tests is the connection string in the web.config.
QA2 is working (streaming), and it is a clone of QA1 (VMs). The only difference between QA1 and QA2 is an added database on QA2 not related to this query at all.
QA1 is not working.
All tests include the maximum sized dataset in the result (we limit to 5k rows at this time). The browser displays a download dialog once the first flush happens. This is the desired result. We want them to know their download is processing, even if the download speed is low and at times drops to zero (such is the way with databases).
My flushing code is simple at this time. Every k rows we flush, with k currently set to 20.
The most perplexing part of this is the fact that QA1 and QA2 behave differently. I did notice our production server is set to compatibility mode 2005 (90) where both QA and local database are set to 2008 (100). I doubt this matters. When I exec the sprocs through SSMS I have similar behavior across all machines. I see results stream in immediately.
Is there any connection string setting that could disable the streaming?
Everything I know says that what you're doing should work; both the DataReader and Response.Write()/.Flush() act in a "streaming" fashion and will result in the client getting the data one row at a time as soon as there are rows to get. Response does include a buffer, but you're pushing the buffer to the client after every read/write iteration which minimizes its use.
I'd check that the web service is configured to respond correctly to Flush() commands from the response. Make sure the production environment is not a Win2008 Server Core installation; Windows Server 2008 does not support Response.Flush() in certain Server Core roles. I'd also check that the conditions evaluated in ShouldFlush() will return true when you expect them to in the production environment (You may be checking the app config for a value, or looking at IIS settings; I dunno).
In your test, I'd try a much larger set of sample data; it may be that the production environment is exposing problems that are also present on the test environments, but with a smaller set of test data and a high-speed Ethernet backbone, the problem isn't noticeable compared to returning hundreds of thousands of rows over DSL. You can verify that it is working in a streaming fashion by inserting a Thread.Sleep() call after each Flush(250); this'll slow down execution of the service, and let you watch the response get fed to your client at 4 rows per second.
Lastly, make sure that the client you're using in the production environment is set up to display CSV files in a fashion that allows for streaming. This basically means that a web browser acting as the client should not be configured to pass the file off to a third-party app. A web browser can easily display a text stream passed over HTTP; that's what it does, really. However, if it sees the stream as a CSV file, and it's configured to hand CSV files over to Excel to open, the browser will cache the whole file before invoking the third-party app.
Put a new task that builds this huge CSV file in a task table.
Run the procedure to process this task.
Wait for the result to appear in your task table with SqlDependency.
Return the result to the client.

Categories