We have WPF client that communicate with the server by web-service.
In the thick installation both client + server + sqlserver are installed on the same machine. The machine has 500M RAM.
I will be great full for tips about:
its very difficult to find the bottleneck , because you can use in profiler on such machine without heavy impact on the result. the profiler can cause to page faults due to missing memory and the diagnostic will show irrelevant results.
In the task manger i see that the asp process consume 135M this is too much. I tried to understand why it consume so much and i saw area - undefined. is it the asp process itself? does asp process has a big memory overhead? in this architecture (thick) i don't have multiple clients against the server . do you have any memory advice for it?
the benchmark results has big variety . i think it because page fault. do you have any advice?
Thank you very much. i am from java world , so i am new to it.
Download the process explorer and see whats needs the memory. Also check with starts with your server using the autoruns
Now 500M is little memory for a server.
From my experiences the SQL server needs a memory, iis and asp.net needs also memory, but the SQL need more, especial for cache and be fast, and make the indexing and all that. The asp.net is running with out needs of too much memory, but all depends from how you have setup your system, how many pools you have give to asp.net (I believe only one with 500M), how man sites, how many memory you get with your sites and so on.
More about
If its possible, move the SQL to a better server, and all the rest computers connect to this one. The SQL server needs memory.
Second possible, move the sap.net session from memory to sql server.
Limiting the memory use of iis can be done by:
Go to the IIS MMC
-> click Application Pools
-> Right-click the pool
-> select Advanced Settings
-> go to the Recycling section, you will see there are two settings:
Private Memory Limit (KB) Virtual
Memory Limit (KB)
When any of these 2 settings are set, if the worker process exceeds the
private or virtual memory quota, IIS will recycle that pool which limits
the memory usage.
You can limit the SQL server memory use in the SQL MMC, go to the database servers properties
Overall I would recommend to split the database and the iis server. 0.5gb is not a lot of memory for a server.
Related
I have a shared hosting account with 128MB of RAM and my site is in its own app pool.
The site is small and gets low traffic, but I keep getting the following error:
A worker process serving application pool 'xxx v4.0 (Classic)' has requested a recycle because it reached its private bytes memory limit.
This is happening frequently, which restarts the app pool. If the app pool restarts too often, eventually it will stop. Then I'll get a 503 error when I go to the site.
The site is written using c#, with data access from ef and ado.net. All my database connections are in using statements and I am confident they are being opened and closed correctly.
I have spoken to the host and I can upgrade the RAM to 256MB which does appear to make the site run nicely. But I am a bit concerned that just upgrading the RAM is only masking the problem temporarily.
Debug is set to false in the web config and I before I copy the files to the server I am building for release.
When I run the solution in visual studio my IIS Worker Process hovers around 100 MB.
I think my questions are:
Is there any way I can replicate my hosting environment on my local machine?
Is it normal for a fairly small website to exceed 128MB of RAM?
I am at a bit of a loss of what to try. Any help or guidance would be greatly appreciated.
Other potentially important info:
.NET Framework is 4.5
Web Forms
AjaxControlToolkit is used (only the scripts I need are loaded)
I've looked at many blog posts and similar questions but I can't seem to make any progress.
Thanks
Jim
That message is about hitting the configured limit within IIS itself, it does not necessarily have anything to do with the amount of RAM on the host itself (although the settings you do set within IIS should take your aggregate RAM into account, so there is an indirect link).
Open IIS
Left Click on "Application Pools"
Find your dedicated pool and right click on it, selecting "Recycling..."
Check the "Private memory usage (in KB):" value
That is what you are exceeding
[These instructions are based on IIS 7.5 but are similar for other versions]
I am using a simple mvc application with an image and a button. But on every postback, memory used by dev server increases and reaches about 100MB in about 50 postbacks.
Is this a known issue about dev server and web applications? Also, mvc is supposed to be stateless but why development server is taking so much space on postbacks. I'm asking this because If I host some complex mvc application in my server and many number of users are accessing my web site at the same time this could crash my server. IIS server are also working like this, any thought about this.
Actually, you don't have to worry about memory usage of the server. But at the same time you must follow the best practices to write your code. The server decides by itself the best time to collect the garbage and free unused memory.
I have a memory-intensive C# app that resides on the same server as SQL Server.
I have tweaks in my application to limit in-memory caching and I am aware of how to set (and have set) maximum memory limits on my SQL Server.
PROBLEM: When the C# app wants to use more memory than is available due to SQL Server's caching, my app slows greatly, presumably to venturing into Virtual Memory space. I can prevent this most of the time by eyeballing how much RAM is available and setting the appropriate values in my custom C# code (whether to cache to RAM and/or to SQL Server) and in SQL Server's settings.
HOWEVER: Sometimes the machine's memory usage goes beyond my eyeballed boundaries due to other processes on the machine, typical OS needs fluctuating, etc.
I have noticed that SQL Server will often yield RAM to other processes, such as Chrome, MS Word... it doesn't seem to do so for my process. I have a gut feeling that my C# app isn't actively using all of the cached data in SQL Server...
So, how do I detect when SQL Server won't yield the RAM to my application and/or how do I detect when my application cannot allocate additional bytes of physical RAM?
(Sorry if this is a really long question, it said to be specific)
The company I work for has a number of sites, which have been running for some time with no problems. The applications are a mix of ASP.NET 2.0, 3.5, and 4.0, all using an ADO.NET to connect to a SQL Server Standard instance (on the same webserver) all being hosted with IIS7.
The problem began when we moved to an upgraded webserver. We made every effort to set up the server, db instance and IIS with the exact same settings (except for the different machine name, and the fact that we had upgraded from SQLExpress to Standard), and as far as we could tell, we did. Both servers are running Windows Server 2008 R2 (all current updates applied), and received a default install.
The problem is very apparent when starting up one of these applications. When you reach the login page of our application, the page itself loads extremely fast. This is true even when you load the page from a new machine that could not possibly have the page cached, with IIS caching disabled. The problem is actually visible when you enter your login information and click the login button. Because of the (not great)design of our databases, the login process must access a number of databases, theoretically up to 150 separate DBs, but in practice usually 2. The problem occurs even when only 2 databases (the minimum) are opened. Not a great design, but we have to live with it for now.
When trying to initially open a connection to the database, the entire process stops for about 20 seconds every time, regardless of whether you are connecting to 2 dbs or 40. I have run a .NET profiler (jetbrains dottrace) against the process, and the only information I could take from it was that one or all of the calls to sqlconnection.open() was accounting for 90% of the time. This only happens on first-use of the application, but the problem is compounded by the fact that IIS seems to disregard the recycling settings we have set for it, and recycles the application after a few minutes of idle, causing the problem to occur again.
I also tried to use the SQL Server profiler to see which database operations were the cause of the slowdown, but because of all the other DB activity, (and the fact that I had to do this on our production server, because the problem doesnt occur in our test environments) I couldn't pin down the exact operation that was causing the stoppage. I will try coming in late at night and shutting down the production sites to run the SQL profiler, but I might not be able to do this right away.
In the course of researching the problem, I have tried a couple solutions
Thinking it might be a name resolution problem, I tried modifiying both the hosts file on the webserver as well as giving the connectionstrings an IP address instead of the servername to resolve, with no difference. I have heard of the LLMNR protocol causing problems like this, but I think trying to connect by IP or resolving with the hosts file should have eliminated that possibility, tho i admit I never tried actually turning off LLMNR.
I have increased the idle timeouts, recycling intervals etc in IIS, but this doesn't even seem to be respected, much less solving the problem. This leads me to believe there is a setting overriding the IIS application settings on the machine.
multiple other code fixes, none of which made any difference. Is a SqlServer setting causing the problem?
other stuff that i forgot by now.
Any ideas, experience or whatevers would be greatly appreciated in helping me solve this problem!
I would advise using a non-tcp connection if you are still running the SQL instance on the local machine. SQL Server supports several protocols, tcp, named pipes, and shared memory are the more common.
Named Pipes
Data Source=np:computer\instance
Shared Memory
Data Source=lpc:computer\instance
Personally I prefer the Shared Memory. Remember you need to enable these protocols, and to avoid configuration mistakes I suggest you disable all you are not using.
see http://msdn.microsoft.com/en-us/library/ms187892.aspx
IIS Reset
In IIS7 there are two ways to configure the idle-timeout. Both begin by clicking on the "Application Pools" section and right-clicking the appropriate app domain. If you click the "Recycling..." option there is one setting. The other is in "Advanced Settings..." under the section for "Process Model" you will find "Idle Time-out (minutes)" which set to zero disables the process timeout. This later option is the one that works for us.
If I were you I'd solve this problem first as restarting the appdomain and/or worker process is always painful even if you don't have a 20 second lag.
Some ideas:
from the web server, can you ping the db server and get a "normal"
response, or are you seeing a similar delay?
if you're seeing a delay, run a tracert to see if you can nail down where the slowness is occurring
try using a tool like QueryExpress (http://www.albahari.com/queryexpress.aspx) which doesn't require an install to run. You can download this EXE and run it from your web server. See if you can connect to your db using this and run queries in a normal fashion.
Try something like SysInternals' TcpView (http://technet.microsoft.com/en-us/sysinternals/bb897437) to take a look at your open connections and see what activity is happening on your server and how much data is being sent to and received from your db server.
Just some initial thoughts on where I'd start to look based upon your problem description. I hope this helps. Good luck with things!
With IIS not respecting recycling settings: did restarting IIS/rebooting change the behavior?
I'm working on a graduation project for one of my university courses, and I need find some place to run several crawlers I wrote in C# from. With no web hosting experience, I'm a bit lost. Is this something that any site allows? Do I need a special host that gives more access to the server? The crawler is a simple app that does its work, then periodically writes information to a remote database.
A web crawler is a simulation of a normal user. It acess sites like browsers do, getting the html code (javascript, etc.) returned from the server (so no internal access to server code). Being that, any site can be crawled.
Be aware of some web crawler ethics guidelines. There are pages you shouldn't index or follow its links. And web developers build some files and instructions to web crawlers, saying what you can index or follow.
If you can't run it off your desktop for some reason, you'll need a host that lets you execute arbitrary C# code. Most cheap web servers don't do this due to the potential security implications, since there will be several other people running on the same server.
This means you'll need to be on a server where you have your own OS. Either a VPS - Virtual Private Server, where virtualization is used to give you your own OS but share the hardware - or your own dedicated server, where you have both the hardware and software to yourself.
Note that if you're running on a server that's shared in any way, you'll need to make sure to throttle yourself so as to not cause problems for your neighbors; your primary issue will be not using too much CPU or bandwidth. This isn't just for politeness - most web hosts will suspend your hosting if you're causing problems on their network, such as denying the other users of the hardware you're on resources by consuming them all yourself. You can usually burst higher usage levels, but they'll cut you off if you sustain them for a significant period of time.
This doesn't seem to have anything to do with web hosting. You just need a machine with an internet connection and a database server.
I'd check with your university if I were you. At least in my time, a lot was possible to arrange in-house when it came to graduation projects.
Failing that, you could look into a simple VPS (Virtual Private Server) account. Unless you are sure your app runs under Mono, you will need a Windows one. The resource limits are usually a lot lower than you'd get from a dedicated server, but they're relatively affordable. Some will offer a MS SQL Server database you can use next to the VPS account (on another machine). Installing SQL Server on the VPS itself can be a problem license wise.
Make sure you check the terms of usage before you open an account, as well as the (virtual) system specs though. Also check if there is some kind of minimum contract period. Sometimes this can be longer than a single month, especially if there is no setup fee.
If at all possible, find a host that's geographically close to you. A server on the other side of the world can get a little annoying to access remotely using Remote Desktop.
80legs lets you use their crawlers to process millions of web pages with your own program.
The rates are:
$2.00 per million pages
$0.03 per CPU-hour
They claim to crawl 2 billion web pages a day.
You will need a VPS(Virtual private server) or a full on dedicated server. Crawlers are nothing more then applications that "crawl" the internet. While you could set up a web site to be a crawler, it is not practical because the web page would have to be accessed for you crawler to work. You will have to read the ToS(Terms of service) for the host to see what the terms are for usage. Some of the lower prices hosts will cut your connection with a reason of "negatively impacting the network" if you try to use to much bandwidth even though they have given you plenty to use.
VPS are around $30-80 for a linux server and $60+ for a windows server.
Dedicated services run $100+ for both linux and windows servers.
You don't need any web hosting to run your spider. Just ask for a PC with web connection that can act as a dedicated server,configure the database and run the crawler from there.