Background
I recently came across an out of memory exception when users would visit few pages of my Kentico website. Fast forward - I found that the allocated memory (System > General) was over 2 GB! I then went to Debug > Clear cache and then noticed the allocated memory sitting roughly around 400 MB (phew..). Now, when the users would visit the page, it would work without any out of memory exception.
Question
Is there a way I could get these memory statistics via code (ideally C#)? I'm thinking of being able to regularly monitor these memory statistics and trigger an alert (send an email/post to webhook from my C# code) when the allocated memory gets too high.
Additonal information
Kentico version 9.0.42, hosted in Azure, scaled to 2 instances.
The App Service Plan's (in Azure) memory usage was roughly at 50% through out - this rules out setting an alert at that level.
Thanks!
If you look at the code on the System page, you'll find your answer. Go to /CMSModules/System/Controls/System.ascx.cs file and search for Memory.Text. You'll find several SystemHelper methods to get the values for you.
SystemHelper.GetVirtualMemorySize()
SystemHelper.GetWorkingSetSize()
SystemHelper.GetPeakWorkingSetSize()
You can use SystemHelper class to get statistics and memory data. It still mostly relies on .NET Process and its properties. If you are on azure, you can use above along with PerformanceCounters to log your own sets of information into application insights and create your own alerts based on these counters.
It recommended by Kentico to restart you app once per 24 hours. I suggest you schedule restart of your instances one after another some time during the night.
There doesn't seem to be anything in the Kentico API that would access the memory statistics, but maybe this answer will help you.
Related
I'm working on a web application that uses a number of external data sources for data that we need to display on the front end. Some of the external calls are expensive and some also comes with a monetary cost so we need a way to persist the result of these external requests to survive ie a app restart.
I've started with some proof of concept and my current solution is a combination of a persistent cache/storage (stores serialized json in files on disk) and a runtime cache. When the app starts it will populate runtime cache from the persistent cache, if the persistent cache is empty it would go ahead and call the webservices. Next time the app restarts we're loading from the persistent cache - avoiding the call to the external sources.
After the first population we want the cache to be update in the background with some kind of update process on a given schedule, we also want this update process to be smart enough to only update the cache if the request to the webservice was successful - otherwise keep the old version. Theres also a twist here, some webservices might return a complete collection while others requires one call per entity - so the update-process might differ depending on the concrete web service.
I'm thinking that this senario can't be totally unique, so I've looked around and done a fair bit of Googleing but I haven't fund any patterns or libraries that deals with something like this.
So what I'm looking for is any patterns that might be useful for us, if there is any C#-libraries or articles on the subject as well? I don't want to "reinvent the wheel". If anyone have solved similar problems I would love to hear more about how you approached them.
Thank you so much!
I'm looking for a way to retrieve performance data of an Azure cloud service. Specifically, I need CPU and memory usage statistics of the last 5/30/60 minutes.
Googling around I found that this can be done by accessing Azure's default performance counters, but the documentation seems to be scarce and ambiguous as to how to do this programatically. Also, I need to do this without making any manual configurations to the service after the deployment.
Anybody got any idea?
Best regards,
Remus
Ideas? Yes. Will it fit your use case? I do not really know. What do you need to do with the data?
Have you thought about integrating Application Insights: https://azure.microsoft.com/en-US/documentation/articles/app-insights-cloudservices/ It allows collecting (custom) performance counters telemetry (https://azure.microsoft.com/en-US/documentation/articles/app-insights-cloudservices/#performance-counters).
If you not only need to see/monitor these counters you can enable continuous export to a sql database and collect the data in code from there. You can also define alerts based on certain values.
They are also working on a Rest API so you could get the raw data from there for further processing, see https://visualstudio.uservoice.com/forums/357324-application-insights/suggestions/4999529-make-data-accessible-via-apis-for-custom-processin.
Might be a bit overkill to use AI for your specific scenarios however, since you only need it for the last hour.
You can use the KUDU API to get the CPU and Memory usage of your w3wp processes that running in your cloud service.
To access the KUDU service from browser type - https://[your-web-site-name].scm.azurewebsites.net.
Their you can see in the Process explorer tab the CPU and Memory information about the w3wp processes.
If you want to do it programtically you can build http client and access the data, for example -
GET https://[your-web-site-name].scm.azurewebsites.net/api/processes/ - To get all processes.
GET https://[your-web-site-name].scm.azurewebsites.net/api/processes/[proccess number] - to access each process and get the information.
For credentials you need to look in your publish profile - and get the userName and userPWD.
A nice example can be found -
http://chriskirby.net/blog/running-your-azure-webjobs-with-the-kudu-api
I'm stuck figuring out what will be the next trouble shooting step to take or research.
I've got one web application running on one IIS8 webserver. Lately we are struggling with the performance of the webapplication. This morning the webapplication 'crasht' again. With this I mean that it didn't respond to any new requests.
Because Perfmon is your friend, I fired off the following counters:
Processor Time (%)
Requests Executing
Requests/Sec
See the image I've included.
What I find interessting is that "Requests Executing" only increased at this point. And... the CPU was not running 100% to get rid of all these requests.
As I'm the main developer of this web application, I already optimized many high-CPU webpages.
With this SO thread I'm hoping to find the bottle neck. Maybe with tips for extra logging. Application code analysis etc. I can provide more information about the setup or application when needed.
Hope you can help. Many thanks in advance.
Some specifications:
Windows Server 2012R2
IIS8
Intel Xeon Quad core CPU
Total of 4 gig memory
ASP.net 4.0 web application
User specs:
Google Analytics Real Time visitors between 100-500
Normal requests/sec average of 30
'Peak' requests/sec of 200 / 300
EDIT:
I run a Perfmon Data Collection Set for about a hour. Exactly at the moment when the website crashed. I used the tool PAL to analyze it. It only had many warnings about the memory being below 5%.
This memory issue is clearly an issue we will be resolving soon.
Another thing I noticed is that the list of "current requests" in IIS8 was enormous. It contained hundreds of "current requests". I would expect these requests to reach a time out value and send a request Timed Out to the users.
Here is a print-screen of the current healthy situation:
This list was infinite long.
And, again another thing I just noticed, was one request taking more then 10 minutes(!) to deliver a byte[] to a user. The 'state' was SendResponse. I persume this user was on a low-bandwitdh device. As long as this user is downloading - one workerprocess is taken. How should we prepare this long pending requests?
Have a look at PAL at https://pal.codeplex.com/ - this will guide you in collecting a number of stats and then analyse them for you.
Along with CPU, other bottlenecks are memory, network, and disc IO. PAL should help you in collecting the appropriate stats - it has several pre-defined sets for e.g. Web Server, DB Server, etc.
Technology: Asp.Net 4.5
Type: MasterPage Application (Transaction/Items/Configurator)
We are in Load Test Phase.
The application locks up when there is more than one user using the application on the hosting environment (M6.Net). The application uses session variables to store datatables. Also, almost all the pages have insert, update, and delete statements to sql tables in the code behind. I have Option Strict turned on. I'm trying to figure out if its an SQL Server issue, memory issue, or something else. Once one user gets locked up, the rest get locked up and eventually goes to custom_error page but there is no error exception message. Has anyone experienced this issue and what is a good way to trouble shoot?
Would Glimpse be able to detect the issues? I have no experience with it.
Note: I did not post any code because I am at a loss as to what code I should post.
The application was running out of memory. I didn't realize the provider account I was writing the app for only had 32MB of memory. So when several users were on and the memory peeked, the app pool would recycle wiping out the session variables and that was causing the errors.
We tested the application on another web server with a large amount of memory and the application worked as intended without any issues. We will be moving to a cloud environment with a significant amount of memory which will allow 500 - 1000 simultaneous users. During this troubleshooting process I did find some memory leaks and fixed those. I will be testing the session variables for values on postbacks to make sure the app pool didn't recycle as well.
#MatthewMartin - Thanks for all your input.
Say I created a winform app and distributed to anonymous users, and I want to have a way to get the statistics of user opens the app, one way I can think of is opening a webpage (lightweight) on app startup then analyzing how many times the webpage is opened.
Any other ways to get the statistics?
The best way is to use the existing tools out there and implement them in your application for statistical usage and analytics for example as mentioned before: EQATEC.
you also have Preemptive analytics: http://www.preemptive.com/products/runtime-intelligence/overview
This is a most common used tool especially in the whole ALM lifecycle process and how applications and companies progress onwards with minimal effort story.
you also need to know exactly what kind of statistics are you monitoring here? number of times your app is run on a particular version of the software? particular screens being used within your app? memory/CPU usage? etc...
you also have trackerbird:
http://www.trackerbird.com/
There are many ways.
The initial idea of having your application "phone home" to count its usage isn't that bad, but it does not have to be an entire web page.
You could just post some data to a webservice.
If it can't connect to your web service, you could store the information locally and send it next time the app starts with a valid web connection.
If you do this asynchronously, it should be hardly noticeable when you start the app.
You can have your own tracking service, which you can consume every time when application starts. Alternative way you can use any third party application/service which provides this kind of functionality. Telerik eqatec analytics could be one of its kind.