Performance Counter - System.InvalidOperationException: Category does not exist - c#

I have following class that returns number of current Request per Second of IIS. I call RefreshCounters every minute in order to keep Requests per Second value refreshed (because it is average and if I keep it too long old value will influence result too much)... and when I need to display current RequestsPerSecond I call that property.
public class Counters
{
private static PerformanceCounter pcReqsPerSec;
private const string counterKey = "Requests_Sec";
public static object RequestsPerSecond
{
get
{
lock (counterKey)
{
if (pcReqsPerSec != null)
return pcReqsPerSec.NextValue().ToString("N2"); // EXCEPTION
else
return "0";
}
}
}
internal static string RefreshCounters()
{
lock (counterKey)
{
try
{
if (pcReqsPerSec != null)
{
pcReqsPerSec.Dispose();
pcReqsPerSec = null;
}
pcReqsPerSec = new PerformanceCounter("W3SVC_W3WP", "Requests / Sec", "_Total", true);
pcReqsPerSec.NextValue();
PerformanceCounter.CloseSharedResources();
return null;
}
catch (Exception ex)
{
return ex.ToString();
}
}
}
}
The problem is that following Exception is sometimes thrown:
System.InvalidOperationException: Category does not exist.
at System.Diagnostics.PerformanceCounterLib.GetCategorySample(String machine,\ String category)
at System.Diagnostics.PerformanceCounter.NextSample()
at System.Diagnostics.PerformanceCounter.NextValue()
at BidBop.Admin.PerfCounter.Counters.get_RequestsPerSecond() in [[[pcReqsPerSec.NextValue().ToString("N2");]]]
Am I not closing previous instances of PerformanceCounter properly? What am I doing wrong so that I end up with that exception sometimes?
EDIT:
And just for the record, I am hosting this class in IIS website (that is, of course, hosted in App Pool which has administrative privileges) and invoking methods from ASMX service. Site that uses Counter values (displays them) calls RefreshCounters every 1 minute and RequestsPerSecond every 5 seconds; RequestPerSecond are cached between calls.
I am calling RefreshCounters every 1 minute because values tend to become "stale" - too influenced by older values (that were actual 1 minute ago, for example).

Antenka has led you in a good direction here. You should not be disposing and re-creating the performance counter on every update/request for value. There is a cost for instantiating the performance counters and the first read can be inaccurate as indicated in the quote below. Also your lock() { ... } statements are very broad (they cover a lot of statements) and will be slow. Its better to have your locks as small as possible. I'm giving Antenka a voteup for the quality reference and good advice!
However, I think I can provide a better answer for you. I have a fair bit of experience with monitoring server performance and understand exactly what you need. One problem your code doesn't take into account is that whatever code is displaying your performance counter (.aspx, .asmx, console app, winform app, etc) could be requesting this statistic at any rate; it could be requested once every 10 seconds, maybe 5 times per second, you don't know and shouldn't care. So you need to separate the PerformanceCounter collection code from that does the monitoring from the code that actually reports the current Requests / Second value. And for performance reasons, I'm also going to show you how to setup the performance counter on first request and then keep it going until nobody has made any requests for 5 seconds, then close/dispose the PerformanceCounter properly.
public class RequestsPerSecondCollector
{
#region General Declaration
//Static Stuff for the polling timer
private static System.Threading.Timer pollingTimer;
private static int stateCounter = 0;
private static int lockTimerCounter = 0;
//Instance Stuff for our performance counter
private static System.Diagnostics.PerformanceCounter pcReqsPerSec;
private readonly static object threadLock = new object();
private static decimal CurrentRequestsPerSecondValue;
private static int LastRequestTicks;
#endregion
#region Singleton Implementation
/// <summary>
/// Static members are 'eagerly initialized', that is,
/// immediately when class is loaded for the first time.
/// .NET guarantees thread safety for static initialization.
/// </summary>
private static readonly RequestsPerSecondCollector _instance = new RequestsPerSecondCollector();
#endregion
#region Constructor/Finalizer
/// <summary>
/// Private constructor for static singleton instance construction, you won't be able to instantiate this class outside of itself.
/// </summary>
private RequestsPerSecondCollector()
{
LastRequestTicks = System.Environment.TickCount;
// Start things up by making the first request.
GetRequestsPerSecond();
}
#endregion
#region Getter for current requests per second measure
public static decimal GetRequestsPerSecond()
{
if (pollingTimer == null)
{
Console.WriteLine("Starting Poll Timer");
// Let's check the performance counter every 1 second, and don't do the first time until after 1 second.
pollingTimer = new System.Threading.Timer(OnTimerCallback, null, 1000, 1000);
// The first read from a performance counter is notoriously inaccurate, so
OnTimerCallback(null);
}
LastRequestTicks = System.Environment.TickCount;
lock (threadLock)
{
return CurrentRequestsPerSecondValue;
}
}
#endregion
#region Polling Timer
static void OnTimerCallback(object state)
{
if (System.Threading.Interlocked.CompareExchange(ref lockTimerCounter, 1, 0) == 0)
{
if (pcReqsPerSec == null)
pcReqsPerSec = new System.Diagnostics.PerformanceCounter("W3SVC_W3WP", "Requests / Sec", "_Total", true);
if (pcReqsPerSec != null)
{
try
{
lock (threadLock)
{
CurrentRequestsPerSecondValue = Convert.ToDecimal(pcReqsPerSec.NextValue().ToString("N2"));
}
}
catch (Exception) {
// We had problem, just get rid of the performance counter and we'll rebuild it next revision
if (pcReqsPerSec != null)
{
pcReqsPerSec.Close();
pcReqsPerSec.Dispose();
pcReqsPerSec = null;
}
}
}
stateCounter++;
//Check every 5 seconds or so if anybody is still monitoring the server PerformanceCounter, if not shut down our PerformanceCounter
if (stateCounter % 5 == 0)
{
if (System.Environment.TickCount - LastRequestTicks > 5000)
{
Console.WriteLine("Stopping Poll Timer");
pollingTimer.Dispose();
pollingTimer = null;
if (pcReqsPerSec != null)
{
pcReqsPerSec.Close();
pcReqsPerSec.Dispose();
pcReqsPerSec = null;
}
}
}
System.Threading.Interlocked.Add(ref lockTimerCounter, -1);
}
}
#endregion
}
Ok now for some explanation.
First you'll notice this class is designed to be a static singleton.
You can't load multiple copies of it, it has a private constructor
and and eagerly initialized internal instance of itself. This makes
sure you don't accidentally create multiple copies of the same
PerformanceCounter.
Next you'll notice in the private constructor (this will only run
once when the class is first accessed) we create both the
PerformanceCounter and a timer which will be used to poll the
PerformanceCounter.
The Timer's callback method will create the PerformanceCounter if
needed and get its next value is available. Also every 5 iterations
we're going to see how long its been since your last request for the
PerformanceCounter's value. If it's been more than 5 seconds, we'll
shutdown the polling timer as its unneeded at the moment. We can
always start it up again later if we need it again.
Now we have a static method called GetRequestsPerSecond() for you to
call which will return the current value of the RequestsPerSecond
PerformanceCounter.
The benefits of this implementation are that you only create the performance counter once and then keep using until you are finished with it. Its easy to use because you simple call RequestsPerSecondCollector.GetRequestsPerSecond() from wherever you need it (.aspx, .asmx, console app, winforms app, etc). There will always be only one PerformanceCounter and it will always be polled at exactly 1 times per second regardless of how quickly you call RequestsPerSecondCollector.GetRequestsPerSecond(). It will also automatically close and dispose of the PerformanceCounter if you haven't requested its value in more than 5 seconds. Of course you can adjust both the timer interval and the timeout milliseconds to suit your needs. You could poll faster and timeout in say 60 seconds instead of 5. I chose 5 seconds as it proves that it works very quickly while debugging in visual studio. Once you test it and know it works, you might want a longer timeout.
Hopefully this helps you not only better use PerformanceCounters, but also feel safe to reuse this class which is separate from whatever you want to display the statistics in. Reusable code is always a plus!
EDIT: As a follow up question, what if you want to performance some cleanup or babysitting task every 60 seconds while this performance counter is running? Well we already have the timer running every 1 second and a variable tracking our loop iterations called stateCounter which is incremented on each timer callback. So you could add in some code like this:
// Every 60 seconds I want to close/dispose my PerformanceCounter
if (stateCounter % 60 == 0)
{
if (pcReqsPerSec != null)
{
pcReqsPerSec.Close();
pcReqsPerSec.Dispose();
pcReqsPerSec = null;
}
}
I should point out that this performance counter in the example should not "go stale". I believe 'Request / Sec" should be an average and not a moving average statistic. But this sample just illustrates a way you could do any type of cleanup or "babysitting" of your PerformanceCounter on a regular time interval. In this case we are closing and disposing the performance counter which will cause it to be recreated on next timer callback. You could modify this for your use case and according the specific PerformanceCounter you are using. Most people reading this question/answer should not need to do this. Check the documentation for your desired PerformanceCounter to see if it is a continuous count, an average, a moving average, etc... and adjust your implementation appropriately.

I don't know, if this passes you .. I've read article PerformanceCounter.NextValue Method
And there was a comment:
// If the category does not exist, create the category and exit.
// Performance counters should not be created and immediately used.
// There is a latency time to enable the counters, they should be created
// prior to executing the application that uses the counters.
// Execute this sample a second time to use the category.
So, I have a question, which can lead to answer: isn't call to a RequestsPerSecond method happends too early?
Also, I would suggest you to to try check if the Category doesn't exists and log the info somewhere, so we can analyze it and determine which conditions we have and how often that happends.

I just solved this type of error or exception with:
Using,
new PerformanceCounter("Processor Information", "% Processor Time", "_Total");
Instead of,
new PerformanceCounter("Processor", "% Processor Time", "_Total");

I had an issue retrieving requests per second on IIS using code similar to the following
var pc = new PerformanceCounter();
pc.CategoryName = #"W3SVC_W3WP";
pc.InstanceName = #"_Total";
pc.CounterName = #"Requests / Sec";
Console.WriteLine(pc.NextValue());
This would sometimes throw InvalidOperationException and I was able to reproduce the exception by restarting IIS. If I run with a non warmed up IIS, e.g. after a laptop reboot or IIS restart, then I get this exception. Hit the website first, make any http request beforehand, and wait a second or two and I don't get the exception. This smells like the performance counters are cached,and when Idle they get dumped, and take a while to re-cache? (or similar).
Update1: Initially when I manually browse to the website and warm it up, it solves the problem. I've tried programmatically warming up the server with new WebClient().DownloadString(); Thread.Sleep() up to 3000ms and this has not worked? So my results of manually warming up server, might somehow be a false positive. I'm leaving my answer here, because it might be the cause, (i.e. manual warming up), and maybe someone else can elaborate further?
Update2: Ah, ok, here are some unit tests that summarises some learning from further experimenting I did yesterday. (There's not a lot on google on this subject btw.)
As far as I can reason, the following statements might be true; (and I submit the unit tests underneath as evidence.) I may have misinterpreted the results, so please double check ;-D
Create a performance counter and calling getValue before the category exists, e.g. querying an IIS counter, while IIS is cold and no process running, will throw InvalidOperation exception "category does not exist". (I assume this is true for all counters, and not just IIS.)
From within a Visual Studio unit test, once your counter throws an exception, if you subsequently warm up the server after the first exception, and create a new PerformanceCounter and query again, it will still throw an exception! (this one was a surprise, I assume this is because of some singleton action. My apologies I have not had enough time to decompile the sources to investigate further before posting this reply.)
In 2 above, if you mark the unit test with [STAThread] then I was able to create a new PerformanceCounter after one has failed. (This might have something to do with Performance counter possibly being singletons? Needs further testing.)
No pause was required for me before creating counter and using it, despite some warnings in MSDN same code documentation, other than the time it takes to create a performance counter itself before calling NextValue().In my case, to warm up the counter and bring the "category" into existance, was for me to fire one shot across the bow of IIS, i.e. make a single GET request, and viola, no longer get "InvalidOperationException", and this seems to be a reliable fix for me, for now. At least when querying IIS performance counters.
CreatingPerformanceCounterBeforeWarmingUpServerThrowsException
[Test, Ignore("Run manually AFTER restarting IIS with 'iisreset' at cmd prompt.")]
public void CreatingPerformanceCounterBeforeWarmingUpServerThrowsException()
{
Console.WriteLine("Given a webserver that is cold");
Console.WriteLine("When I create a performance counter and read next value");
using (var pc1 = new PerformanceCounter())
{
pc1.CategoryName = #"W3SVC_W3WP";
pc1.InstanceName = #"_Total";
pc1.CounterName = #"Requests / Sec";
Action action1 = () => pc1.NextValue();
Console.WriteLine("Then InvalidOperationException will be thrown");
action1.ShouldThrow<InvalidOperationException>();
}
}
[Test, Ignore("Run manually AFTER restarting IIS with 'iisreset' at cmd prompt.")]
public void CreatingPerformanceCounterAfterWarmingUpServerDoesNotThrowException()
{
Console.WriteLine("Given a webserver that has been Warmed up");
using (var client = new WebClient())
{
client.DownloadString("http://localhost:8082/small1.json");
}
Console.WriteLine("When I create a performance counter and read next value");
using (var pc2 = new PerformanceCounter())
{
pc2.CategoryName = #"W3SVC_W3WP";
pc2.InstanceName = #"_Total";
pc2.CounterName = #"Requests / Sec";
float? result = null;
Action action2 = () => result = pc2.NextValue();
Console.WriteLine("Then InvalidOperationException will not be thrown");
action2.ShouldNotThrow();
Console.WriteLine("And the counter value will be returned");
result.HasValue.Should().BeTrue();
}
}

Just out of curiousity, what do you have set for properties in Visual Studio? In VS go to Project Properties, Build, Platform target and change it to AnyCPU. I have seen it before where Performance Counters aren't always retrieved when it is set to x86, and changing it to AnyCPU could fix it.

Related

C# Console application does work when started from Visual studio, after publishing it only runs for 3 cycles is this normal behaviour?

I made a C# console application (.NET CORE 5.0) to check every minute for changes in a MySQL database and send e-mails with the changes.
If I run this application directly from visual studio 2019, it works fine without any problems.
If I run it after I publish it, it only does 3 cycles and the console window stays open. No errors or anything else.
This first screenshot is from running via Visual Studio 2019
This is screenshot is from running directly from desktop after publish
static void Main(string[] args)
{
TimerCallback callback = new TimerCallback(DoStuff);
Timer stateTimer = new Timer(callback, null, 0, 1000);
for (; ; )
{
Thread.Sleep(100);
}
}
static public void DoStuff(Object stateInfo)
{
DataTable DtblEmployee = DatabaseClass.GetEmployeeList();
foreach (DataRow row in DtblEmployee.Rows)
{
foreach (var item in row.ItemArray)
{
string Str = RandomStringGenerator.GetRandomAlphanumericString(8);
DatabaseClass.EmployeeUpdate(item.ToString(), Str);
EmailClass.SendEmail(item.ToString(), Str);
}
}
Console.WriteLine("Last check was # {0}", DateTime.Now.ToString("h:mm:ss"));
Console.WriteLine("{0} e-mail(s) were send.", DtblEmployee.Rows.Count);
}
The program is supposed to run forever.
Would love to hear from you guys, if more information is needed please ask.
EDIT: Added extra code to show Database
public static void EmployeeUpdate(string Emailadress, string Password)
{
string connectionstring;
connectionstring = "server=1.1.1.1;user id=User;password=Pass;port=3306;persistsecurityinfo=True;database=Test";
connection = new MySqlConnection(connectionstring);
try
{
connection.Open();
var cmd = new MySqlCommand("UPDATE Users SET Password=#param_val_1, GeneratePassword=#param_val_3 where Username=#param_val_2", connection);
cmd.Parameters.AddWithValue("#param_val_1", Password);
cmd.Parameters.AddWithValue("#param_val_2", Emailadress);
cmd.Parameters.AddWithValue("#param_val_3", 0);
cmd.ExecuteScalar();
cmd.Dispose();
}
catch (MySqlException ex)
{
switch (ex.Number)
{
case 0:
Console.WriteLine("Cannot connect to server. Contact administrator");
break;
case 1045:
Console.WriteLine("Invalid username/password, please try again");
break;
}
}
finally
{
connection.Close();
}
}
I strongly expect that the problem is that your Timer is being garbage collected and finalized, and that's stopping the callback from being executed.
When you run your code from Visual Studio in the debugger, the JIT is less aggressive about garbage collection, which is why it's working in that scenario.
The smallest change to fix that would be to add this line at the end of your Main method:
GC.KeepAlive(stateTimer);
Alternatively, as per quetzalcoatl's answer, you could use a using statement for the timer. Either option will have the desired effect of keeping the timer alive for the duration of the method.
An alternative I think you should explore would be to not use a timer at all, instead just loop within the Main method and call your DoStuff method directly. You'd still call Sleep within that loop, which would handle the timing aspect. Obviously that will affect the precise timing of how the code runs, but it's likely to end up being simpler to understand and simpler to debug.
Additionally, I'd suggest being a lot more intentional about your exception handling. Work out whether you want the code to stop looping if any one iteration throws an exception, and make that explicit in the code.
I 100% agree with JonSkeet about the reason.
It's the GC that cleans up the stateTimer variable. Right after this line, this variable becomes unused and the compiler is free to get rid of it from the stack, and then GC is free to get rid of the timer.
When you are running your application in different environments, the GC may be using different set rules. Debugging session, console app, IIS Application, module for SqlServer, etc - they all have different rules as for when and how aggressively run the GC. Under debugging session, it also MAY clean up this variable, but it also MAY do it hours or days later, maybe to give you more time to inspect things? Under free-running console app, it simply occurred sooner.
GC also has its hard rules that has to abide them always: if the variable is used, it cannot be purged.
JonSkeet suggested pinning the stateTimer, I disagree. It's a last-resort option.
Much better, just use USING directive, as the Timer is a IDisposable:
TimerCallback callback = new TimerCallback(DoStuff);
using(Timer stateTimer = new Timer(callback, null, 0, 1000))
for (; ; )
{
Thread.Sleep(100);
}
The variable is still unused, you can even get rid of it and write
using(new Timer(callback, null, 0, 1000))
but even now, the using() statement will remember that Timer object and prevent GC from cleaning it up too soon. (It has to remember that object to be able to call Dispose() when the loop ends..)

Threading Timer doesn't callback

I have several Machine classes which have state whether they are online/offline and DateTime EndsAt when they will turn offline if they are online. They are (mapped?) to database using EF. When i turn them on i pass amount of seconds for them to stay online and create System.Threading.Timer to change its state back to offline when the time comes (EndsAt == DateTime.Now). Turning them on works fine, however they don't turn off - turnoff() is never called. And on top of that if it would be called and object would change its own variables will they be saved by entity framework?
public class Machine
{
private Timer timer=null;
[Key]
public int MachineId { get; set; }
public bool Online { get; set; }
public DateTime EndsAt { get; set; }
public void TurnOn(TimeSpan amount)
{
Debug.WriteLine("Turn on reached");
if (!Online)
{
EndsAt = DateTime.Today.Add(amount);
Online = true;
setTimer();
}
}
private void turnOff(object state)
{
Online = false;
Occuppied = false;
Debug.WriteLine("Timer ended!");
}
private void setTimer()
{
Debug.WriteLine("Timer being set");
if (EndsAt.CompareTo(DateTime.Now) == 1)
{
timer = new Timer(new TimerCallback(turnOff));
int msUntilTime = (int)((EndsAt - DateTime.Now).TotalMilliseconds);
timer.Change(msUntilTime, Timeout.Infinite);
}
else
{
Debug.WriteLine("EndsAt is smaller than current date");
}
}
}
Controller method where turnOn() is called
[HttpPost]
public ActionResult TurnOn() {
bool isChanged = false;
if (Request["machineId"] != null && Request["amount"] != null)
{
byte machineId = Convert.ToByte(Request["machineId"].ToString());
int amount = Convert.ToInt32(Request["amount"].ToString());
foreach (var machine in db.Machines.ToList())
{
if (machine.MachineId == machineId)
{
machine.TurnOn(TimeSpan.FromSeconds(amount));
db.Entry(machine).State = EntityState.Modified;
db.SaveChanges();
isChanged = true;
}
}
}
if (isChanged)
return new HttpStatusCodeResult(HttpStatusCode.OK);
else
return new HttpStatusCodeResult(HttpStatusCode.BadRequest);
}
The problem comes not from Entity Framework but ASP.NET.
The best way I can describe it is imagine your page request in ASP.NET is a console application, every new request the application starts up, does the request and responds to the user, waits a tiny bit for another request to come in then exits the Main() function.
If you created a Timer in that kind of application once the "tiny bit" runs out and the Main() returns your timer will not be running anymore and the thing you where waiting to happen will never happen. IIS does this exact process but it does it with AppDomain recycling, if no requests come in it will shut down the AppDomain and will kill your timer.
There two ways I know of to handle this problem:
The first way is you need to make a 2nd application that runs as a windows service outside of IIS that is always running, it will be what holds the timer. When you want to run any kind of long running operation that will outlive a page request you use WCF or some other technology for your web app to communicate with the service to start up the timer, when the timer is done either the service executes whatever operation you wanted done.
The second way to do it is you save the timer request in a database then in the background before every request you check the database of events and see if any need to be executed. There are libraries like hangfire that make this process easy, they also have tricks to keep the app domain alive longer or wake it back up if it shuts down (often they use two websites that talk to each other each keeping the other one alive).
Even though this specific question has been answered, here's some related discussion I hope can be helpful in the case of a timer callback not working.
Import considerations when using Threading.Timer
1.) Timer is subject to garbage collection. Even if active, it may be collected as garbage if it does not haven a reference.
2.) DotNet has many different types of timers, and it's important to use the right kind in the right way because it involves threading. Use Forms.Timer for Forms, Threading.Timer or wrap it in Timers.Timer (debate on thread safety), or Web.UI.Timer with ASP.NET for web page postbacks.
3.) The Callback method is defined when the timer is instantiated and cannot be changed.
Timer Related Tools
1.) You can use Thread.Sleep to release CPU resources and place your thread in a waitsleepjoin state which is essentially stopped.
2.) Sometimes a Task can be used along with or instead of a timer.
3.) Stopwatch can be used in different ways, for example, with an empty loop.

C# PerformanceCounter Class causes Handle Leak

I have an application developed with .NET 4.0. This application keeps track on some Custom Performance Counter and display to the user. Recently i found that there's handle leak in the application. The 2 types of handles are Mutant and PcwObject.
I followed this page (http://blogs.technet.com/b/yongrhee/archive/2011/12/19/how-to-troubleshoot-a-handle-leak.aspx) and got the following stack trace:
Handle = 0x0000000000003760 - OPEN
Thread ID = 0x00000000000073d0, Process ID = 0x0000000000005fdc
0x0000000077c41cea: ntdll!NtCreateMutant+0x000000000000000a
0x000007fefde08bf7: KERNELBASE!CreateMutexExW+0x000000000000004f
0x000007fefde14460: KERNELBASE!CreateMutexExA+0x0000000000000038
0x000007feff6bbcf6: ADVAPI32!PerflibciOpenLocalQueryHandle+0x0000000000000116
0x000007feff6a5a86: ADVAPI32!PerflibciQueryV2Provider+0x000000000000020d
0x000007feff68926d: ADVAPI32!QueryExtensibleData+0x00000000000004a2
0x000007feff6898e4: ADVAPI32!alloca_probe+0x00000000000051b2
0x00000000779d4087: KERNEL32!TlsGetValue+0x000000000000fbb8
0x00000000779e4b52: KERNEL32!RegQueryValueExW+0x00000000000000f2
0x000007feff68c2ed: ADVAPI32!RegQueryValueExWStub+0x000000000000001d
0x000007fef99b17c7: clr!DoNDirectCall__PatchGetThreadCall+0x000000000000007b
0x000007fef8a38422: mscorlib_ni+0x0000000000428422
0x000007fef89948f1: mscorlib_ni+0x00000000003848f1
0x000007fef899392e: mscorlib_ni+0x000000000038392e
and
Handle = 0x0000000000003998 - OPEN
Thread ID = 0x0000000000002808, Process ID = 0x0000000000005fdc
0x0000000077c4138a: ntdll!NtDeviceIoControlFile+0x000000000000000a
0x000007fefce214a3: pcwum!PcwpSendIoctl+0x00000000000000f3
0x000007fefce21962: pcwum!PcwCreateNotifier+0x000000000000003e
0x000007feff6abf53: ADVAPI32!PerfpCreateProvider+0x00000000000000d3
0x000007feff6ddb77: ADVAPI32!PerflibciLocalValidateCounters+0x0000000000000167
0x000007feff6a5ced: ADVAPI32!PerflibciQueryV2Provider+0x0000000000000478
0x000007feff68926d: ADVAPI32!QueryExtensibleData+0x00000000000004a2
0x000007feff6898e4: ADVAPI32!alloca_probe+0x00000000000051b2
0x00000000779d4087: KERNEL32!TlsGetValue+0x000000000000fbb8
I also opened Process Explorer to monitor the handle state. According to my observation, the above 2 handles (3760 and 3998) keep alive for over half an hour and not yet destroyed. The handles count is increased by ~1000 within 2 hours. half of them are Mutant and the other half are PcwObject.
I suspect it is related to PerformanceCounter coz i know the PerformanceCounter Class grep data from Registry and i find PerflibciQuery and RegQueryValue in the stack trace.
I've search through the Internet but seems no luck. Does anyone have any idea about this?
Thanks
Additional Information
I tested those Performance Counters one by one and find that these handles were leaked when getting this counter : PerformanceCounter("HTTP Service Request Queues", "CurrentQueueSize", "ABC")
My Code is like this :
private PerformanceCounter counter;
private void Detect()
{
/*
Do sth
*/
try
{
if (null == counter) counter = new PerformanceCounter("HTTP Service Request Queues", "CurrentQueueSize", "ABC");
long rawValue = counter.RawValue;
if (0 < rawValue)
WriteLog("ABC CurrentQueueSize: {0}", rawValue);
}
catch(Exception e)
{
WriteLog("Fail to get ABC counter. {0}", e);
}
}
counter is a member variable and I'm very sure that it is disposed when this class is destroyed. So i don't know why it leaks the handles.
What i noticed recently was a handle leak if you ask the counter for a value in a different thread. If you create a new thread, ask the performance counter for next value it will create bunch of handled that dont get cleaned up even if you dispose the counter.
I already found the cause few days ago but forgot to post it. Sorry about this.
Actually this is a resource leak caused by using v2 PerformanceCounter (http://support.microsoft.com/kb/2734909). Then I followed instruction in this page (http://msdn.microsoft.com/en-us/library/aa392740(v=vs.85).aspx) and found that "HTTP Service Request Queues" is a v2 counters provider.
So that's the cause! Done!

How to stop constant memory increase every time segment of code is run?

I have a program that runs some methods every 5 seconds in the background. However every 5 seconds its physical memory usage jumps by 16-20 Kb. Through commenting out segments of code, I've narrowed it down to this specific segment is what is causing the issue. What am I missing here to correctly release the allocated Memory?
Loop segment from main method:
while (true)
{
listMessages = FetchAllMessages();
//Commented out other segments. Not causing memory increase
System.Threading.Thread.Sleep(5000);
}
Method called:
public static List<Message> FetchAllMessages()
{
try
{
using (Pop3Client client = new Pop3Client())
{
client.Connect("pop.gmail.com", 995, true);
client.Authenticate("removed", "removed");
int messageCount = client.GetMessageCount();
List<Message> allMessages = new List<Message>(messageCount);
for (int i = messageCount; i > 0; i--)
{
if (verifiedEmail.Contains(client.GetMessage(i).Headers.From.Address) || verifiedSms.Contains(client.GetMessage(i).Headers.From.Address))
{
string tempMessage = client.GetMessage(i).ToMailMessage().Body.ToLower();
if (tempMessage.Contains("cmd") && tempMessage.Contains("fin"))
{
allMessages.Add(client.GetMessage(i));
}
}
client.DeleteMessage(i);
}
client.Disconnect();
return allMessages;
}
}
catch (Exception ex)
{
return null;
}
}
One of the things that could be causing a steady increase in memory usage is that you're calling GetMessage so many times. Depending on how your POP client is written, that could be allocating a new buffer every time so that it can download the message from the POP server. That memory will of course be collected eventually, but you're exercising the garbage collector needlessly. And you're also being highly inefficient.
You should consider changing your code to something like this:
for (int i = messageCount; i > 0; i--)
{
var msg = client.GetMessage(i);
if (verifiedEmail.Contains(msg.Headers.From.Address)
|| verifiedSms.Contains(msg.Headers.From.Address))
{
string tempMessage = msg.ToMailMessage().Body.ToLower();
if (tempMessage.Contains("cmd") && tempMessage.Contains("fin"))
{
allMessages.Add(msg);
}
}
client.DeleteMessage(i);
}
So instead of calling client.GetMessage(i) four times, you call it only once.
It also makes the code easier to read.
That said, I think it's likely that your "memory leak" is just the GC taking its own sweet time in collecting memory.
One other thing. You have a sleep loop:
while (true)
{
listMessages = FetchAllMessages();
Thread.Sleep(5000);
}
You're tying up a thread that spends most of its time doing nothing. You'd be better off creating a timer with a 5 second interval, like this:
System.Threading.Timer MailTimer; // declare at class scope
// Do this in your initialization
MailTimer = new Timer(MessageFetcher, null, 5000, -1);
And your MessageFetcher method is:
void MessageFetcher(object state)
{
listMessages = FetchAllMessages();
// do that other stuff that you didn't show
// reset the timer so that it fires 5 seconds from now
MailTimer.Change(5000, -1);
}
The initialization creates a one-shot timer that expires in five seconds and calls MessageFetcher. When MessageFetcher is done, it sets a timer so that mail will be checked in another five seconds. You want to do it this way rather than setting a periodic interval, because you don't want the timer to call MessageFetcher again if the previous tick isn't done processing.
The MessageFetcher method is executed on a pool thread. Using the timer prevents you from having to keep a thread around all the time, occupying memory while it's doing essentially nothing.
As Paddy said, eventually garbage collection will get around to disposing the objects and releasing memory, but you CAN manually force it, although it's generally better to allow it to happen automatically.
But to test that garbage collection will reduce the memory, exit the While loop after several calls and call GC.Collect();. The memory should go down.
Calling GC.Collect(); is expensive and that's why you're better off to let the OS choose the optimal time to call garbage collection automatically.
This is a great answer to a similar question regarding your concerns: C# Garbage collection
Who knows? It's not deterministic. Think of it like this: on a system
with infinite memory, the garbage collector doesn't have to do
anything. And you might think that's a bad example, but that's what
the garbage collector is simulating for you: a system with infinite
memory. Because on a system with sufficiently more memory available
than required by your program, the garbage collector never has to run.
Consequently, your program can not make any assumptions about when
memory will (if ever) be collected.
So, the answer to your question is: we don't know.
I would recommend setting up a memory profiler to log the memory consumption of your application and run it for a while to test. You should see that the Garbage Collector will keep things under control automatically without you having to change anything with your code.

Background task in a ASP webapp

I'm fairly new to C#, and recently built a small webapp using .NET 4.0. This app has 2 parts: one is designed to run permanently and will continuously fetch data from given resources on the web. The other one accesses that data upon request to analyze it. I'm struggling with the first part.
My initial approach was to set up a Timer object that would execute a fetch operation (whatever that operation is doesn't really matter here) every, say, 5 minutes. I would define that timer on Application_Start and let it live after that.
However, I recently realized that applications are created / destroyed based on user requests (from my observation they seem to be destroyed after some time of inactivity). As a consequence, my background activity will stop / resume out of my control where I would like it to run continuously, with absolutely no interruption.
So here comes my question: is that achievable in a webapp? Or do I absolutely need a separate Windows service for that kind of things?
Thanks in advance for your precious help!
Guillaume
While doing this on a web app is not ideal..it is achievable, given that the site is always up.
Here's a sample: I'm creating a Cache item in the global.asax with an expiration. When it expires, an event is fired. You can fetch your data or whatever in the OnRemove() event.
Then you can set a call to a page(preferably a very small one) that will trigger code in the Application_BeginRequest that will add back the Cache item with an expiration.
global.asax:
private const string VendorNotificationCacheKey = "VendorNotification";
private const int IntervalInMinutes = 60; //Expires after X minutes & runs tasks
protected void Application_Start(object sender, EventArgs e)
{
//Set value in cache with expiration time
CacheItemRemovedCallback callback = OnRemove;
Context.Cache.Add(VendorNotificationCacheKey, DateTime.Now, null, DateTime.Now.AddMinutes(IntervalInMinutes), TimeSpan.Zero,
CacheItemPriority.Normal, callback);
}
private void OnRemove(string key, object value, CacheItemRemovedReason reason)
{
SendVendorNotification();
//Need Access to HTTPContext so cache can be re-added, so let's call a page. Application_BeginRequest will re-add the cache.
var siteUrl = ConfigurationManager.AppSettings.Get("SiteUrl");
var client = new WebClient();
client.DownloadData(siteUrl + "default.aspx");
client.Dispose();
}
private void SendVendorNotification()
{
//Do Tasks here
}
protected void Application_BeginRequest(object sender, EventArgs e)
{
//Re-add if it doesn't exist
if (HttpContext.Current.Request.Url.ToString().ToLower().Contains("default.aspx") &&
HttpContext.Current.Cache[VendorNotificationCacheKey] == null)
{
//ReAdd
CacheItemRemovedCallback callback = OnRemove;
Context.Cache.Add(VendorNotificationCacheKey, DateTime.Now, null, DateTime.Now.AddMinutes(IntervalInMinutes), TimeSpan.Zero,
CacheItemPriority.Normal, callback);
}
}
This works well, if your scheduled task is quick.
If it's a long running process..you definitely need to keep it out of your web app.
As long as the 1st request has started the application...this will keep firing every 60 minutes even if it has no visitors on the site.
I suggest putting it in a windows service. You avoid all the hoops mentioned above, the big one being IIS restarts. A windows service also has the following benefits:
Can automatically start when the server starts. If you are running in IIS and your server reboots, you have to wait until a request is made to start your process.
Can place this data fetching process on another machine if needed
If you end up load-balancing your website on multiple servers, you could accidentally have multiple data fetching processes causing you problems
Easier to main the code separately (single responsibility principle). Easier to maintain the code if it's just doing what it needs to do and not also trying to fool IIS.
Create a static class with a constructor, creating a timer event.
However like Steve Sloka mentioned, IIS has a timeout that you will have to manipulate to keep the site going.
using System.Runtime.Remoting.Messaging;
public static class Variables
{
static Variables()
{
m_wClass = new WorkerClass();
// creates and registers an event timer
m_flushTimer = new System.Timers.Timer(1000);
m_flushTimer.Elapsed += new System.Timers.ElapsedEventHandler(OnFlushTimer);
m_flushTimer.Start();
}
private static void OnFlushTimer(object o, System.Timers.ElapsedEventArgs args)
{
// determine the frequency of your update
if (System.DateTime.Now - m_timer1LastUpdateTime > new System.TimeSpan(0,1,0))
{
// call your class to do the update
m_wClass.DoMyThing();
m_timer1LastUpdateTime = System.DateTime.Now;
}
}
private static readonly System.Timers.Timer m_flushTimer;
private static System.DateTime m_timer1LastUpdateTime = System.DateTime.MinValue;
private static readonly WorkerClass m_wClass;
}
public class WorkerClass
{
public delegate WorkerClass MyDelegate();
public void DoMyThing()
{
m_test = "Hi";
m_test2 = "Bye";
//create async call to do the work
MyDelegate myDel = new MyDelegate(Execute);
AsyncCallback cb = new AsyncCallback(CommandCallBack);
IAsyncResult ar = myDel.BeginInvoke(cb, null);
}
private WorkerClass Execute()
{
//do my stuff in an async call
m_test2 = "Later";
return this;
}
public void CommandCallBack(IAsyncResult ar)
{
// this is called when your task is complete
AsyncResult asyncResult = (AsyncResult)ar;
MyDelegate myDel = (MyDelegate)asyncResult.AsyncDelegate;
WorkerClass command = myDel.EndInvoke(ar);
// command is a reference to the original class that envoked the async call
// m_test will equal "Hi"
// m_test2 will equal "Later";
}
private string m_test;
private string m_test2;
}
I think you can can achieve it by using a BackgroundWorker, but i would rather suggest you to go for a service.
Your application context lives as long as your Worker Process in IIS is functioning. In IIS there's some default timeouts for when the worker process will recycle (e.g. Number of Idle mins (20), or regular intervals (1740).
That said, if you adjust those settings in IIS, you should be able to have the requests live, however, the other answers of using a Service would work as well, just a matter of how you want to implement.
I recently made a file upload functionality for uploading Access files to the database (not the best way but just a temporary fix to a longterm issue).
I solved it by creating a background thread that ran through the ProcessAccess function, and was deleted when completed.
Unless IIS has a setting in which it kills a thread after a set amount of time regardless of inactivity, you should be able to create a thread that calls a function that never ends. Don't use recursion because the amount of open functions will eventually blow up in you face, but just have a for(;;) loop 5,000,000 times so it'll keep busy :)
Application Initialization Module for IIS 7.5 does precisely this type of init work. More details on the module are available here Application Initialization Module

Categories