This question already has answers here:
Measuring code execution time
(7 answers)
Closed 9 years ago.
I want to check the amount of time it takes for a specific function to be performed ,so that I will add accurate time delay on my program...
I tried to write this before and after my function:
public static String (this DateTime value)
{
return value.ToString("mmssffff");
}
Then I calculated the difference between both results, and I get 350.I do not know if it microseconds or milliseconds...
Do you know the meaning of the result?(350)
Do you have another idea to do this?
Thanks!
Use Stopwatch for performance profiling.
Stopwatch sw = new Stopwatch();
sw.Start();
SayHello();
sw.Stop();
Console.WriteLine("Total time elapsed {0} millseconds", sw.Elapsed.TotalMilliSeconds);
I normally use something like this:
Stopwatch sw = Stopwatch.StartNew();
// rest of the code
sw.Stop();
Console.WriteLine("Total time (ms): {0}", sw.ElapsedMilliseconds);
You can use profilers for this task. In many cases the single method takes a lot of time but a sub-method which should be executed instantly takes most of the time. In fact, I was working on a Java project before and I have observed that it took unreasonably much time to work out the solution. I have used a profiler there and have seen that a simple object instantiation took most of the time. The problem was, of course, that the Java library initialized a hole lot of things because of that constructor, so I have chosen an alternative instead. So, I think you should use a profiler, read more here and here to start working with C# profilers.
Related
My question consists of 2 parts:
Is there any good way in C# to measure computation effort other than using timers such as Stopwatch? Below is what I have been doing, but the granularity is not great, and the result returned varies every time. I am wondering if there is more precise measure such as CPU operation count so that the result returned can be consistent.
Stopwatch stopWatch = new Stopwatch();
stopWatch.Start();
//do work
stopWatch.Stop();
TimeSpan ts = stopWatch.Elapsed;
Console.WriteLine(ts);
If the alternative approach in 1 is not possible, how can I make the performance test result less variate? What are some factors that can make the result change? Would closing all other applications running help? (I did try it but there seems to be no significant effect.) How about running the test on a VM, sandbox, etc.?
(After typing the proceeding text I realized that I also have tried the Performance Analysis feature which comes with Visual Studio. The test result seems more coarse because of the sampling method it uses. So I also want to rule out that option)
You need to get a profiling tool. But you can use StopWatch more reliably if you run your tests in a loop multiple times but only take the results of the test if the garbage collection generation stays the same.
Like this:
var timespans = new List<TimeSpan>();
while (true)
{
var count = GC.CollectionCount(0);
var sw = Stopwatch.StartNew();
/* run test here */
sw.Stop();
if (count == GC.CollectionCount(0))
{
timespans.Add(sw.Elapsed);
}
if (timespans.Count == 100)
{
break;
}
}
That'll give you 100 tests where garbage collection didn't occur. The average is then pretty good to work from.
If you find that your tests never run without invoking a garbage collection then try working out the minimum number of GC's that get triggered and collect your time spans only when that number occurs.
You could query a system performance counter. The msdn doc for the System.Diagnostics.PerformanceCounter class has some examples. With this class you could query "\Process(your_process_name)\% Processor Time" for example. It's an alternative to Stopwatch but tbh I think just using stopwatch and averaging many runs over time is a perfectly good way to go.
If what you need is a higher resolution stopwatch because you are trying to measure a very small slice of cpu time, then you may be interested in the High-Performance Counter.
I have a pretty big method.
where i have some c# calculation and also i am calling 3/4 stored procedures.
constructing 3/4 objects and finally adding in a list and returning the list.
My target is to improve the performance of this method so that it takes less time to execute.
My question is, is there any way so that I can check each part of the method and find out which part is taking time to execute??
may be some looging or something !!
I am using LINQ to EF.
Invest in a performance profiler, like Ants from Redgate. Some of the better versions of Visual Studio also come with one.
At the least, you could try using System.Diagnostics.Stopwatch
From msdn:
static void Main(string[] args)
{
Stopwatch stopWatch = new Stopwatch();
stopWatch.Start();
Thread.Sleep(10000);
stopWatch.Stop();
TimeSpan ts = stopWatch.Elapsed;
string elapsedTime = String.Format("{0:00}:{1:00}:{2:00}.{3:00}",
ts.Hours, ts.Minutes, ts.Seconds,
ts.Milliseconds / 10);
Console.WriteLine("RunTime " + elapsedTime);
}
If possible, you can try executing your stored procedures in parallel. I've seen this improve performance quite a bit, especially if your stored procedures just do reads and no writes.
It might look something like this:
ConcurrentBag<Result> results = new ConcurrentBag<Result>();
Parallel.Invoke(
() => {
var db = new DatabaseEntities();
Result result1 = db.StoredProcudure1();
results.Add(result1);
}
() => {
var db = new DatabaseEntities();
Result result2 = db.StoredProcudure2();
results.Add(result2);
}
() => {
var db = new DatabaseEntities();
Result result3 = db.StoredProcudure3();
results.Add(result3);
}
);
return results;
I'm using a ConcurrentBag here instead of a List because it is thread safe.
What you're looking for is a profiler - a profiler runs your program and tells you how much time each line of code took to execute, as well as how long it took to execute as a percentage of the total execution time.
A great C# profiler is the ANTS .Net Profiler, it's rather expensive, but it has a 14 day free trial - I think this would be perfect for your needs.
You have several options. I find myself using stop watches to test this kind of thing. Howerver before you do anything are you sure the code isn't already performing well enough. If it ain't broke don't fix it is often the best advice. If you're still interested you can do this kind of thing:
Stopwatch sw = Stopwatch.StartNew();
// do some code stuff here
sw.Stop();
Console.WriteLine(sw.ElapsedTicks);
You also have seconds, milliseconds and other measurements in the sw variable.
My advise would be for you to use JetBrains dottrace it have some very helpfull functionality that points hotspot and tells you which piece of code have taken how long
PS: it has saved my neck few times
Database accesses are generally orders of magnitude slower than any calculations that you might make (unless you are trying the predict tomorrows weather). So the LINQ-to-EF part is most probably where time gets lost.
You can use profilers to analyse a program. SQL-Server has a profiler that allows you to monitor queries. If you want to analyse the code, google for .NET profilers and you will find quite a few that have a free licence. Or buy one, if you find it useful. The EQATEC profiler was quite useful for me.
If you have a big method your code is badly structured. Making a big method does not make it faster than splitting it into smaller logical parts. Smaller parts will be easier to maintain and the code profilers will yield more useful informations, since they often only return method call totals and don't show the times for single lines of code.
I am trying to figure out what the best way would be for me to find out what portions of my application are taking the longest time to run (Largest run cost). The application is not overly complex, but I wanted to make ensure that I have all of the pieces properly tuned so that I could potentially handle a greater load.
Application: Loads / shreds xml documents and dumps the contents into a DB. The application is using Linq to XML to parse the xml, and SQL Server TVPs to pass the data down to the DB. Because I am using TVPs I have one round trip to the DB even when there are collections of data the data is not big (XML files at most 1MB).
Any suggestions on how to isolate the bottlenecks would be greatly appreciated.
As always greatly appreciate the feedback.
You may want to check out the StopWatch class. You can sprinkle it into your code like this:
// load XML Method
var stopWatch = new Stopwatch();
stopWatch.Start();
// run XML parsing code
stopWatch.Stop();
var xmlTime = stopWatch.Elapsed;
// SQL Server dump Method
var stopWatch = new Stopwatch();
stopWatch.Start();
// dump to SQL Server
stopWatch.Stop();
var sqlTime = stopWatch.Elapsed;
This is a low-tech way to take general measurments. For a simple application this is probably more efficient than a profiler, since your application only has two real points for a bottle neck. That said, learning how to use a profiler may be worth your while.
Based on the answer of Nate you could make things easy and use a small helper method for this purpose.
public static Int64 MeasureTime(Action myAction)
{
var stopWatch = new Stopwatch();
stopWatch.Start();
myAction();
stopWatch.Stop();
return stopWatch.ElapsedMilliseconds;
}
Sample usage:
StringBuilder result;
Console.WriteLine("Parse-Xml: {0}", MeasureTime(() => result = MyAction("Test.xml")));
The common way to do this is to use a profiling tool. I've used RedGate ANTS for profiling C# applications, and it works well, but there are plenty of alternatives.
Is there anywhere in C# to perform timing operations with sub millisecond accuracy? I'm putting timing code in my software and everything is being returned as 0ms. I would like to know if there is a way of getting even finer granularity.
Addendum: is this the correct code to get sub millisecond timing?
timeSpan.TotalMilliseconds / 10
I'm still getting 0 as the elapsed time
You could always try using QueryPerformanceCounter or the StopWatch class for a managed approach
Use System.Diagnostics.Stopwatch
Usually I measure these kinds of scenarios by repeating the operation multiple times (as many as needed to get the milliseconds over a few dozen or hundred.
Then you can adjust and measure more easily.
You could try measuring your performance in matter of Ticks instead of milliseconds. It will be much more accurate in terms of performance.
but I agree with Sam and Tyranid, use System.Diagnostics.StopWatch.
I learned something new. Those Ticks are handy indeed. Complete example:
Stopwatch stopwatch = Stopwatch.StartNew();
// Run code to get duration of
long durationInTicks = stopwatch.ElapsedTicks;
Console.WriteLine($"Duration: {(decimal)durationInTicks / TimeSpan.TicksPerMillisecond:f3}ms");
You can also use stopwatch.ElapsedTicks inside the Console.WriteLine, but I do not know if that has any performance delays, that's why I first retrieve it and put it in a variable.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 2 years ago.
Improve this question
I need to find a bottleneck and need to accurately as possible measure time.
Is the following code snippet the best way to measure the performance?
DateTime startTime = DateTime.Now;
// Some execution process
DateTime endTime = DateTime.Now;
TimeSpan totalTimeTaken = endTime.Subtract(startTime);
No, it's not. Use the Stopwatch (in System.Diagnostics)
Stopwatch sw = Stopwatch.StartNew();
PerformWork();
sw.Stop();
Console.WriteLine("Time taken: {0}ms", sw.Elapsed.TotalMilliseconds);
Stopwatch automatically checks for the existence of high-precision timers.
It is worth mentioning that DateTime.Now often is quite a bit slower than DateTime.UtcNow due to the work that has to be done with timezones, DST and such.
DateTime.UtcNow typically has a resolution of 15 ms. See John Chapman's blog post about DateTime.Now precision for a great summary.
Interesting trivia: The stopwatch falls back on DateTime.UtcNow if your hardware doesn't support a high frequency counter. You can check to see if Stopwatch uses hardware to achieve high precision by looking at the static field Stopwatch.IsHighResolution.
If you want something quick and dirty I would suggest using Stopwatch instead for a greater degree of precision.
Stopwatch sw = new Stopwatch();
sw.Start();
// Do Work
sw.Stop();
Console.WriteLine("Elapsed time: {0}", sw.Elapsed.TotalMilliseconds);
Alternatively, if you need something a little more sophisticated you should probably consider using a 3rd party profiler such as ANTS.
This article says that first of all you need to compare three alternatives, Stopwatch, DateTime.Now AND DateTime.UtcNow.
It also shows that in some cases (when performance counter doesn't exist) Stopwatch is using DateTime.UtcNow + some extra processing. Because of that it's obvious that in that case DateTime.UtcNow is the best option (because other use it + some processing)
However, as it turns out, the counter almost always exists - see Explanation about high-resolution performance counter and its existence related to .NET Stopwatch?.
Here is a performance graph. Notice how low performance cost UtcNow has compared to alternatives:
The X axis is sample data size, and the Y axis is the relative time of the example.
One thing Stopwatch is better at is that it provides higher resolution time measurements. Another is its more OO nature. However, creating an OO wrapper around UtcNow can't be hard.
It's useful to push your benchmarking code into a utility class/method. The StopWatch class does not need to be Disposed or Stopped on error. So, the simplest code to time some action is
public partial class With
{
public static long Benchmark(Action action)
{
var stopwatch = Stopwatch.StartNew();
action();
stopwatch.Stop();
return stopwatch.ElapsedMilliseconds;
}
}
Sample calling code
public void Execute(Action action)
{
var time = With.Benchmark(action);
log.DebugFormat(“Did action in {0} ms.”, time);
}
Here is the extension method version
public static class Extensions
{
public static long Benchmark(this Action action)
{
return With.Benchmark(action);
}
}
And sample calling code
public void Execute(Action action)
{
var time = action.Benchmark()
log.DebugFormat(“Did action in {0} ms.”, time);
}
The stopwatch functionality would be better (higher precision). I'd also recommend just downloading one of the popular profilers, though (DotTrace and ANTS are the ones I've used the most... the free trial for DotTrace is fully functional and doesn't nag like some of the others).
Use the System.Diagnostics.Stopwatch class.
Stopwatch sw = new Stopwatch();
sw.Start();
// Do some code.
sw.Stop();
// sw.ElapsedMilliseconds = the time your "do some code" took.
Ditto Stopwatch, it is way better.
Regarding performance measuring you should also check whether your "// Some Execution Process" is a very short process.
Also bear in mind that the first run of your "// Some Execution Process" might be way slower than subsequent runs.
I typically test a method by running it 1000 times or 1000000 times in a loop and I get much more accurate data than running it once.
These are all great ways to measure time, but that is only a very indirect way to find bottleneck(s).
The most direct way to find a bottneck in a thread is to get it running, and while it is doing whatever makes you wait, halt it with a pause or break key. Do this several times. If your bottleneck takes X% of time, X% is the probability that you will catch it in the act on each snapshot.
Here's a more complete explanation of how and why it works
#Sean Chambers
FYI, the .NET Timer class is not for diagnostics, it generates events at a preset interval, like this (from MSDN):
System.Timers.Timer aTimer;
public static void Main()
{
// Create a timer with a ten second interval.
aTimer = new System.Timers.Timer(10000);
// Hook up the Elapsed event for the timer.
aTimer.Elapsed += new ElapsedEventHandler(OnTimedEvent);
// Set the Interval to 2 seconds (2000 milliseconds).
aTimer.Interval = 2000;
aTimer.Enabled = true;
Console.WriteLine("Press the Enter key to exit the program.");
Console.ReadLine();
}
// Specify what you want to happen when the Elapsed event is
// raised.
private static void OnTimedEvent(object source, ElapsedEventArgs e)
{
Console.WriteLine("The Elapsed event was raised at {0}", e.SignalTime);
}
So this really doesn't help you know how long something took, just that a certain amount of time has passed.
The timer is also exposed as a control in System.Windows.Forms... you can find it in your designer tool box in VS05/VS08
This is the correct way:
using System;
using System.Diagnostics;
class Program
{
public static void Main()
{
Stopwatch stopWatch = Stopwatch.StartNew();
// some other code
stopWatch.Stop();
// this not correct to get full timer resolution
Console.WriteLine("{0} ms", stopWatch.ElapsedMilliseconds);
// Correct way to get accurate high precision timing
Console.WriteLine("{0} ms", stopWatch.Elapsed.TotalMilliseconds);
}
}
For more information go through Use Stopwatch instead of DataTime for getting accurate performance counter.
Visual Studio Team System has some features that may help with this problem. Essentially you can write unit tests and mix them in different scenarios to run against your software as part of a stress or load test. This may help to identify areas of code that impact your applications performance the most.
Microsoft' Patterns and Practices group has some guidance in Visual Studio Team System Performance Testing Guidance.
I just found a post in Vance Morrison's blog about a CodeTimer class he wrote that makes using StopWatch easier and does some neat stuff on the side.
The way I use within my programs is using the StopWatch class as shown here.
Stopwatch sw = new Stopwatch();
sw.Start();
// Critical lines of code
long elapsedMs = sw.Elapsed.TotalMilliseconds;
I've done very little of this sort of performance checking (I tend to just think "this is slow, make it faster") so I have pretty much always gone with this.
A google does reveal a lot of resources/articles for performance checking.
Many mention using pinvoke to get performance information. A lot of the materials I study only really mention using perfmon..
Edit:
Seen the talks of StopWatch.. Nice! I have learned something :)
This looks like a good article
This is not professional enough:
Stopwatch sw = Stopwatch.StartNew();
PerformWork();
sw.Stop();
Console.WriteLine("Time taken: {0}ms", sw.Elapsed.TotalMilliseconds);
A more reliable version is:
PerformWork();
int repeat = 1000;
Stopwatch sw = Stopwatch.StartNew();
for (int i = 0; i < repeat; i++)
{
PerformWork();
}
sw.Stop();
Console.WriteLine("Time taken: {0}ms", sw.Elapsed.TotalMilliseconds / repeat);
In my real code, I will add GC.Collect call to change managed heap to a known state, and add Sleep call so that different intervals of code can be easily separated in ETW profile.
Since I do not care to much about precision I ended up comparing them. I am capturing lots of packets on the network and I want to place the time when I receive each packet. Here is the code that tests 5 million iterations
int iterations = 5000000;
// Test using datetime.now
{
var date = DateTime.UtcNow.AddHours(DateTime.UtcNow.Second);
var now = DateTime.UtcNow;
for (int i = 0; i < iterations; i++)
{
if (date == DateTime.Now)
Console.WriteLine("it is!");
}
Console.WriteLine($"Done executing {iterations} iterations using datetime.now. It took {(DateTime.UtcNow - now).TotalSeconds} seconds");
}
// Test using datetime.utcnow
{
var date = DateTime.UtcNow.AddHours(DateTime.UtcNow.Second);
var now = DateTime.UtcNow;
for (int i = 0; i < iterations; i++)
{
if (date == DateTime.UtcNow)
Console.WriteLine("it is!");
}
Console.WriteLine($"Done executing {iterations} iterations using datetime.utcnow. It took {(DateTime.UtcNow - now).TotalSeconds} seconds");
}
// Test using stopwatch
{
Stopwatch sw = new Stopwatch();
sw.Start();
var now = DateTime.UtcNow;
for (int i = 0; i < iterations; i++)
{
if (sw.ElapsedTicks == DateTime.Now.Ticks)
Console.WriteLine("it is!");
}
Console.WriteLine($"Done executing {iterations} iterations using stopwatch. It took {(DateTime.UtcNow - now).TotalSeconds} seconds");
}
The output is:
Done executing 5000000 iterations using datetime.now. It took 0.8685502 seconds
Done executing 5000000 iterations using datetime.utcnow. It took 0.1074324 seconds
Done executing 5000000 iterations using stopwatch. It took 0.9625021 seconds
So in conclusion DateTime.UtcNow is the fastest if you do not care to much about precision. This also supports the answer https://stackoverflow.com/a/6986472/637142 from this question.