Is there a good way to find the bottleneck in your app? - c#

I am trying to figure out what the best way would be for me to find out what portions of my application are taking the longest time to run (Largest run cost). The application is not overly complex, but I wanted to make ensure that I have all of the pieces properly tuned so that I could potentially handle a greater load.
Application: Loads / shreds xml documents and dumps the contents into a DB. The application is using Linq to XML to parse the xml, and SQL Server TVPs to pass the data down to the DB. Because I am using TVPs I have one round trip to the DB even when there are collections of data the data is not big (XML files at most 1MB).
Any suggestions on how to isolate the bottlenecks would be greatly appreciated.
As always greatly appreciate the feedback.

You may want to check out the StopWatch class. You can sprinkle it into your code like this:
// load XML Method
var stopWatch = new Stopwatch();
stopWatch.Start();
// run XML parsing code
stopWatch.Stop();
var xmlTime = stopWatch.Elapsed;
// SQL Server dump Method
var stopWatch = new Stopwatch();
stopWatch.Start();
// dump to SQL Server
stopWatch.Stop();
var sqlTime = stopWatch.Elapsed;
This is a low-tech way to take general measurments. For a simple application this is probably more efficient than a profiler, since your application only has two real points for a bottle neck. That said, learning how to use a profiler may be worth your while.

Based on the answer of Nate you could make things easy and use a small helper method for this purpose.
public static Int64 MeasureTime(Action myAction)
{
var stopWatch = new Stopwatch();
stopWatch.Start();
myAction();
stopWatch.Stop();
return stopWatch.ElapsedMilliseconds;
}
Sample usage:
StringBuilder result;
Console.WriteLine("Parse-Xml: {0}", MeasureTime(() => result = MyAction("Test.xml")));

The common way to do this is to use a profiling tool. I've used RedGate ANTS for profiling C# applications, and it works well, but there are plenty of alternatives.

Related

Pass a timestamp and calculate function time from c++ through to C#

I am passing data from c++ to C# via a dll and a callback. I would like to measure the time that this function takes.
I currently have in c++:
std::chrono::milliseconds ms = std::chrono::duration_cast<std::chrono::milliseconds>(
std::chrono::system_clock::now().time_since_epoch());
And i am passing the ms variable through to C#. I then have:
long millisecondsSinceEpoch = DateTime.Now.Ticks / TimeSpan.TicksPerMillisecond;
Console.WriteLine("latency: " + Convert.ToString(millisecondsSinceEpoch - ms));
That prints:
latency: 63802044110874
Am I doing this correctly?
if so, how can I use that value to calculate the milliseconds between the two timestamps in a readable way?
Am I doing this correctly?
No. If the goal is to measure any kind of performance you need to use the high resolution performance counters, not the system clock. The system clock will typically have a resolution of 1-16ms, far to low for something like a method call.
In c# this is trivial get a accurate measurement:
var sw = Stopwatch.StartNew();
// Do method call
sw.Stop();
var elapsedTime = sw.Elapsed;
If you absolutely need to start your measuring from the c++ side you should be able to call QueryPerformanceCounter directly, see Stopwatch Source. At least on windows, linux should have something equivalent, but I have no idea what. I think there is also some standard c++ api for this, but I'm not a c++ guy.
Note that ideally you should be using something like benchmark.Net that does some statistics to ensure more accurate values.

How to reliably measure code efficiency/complexity/performance/expensiveness in C#?

My question consists of 2 parts:
Is there any good way in C# to measure computation effort other than using timers such as Stopwatch? Below is what I have been doing, but the granularity is not great, and the result returned varies every time. I am wondering if there is more precise measure such as CPU operation count so that the result returned can be consistent.
Stopwatch stopWatch = new Stopwatch();
stopWatch.Start();
//do work
stopWatch.Stop();
TimeSpan ts = stopWatch.Elapsed;
Console.WriteLine(ts);
If the alternative approach in 1 is not possible, how can I make the performance test result less variate? What are some factors that can make the result change? Would closing all other applications running help? (I did try it but there seems to be no significant effect.) How about running the test on a VM, sandbox, etc.?
(After typing the proceeding text I realized that I also have tried the Performance Analysis feature which comes with Visual Studio. The test result seems more coarse because of the sampling method it uses. So I also want to rule out that option)
You need to get a profiling tool. But you can use StopWatch more reliably if you run your tests in a loop multiple times but only take the results of the test if the garbage collection generation stays the same.
Like this:
var timespans = new List<TimeSpan>();
while (true)
{
var count = GC.CollectionCount(0);
var sw = Stopwatch.StartNew();
/* run test here */
sw.Stop();
if (count == GC.CollectionCount(0))
{
timespans.Add(sw.Elapsed);
}
if (timespans.Count == 100)
{
break;
}
}
That'll give you 100 tests where garbage collection didn't occur. The average is then pretty good to work from.
If you find that your tests never run without invoking a garbage collection then try working out the minimum number of GC's that get triggered and collect your time spans only when that number occurs.
You could query a system performance counter. The msdn doc for the System.Diagnostics.PerformanceCounter class has some examples. With this class you could query "\Process(your_process_name)\% Processor Time" for example. It's an alternative to Stopwatch but tbh I think just using stopwatch and averaging many runs over time is a perfectly good way to go.
If what you need is a higher resolution stopwatch because you are trying to measure a very small slice of cpu time, then you may be interested in the High-Performance Counter.

C# calculate time stamp [duplicate]

This question already has answers here:
Measuring code execution time
(7 answers)
Closed 9 years ago.
I want to check the amount of time it takes for a specific function to be performed ,so that I will add accurate time delay on my program...
I tried to write this before and after my function:
public static String (this DateTime value)
{
return value.ToString("mmssffff");
}
Then I calculated the difference between both results, and I get 350.I do not know if it microseconds or milliseconds...
Do you know the meaning of the result?(350)
Do you have another idea to do this?
Thanks!
Use Stopwatch for performance profiling.
Stopwatch sw = new Stopwatch();
sw.Start();
SayHello();
sw.Stop();
Console.WriteLine("Total time elapsed {0} millseconds", sw.Elapsed.TotalMilliSeconds);
I normally use something like this:
Stopwatch sw = Stopwatch.StartNew();
// rest of the code
sw.Stop();
Console.WriteLine("Total time (ms): {0}", sw.ElapsedMilliseconds);
You can use profilers for this task. In many cases the single method takes a lot of time but a sub-method which should be executed instantly takes most of the time. In fact, I was working on a Java project before and I have observed that it took unreasonably much time to work out the solution. I have used a profiler there and have seen that a simple object instantiation took most of the time. The problem was, of course, that the Java library initialized a hole lot of things because of that constructor, so I have chosen an alternative instead. So, I think you should use a profiler, read more here and here to start working with C# profilers.

How to improve performance of a method

I have a pretty big method.
where i have some c# calculation and also i am calling 3/4 stored procedures.
constructing 3/4 objects and finally adding in a list and returning the list.
My target is to improve the performance of this method so that it takes less time to execute.
My question is, is there any way so that I can check each part of the method and find out which part is taking time to execute??
may be some looging or something !!
I am using LINQ to EF.
Invest in a performance profiler, like Ants from Redgate. Some of the better versions of Visual Studio also come with one.
At the least, you could try using System.Diagnostics.Stopwatch
From msdn:
static void Main(string[] args)
{
Stopwatch stopWatch = new Stopwatch();
stopWatch.Start();
Thread.Sleep(10000);
stopWatch.Stop();
TimeSpan ts = stopWatch.Elapsed;
string elapsedTime = String.Format("{0:00}:{1:00}:{2:00}.{3:00}",
ts.Hours, ts.Minutes, ts.Seconds,
ts.Milliseconds / 10);
Console.WriteLine("RunTime " + elapsedTime);
}
If possible, you can try executing your stored procedures in parallel. I've seen this improve performance quite a bit, especially if your stored procedures just do reads and no writes.
It might look something like this:
ConcurrentBag<Result> results = new ConcurrentBag<Result>();
Parallel.Invoke(
() => {
var db = new DatabaseEntities();
Result result1 = db.StoredProcudure1();
results.Add(result1);
}
() => {
var db = new DatabaseEntities();
Result result2 = db.StoredProcudure2();
results.Add(result2);
}
() => {
var db = new DatabaseEntities();
Result result3 = db.StoredProcudure3();
results.Add(result3);
}
);
return results;
I'm using a ConcurrentBag here instead of a List because it is thread safe.
What you're looking for is a profiler - a profiler runs your program and tells you how much time each line of code took to execute, as well as how long it took to execute as a percentage of the total execution time.
A great C# profiler is the ANTS .Net Profiler, it's rather expensive, but it has a 14 day free trial - I think this would be perfect for your needs.
You have several options. I find myself using stop watches to test this kind of thing. Howerver before you do anything are you sure the code isn't already performing well enough. If it ain't broke don't fix it is often the best advice. If you're still interested you can do this kind of thing:
Stopwatch sw = Stopwatch.StartNew();
// do some code stuff here
sw.Stop();
Console.WriteLine(sw.ElapsedTicks);
You also have seconds, milliseconds and other measurements in the sw variable.
My advise would be for you to use JetBrains dottrace it have some very helpfull functionality that points hotspot and tells you which piece of code have taken how long
PS: it has saved my neck few times
Database accesses are generally orders of magnitude slower than any calculations that you might make (unless you are trying the predict tomorrows weather). So the LINQ-to-EF part is most probably where time gets lost.
You can use profilers to analyse a program. SQL-Server has a profiler that allows you to monitor queries. If you want to analyse the code, google for .NET profilers and you will find quite a few that have a free licence. Or buy one, if you find it useful. The EQATEC profiler was quite useful for me.
If you have a big method your code is badly structured. Making a big method does not make it faster than splitting it into smaller logical parts. Smaller parts will be easier to maintain and the code profilers will yield more useful informations, since they often only return method call totals and don't show the times for single lines of code.

Testing your code for speed?

I'm a total newbie, but I was writing a little program that worked on strings in C# and I noticed that if I did a few things differently, the code executed significantly faster.
So it had me wondering, how do you go about clocking your code's execution speed? Are there any (free)utilities? Do you go about it the old-fashioned way with a System.Timer and do it yourself?
What you are describing is known as performance profiling. There are many programs you can get to do this such as Jetbrains profiler or Ants profiler, although most will slow down your application whilst in the process of measuring its performance.
To hand-roll your own performance profiling, you can use System.Diagnostics.Stopwatch and a simple Console.WriteLine, like you described.
Also keep in mind that the C# JIT compiler optimizes code depending on the type and frequency it is called, so play around with loops of differing sizes and methods such as recursive calls to get a feel of what works best.
ANTS Profiler from RedGate is a really nice performance profiler. dotTrace Profiler from JetBrains is also great. These tools will allow you to see performance metrics that can be drilled down the each individual line.
Scree shot of ANTS Profiler:
ANTS http://www.red-gate.com/products/ants_profiler/images/app/timeline_calltree3.gif
If you want to ensure that a specific method stays within a specific performance threshold during unit testing, I would use the Stopwatch class to monitor the execution time of a method one ore many times in a loop and calculate the average and then Assert against the result.
Just a reminder - make sure to compile in Relase, not Debug! (I've seen this mistake made by seasoned developers - it's easy to forget).
What are you describing is 'Performance Tuning'. When we talk about performance tuning there are two angle to it. (a) Response time - how long it take to execute a particular request/program. (b) Throughput - How many requests it can execute in a second. When we typically 'optimize' - when we eliminate unnecessary processing both response time as well as throughput improves. However if you have wait events in you code (like Thread.sleep(), I/O wait etc) your response time is affected however throughput is not affected. By adopting parallel processing (spawning multiple threads) we can improve response time but throughput will not be improved. Typically for server side application both response time and throughput are important. For desktop applications (like IDE) throughput is not important only response time is important.
You can measure response time by 'Performance Testing' - you just note down the response time for all key transactions. You can measure the throughput by 'Load Testing' - You need to pump requests continuously from sufficiently large number of threads/clients such that the CPU usage of server machine is 80-90%. When we pump request we need to maintain the ratio between different transactions (called transaction mix) - for eg: in a reservation system there will be 10 booking for every 100 search. there will be one cancellation for every 10 booking etc.
After identifying the transactions require tuning for response time (performance testing) you can identify the hot spots by using a profiler.
You can identify the hot spots for throughput by comparing the response time * fraction of that transaction. Assume in search, booking, cancellation scenario, ratio is 89:10:1.
Response time are 0.1 sec, 10 sec and 15 sec.
load for search - 0.1 * .89 = 0.089
load for booking- 10 * .1 = 1
load for cancell= 15 * .01= 0.15
Here tuning booking will yield maximum impact on throughput.
You can also identify hot spots for throughput by taking thread dumps (in the case of java based applications) repeatedly.
Use a profiler.
Ants (http://www.red-gate.com/Products/ants_profiler/index.htm)
dotTrace (http://www.jetbrains.com/profiler/)
If you need to time one specific method only, the Stopwatch class might be a good choice.
I do the following things:
1) I use ticks (e.g. in VB.Net Now.ticks) for measuring the current time. I subtract the starting ticks from the finished ticks value and divide by TimeSpan.TicksPerSecond to get how many seconds it took.
2) I avoid UI operations (like console.writeline).
3) I run the code over a substantial loop (like 100,000 iterations) to factor out usage / OS variables as best as I can.
You can use the StopWatch class to time methods. Remember the first time is often slow due to code having to be jitted.
There is a native .NET option (Team Edition for Software Developers) that might address some performance analysis needs. From the 2005 .NET IDE menu, select Tools->Performance Tools->Performance Wizard...
[GSS is probably correct that you must have Team Edition]
This is simple example for testing code speed. I hope I helped you
class Program {
static void Main(string[] args) {
const int steps = 10000;
Stopwatch sw = new Stopwatch();
ArrayList list1 = new ArrayList();
sw.Start();
for(int i = 0; i < steps; i++) {
list1.Add(i);
}
sw.Stop();
Console.WriteLine("ArrayList:\tMilliseconds = {0},\tTicks = {1}", sw.ElapsedMilliseconds, sw.ElapsedTicks);
MyList list2 = new MyList();
sw.Start();
for(int i = 0; i < steps; i++) {
list2.Add(i);
}
sw.Stop();
Console.WriteLine("MyList: \tMilliseconds = {0},\tTicks = {1}", sw.ElapsedMilliseconds, sw.ElapsedTicks);

Categories