I have a lengthy method that writes data into a database. It is called repeatedly. I also maintain the counter of records written so far, as well as the total number of records that need to be written as such:
private int currentCount;
private int totalCount;
private double fAverageTransferRate;
bool processingMethod()
{
//Processes one record at a time
DateTime dtNow = DateTime.Now; //Time now
fAverageTransferRate = //?
}
I know that to calculate a transfer rate I need to take the number of records written in one second, right, but here come two questions:
How would I time my calculation exactly at 1 second mark?
And, most of all, how do I calculate an average transfer rate?
PS. I need this done, on the go, so to speak, while this method is running (and not after it is finished.)
You could think about it a different way, since what you're really interested in is the rate of processing records. Therefore, you con't need to make the calculation happen at precisely 1 second intervals. Rather, you need it happen about every second but then know exactly when it happens.
To calculate the average transfer rate, just keep a running count of the number of records you are transferring. If more than 1 second has elapsed since the last time you computed the average, its time to compute the average anew. Zero out the running count when you're done, in preparation for the next round.
Pseudo-code follows:
// somewhere outside:
int lastdonetime = 0;
int numprocessed = 0;
bool processingMethod()
{
DateTime dtNow = DateTime.Now; //Time now
if (lastdonetime == 0) lastdonetime = dtNow;
if (dtNow - lastdonetime > 1) {
fAverageTransferRate = numprocessed / (dtNow - lastdonetime);
// Do what you want with fAverageTransferRate
lastdonetime = dtNow;
numprocessed = 0;
}
}
Related
I have tried to write multiple programs relating to date/time (including Random.Next()), but I have a persisting problem. Every time I use a Console.ReadLine() statement (or some other statement that would cause the code to pause and resume) it refreshes the variable set by date/time as well as every variable caused by or modified by date/time in any way. Is there a way to store the time the program is run instead of the current time? (I'm using an online editor, dotnetfiddle.net, if that changes anything)
Here's a few examples:
using System;
namespace timeTest
{
public class Program
{
public static void Main()
{
Random random = new Random();//Random is based off of time, which updates
Console.WriteLine("Randomizer: " + random.Next(1,11));//Prints random number
long time = DateTime.Now.Ticks;
Console.WriteLine("Time in ticks: "+time);//Prints time in ticks
int hr = DateTime.Now.Hour;
int min = DateTime.Now.Minute;
int sec = DateTime.Now.Second;
Console.WriteLine("Military time: "+hr+":"+min+":"+sec);//Prints time on the clock
int sec2 = sec*2;//Stores value in a different variable times two
Console.WriteLine("2x the current second: "+sec2);//Prints twice the second
while (true) {Console.ReadLine();}//Every time Console.ReadLine is called, it changes again
}
}
}
DateTime.Now is a calculated property. It returns the current time at the exact moment it is called. When you run under the debugger, it always gets the value it has at the moment the value is evaluated by the debugger. Your program won't notice that, but it is something you need to keep in mind.
However, you have a conceptual error here:
int hr = DateTime.Now.Hour;
int min = DateTime.Now.Minute;
int sec = DateTime.Now.Second;
Because DateTime.Now is evaluated each time it is called, this returns a new instance of the time with every call. This may result in the minute not matching the hour displayed (when the hour just changes between the first and the second line, your displayed time will be an hour off).
Evaluate the value once:
DateTime now = DateTime.Now;
int hr = now.Hour;
int min = now.Minute;
int sec = now.Second;
You'll note that in this case, the value of now won't change, even if you stop in the debugger.
Note that there are also standard-formatting options that don't require you to separate the values yourself, just do
Console.WriteLine($"{now:T}"); // standard (localized) time format
I was looking at different times for searching a specific element in ConcurrentBag with the use of .ElementAt() and found this strange time difference between searching for an element with an index of 950,000 and searching for an element with an index of 1,000,000.
The time it took to find the element on 950,000th place took between 62 and 68 milliseconds.
The time it took to find the element on 1,000,000th place took between 20 and 23 milliseconds.
And I'm not sure why that is.
The code looks like this:
ConcurrentBag<int?> concurrentBag = new ConcurrentBag<int?>();
int n = 1000000;
int? n1 = n;
for (int i = 0; i <= n1; i++)
{
concurrentBag.Add(i);
}
DateTime before = DateTime.Now;
int? a = concurrentBag.ElementAt(n);
DateTime after = DateTime.Now;
TimeSpan time = after - before;
Console.WriteLine(time.TotalMilliseconds);
You should not be using the DateTime library to check accuracy and benchmarks at this small of an interval. When I run this code I get anywhere from 11ms to 70ms each time. It's not going to be consistent.
You are doing one single lookup. Your machine could be doing any number of other operations that would affect the speed of a single lookup. You should run this code many thousand times and get the average to have any sort of valid data.
I'm trying to do something like https://www.humanbenchmark.com/ with android xamarin. I'm stuck with converting lifespan to integer? This is my piece of code:
private void AfterClicked()
{
ScreenClickButton.SetBackgroundColor(Android.Graphics.Color.Red);
Random rnd = new Random();
int seconds = rnd.Next(1, 11);
DateTime startTime1;
startTime1 = DateTime.Now;
TimeSpan timeElapsed = DateTime.Now - startTime1;
if (timeElapsed == seconds)
{
ScreenClickButton.SetBackgroundColor(Android.Graphics.Color.Green);
}
}
If you look at the docs for TimeSpan, you'll see that it has a seconds property
if (timeElapsed.Seconds == seconds)
However, this is probably not going to work: you are essentially creating two DateTime objects in a row and comparing them - they will not be identical but will also not differ by any meaningful amount, and certainly not by more than a second.
If you are trying to measure user reaction time, you probably want to establish the benchmark timestamp before they click the button, not after.
I have a text file with X number of records that have 24 pipe delimited fields.
ABCDEFG|123456|BILLING|1234567|12345678|12345678|...
My concern is with the BILLING column. I need to append this word with current date and a sequential number BILLING-20131021-1 but here is the trick: The digit can or must increment only for each 10% of the record. So for example if I have 100 records, first ten of them will end with 1, next ten will end with 2, and so on. If there is an uneven number than the remainder will acquire the next sequence.
I started with two loops but that didn't produce the results. The the first loop iterates through the record count and the second iterates through the first 10% of records but then I can't figure out how to get the next batch of records.
for (uint recordCount = 0; recordCount < RecordsPerBatch; recordCount++)
{
for (uint smallCount = 0; smallCount < (RecordsPerBatch / 10)); smallCount++)
{}
}
You can simply loop through, keep a counter and only increment the "small count" when you hit a defined condition.
ie.
int smallCount = 1;
for (int recordCount = 0; recordCount < totalRecords; ++recordCount)
{
if (recordCount % (totalRecords / 10) == 0)
++smallCount;
}
Maintaining your current logic you could add another variable that keep the batch counter and simplify the condition on the inner loop calculating the batch size (10% of the total records).
Also it is necessary to check if the indexer in the inner loop doesn't exceed the total record count.
uint TotalRecordCounter = 101;
uint currentBatch = 1;
uint batchSize = TotalRecordCounter / 10;
// This will account for batch size that are not exactly divisible for 10.
// But if it is allowed to have more than 10 batches then remove it
// if((TotalRecordCounter % 10) != 0)
// batchSize++;
for (uint recordCount = 0; recordCount < TotalRecordCounter; recordCount+=batchSize)
{
for (uint smallCount = 0;
smallCount < batchSize && (recordCount+smallCount) < TotalRecordCounter;
smallCount++)
{
string billing = string.Format("BILLING-{0:yyyymmdd}-{1}", DateTime.Today, currentBatch);
}
currentBatch++;
}
If I'm understanding correctly, you're saying the issue is figuring out when you hit that 10% (and 20% and 30%, etc) of the file threshold. Giving a good answer depends a lot on what your system is capable of, but there are many ways to do this with a single loop, and in the worst case, you can do it in a single, non-nested loop.
Can you find the exact number of lines in the file?
If your file isn't gigantic, and your file is line-delimited (one record per line), this is the easiest solution, just read in the file as a string array. Then you just need to go through each line, and generate the last number each time, using current_record / exact_count.
Can you calculate the exact number of lines in the file?
If your records are fixed-length, you can take the file size, divide by the record size, and therefore calculate the exact number of records, and then generate the last number as above.
Is it sufficient to estimate the number of lines in the file?
Same idea as the previous suggestion, only use an estimate of your average record size.
What type of stream are you using to read the file?
If it's one where you can find the total length of the stream and your current position, you can calculate your percentage using that, instead of using row indices.
Final fallback suggestion
If none of the other suggestions work, you can always do a simple two pass solution. In the first one, simply read through and count the number of records. If you can fit it into memory, store each record and then parse it in-memory. If you can't, just read the file to count the number of entries, then read it again.
I have a while loop and all it does is a method call. I have a timer on the outside of the loop and another timer that incrementally adds up the time the method call takes inside the loop. The outer time takes about 17 seconds and the total on the inner timer is 40 ms. The loop is executing 50,000 times. Here is an example of the code:
long InnerTime = 0;
long OutterTime = 0;
Stopw1.Start();
int count = 1;
while (count <= TestCollection.Count) {
Stopw2.Start();
Medthod1();
Stopw2.Stop();
InnerTime = InnerTime + Stopw2.ElapsedMilliseconds;
Stopw2.Reset();
count++;
}
Stopw1.Stop();
OutterTime = Stopw1.ElapsedMilliseconds;
Stopw1.Reset();
Any help would be much appreciated.
Massimo
You are comparing apples and oranges. Your outer timer measures the total time taken. Your inner timer measures the number of whole milliseconds taken by the call to Method1.
The ElapsedMilliseconds property "represents elapsed time rounded down to the nearest whole millisecond value." So, you are rounding down to the nearest millisecond about 50,000 times.
If your call to Method1 takes, on average, less than 1ms, then most of the time, the `ElapsedMilliseconds' property will return 0 and your inner count will be much, much less than the actual time. In fact, your method takes about 0.3ms on average, so you're lucky even to get it to go over 1ms 40 times.
Use the Elapsed.TotalMilliseconds or ElapsedTicks property instead of ElapsedMilliseconds. One millisecond is equivalent to 10,000 ticks.
What is this doing: TestCollection.Count ?
I suspect your 17 seconds are being spent counting your 50,000 items over and over again.
Try changing this:
while (count <= TestCollection.Count) {
...
}
to this:
int total = TestCollection.Count;
while (count <= total) {
...
}
To add to what the others have already said, in general the C# compiler must re-evaluate any property, including
TestCollection.Count
for every single loop iteration. The property's value could change from iteration to iteration.
Assigning the value to a local variable removes the compiler's need to re-evaluate for every loop iteration.
The one exception that I'm aware of is for Array.Length, which benefits from an optimization specifically for arrays. This is referred to as Array Bounds Check Elimination.
To have a correct measurement of the time that your calls take,
you should use the Ticks
Please try the following:
long InnerTime = 0;
long OutterTime = 0;
Stopwatch Stopw1 = new Stopwatch();
Stopwatch Stopw2 = new Stopwatch();
Stopw1.Start();
int count = 1;
int run = TestCollection.Count;
while (count <= run) {
Stopw2.Start();
Medthod1();
Stopw2.Stop();
InnerTime = InnerTime + Stopw2.ElapsedTicks;
Stopw2.Reset();
count++;
}
Stopw1.Stop();
OutterTime = Stopw1.ElapsedTicks;
Stopw1.Reset();
You should not measure such a tiny method individually. But if you really want to, try this:
long innertime = 0;
while (count <= TestCollection.Count)
{
innertime -= Stopw2.GetTimestamp();
Medthod1();
innertime += Stopw2.GetTimestamp();
count++;
}
Console.WriteLine("{0} ms", innertime * 1000.0 / Stopw2.Frequency);