How do you check if 2 OffsetDateTime lie within another 2 OffsetDateTIme? - c#

Given an POCO Event{OffsetDateTime Start, OffsetDateTime End} and a POCO Trial {OffsetDateTime Start, OffsetDateTime End}
Where trials typical span hours, and events happen over a few seconds.
How can I test whether an Event happened within a Trial?
The Naive code that came before, used: event.Start > trial.Start && event.Start < trial.End
but converting to NodaTime those comparisons are no longer valid.
I suspect I can't without making some assumptions about how it should be converted to instants and intervals, considering both Event and Trial come from a third party library, that should probably be using timezoned types, or instants rather then OffsetDateTimes.

Note: this answer aims at "trial completely contains event" - for "trial overlaps event", see Matt Johnson's answer.
OffsetDateTime.ToInstant is unambiguous, so you could certainly just convert to Instant values. You might want to create an interval from the trial though:
Interval trial = new Interval(trial.Start.ToInstant(), trial.End.ToInstant());
if (trial.Contains(event.Start.ToInstant()) &&
trial.Contains(event.End.ToInstant()))
{
...
}
One potential wrinkle of this is that the end point of an interval is exclusive... so if event.End and trial.End are the same instant, the above will not enter the if statement body.

I could be wrong, but it seems you were wanting to know if the trial and the event overlapped. Assuming that your ranges are half-open intervals (inclusive start, exclusive end) - then you would test for overlap with:
if (trial.Start.ToInstant() < event.End.ToInstant() &&
trial.End.ToInstant() > event.Start.ToInstant())
{
...
}

Related

Returning each value from for loop in c#

EDIT: I realized that I had been going about it completely the wrong way and after an overhaul, got it working. Thanks for the tips guys, I'll keep them in mind for the future.
I've hit an unusual problem in my program. What I need to do is find the difference between two times, divide it by 1.5 hours, then return the starting time followed by each 1.5 hour increment of the starting time. So if the time was 11:45 am - 2:45 pm, the time difference is three hours, 3/1.5 = 2, then return 11:45 am and 1:15 pm. At the moment, I can do everything except return more than one time. Depending on what I've tried, it returns either the initial time (11:45 am), the first increment (1:15 pm) or the end time (2:45 pm). So far I've tried a few different types of for and do/while loops. The closest I've come was simply concatenating the start time and the incremented time, but the start and end times can range anywhere from 3 - 6 hours so that's not a practical way to do it.
The latest thing I tried:
int i = 0;
do{
i++;
//Start is the starting time, say 11:45 am
start = start.AddMinutes(90);
return start.ToShortTimeString();
} while (i < totalSessions); //totalSessions is the result of hours / 1.5
and I'm calling the function on a dynamic label (which is also in a for loop):
z[i] = new Label();
z[i].Location = new Point(PointX, PointZ);
z[i].Name = "sessionTime_" + i;
z[i].Text = getPlayTimes(dt.Rows[i][1].ToString());
tabPage1.Controls.Add(z[i]);
z[i].BringToFront();
PointZ += z[i].Height;
I'm pretty new to c# so I guess I've just misunderstood something somewhere.
I think you're trying to solve the problem the wrong way. Instead of returning each value as you come to it, create a collection ie List, and push each of your results onto the list until your finishing condition is met.
Then, return the whole array as your return value. This way you will have a nice self-contained function that doesn't have cross-concerns with other logic - it does only it's one little task but does it well.
Good luck!
It's a bit difficult to determine your exact use case, as you haven't offered the complete code you are using. However, you can do what you are asking by using the yield return functionality:
public IEnumerable<string> GetPlayTimes()
{
int i = 0;
do
{
i++;
//Start is the starting time, say 11:45 am
start = start.AddMinutes(90);
yield return start.ToShortTimeString();
} while (i < totalSessions); //totalSessions is the result of hours / 1.5
}
And then use it like so:
foreach (var time in GetPlayTimes())
{
// Do something with time
}

Quartz.net - Issues with Adjusting and Speeding up SystemTime causing Misfires

For testing reasons I want to be able to adjust what time Quartz.Net currently thinks it is so I do not necessarily have to wait hours, days, or weeks in order to check that my code is working.
For this purpose I created the following simple function (it is in F# but could be easily be done in C# or another language) :
let SimulateTime = fun () ->
currentTime <- DateTimeOffset.UtcNow
timeDifferenceInSeconds <- (currentTime - lastCheckedTime).TotalSeconds
simulatedTime <- simulatedTime.AddSeconds((timeDifferenceInSeconds *scaleTimeBy))
lastCheckedTime <- currentTime
simulatedTime
Where currentTime, lastCheckedTime, and simulatedTime would all be of type DateTimeOffset and both timeDifferenceInSeconds and scaleTimeBy are of type float.
I then change SystemTime.Now and SystemTime.UtcNow to use the above function as follows :
SystemTime.Now <-
Func<DateTimeOffset>(
fun () -> SimulateTime())
SystemTime.UtcNow <-
Func<DateTimeOffset>(
fun () -> SimulateTime())
Which was shown by Mark Seemann in a previous question of mine that can find here.
Now this mostly works except it seems like the longer function causes it to be off by a decently wide margin. What I mean by this is that all of my triggers will misfire. For example if I have a trigger set to occur every hour and set scaleTimeBy to 60.0 so that every second passed counts as a minute, it will never actually trigger on time. If I have a misfire policy, the trigger can then go off but the time it lists for when it activated will be as late as the half hour mark (so takes a full 30 seconds slower than what it should have been in this example).
However I can do this :
Console.WriteLine(SimulateTime())
Thread.Sleep(TimeSpan.FromSeconds(60.0))
Console.WriteLine(SimulateTime())
And the difference between the two times output to the screen in this example will be exactly an hour so the call doesn't seem like it should be adding as much of a time difference than it does.
Anyone have any advice on how to fix this issue or a better way of handling this problem?
Edit :
So the C# version of the SimulateTime function would be something like this :
public DateTimeOffset SimulateTime() {
currentTime = DateTimeOffset.UtcNow;
double timeDifference = (currentTime - lastCheckedTime).TotalSeconds;
simulatedTime = simulatedTime.AddSeconds(timeDifference * scaleTimeBy);
lastCheckedTime = currentTime
return simulatedTime;}
If that helps anyone with solving this problem.
So this issue is misfires caused by the fact that Quartz.net will idle and wait when it thinks it doesn't have any triggers occurring any time soon to avoid making too many calls. By default it waits about 30 seconds give or take if it doesn't have any triggers occurring in the time span. The idleWaitTime variable is a Timespan set in the QuartzSchedulerThread. Now when checking for triggers that might occur soon it also uses the BatchTimeWIndow from QuartzSchedulerResources.
Both idleWaitTime and BatchTimeWindow can be set in configuration/properties files where they'd be called "org.quartz.scheduler.idleWaitTime" and "org.quartz.scheduler.batchTriggerAcquisitionFireAheadTimeWindow."
Based off what it is called in BatchTimeWindow I thought it was just a bit of look ahead for grabbing a variable (which would like since if I'm speeding things up, I'd want a small idleWaitTime but I would want it to look further ahead for triggers because the few seconds your waiting is actually minutes so will trigger sooner than it thinks), but the description of "org.quartz.scheduler.batchTriggerAcquisitionFireAheadTimeWindow" on pages going over configuration properties implies that it can cause things to fire early and be less accurate. So to start here is the code for just modifying idleWaitTime
let threadpool = Quartz.Simpl.SimpleThreadPool()
let jobstore = Quartz.Simpl.RAMJobStore()
let idleWaitTime = TimeSpan.FromSeconds(30.0/scaleTimeBy)
let dbfailureretryinverval = TimeSpan(int64 15000)
Quartz.Impl.DirectSchedulerFactory.Instance.CreateScheduler("TestScheduler","TestInstance",threadpool,jobstore,idleWaitTime,dbfailureretryinverval)
let scheduler = Quartz.Impl.DirectSchedulerFactory.Instance.GetScheduler("TestScheduler")
You can create a Scheduler that has the idleWaitTime you want by using the DirectSchedulerFactory which probably could use a little bit better documentation. It takes also a bunch of stuff you may or may not want to modify depending on what you are working on. For threadpool I just use Quartz.net's default SimpleThreadPool because I do not care about messing with the threading at this time and would not want to explain how you go about doing so unless that was the whole point of the question. Information on jobstores is available here. I am using RAMJobStore here because it is simpler than AdoJobStore but it shouldn't matter for this example. The dbfailureretryinterval is another value that don't care about for this example so I just looked up what it is set to by default. Its value should matter the least for this example because not connecting to a database. For idleWaitTime might want to do more tests to figure out what is a good value for it, but I chose to go with just scaling its default value of 30 seconds by scaleTimeBy since that is what I'm using to scale how fast things are going by. So this should make it so if I am having the program simulate time going by at a much faster rate, then it should only remain idle for smaller periods of time. One important thing to note is that when create a scheduler in this way, it is not returned as well so need to make a separate call to get the scheduler I just created. I have no idea why this is this way, I'm guessing that if you are creating several Schedulers and not necessarily using all of them it is better this way.
Now after all that you are likely to still get a bit of a misfire rate. While it is now idling for much smaller units of time (only a few seconds so potentially an acceptable margin depending on what your use case is), it still has the issue of it is only then checking to see if it has a coming trigger in the next few fractions of a second.
So lets see if adding time to BatchTimeWindow helps matters?
let threadpool = Quartz.Simpl.SimpleThreadPool()
let threadexecutor = Quartz.Impl.DefaultThreadExecutor()
let jobstore = Quartz.Simpl.RAMJobStore()
let schedulepluginmap = System.Collections.Generic.Dictionary<String,Quartz.Spi.ISchedulerPlugin>()
let idleWaitTime = TimeSpan.FromSeconds(30.0/timeScale)
let maxBatchSize = 1
let batchTimeWindow = TimeSpan.FromSeconds(timeScale)
let scheduleexporter = Quartz.Simpl.RemotingSchedulerExporter()
Quartz.Impl.DirectSchedulerFactory.Instance.CreateScheduler("TestScheduler","TestInstance",threadpool,threadexecutor,jobstore,schedulepluginmap,idleWaitTime,maxBatchSize,batchTimeWindow,scheduleexporter)
let scheduler = Quartz.Impl.DirectSchedulerFactory.Instance.GetScheduler("TestScheduler")
Now this has even more variables that don't really care about for the purposes of this example and won't even bother going over because adjusting batchTimeWindow actually makes it worse. Like getting you back to misfiring by 30 minutes. So no, batchTimeWindow while looks like might be useful is not. Only modify idleWaitTime.
Ideally for this use would want a small wait time and a larger look ahead time, but the option for that does not seem like its available.

.NET: How to get time for keypress as precisely as possible?

I am trying to build an application that does not have to be super performant in general, except for the timestamp of a keyboard press (or external controller button press). I would like the timestamp to be as precise as possible, preferably down to +/- 5ms. This is for a scientific application. What is the best programming paradigm to achieve this with minimal latency? Which of the following is preferable?
(a) Create a worker thread that runs in a higher priority and loops to see if a key was pressed. Use Sleep(x) where x is less than 5.
(b) Create an keyboard hook, which has an asynchronous callback.
(c) Another option.
In addition to an answer, any code (or link to sample) would be much appreciated since I am a reasonably new dev.
EDIT: Nomenclature. Should have been more careful. By timestamp, I mean time in general, not necessarily the full Day Month Year Hour Minute Second Millisecond. I was planning on using the StopWatch class from the beginning, because all I need is the time between the start of an event in the program and the time of the button press. Sorry if this was not clear.
From comments:
I am looking for the difference between the start of an event and the key press.
It's important to understand that that's very different to trying to get the actual timestamp of a key press.
This situation is exactly what Stopwatch is for. Yes, use that and be grateful you're not really trying to get a timestamp.
Just use Stopwatch.StartNew when the event starts, and Stopwatch.Stop on the key press event. There's the slight latency of the event handler firing, but that will be tiny compared with anything else - I'd be astonished if it caused you any problems here.
If you are looking for exact timestamp of keypress, DateTime mightn't be what you want to use:
Get DateTime.Now with milliseconds precision
DateTime has a lot of precision, but is fairly coarse in terms of
accuracy. Generally speaking, you can't. Usually the system clock
(which is where DateTime.Now gets its data from) has a resolution of
around 10-15ms. See Eric Lippert's blog post about precision and
accuracy for more details.
If you need more accurate timing than this, you may want to look into
using an NTP client.
And the referenced link from that post:
http://blogs.msdn.com/b/ericlippert/archive/2010/04/08/precision-and-accuracy-of-datetime.aspx
Now, the question “how much time has elapsed from start to finish?” is
a completely different question than “what time is it right now?” If
the question you want to ask is about how long some operation took,
and you want a high-precision, high-accuracy answer, then use the
StopWatch class. It really does have nanosecond precision and accuracy
that is close to its precision.
Remember, you don’t need to know what time it is to know how much time
has elapsed. Those can be two different things entirely.
DateAndTime.Timer is the most exact. It returns a double precision float, which represents number of seconds passed from midnight.
Usage (in VB):
x# = DateAndTime.Timer
'... some code...
msgbox((DateAndTime.Timer - x).ToString + " seconds elapsed")
In case you end up using a loop, here's one that outperforms all the others that I've tested ->
<DllImport("user32.dll", EntryPoint:="GetInputState")>
Private Shared Function Chk4AnyKey() As Boolean
End Function
Sub HighPerfKybdRead(maxtime#)
maxtime += DateAndTime.Timer
Process.GetCurrentProcess.PriorityBoostEnabled = True
Process.GetCurrentProcess.PriorityClass = 256
while DateAndTime.Timer < maxtime
If Chk4AnyKey() Then
t# = DateAndTime.Timer
' Do further processing of the pressed key
' You might wanna call the PeekMessage function here
' with the option to ignore the repeated keystrokes
' and use the "t#" variable accordingly
End If
End While
Process.GetCurrentProcess.PriorityBoostEnabled = False
Process.GetCurrentProcess.PriorityClass = ProcessPriorityClass.Normal
End Sub
But if you're using an external controller instead of keyboard, then I guess you should be calling GetQueueStatus and not GetInputState...

Strange Behavior with Threading and Timer

I explain my situation.
I have a producer 1 to N consumers pattern. I'm using blocking collections and everything is working well. Doing some test I noticed this strange behavior:
I was testing how long my manipulation of data took in my consumers.
I noticed this strange things, below you'll find the code cleaned of my manipulation and which produce the strange behavior.
I have 4 consumers for 1 producer.
For most of data, the Console doesn't print anything, because ts=0 (its under a tick) but randomly (between every 1 to 5sec) it plots something like this (not in this very specific order, but of the same kind):
10000
20001
10000
30002
10000
40003
10000
10000
It is of the order of 10,000 ticks so around 1ms. Always a number in the format (N)000(N-1)
Note that the BlockingCollection I consume is filled depending on some network events which occurred completely at random times. Nothing regular from here.
The timing is almost perfect, always a multiple of 10,000 ticks.
What could be behind this ? Thks !
while(IsAlive)
{
DataToFieldMapping item;
try
{
_CollectionToConsume.TryTake(out item, -1);
}
catch
{
item = null;
}
if (item != null)
{
long ts = (DateTime.Now.Ticks - item.TimeStamp.Ticks);
if(ts>10)
Console.WriteLine(ts);
}
}
What's going on here is that DateTime.Now has a fairly limited precision. It's not giving you the time to the nearest tick. It is only updated every 10,000 ticks or so, which is why you generally see multiples of 10k ticks in your prints.
If you really want to get a better feel for the duration of those events, use the StopWatch class, which has a much higher precision. That said, StopWatch is simply a diagnostic tool (hence why it's in the Diagnostics namespace). You should only be using it to help you diagnose what's going on, and should be using it in production code.
On a side note, there really isn't any need to use a timer here at all. It appears that you're creating several consumers that are polling the BlockingCollection for new content. There is no reason to do this. They can simply block until the collection has items. (Hence the name, BlockingCollection.
The easiest way is for the consumers to simply do this:
foreach(var item in _CollectionToConsume.GetConsumingEnumerable())
ProcessItem(item);
Then just run that code in a background thread.
if you write the following and run, you'll see that ticks do not roll one to one, but rather in relatively large chunks b/c ticks resolution is actually much smaller.
for(int i =0; i< 100; i++)
{
Console.WriteLine(DateTime.Now.Ticks);
}
Use Stopwatch class to measure performance as that one uses a high-resolution timer which is much more suitable for the purpose.

Compund interest calculator - Solve for I or is there a better formula

I need an formula for determining a debt payoff plan where the following are known: number of payments, amount per payment, and principal and need to figure out what the interest rate would be from that. I am re-factoring existing code and the current method uses the following (compounded = 12;interest rate starts at .1) :
while (counter < 100)
{
intermediatePayment = (interestRate*(principal/compounded))/(1 - (1/Math.Pow(interestRate/compounded + 1,(compounded*numberOfYears))));
interestIncrement = Math.Abs(interestRate - previousRate)/2;
previousRate = interestRate;
if(intermediatePayment == payment)
break;
if (intermediatePayment > payment)
interestRate -= interestIncrement;
else
interestRate += interestIncrement;
counter++;
}
Now I understand what this formula does but I would never be able to arrive at it myself. What's here is actually an equation that is supposed to be used to determine monthly payment if interest rate,principal, and number of payments is known. It is using brute force and looping (at most 100 times) until the calculated payment equals the desired payment. It arrives at an answer usually after about 40-50 loops and that could be optimized by reducing significant digits.
Seems to me if we just solved for interestRate there would be no looping. Try as I might, I can't get the equation to solve for I, so that's my main question.
Now, if you understand the problem well enough and know financial formulas and compounding interest, you might provide me with an even better solution altogether, which would be awesome. I have done significant research myself and found tools but not the raw equation, or more often I find different formulas for determining interest related stuff but am not knowledgeable to retool them for my needs.
Basically I've spent too much time on this and my boss thinks since the loop works I need to leave it be or ask for help. Fair enough, so I am. :)
Here's a more traditional layout of the formula if that helps any: http://i.imgur.com/BCdsV.png
And for test data: if
P=45500
c=12
y=3
m=1400
then
I = .0676
Thanks for the help
If you attempt to solve the formula you linked to for I, the interest rate, you'll find that you get a polynomial of degree cy+1, that is, the total number of payments plus one. It is difficult/impossible to find closed form solutions to high degree polynomials, so an approximation is the best you can do.
The algorithm you've given has some nice properties: it is pretty clear what it is doing, and it gives the right answer in a reasonable amount of time. My attitude would therefore be "if it ain't broke don't try to fix it".
If it turned out that this algorithm was too slow for some reason then there are algorithms which converge to the right answer faster; you could work out what the polynomial you need to find roots of is, work out its derivative using simple calculus, and then use Newton's Method to converge to the roots faster. But for this simple example where the answer only has to be accurate to four decimal places anyway, that seems like overkill.
This formula cannot be explicitly solved for I, so you can stop trying. On the other hand, the loop goes way beyond common sense in precision. You can surely stop when you are within half cent of the payment amount or when the increment in the estimate of I gets below 0.0001, since there was some rounding during the original calculations anyway.

Categories