I have this DateTime object (update) which is set to DateTime.now, when i update my application.
I also have this timerTick event, called on every 5 seconds which should check how many minutes ago, update was.
I've tried with:
if ((DateTime.Now - Updated).Minutes > 0)
{
updateTextBlock.Text = "updated " + ((DateTime.Now - Updated).Minutes).ToString() + " minutes ago";
}
But it does not seem to work correctly. Isn't there a better way to do this?
/R
I suspect you want TotalMinutes instead of Minutes. Otherwise you'll only ever get a value in the range -59 to 59.
You may also want to consider using UtcNow instead of Now - otherwise you could get odd effects due to time zone changes (either the user changing time zone, or the time zone changing its UTC offset, usually for daylight saving time.)
You may find it easier to use an instance of System.Diagnostics.StopWatch to keep track of how much time has elapsed since any particular starting point.
It can be more reliable and accurate than doing math on DateTime objects because it'll use the hardware's High Resolution Timer if one is available.
You're probably looking for TotalMinutes, not just Minutes. TotalMinutes will give you the total number of minutes in the interval, whereas Minutes only gives 0-59 (since you also have Hours, etc.)
One way I did something similar was using TimeSpan
I had two variables timePassed and tickTime
where I would set tickTime to be a 5 second TimeSpan and timePassed to be 0
TimeSpan tickTime = new TimeSpan(0,0,0,5); // 5 seconds
TimeSpan timePassed = new TimeSpan(0,0,0,0); // 0 seconds
then in the handler for the tick event I would add tickTime to timePassed
timePassed = timePassed.Add(tickTime); \\ adds 5 seconds to the timePassed TimeSpan
Then you can use timePassed to get the time since update.
Hope this helps
Related
I have this simple bit of code:
//LastUpdate = DateTime.Npw
//These two lines occur every frame
TimeSpan timeSpan = DateTime.Now - data.LastUpdate;
Debug.Log(timeSpan.Milliseconds);
But the result of this is showing milliseconds not really increasing, it fluctuates between 100 to 900 ms. It should be ever increasing in size since, the time past is increasing.
I have checked that LastUpdate does not change, so that isn't the cause.
I guess i am misunderstanding how timespan works. I am trying to get milliseconds that has passed between LastUpdate and now of the current frame.
Am i using it wrong ? I don't understand the issue.
From TimeSpan docs:
.Milliseconds (emphasis mine):
Gets the milliseconds component of the time interval represented by the current TimeSpan structure.
You want to use .TotalMilliseconds:
Gets the value of the current TimeSpan structure expressed in whole and fractional milliseconds.
So I run some type of game, and I want to add a command !uptime that displays how long server has been running for since the last open or whatever
This code (from microsoft website) shows the tick count and displays it correctly
int result = Environment.TickCount & Int32.MaxValue;
player.SendMessage("Result: " + result);
but I want to be able to display how long its been up in minutes.
From the MSDN documentation, we can see that Environment.TickCount
Gets the number of milliseconds elapsed since the system started.
You can then convert it to minutes like so:
var minutes = (Environment.TickCount - serverStartTickCount) / 60000; // 1000 ms/s * 60 s/m
Alternatively, you might want to consider storing DateTime.Now when the server starts. Say your class is called Program, you can add this to it:
public static readonly DateTime ServerStartTime = DateTime.Now;
and then do this when the command is run:
var uptime = DateTime.Now - Program.ServerStartTime;
var minutes = uptime.TotalMinutes;
This would allow you to get an accurate uptime when the Environment.TickCount roll over every few weeks, as #Carlos pointed out.
From the reference docs:
A 32-bit signed integer containing the amount of time in milliseconds
that has passed since the last time the computer was started.
So divide by 1000 to get seconds, and then 60 to get minutes.
Note the thing is only 32 bit, so it loops back every few weeks.
Use a TimeSpan.
TimeSpan uptime = TimeSpan.FromMilliseconds(Environment.TickCount);
double totalMinutes = uptime.TotalMinutes;
I'm working in VB.NET, but just looking for a formula that will give me the following:
Say I have a process that needs to be launched every x minutes between start time and end time. Every minute I need to determine if the process needs to be launched.
So if I have the following:
StartTime = 8:00:00 AM
EndTime = 11:00:00 PM
IntervalMinutes = 7
I have a timer set to fire every 1 minute. I need to determine if the current time is time to launch the process.
Currently I just use a loop that adds IntervalMinutes to StartTime and compares it to the current time and EndTime. If StartTime = CurrentTime then launch. If StartTime > End Time then exit loop. I know it's clunky but it works. However as it gets later in the day, it has to iterate through a lot more minutes. I know there has to be a formula for this but my brain is dead from searching and thinking.
My pseudo modulus operation:
float tolerance = 0.0001f;
if((CurTime - StartTime) % IntervalMinutes <= tolerance)
{
// Do something
}
You could pre-calcualte all of the launch times between StartTime and EndTime on app start, put them in a List then each time the timer fires you only need to check if the current time == launchList[0]. If it does launch and remove that entry from the list.
I'm trying to create a clock for my game. My hours and seconds are both float values so I am using Math.Round to round them off to the nearest whole number. The problem is that the Hours and Seconds variables aren't changing at all. Am I using Math.Round wrong?
public void Update()
{
Hours = (float)Math.Round(Hours, 0);
ClockTime = Hours + ":" + Seconds;
if (Hours >= 24)
Hours = 0;
if (Seconds >= 60)
Seconds = 0;
}
In my update method for my day/night class.
float elapsed = (float)gameTime.ElapsedGameTime.TotalSeconds;
clock.Hours += (float)elapsed;
clock.Update();
When I print the numbers on the screen, nothing is changing. If I take away the (float) cast to the Math.Round I get an error cannot convert double to float.
Don't use floating point in this case, there's absolutely no reason for an hour, minute or second to be non-integral.
What's almost certainly happening is that you're ending up with a float value like 59.9999 despite the fact you think you're rounding it.
There are real dangers in assuming floating point values have more precision than they actually do.
If you hold your number of seconds in an unsigned integral 32-bit type, you can represent elapsed time from now until about the year 2150 AD, should anyone still be playing your game at that point :-)
Then you simply use integer calculations to work out hours and seconds (assuming you're not interested in minutes as seems to be the case), pseudo-code such as:
hours = elapsed_secs / 3600
secs = elapsed_secs % 3600
print hours ":" seconds
Beyond that advice, what you're doing seems a tad strange. You are adding an elapsed seconds field (which I assume you're checked isn't always set to zero) to the hours variable. That's going to make gameplay a little difficult as time speeds by at three and a half thousand times its normal rate.
Actually, you should used DateTime to track your time and use the DateTime properties to get the hours and seconds correctly instead trying it yourself using float for seconds and hours. DateTime is long based and supports from fractions of milliseconds to millenias and of course seconds. It has all the functions built in to add milliseconds or years or seconds or ... correctly, which is actually rather difficult.
i am trying to set the tick alert for the timer based on the time set by the datetime picker. however i dont get the tick alert at all.
if (dateTimePicker1.Value >= DateTime.Now)
{
sendOrder.Interval =(int) (dateTimePicker1.Value.Ticks-DateTime.Now.Ticks);
sendOrder.Enabled = true;
}
in the above code, i set the tick time based on difference between time from datetimepicker - current time. what am i doing wrong on this one?
Assuming you are using System.Timers.Timer, Interval is in milliseconds, whereas you are specifying ticks. There are 10,000 ticks in 1 millisecond.
Try this instead:
sendOrder.Interval = (int)(dateTimePicker1.Value.Ticks - DateTime.Now.Ticks) / 10000;
DateTime.Ticks Property:
A single tick represents one hundred nanoseconds or one ten-millionth of a second. There are 10,000 ticks in a millisecond.
Timer.Interval Property:
The time, in milliseconds
So your timer is firing; the interval is just off by a factor of 10,000.
Instead of twiddling with ticks, you can use the TimeSpan.TotalMilliseconds Property to get the difference between the two DateTime values in milliseconds:
sendOrder.Interval = (dateTimePicker1.Value - DateTime.Now).TotalMilliseconds;
Ticks is in intervals of 100 nanoseconds. There are 10,000 ticks in a millisecond. Interval is in milliseconds. So that is obviously an error.