I'm doing performance tuning on a Silverlight WP7 game and I'm observing a weird behavior when running in the Emulator (I have no real device, so I can't tell if it's an emulator artifact).
I am using a Storyboard as main game loop timer, I update the rendering in the Completed event, then restart the animation. The problem is that the Completed event is triggered at very different intervals.
I isolated the problem in a very simple function, called for ex. on a button click (I put this in a totally empty application, there is nothing else going on in the app).
var sw = new Stopwatch();
var animation = new Storyboard();
animation.Duration = TimeSpan.FromMilliseconds(10);
animation.Completed += (s, e) =>
{
sw.Stop();
Debug.WriteLine(sw.ElapsedMilliseconds.ToString());
sw.Reset();
sw.Start();
animation.Begin();
};
animation.Begin();
The output I would expect (ideally) is something like
10
10
10
10
...
but instead, this is what I get:
12
15
16
17
17
14
10
10
10
9
11
132
10
20
11
11
10
12
The time between cycles varies quite a bit, but most important there are occasional very long delays (like the 132 above). This happens regardless of the animation Duration, i.e. the actual duration is "centered" towards the set duration, but varies a lot with intermittent long delays.
You can imagine that my game is running quite irregularly, not smooth and uniform at all. Also I noticed that regardless of how simple the render operation is, I cannot go over ~42 FPS.
Question 1: Am I doing some obvious and blatant mistake? Is my approach flawed?
Question 2: Why is the Storyboard duration so inconsistent between runs?
I am 90% sure is it caused by emulator yet be aware that 10ms is very short interval. Silverlight app is getting refreshed 60 times per second so it is 16,6ms. I recommend you set it to 33 ms or more.
In one of my game I use DispatcherTimer which works fine (for falling bombs :) ) but is not that accurate as I thought. Using storyboard as a timer is better IMO. Basically, if you do not have any other complex animations compositor thread works smooth and your timer is pretty accurate.
As someone with a device to test your code (or buy one).
Related
I just started learning game programming, I choosed to use a fixed-timestep game loop since it's straight forward and easier to understand.
Basically like this:
while (game_is_running) {
input();
update();
render();
}
When VSync is enabled, it runs at the speed of my display, but I want to limit it to run as fast as 60 FPS even on higher refresh rate monitors, so I added frame skip and fps limiter:
static void Main()
{
long frame_count = 0;
long tick_frame_end;
Stopwatch stopwatch = new Stopwatch();
WaitForVSync();
stopwatch.Start(); // hoping to get in sync with display
while (game_is_running) {
frame_count++;
tick_frame_end = GetEndTick(frame_count);
if (stopwatch.ElapsedTicks > tick_frame_end) { // falling behind
continue; // skip this frame
}
input();
update(tick_frame_end);
// render(); // render here when VSync is enabled
while (stopwatch.ElapsedTicks < tick_frame_end) {
// Busy waiting, in real code I used Spinwait to avoid 100% cpu usage
}
render(); // I moved it here because I turned off VSync and I want render to happen at a constant rate
}
}
static long GetEndTick(long frame_count) {
return (long)((double)frame_count * Stopwatch.Frequency / 60); // Target FPS: 60
}
When VSync is off, it can run at a steady 30, 60, 120 (or any value in between) FPS. I was happy with the result, but as soon as I turn VSync back on (and set FPS to 60), I noticed there's a lot frame skipping.
I can run a steady 120FPS when VSync is off with not a single frame dropped, yet when running 60FPS VSync on I have a notably number of frames dropped. It took me quiet a while to nail down the cause of this:
My monitor which I thought was running at 60Hz is actually running at about 59.920Hz (tested with https://www.displayhz.com/)
As time goes on, my internal frame time and monitor frame time would become more and more out of sync, causing many unexpected frame drops. This completely breaks my GetEndTick() function, which basically breaks everything: the frame skipping logic, the fps limiter, and most importantly update(tick_frame_end) is broken to.
So the questions are:
1. How can I get the end time of current frame?
In my code above, the end time is calculated by the assumption that there's 60 even frames in every second. If that's true, my internal frame time may not align perfectly to the display frame time, but would still be in-sync which is useful enough.
(Everything was calculated based on the time stamp of frame end, the idea is I generate the image based on the time of frame end, which is just moment before next frame getting displayed on screen.)
I tried to record the time at beginning of the loop (or right after last frame render) and add 1 frame worth of time to it - Doesn't work as the recorded time isn't reliable due to system load, background processes, and many other things. Also I'll need a new method to determine which frame is current frame.
2. If (1.) isn't possible, how can I fix my game loop?
I have read the infamous Fix Your Timestep! by Glenn Fiedler, yet it's about game physics not graphic, and doesn't really provide me an answer to the first question.
I'm a beginner programmer just starting out with MonoGame. I was implementing some code for running an animation and came to a problem I don't know how to solve. This is the code I used in my AnimationManager class:
public void Update(GameTime gameTime)
{
timeElapsed += (float)gameTime.ElapsedGameTime.TotalSeconds;
if (timeElapsed > timeToUpdate)
{
timeElapsed -= timeToUpdate;
//Tried to just reset timeElapsed to zero, still doesn't work
//timeElapsed = 0;
if (frameIndex < rectangles.Count -1)
{
frameIndex++;
}
else if (isLooping)
{
frameIndex = 0;
}
}
}
The problem is I animate at 24 frames per second and when the game runs at 60 or 30 fps and on each update code checks is it time to draw a new frame of animation, you get some remainder of frames, because you tried to draw 24 images per second, evenly, with 30 even game frames per second. So there's that remainder of 6 frames. The result of this is that approx 6 frames of animation gets drawn twice, some are skipped a whole animation gets about 25% longer. Is there a way to fix this? Can I maybe move this code to Draw call and then cap just the draw calls at 48 fps for 60 fps game, so each animation frame will be drawn at every second draw call. I don't know how would I go about doing this if it's a reasonable solution at all?
And how will VSync affect all of this at the end. Will it mess up animation frames, even if this problem is solved.
EDIT: forgot to say it's a 2d game, hand drawn animation.
Generally when you're doing this kind of stuff you don't need to worry about VSync or the fact that your game runs at 60 or 30 fps. That's all pretty much irrelevant.
The thing that matters is that you know the desired frames per second of your animation and how long it takes for a single game frame.
From that you can calculate the length of time in a single animation frame. For example, you'd end up with something like this:
var animationFrameTime = 1f / 24f; // 24 frames per second
So far so good.
The next thing you need to do is to start accumulating time somewhere. There's a few different ways to think about this. I like to think about it in terms of how much time is left until the next frame.
var gameFrameTime = (float)gameTime.ElapsedGameTime.TotalSeconds;
timeUntilNextFrame -= gameFrameTime;
Then you need to increment the frameIndex when you've run out of frame time.
if(timeUntilNextFrame <= 0)
{
frameIndex++;
timeUntilNextFrame += animationFrameTime;
}
Notice the important bit here. I've added the animationFrameTime to the timeUntilNextFrame. This way, if there's any left over time from the previous frame, it'll get added to the next frame, keeping the animation smooth.
There's really not much more to it than that. I've left out the rectangles.Count and isLooping bits, but it should be easy enough to add those back in. You might also want to initialize timeUntilNextFrame to animationFrameTime for the first frame.
Just a word of warning with this code, if the game is running slowly for some reason (or perhaps it's paused maybe) the gameFrameTime might end up having a very large value causing some strange behavior. I'm not entirely certain if that can ever happen in MonoGame / XNA but it's worth considering at least.
It turned out this is a well known problem. Basicaly its mathematicaly imposible to jam 24 animation fps in 30 game fps because they are not independent from each other. Interpolation is the only way to fix this problem. It's the same thing that's being done whenever you watch 24 fps movie on a 60 hz monitor.
https://en.wikipedia.org/wiki/Three-two_pull_down
I am trying to make a Snake game. And what I want to happen is that each time the snake eats a piece of food it moves quicker.
I want to do this by making the timer tick faster.
I start the timer at an interval time of:
gltimer.Interval = new TimeSpan(20000)
And update it with:
public void updateTimer(object sender, EventArgs e)
{
long nrTicks = gltimer.Interval.Ticks;
nrTicks = (long)(nrTicks * 0.95);
gltimer.Interval = new TimeSpan(nrTicks);
}
But when I run my game the speed stays the same until I reach the 14th snack and then it suddenly changes. I already figured out that the nrTicks then drops below 10000.
My question is why the Interval doesn't update for the intermediate values?
The DispatcherTimer is not designed to execute an action at precise intervals.
While it dispatches the action at the exact interval, the dispatched actions will get executed, when the GUI thread in your case the wpf application loop decides to do so.
You setting the interval to 20.000 ticks results in an interval of 20.000*100ns = 2.000.000ns = 2ms. A wpf application has a resolution of around 10ms at best.
Look at those posts:
WPF Timer problem... Cannot get correct millisecond tick
WPF Dispatchertimer delayed reaction / freeze
i have the following issue in my code,
i have this loop running on timer (this is just a small part of the loops that running on the big timer),
inside that big timer (he tick every 1 second) i have 1 method that need to wait 5 second then continue with the the rest of the loop code, but i want that it wont stuck the code and the timer will continue to run every 1sec and wont wait for those 5sec.
what i did i add a new timer (timer_deva) that tick every 5sec and did all the checks inside it, and then timer stops.
so my issue is that i need to wait 5sec to retrieve a value to complete my code, but i need that my main timer will keep running simultaneously, and when he get his result for the other time he will need to complete the code he left behind.
thanks in advance,
else if (mobID.Equals(Convert.ToInt32(txtDeva)))
{
//START CHECK WITH TIMER
timer_deva.Start();
//Methods inside timer_deva update the winnerNation
//END CHECK TIMER - GET RESULT
winner(zoneId, winnerNation, humansKills, orcKills);
}
tl;dr
Conventional Timers are not used in games. Games have a very different mechanism for handling their logic and the time that passed.
Long Version:
I know this may not answer your question directly, but it's way to much text to cramp into a comment. Your timers sound very complicated and hierarchical. From your variable names I will assume you are programming a game. Games normally don't work with timers or not in the way you would expect. This different game-like behaviour would help you a lot in your timers problem and may even help you more with your design in general.
Games normally have something called a game loop. Generally speaking it's three main functions that are called one after the other in a loop:
while(running)
{
HandleUserInput();
ChangeWorld();
Render();
}
You get user input, you change the game world accordingly and you draw it to the screen. Now, the faster your computer is, the faster this loop runs. That's good for the graphics (think FPS), but bad for the game. Imagine Tetris where every frame the blocks move. Now I would not want to buy a faster computer, the game would get more difficult that way.
So to keep the game speed constant independent of the power of the computer, the loop considers the time passed:
while(running)
{
var timePassedSinceLastLoop = CalculateTimeDelta();
HandleUserInput();
ChangeWorld(timePassedSinceLastLoop);
Render();
}
Now imagine a cooldown for something in game. The player pressed "a", some cool action happened and although he may press "a" again, nothing will happen for the next 5 seconds. But the game still runs and does all the other things that may happen ingame. This is not a conventional timer. It's a variable, lets call it ActionCooldown, and once the player triggers the action, it's set to 5 seconds. Every time the world changes, the timePassed is subtracted from that number until it's zero. All the time, the game is running and handling input and rendering. But only once ActionCooldown hits zero, another press of "a" will trigger that action again.
I work on a 2D game engine that has a function called LimitFrameRate to ensure that the game does not run so fast that a user cannot play the game. In this game engine the speed of the game is tied to the frame rate. So generally one wants to limit the frame rate to about 60 fps. The code of this function is relatively simple: calculate the amount of time remaining before we should start work on the next frame, convert that to milliseconds, sleep for that number of milliseconds (which may be 0), repeat until it's exactly the right time, then exit. Here's the code:
public virtual void LimitFrameRate(int fps)
{
long freq;
long frame;
freq = System.Diagnostics.Stopwatch.Frequency;
frame = System.Diagnostics.Stopwatch.GetTimestamp();
while ((frame - previousFrame) * fps < freq)
{
int sleepTime = (int)((previousFrame * fps + freq - frame * fps) * 1000 / (freq * fps));
System.Threading.Thread.Sleep(sleepTime);
frame = System.Diagnostics.Stopwatch.GetTimestamp();
}
previousFrame = frame;
}
Of course I have found that due to the imprecise nature of the sleep function on some systems, the frame rate comes out quite differently than expected. The precision of the sleep function is only about 15 milliseconds, so you can't wait less than that. The strange thing is that some systems achieve a perfect frame rate with this code and can achieve a range of frame rates perfectly. But other systems don't. I can remove the sleep function and then the other systems will achieve the frame rate, but then they hog the CPU.
I have read other articles about the sleep function:
Sleep function in c in windows. Does a function with better precision exist?
Sleep Function Error In C
What's a coder to do? I'm not asking for a guaranteed frame rate (or guaranteed sleep time, in other words), just a general behavior. I would like to be able to sleep (for example) 7 milliseconds to yield some CPU to the OS and have it generally return control in 7 milliseconds or less (so long as it gets some of its CPU time back), and if it takes more sometimes, that's OK. So my questions are as follows:
Why does sleep work perfectly and precisely in some Windows environments and not in others? (Is there some way to get the same behavior in all environments?)
How to I achieve a generally precise frame rate without hogging the CPU from C# code?
You can use timeBeginPeriod to increase the timer/sleep accuracy. Note that this globally affects the system and might increase the power consumption.
You can call timeBeginPeriod(1) at the beginning of your program. On the systems where you observed the higher timer accuracy another running program probably did that.
And I wouldn't bother calculating the sleep time and just use sleep(1) in a loop.
But even with only 16ms precision you can write your code so that the error averages out over time. That's what I'd do. Isn't hard to code and should work with few adaptions to your current code.
Or you can switch to code that makes the movement proportional to the elapsed time. But even in this case you should implement a frame-rate limiter so you don't get uselessly high framerates and unnecessarily consume power.
Edit: Based on ideas and comments in this answer, the accepted answer was formulated.
Almost all game engines handle updates by passing the time since last frame and having movement etc... behave proportionally to time, any other implementation than this is faulty.
Although CodeInChaos suggestion is answers your question, and might work partially in some scenarios it's just a plain bad practice.
Limiting the framerate to your desired 60fps will work only when a computer is running faster. However the second that a background task eats up some processor power (for example the virusscanner starts) and your game drops below 60fps everything will go much slower. Even though your game could be perfectly playable on 35fps this will make it impossible to play the game because everything goes half as fast.
Things like sleep are not going to help because they halt your process in favor of another process, that process must first be halted, sleep(1ms) just means that after 1ms your process is returned to the queue waiting for permission to run, sleep(1ms) therefor can easily take 15ms depending on the other running processes and their priorities.
So my suggestions is that you as quickly as possible at some sort of "elapsedSeconds" variable that you use in all your update methods, the earlier you built it in, the less work it is, this will also ensure compatibility with MONO.
If you have 2 parts of your engine, say a physics and a render engine and you want to run these at different framerates, then just see how much time has passed since the last frame, and then decide to update the physics engine or not, as long as you incorporate the time since last update in your calculations you will be fine.
Also never draw more often than you're moving something. It's a waste to draw 2 times the exact same screen, so if the only way something can change on screen is by updating your physics engine, then keep render engine and physics engine updates in sync.
Based on ideas and comments on CodeInChaos' answer, this was the final code I arrived at. I originally edited that answer, but CodeInChaos suggested this be a separate answer.
public virtual void LimitFrameRate(int fps)
{
long freq;
long frame;
freq = System.Diagnostics.Stopwatch.Frequency;
frame = System.Diagnostics.Stopwatch.GetTimestamp();
while ((frame - fpsStartTime) * fps < freq * fpsFrameCount)
{
int sleepTime = (int)((fpsStartTime * fps + freq * fpsFrameCount - frame * fps) * 1000 / (freq * fps));
if (sleepTime > 0) System.Threading.Thread.Sleep(sleepTime);
frame = System.Diagnostics.Stopwatch.GetTimestamp();
}
if (++fpsFrameCount > fps)
{
fpsFrameCount = 0;
fpsStartTime = frame;
}
}