I have made a very simple hypercasual game everything works fine but after some few minutes of gameplay, the fps goes from 60 to 50 even the phone gets heated up. Similar to this question. I tried profiling but just can't see anything off. Tried even removing some UI elements but still no luck. Tried various vsync settings. Also, I had used this to display the fps. Even without it, the lag can be seen. Even if I just open the game and do nothing then after 5 minutes the fps will become 50. If go back using the home button and re-enter the game then the fps becomes 60 again. Using unity 2018.2.6f1. Never experienced this behavior in my other Android games.
Basically it was a faulty custom vertex shader which was applied to a plane to change the background color which changed color over time. I had not used the mobile vertex color because I was not getting the desired output. But now I'll stick to the mobile one.
The two symptoms you observed are very much likely to be connected.
The phone might heat up, as you are using its full power, which in turn makes the throttling kick in, reducing the perform
I've had the EXACTLY same problem. I was trying to fix it for a very long time. You said something about faulty shaders you use. And this is the key to solve our problem.
I use a 2-color gradient as a BG, so I have to use a shader too. Due to the fact that I'm a total noob in the writing "shader-code", I have to find something in the Internet. And it was my biggest fail)
To fix the problem and remove this fps drop you should remove your gradient and shader attached to it from the scene. And try to find a more optimized shader for 2D-game (or you can always write your own one c:)
Related
I have been thinking about this for a while. I know we can write to our own textures with setpixels, but i also know this is a really slow method that would start dropping framerates below 30 just being there. (Due to the sync with the videocard that happens after.)
So either i am using the wrong method, or doing it wrong. But i cannot find a proper way to write my own vram to a texture or directly to a camera.
Long story short, if i were to build like an emulator inside unity, and i wanted this emulator to run on either camera pixel by pixel or just on a texture inside unity. How would i get this going without slowing my framerate to a crawl on most devices?
Are shaders an option? If so, please point me in a direction since i never made those on my own just yet.
I'm currently creating a 2D Android and iOS game using Unity3D engine. I'm testing the game on a nexus 5, and an iPhone 5s device. Everything until now is working fine and I am pretty happy with the result, but when I test that application on an iPad or a Samsung tablet all the objects in my game scene are not in the correct position anymore. Is this a common problem in Unity3D ?
I know I am missing something but I tried to do some research and what I found is only by changing the orthographic camera scale might fix this problem, but I found it as a big amount of code to write as my game have not only one scene but multiple scenes and every scene have it's own game objects.
Is there any other method to do, a good and simple work around for this problem?
It's all about setting the Anchors right.
If you're using the new UI System, make sure you anchor the objects where you want them to be, that's how you will achieve resolution independence.
For more information about anchoring, see this tutorial
Don't have separate scenes for separate devices. You can use the Screen object to check the height and width of your display. Then you can use this to set the orthographic size of your camera to something that makes everything visible as expected.
Update: I misunderstood your question, you say GameObject, i understand UI.
Please check this. I don't have this issue on my game. But when i try it with mac or windows machines, it is problematic. So maybe this can solve.
Other solution is more common which is you and Agumander say, change orthographic size of Camera.
This Is for UI
You can use Unity UI, and don't need to seperate. There are so many different resolution and density devices, you need to create so many scenes. So it is meanless separating scenes.
Unity UI has pixel based solutions which can be very helpfull for many density and resolution options. Forexample, It has VerticalLayoutGroup and HorizontalLayoutGroup for easy list like element visulation.
Most important thing is: Do you want to change UI for different screen size or resolutions? For example iPad has larger screen so user can be see more content. This change UX. Maybe you need to consider this.
So, Unity does not do a lot of rhythm games on android. I decided to find out why, and program one as assignment (the basics of it anyway). My most important hurdle is user input. As we know that input is based on frame rate in unity, and a music game (i assume) would prefer the smallest possible delay between button press and action.
If we look at music, at around 15 to 20ms of delay, the human ear hears something is "off beat".
I heard Android Unity games run at 30FPS (since 60FPS sucks the battery dry), simple math indicates: 1000/30 = 33ms per frame. Calculating in the 15ms we can probably not notice, we are at 18ms of possible disaster. assuming we always reach this 30FPS at any given moment.
When i get input from a user, i can play a sound on that input on the exact same frame. However, we could be 18ms off.
Now there is a way to get DIRECT controls from mouse and keyboard, which uses OnGui() instead of Update(), to get events of the keyboard or mouse clicks on the spot. The problem is, android probably doesn't work with this, (this doesn't work for gamepads either) and the methods sounds downright strange, especially when we try to play sounds from the OnGui() method.
My question:
What would you do, and why? Should we just accept the possible 18ms off, and assume we reach the 30FPS, or should we look for a reliable way to get input directly, instead of waiting for an update to come by?
Thanks for any insight you can give me, i have not found any articles on this that are useful just yet.
-Smiley
EDIT
I just did some basic testing with a metronome, running at 100FPS in the editor (which should be 10ms per frame) tapping my spacebar on a metronome inside unity. The results i got were just horrible.
Tapping rapidly: I get as close as 20ms to my metronome tick, but nothing closer.
Tapping on the beat: I got at least 200ms off my target tick. Unless i am confused with this rhythm, this is just wrong.
Currently i use Debug.Log to get my test data to the log. Can anyone please confirm for me if this may be the cause (causes some long delay? i know debug isn't that optimized), or is the timing actually that bad on it?
Thanks in advance,
-Smiley
First, I'd like to add a few things to your comments and analysis:
There is a hell of a lot more to measuring the latency between the tactile input and what your eyes perceive.
Probably the biggest one I'm seeing missing in your tests is the latency between the graphics card and the PC monitor you're testing on. Many common LCD monitors these days have a latency of between 15-30ms in processing lag. I don't know how much this relates to mobile screens and hardware, but I would suggest you take the time to perform additional tests on your target hardware before drawing further conclusions.
To more directly answer your question:
I would continue to use Unity however I would continue to research the best methods of taking the input and feeding it back to the player as fast as possible. In the above comments #rutter has pointed you to what appears to be a pretty good thread on the issue.
Of specific note I would look at using FixedUpdate() to decouple the framerate of the game from the input processing speed.
I think it is also worth putting time into researching the psychology of the perception of latency. For example, if your game is a sort of Guitar Hero game of matching the playing song, you could simply take into account the lag you know is there and in your game logic take that into account when checking input.
I think you are over-complicating this, and that the accuracy issue is no where near as bad as you think.
People usually hit the buttons a little early in order to sync what they are seeing and hearing.
It also depends alot on if you have some kind of scrolling display that they are trying to match up to... if the display is scrolling smoothly at 30fps (without big jumps) they they are still able to make their timing presses fairly accurate.
I would surmise that although people can hear when their timing is off, their actual timing of hitting the buttons at exactly the right time is not that accurate anyway.
Here is one other simple solution... which I think is what rock band and guitar hero often do...
You start playing the note/sound at the correct time anyway.... then change it to a broken sound if you detect they missed it or goofed up.
This question is going to sound odd because I really don't understand how this could be possible but here goes. I have some collision code for a 2D game of mine which works perfectly fine on Windows, Xbox, WP7, WP8. But for some odd reason the exact same code does not work when I run my game as a Windows 8 Metro App. What's even weirder is, the code works when I run the same project on my Surface but when I run it on my PC the bullets just go straight through the enemy. I don't think posting the code would be of any use since the code is identical where ever I use it and KNOW for a fact it works. If anyone knows how this is even possible please let me know. If you want me to post the code then let me know.
I'll explain a bit of what the code is doing:
Loops through all the player's bullets
Loops through all the enemies
If rectangle collision takes place
If per pixel collision takes place
Kill enemy, remove bullet etc.
The game runs fine as an XNA game on my PC which is the same PC I use to test it as a metro app.
This might be a computer speed problem. Does your game-loop work on a static timer, or do you throw updates/draws as fast as you can? It's possible the bullets aren't colliding because on one update they're in front of the enemy, and on the next they're behind. Try 'widening' the enemies or bullets as a debug - that may fix it. If this is the case, you may have to do some bullet updating within the update to make sure it hits all the locations in-between. and doesn't teleport through the enemies.
If you are using pixel values to test for collision, you may be trying to use DIP (device independent pixels) pixels thinking they are screen location pixels (something new with Metro, in fact, it's the default).
Set your app to run in simulation mode and set the sim's screen resolution to: 1366x768. Does it suddenly work correctly? if so, then it's a DIP issue.
Start here: http://msdn.microsoft.com/en-us/library/windows/desktop/ff684173(v=vs.85).aspx
notice the formula halfway down the page: DIPs = pixels / (DPI/96.0)
I've created a WPF application using C# that renders about 50 3D elements and animates the camera in order to move around the scene.
While it all works, I've noticed as the camera moves, ugly video tearing occurs.
I'm well aware of the standard cause of tearing, ie the application is updating frames at a different rate from the monitor/adapter vertical sync.
I've also read that the default screen update rate in WPF is 60 frames per second and that on XP WPF updates without regard to the montior v-sync.
For certain scenes the tearing is less noticeable, but often it is distracting enough to be a real problem.
I'm really hoping there is a resolution, after all Windows Forms has had double buffering for years. Moving to WPF is actally a backward step for many types of application if there is no resolution.
So, on Windows XP, how can I configure WPF to remove tearing?
I'm looking for more helpful answers then "move to Vista". This is simply out of the question for many of the future users of this application.
No answers yet, surely with the number of WPF developers out there someone else has seen this problem?
In the meantime I've investigated/read some more and have come across the Timeline.DesiredFrameRate property.
With my display running at 50hz vertical refresh I tried setting this to 50 FPS in the hope this would brings things in sync. Unfortunately this had no effect on the tearing.
However, I did have to apply it to a single specific animation I created (in code) using BeginAnimation(). I'm not sure how this would affect a "global" framerate at all.
Instead, I would have expected to need to do this at a more global level, say on the viewport, however I couldn't find DesiredFramerate property on the viewport object.
Another area to look at could be the RenderCapability.Tier but I haven't had the time to look in detail yet.
[Update]
No solution but a useful tip. To set the DesiredFramerate globally (ie for all animations in the viewport) you can override the PropertyMetadata for the
Timeline.DesiredFrameRateProperty dependency property:
// Set the global desired framerate to 50 frames per second
Startup += delegate(object sender, StartupEventArgs e)
{
Timeline.DesiredFrameRateProperty.OverrideMetadata(typeof(Timeline),
new PropertyMetadata(50));
};
It's best to place this in the constructor of the standard WPF Application class.
Unfortunately this didn't solve my problem because even setting it to FPS, it must not be synced accurately enough with monitor v-sync.
It is handy to know about though, particularly if you have slow moving animations and want to save CPU usage by setting the FPS much lower then the default 60fps.
[Update]
Setting the DesireFrameRate to some very high value such as 100+ seems to reduce but not remove the flicker. One big drawback however is that the CPU (on my 4 yo PC) goes to 35%.