LocalEulerAngle is different from inspector [duplicate] - c#

This question already has answers here:
Unity3d - eulerAngles (local and global) totally different than what's in inspector
(2 answers)
Closed 5 years ago.
What i know about unity rotation is that, it show euler angle on inspector while behind the scene it use Quaternion. But I am surprised that my local euler angle and inspector rotation y are not same when my inspector value change from a positive number to a negative number. Why
I am using this
transform.localEulerAngles.y
to get local euler angles.
I noticed that it increasing in positive
like:
Inspector : Code LocalEulerAngle
97.04301: 97.04301 //matched
158.659: 158.659 //matched
-179.094 : 180.9064 //not matched
-170.812 : 189.1875 //not matched

Unity, and every popular game engine for that matter, uses Quaternions for all sorts of rotational representation and computation. Using Euler angles, you often run into issues like Gimbal lock and hence they are never used internally.
What you see in the inspector is for your own understanding and visualization. Quaternions are difficult (I so mean it!) to grasp so they are always hidden from the not-so-mathematical-genius common user. The values retrieved in code are quaternions represented as Euler, whereas the inspector values are, well, inspector values.

localEulerAngles use values between 0 and 360, as briefly stated in the documentation. They are mostly informative.

Related

How to make the rotation angle smooth in unity? [duplicate]

This question already has answers here:
Rotate GameObject over time
(2 answers)
Closed last month.
I count the angle at which the drone needs to turn so that it looks at the player. I figured it out, but the drone can change its targets and during the target change it abruptly changes its rotation. How can this transition be made smooth?
var finalAngle = Vector3.Angle(targetPosition - _droneTransform.localPosition, _droneTransform.forward);
_droneTransform.localEulerAngles = new Vector3(_finalPitch, finalAngle, _finalRoll);
You can use Vector3.RotateTowards()
https://docs.unity3d.com/ScriptReference/Vector3.RotateTowards.html
In this scenario to prevent the issue of Gimbal Lock I'd suggest pre calculation the quaternion rotations and use Quaternion.RotateTowards()
https://docs.unity3d.com/ScriptReference/Quaternion.RotateTowards.html
Precalculate the rotations using Quaternion.Euler()
https://docs.unity3d.com/ScriptReference/Quaternion.Euler.html
The quickest way to get a rotation that looks at a target is by using Quaternion.LookRotation()
https://docs.unity3d.com/ScriptReference/Quaternion.LookRotation.html

How to make a perfect cube

In Unity, I create a cube with scale 1,1,1. Position is 0,1,0.
Then I placed it above a plane which is 15,1,5000. Position is 0,0,0.
I checked if the cube is below 1 in Y-axis, this will mean to me that the cube fall on the plane. I can control this cube by going left or right. If I go to left, there's no issue. If I go to right, my position becomes 0.9999998~. This makes my checking of falling become true even though the cube is still on the plane. Somehow, the cube seems not be be a perfect cube. Hope someone can enlighten me on why is this happening. Thanks!
This may not be the answer you want, but - in poor words - computers' arithmetic is finite (search for floating point arithmetic). So, the "perfect cube" that you're looking for does not exist in the finite representation a machine could perform.
Moreover, Unity has its own physics engine that (like all physics engines) approximates the calculus of real world during each operation (translation, rotation, scaling).
The only way in which you can overcome the problem is by doing comparisons not with exact values (0, 1) but with ranges.
To maintain "order" in the coordinate system of your scene you could also - at fixed intervals - "adjust" your values, so, for example, manually setting the coordinate value to 1 if it is between 0.95 and 1.05 (adjust the values with your world's coordinate system, of course).
Related note: in your comment you say "But my point is that why it seems like the cube is not perfect 1x1x1. Somehow it's like 1x1x0.9999998". The fact is that a VR system, like Unity, does not maintain the objects' size in memory, but their vertices' coordinates. You feel like the object's dimensions have changed due to the translation, but this is not true in a strict way: it's just a finite approximation of the vertices' values for their X, Y, Z.

Unity Coordinate Limitations and its impact

Some objects which I have placed at a position (-19026.65,58.29961, 1157) from the origin (0,0,0) are rendering with issues, the problem is referred to as spatial Jitter (SJ) ref. Like You can check its detail here or You can see the below image. The objects are rendering with black spots/lines or maybe it is Mesh flickering. (Actually, I can't describe the problem, maybe the picture will help you to understand it)
I have also tried to change the camera near and far clipping but it was useless. Why I am getting this? Maybe my object and camera are far away from the origin.
Remember:
I have a large environment and some of my game objects (where the problem is) are at (-19026.65,58.29961, 1157) position. And I guess this is the problem that Object and Camera are very far from the origin (0,0,0). I found numerous discussions which is given below
GIS Terrain and Unity
Unity Coordinates bound Question at unity
Unity Coordinate and Scale - Post
Infinite Runner and Unity Coordinates
I didn't find that what is the minimum or maximum limit to place the Object in unity so that works correctly.
Since the world origin is a vector 3(0,0,0), the max limit that you can place an object would be 3.402823 × 10^38 since it is a floating point. However, as you are finding, this does not necessarily mean that placing something here will insure it works properly. Your limitation will be bound by what other performance factors your have in your game. If you need to have items placed at this point in the world space,consider building objects at runtime based on where the camera is. This will allow the performance to work at different points from the origin.
Unity suggests: not recommended to go any further than 100,000 units away from the center, the editor will warn you. If you notice in today's gaming world, many games move the world around the player rather than the player around the world.
To quote Dave Newson's site Read Here:
Floating Point Accuracy
Unity allows you to place objects anywhere
within the limitations of the float-based coordinate system. The
limitation for the X, Y and Z Position Transform is 7 significant
digits, with a decimal place anywhere within those 7 digits; in effect
you could place an object at 12345.67 or 12.34567, for just two
examples.
With this system, the further away from the origin (0.000000 -
absolute zero) you get, the more floating-point precision you lose.
For example, accepting that one unit (1u) equals one meter (1m), an
object at 1.234567 has a floating point accuracy to 6 decimal places
(a micrometer), while an object at 76543.21 can only have two decimal
places (a centimeter), and is thus less accurate.
The degradation of accuracy as you get further away from the origin
becomes an obvious problem when you want to work at a small scale. If
you wanted to move an object positioned at 765432.1 by 0.01 (one
centimeter), you wouldn't be able to as that level of accuracy doesn't
exist that far away from the origin.
This may not seem like a huge problem, but this issue of losing
floating point accuracy at greater distances is the reason you start
to see things like camera jitter and inaccurate physics when you stray
too far from the origin. Most games try to keep things reasonably
close to the origin to avoid these problems.

Understanding gyrometer readings using the Unity3D API

I am building a game that at one point relies on raw gyrometer readings. I am getting the readings using the following code (I'm using C#):
void Start () {
Input.gyrometer.enabled = true;
}
void Update() {
_guiText = Input.gyrometer.attitude.eulerAngles.ToString();
}
I'm displaying the values on screen in the OnGUI() method. As I'm using Unity I'm using a right-handed coordinate system and I've noticed that the Y value from the gyrometer corresponds to the X rotation in Unity.
The problem is that these values jump around massively. For example I can be holding the device at X:90° and rotate slowly through the Y Axis; after a certain point (around the 270° mark) the X will suddenly flip 180°. It means however, that if I start the app while the device is at 270° in the Y the X reading will be completely unusable. The same effect happens in the Z axis.
I have managed to get round this on iOS devices by resetting the gyro readings using the following snippet: Input.gyro.enabled = false; followed by Input.gyro.enabled = true; I use this to get an accurate reading of the device's absolute rotation and then I use rotational changes to get any other device movement. This doesn't work on Android however.
Has anyone come across this before? If so how did you get round it or is a bug that cannot be resolved?
This is because you are reading a conversion of Quaternion. Euler angles in Unity are just a friendly way to get a V3 representation of something that is hard to grasp. It is recommended not to affect eulerAngles directly as well as using them for reference. Instead, think of flat device as one rotation then start getting the angle between default rotation and current rotation and use that value to define your actions.
You can use Quaternion.Angle for that purpose.

OpenCV - How to detect and measure an angle between two frames?

I'm trying to understand and use OpenCV. I wanted to know if it is possible to find and measure an angle between two frames.
I explain : The cam is fix and the frames could rotate around the center and won't move. For now I managed to rotate manually and I would like to be able to compare frames and return the angle. For instance :
double getRotation(Image img1, Image img2) {
//Compare the frames
//Return the value
}
and then I rotate following that angle.
If you're able to detect static objects, e. g. background, on the frames then you may find points called good_features_to_track (cvGoodFeaturesToTrack) on the background and track this points using optical_flow (cvCalcOpticalFlowPyrLK).
If rotation is only on 'xy' plain you're able to detect rotation using cvGetAffineTransform.
Since only rotation is allowed (no translation and scaling) it's not difficult to determine an angle of rotation using transformation matrix, obtained by cvGetAffineTransform. That matrix looks like (see wikipedia):
Where \theta is the rotation angle
Well this might be very tricky, just a simpler solution might be to find the hough lines of the frames. Of course you would need to determined where the best and stable lines are which you can track between the two frames, once that is available, you can then find the angle between the two frames. What Andrey has suggested for finding the angles should be usable as well.

Categories