AxisAngleRotation3D to Quaternion gives unexpected result - c#

I am developing a controller in WPF 3D (using C#) to be able to easily move and rotate a ProjectionCamera using Move()- and Pitch()-functions. My controller will become a Behavior<ProjectionCamera> that can be attached to a ProjectionCamera. In order to initialize the controller, I want to calculate the current rotation of the camera by looking at its current Up and Forward-vectors and compare them to the default camera orientation (Up = [0 1 0], Forward = [0 0 -1]). In other words, I want to calculate a rotation that will transform the camera default's orientation to its current one.
Ultimately, I want to express the rotation as a single Quaternion, but as an intermediate step I first calculate the Proper Euler Angle rotations of the form z-N-Z expressed as AxisAngleRotation3D-values, following the default definition of Wikipedia:
var alphaRotation = CalculateRotation(z, x, N);
var betaRotation = CalculateRotation(N, z, Z);
var gammaRotation = CalculateRotation(Z, N, X);
with
CalculateRotation(Vector3D axisOfRotation, Vector3D from, Vector3D to) : AxisAngleRotation3D
The Euler Angle Rotations seem to be calculated correctly, based on some unit tests. However, when I convert these rotations to a single Quaternion, the resulting Quaternion represents a rotation that differs from the Euler Angle Rotations, and I don't know why.
This is how I convert the Euler Angles to a single Quaternion:
var rotation =
new Quaternion(alphaRotation.Axis, alphaRotation.Angle) *
new Quaternion(betaRotation.Axis, betaRotation.Angle) *
new Quaternion(gammaRotation.Axis, gammaRotation.Angle);
For example, when I initialize a ProjectionCamera with an UpDirection of [1 0 0], meaning it's been rotated 90 degrees around its LookDirection axis ([0 0 -1]), the calculated Euler Angle Rotations are as follows:
alphaRotation --> 90 deg. around [0 1 0]
betaRotation --> 90 deg. around [0 0 -1]
gammaRotation --> -90 deg. around [1 0 0]
My test verifies that, when applied in order, these rotations will transform the default Up-vector ([0 1 0]) into the current Up-vector ([1 0 0]), effectively rotating it 90 deg. around the [0 0 -1] axis. (It's also reasonable straightforward to verify this by hand.)
However, when I apply the calculated QuaternionRotation to the default Up-vector, it is transformed to the vector [-1 0 0], which is obviously wrong. I have hard-coded these results within a Unit Test and got the same results:
[TestMethod]
public void ConversionTest()
{
var vector = new Vector3D(0, 1, 0);
var alphaRotation = new AxisAngleRotation3D(new Vector3D(0, 1, 0), 90);
var betaRotation = new AxisAngleRotation3D(new Vector3D(0, 0, -1), 90);
var gammaRotation = new AxisAngleRotation3D(new Vector3D(1, 0, 0), -90);
var a = new Quaternion(alphaRotation.Axis, alphaRotation.Angle);
var b = new Quaternion(betaRotation.Axis, betaRotation.Angle);
var c = new Quaternion(gammaRotation.Axis, gammaRotation.Angle);
var combinedRotation = a * b * c;
var x = Apply(vector, alphaRotation, betaRotation, gammaRotation);
var y = Apply(vector, combinedRotation);
}
When you run the test above, you will see that x gives you the expected vector ([1 0 0]) but y will be different, where it should be exactly the same rotation.
What am I missing?

I solved the issue. Apparantly, the order of the multiplication of the individual Quaternions should be reversed. So, to convert Euler Angles (or any set of rotations) into a single Quaternion in .NET you should do the following:
var rotation = gammaRotation * betaRotation * alphaRotation;
where rotation represents geometrically applying alphaRotation first, then betaRotation and finally gammaRotation. I just wished they had documented this, since the meaning of the order depends on the specific library you are working with....

Related

C# Set Rotation of Object to Face Coordinates

I'm making a game in a custom engine using C#. (Not Unity)
I've got a large grid and the x/y coordinates of two objects. The Player object and the Destination object. Along with the player's current rotation in degrees (0-360).
I've become too reliant on existing game engines and cannot work out how to find the rotation I need to put the player at to face the target.
playerRotation;//0 to 360 degrees.
playerX double = 47.43;
playerY double = 43.36;
targetX double = 52.15;
targetY double = 38.67;
My method at the moment is to try and get the distance between the objects by:
float distanceX = Math.Abs(playerX - destinationX);
float distanceY = Math.Abs(playerY - destinationY);
Which seems to work fine. Then I need to rotate the player to face the destination and have them move towards it until distanceX/Y are <= 0.
Edit: I've been messing with Atan2 to try and get the answer.
Vector2 playerCoords = new Vector2(playerX, playerY);
Vector2 targetCoords = new Vector2(targetX, targetY);
double theta = Math.Atan2((targetCoords.yValue - playerCoords.yValue), (targetCoords.xValue - playerCoords.xValue));
theta = theta * (180 / Math.PI);//Convert theta to degrees.
double sigma = playerRotation;//Direction in degrees the player is currently facing.
double omega = sigma - theta;
OutputLog("omega: " + omega);
My output log should be showing me the degrees my player needs to be facing to be facing the target. But it's giving me the wrong results.
Player: (4782, 4172) and
Target: (4749, 4157)
Angle should be about 286~.
But Theta = -155 and omega = 229.
Vector math can be very helpful, and it's not that complicated.
First vectors would be your player position and destination's:
Vector2 playerPos = new Vector2(playerX, playerY);
Vector2 destinationPos = new Vector2(destinationX, destinationY);
Now you can just subtract both vectors, to get a vector which points from one position to the other.
Vector2 delta = destination - playerPos; // Note, it might be the other way around: playerPos - destination
That delta vector has a length, and that is the distance between both points. There is usually a Length and a LengthSquared property available on the Vector class. Be aware however that calculating the length is quite CPU intensive because it uses a square root. If you want to compare that distance to a fixed distance like 200, just use the length squared and compare it to (200 * 200) which is way faster.
You can also use that delta, to let a bullet fly from one position to the other. You just need to normalize delta, there's a method for it, and you have it scaled down to length one. You can now use that delta, multiplied with a speed on each physics cycle to change the bullets position.
Now to get the angle, you can just use:
double angle = Math.Atan2 (delta.Y, delta.X); // Note that y and x are reversed here, and it should be like that.
Note that this angle is in radians, not degrees. A circle in radians starts at -PI and ends at PI. A full circle therefore is 2 * PI. To convert radians to degrees, you can see this question
Edit
I always assume that 12 o'clock is 0 degrees, 3 o'clock is 90, 6 o'clock is 180 and 9 o'clock is 270.
But actually in the cartesian coordinate system things are a bit different. I also made this false assumption in the code below.
But it turned out that I was wrong. See this picture
Now if you look at my sourcecode, all my variables are named incorrectly. If you look at their values however you can see that they match up with the picutre. Therefore the Atan2 correction is as it's supposed to be.
// Using Vector2 from System.Numerics;
internal static double RadianToDegree(double rad)
{
double thetaDegree = rad * (180.0 / Math.PI);
// Convert negative angles into positive ones
// https://stackoverflow.com/a/25725005/7671671
double thetaDegree2 = (thetaDegree + 360) % 360;
return thetaDegree2;
}
internal void Run()
{
// Player: (4782, 4172) and Target: (4749, 4157)
Vector2 player = new Vector2(4782, 4172);
Vector2 target = new Vector2(4749, 4157);
Vector2 delta = target - player;
double theta = Math.Atan2(delta.Y, delta.X);
double thetaDegree = RadianToDegree(theta);
// Given cartesian coordinate system
// positive y is up, negative is down
// positive x is right, negative is left
// Falsely assuming up is 0
// Falsely assuming right is 90
// Falsely assuming down is 180
// Falsely assuming left is 270
Vector2 v0 = new Vector2(0, 1);
Vector2 v45 = new Vector2(0.5f, 0.5f);
Vector2 v90 = new Vector2(0.5f, 0);
Vector2 v180 = new Vector2(0, -1);
Vector2 v270 = new Vector2(-1, 0);
double theta0 = Math.Atan2(v0.Y, v0.X);
double theta45 = Math.Atan2(v45.Y, v45.X);
double theta90 = Math.Atan2(v90.Y, v90.X);
double theta180 = Math.Atan2(v180.Y, v180.X);
double theta270 = Math.Atan2(v270.Y, v270.X);
double result0 = RadianToDegree(theta0);
double result45 = RadianToDegree(theta45);
double resultv90 = RadianToDegree(theta90);
double resultv180 = RadianToDegree(theta180);
double resultv270 = RadianToDegree(theta270);
// result 0 --> 90
// result 45 --> 45
// result 90 --> 0
// result 180 --> 270
// result 270 --> 180
}

Combine more than 3 rotations (Quaternions)

I have a 3D point and the x,y,z rotations (qInitial) for that point.
I want to rotate that point more (by some degrees that could be 0 up to 360) around y axis (qYextra). How can I calculate the final Euler rotation (qResult.eulerAngles) that is a combination of these 4 rotations (x-y-z-y)?
I have tried calculating the initial quaternion rotation, and the extra rotation to be applied. And then multiply these two quaternions. However, I get weird results (probably gimbal lock).
Code in C#. Unity.
1.Quaternion qX = Quaternion.AngleAxis(rotationFromBvh.x,Vector3.right);
2.Quaternion qY = Quaternion.AngleAxis(rotationFromBvh.y,Vector3.up);
3.Quaternion qZ = Quaternion.AngleAxis(rotationFromBvh.z,Vector3.forward);
4.Quaternion qYextra = Quaternion.AngleAxis(angle,Vector3.up);
Quaternion qInitial = qY * qX * qZ; // Yes. This is the correct order.
qY*qX*qZ has exactly the same Euler x,y,z results as
Quaternion.Euler(rotationFromBvh)
Quaternion qResult = qInitial * qYextra;
return qResult.eulerAngles;
I can confirm that the code works fine (no gimbal lock) when 4th rotation is 0 degrees (qYextra = identity). Meaning that qInitial is correct. So, the error might be due to the combination of those 2 rotations (qInitial and qYextra) OR due to the convertion from Quaternion to Euler.
EXAMPLE: (qYextra angle is 120 degrees)
RESULTS:
qInitial.eulerAngles gives these results: applying_qInitial_rotation
qResult.eulerAngles gives these results: applying_qResult_rotation
EXPECTED RESULTS:
The expected results should be like qInitial but rotated 120 degrees around y.
Any suggestions? I haven't yet found a solution, and probably I won't.
In your question, you write:
How can I calculate the final Euler rotation that is a combination of these 4 rotations (x-y-z-y)?
However, in your code, you write
Quaternion qInitial = qY * qX * qZ; // Multiply them in the correct order.
I don't know unity, but I would have expected that you would want the order of the rotations to match x, y, z, rather than y, x, z.
You stated that it works when the y-rotation is 0, in which case the place of the y-rotation in the order becomes irrelevant.
Do you get the correct result if you instead write the code below?
Quaternion qInitial = qX * qY * qZ; // Multiply them in the correct order.

how applying transformation to matrix from random rotations

i have object that are receiving 4 transformations step by step at different location in the software. We use a CAD engine library so i have custom objects to do transformation.
The Situation
i need to recuperate from the matrix of one of the object translation and rotation so that i can create a new matrix later one from those rotation.
The Code (sample)
// create a transformation object, very simple 1 translation and 3 rotations
var t = new Translation(10, 15, 20)
* new Rotation(10d.ToRadian(), Vector3D.AxisX)
* new Rotation(10d.ToRadian(), Vector3D.AxisY)
* new Rotation(10d.ToRadian(), Vector3D.AxisZ);
// convert to the windows Matrix3D
var m1 = ToWindowsMatrix(t);
// decompose the matrix to get one random possibility of combinaison of rotation x, y and Z to obtain the same result
var rx = Math.Atan2(m1.M32, m1.M33).ToDegree();
var ry = Math.Atan2(-m1.M31, Math.Sqrt(Math.Pow(m1.M32, 2) + Math.Pow(m1.M33, 2))).ToDegree();
var rz = Math.Atan2(m1.M21, m1.M11).ToDegree();
Source for formula to get rotation x,y and z:
// create a transformation object, in the same order, hardcoded translate but use the computed equivalent rotation x,y and z
var t2 = new Translation(10, 15, 20)
* new Rotation(rx.ToRadian(), Vector3D.AxisX)
* new Rotation(ry.ToRadian(), Vector3D.AxisY)
* new Rotation(rz.ToRadian(), Vector3D.AxisZ);
// convert to the windows Matrix3D
var m2 = ToWindowsMatrix(t2);
The Problem
if i take a 3d object and apply the transformation of t2 the rotation of the object is completely off compared to what t1 gives.
I absolutely need to get decomposed value as the whole system display the objects using 6 parameters (origin x,y,z and rotation x,y,z) which then create an identity matrix and apply a translate and 3 rotate.
I have been working of this for around 7 hours and i figured out that if i ONLY apply a single rotation on t1 ie "rotation X 90" and on t2 i still apply ALL rotations with the computed value the rotation DO match. So i have the feeling that the order of transformation need to be in a specific order but i cannot change it. So it mean i need to get the rotation x,y and z so that by applying Translate * RotX * RotY * RotZ it matches.
Anything i might have missed ? i have found some mathlab samples too but formula are all the same so far.
Figured out after a while how to get the XYZ rotation of a matrix. The problem is that the Euler angles return in a ZYX rotation. More operation need to be performed in order to extract the XYZ from it.
public static double[] GetXYZRotation(this Matrix3D matrix)
{
// get the euler angles
var eulerX = Math.Atan2(matrix.M32, matrix.M33).ToDegree();
var eulerY = Math.Atan2(-matrix.M31, Math.Sqrt(Math.Pow(matrix.M32, 2) + Math.Pow(matrix.M33, 2))).ToDegree();
var eulerZ = Math.Atan2(matrix.M21, matrix.M11).ToDegree();
// create transformation of the euler angle to get a matrix. euler rotation is ZYX
var eulerMatrix = (new Translation(matrix.OffsetX,matrix.OffsetY,matrix.OffsetZ)
* new Rotation(eulerZ.ToRadian(), Vector3D.AxisZ)
* new Rotation(eulerY.ToRadian(), Vector3D.AxisY)
* new Rotation(eulerX.ToRadian(), Vector3D.AxisX)).ToWindowsMatrix();
// get the Alpha Beta and Gamma of the euler matrix
var beta = Math.Atan2(eulerMatrix.M13, Math.Sqrt(Math.Pow(eulerMatrix.M11, 2d) + Math.Pow(-eulerMatrix.M12, 2d)));
var alpha = Math.Atan2(-(eulerMatrix.M23 / Math.Cos(beta)), eulerMatrix.M33 / Math.Cos(beta));
var gamma = Math.Atan2(-(eulerMatrix.M12 / Math.Cos(beta)), eulerMatrix.M11 / Math.Cos(beta));
// get the XYZ rotation per euler computed alpha beta gamma
var rotationX = alpha.ToDegree();
var rotationY = beta.ToDegree();
var rotationZ = gamma.ToDegree();
return new[] { rotationX, rotationY, rotationZ };
}
So i needed to compute the matrix applying the euler angle in the reverse order : Z then Y then X and from there calculate the Alpha, Beta and Gamma which retreive a XYZ rotation values

Offsetting a 3D object in the camera's view

I'm new here. I'm having a bit of trouble on the last bit of rendering my model in accordance(Is that a word?) to my camera's view angle.
I have succeeded in creating my 3D camera, being able to rotate it in 3D space, and move/rotate my gun model to correspond with the camera's movements/ rotations.
The final part however, I cannot figure out. I'm trying to offset the gun model's position on the camera's rotated axes, but can't find anything about how to do so.
This is the code I have so far, without the offset I would like.
GunWorldMatrix = Matrix.CreateScale(0.5f) *
CameraRotation *
Matrix.CreateTranslation(CameraPosition);
It ends up placing the gun "in my head", so to speak. perfectly positions and rotates the model, but I don't know how to add any offset based on either it's own axes or the camera's. (So you can have it in a hip fire position, etc.)
Although, This;
Matrix view = Matrix.CreateLookAt(new Vector3(0, 2, 0), new Vector3(0, 0, 0), Vector3.UnitY);
as the view, with this;
GunWorldMatrix = Matrix.CreateScale(0.5f) *
Matrix.CreateRotationX(-0.15f) *
Matrix.CreateTranslation(new Vector3(0.2f, 1.65f, 0f));
as the Guns World Matrix was perfectly positioned if it were frozen to axes with no rotation... which may or may not help. Any ideas?
The viewMatrix (modeling the camera), consists of three orthogonal vectors:
X=[xx,xy,xz], Y=[yx,yy,yz], Z=[zx,zy,zz] plus cameras position T=[tx,ty,tz]
[xx yx zx tx] --> order is [0 4 ] [ 0 1 2 ]
[xy yy zy ty] [1 5 ] -> R^-1 = [ 4 5 6 ]
[xz yz zz tz] [2 ... ] [ 8 9 10 ]
[0 0 0 1] [3 15]
Here typically Y vector represents Up, X is right and Z is where to lookAt -- all are normalized to length 1.
You need to be concerned only about the upper left 3x3 Rotation matrix and it's inverse R^-1, which for normalized rotation matrices are trivially the transpose: R^-1 = R'.
To move something at "screen" right, you'll add to object.t_vector += camera^(-1).x_vector. This can be also scaled by distance, FoV and aspect ratio to move an object exactly 1 pixel right or proportionally to the screen dimensions.
Note: This is openGL convention of view matrix, where the vectors are column vectors and the matrices are in column-row order. Other systems may represent or store the matrices in row-column order.
Thanks to a guy over on another forum, he came up with this;
float verticalOffset = 1.65f;
float longitudinalOffset = 0f;
float lateralOffset = .2f;
GunWorldMatrix = Matrix.Invert(view); // GunWorldMatrix is now a world space version of the camera's Matrix with position & orientation of the camera
Vector3 camPosition = GunWorldMatrix.Translation;// note position of camera so when setting gun position, we have a starting point.
GunWorldMatrix *= Matrix.CreateFromAxisAngle(GunWorldMatrix.Up, 0.15f); // gives the gun that slight angle to point towards target while laterally offset a bit
Vector3 gunPositionOffsets = (GunWorldMatrix.Up * verticalOffset) +
(GunWorldMatrix.Forward * longitudinalOffset) +
(GunWorldMatrix.Right * lateralOffset);
GunWorldMatrix.Translation = camPosition + gunPositionOffsets;
Which gives me my answer. Thanks for your answer though, Aki. It led me down the right track.

Calculate top-left and bottom right of a Rectangle GIven Origin, Direction & Size

I did a search on this, but didn't find a question that quite matched what I was after. I want the user to be able to define a textured plane. The parameters I have are:
Size (A Vector2)
Direction (A Vector3)
Origin (A Vector3)
So, I want to be able to calculate the 4 vertices of the rectangle given the above information. So, if I wanted a plane facing up, with a width and height of 1000:
Size = (1000, 1000)
Direction = 0, 1, 0 (Up)
Origin = 0, 0, 0
So this would define a plane on the X & Z axis, facing upwards. What I do not understand is how to calculate the 4 corners in 3D space given this information. Do I need extra information, or is there a better way to arbitrarily specify a plane?
Edit : Current Code
In the following code:
Size = 10000, 10000
Center = 0, 0, 0
Normal = 0, 1, 0
Vector3 arb = new Vector3(1, 1, 1);
Vector3 planeY = Vector3.Normalize(Vector3.Cross(Normal, arb));
Vector3 planeX = Vector3.Normalize(Vector3.Cross(Normal, planeY));
planeX *= Size.X / 2;
planeY *= Size.Y / 2;
Vector3[] ret = new Vector3[4]
{
(Center - planeX - planeY),
(Center - planeX + planeY),
(Center + planeX - planeY),
(Center + planeX + planeY)
};
Your plane isn't fully defined yet. You need another vector going along the plane, the so called 'tangent' vector. In your above example, where should the Y-axis of the texture be pointing? Along the X-axis, along the Z-axis? Or maybe a completely different user defined axis? Your tangent vector is a vector that should point in the general direction where the X-axis of the plane should go.
Let's say we have a tangent vector as well, it doesn't neccesarilly need to point along the plane. You can construct the plane as follows:
Vector3[] vertices(Vector2 size, Vector3 center, Vector3 normal, Vector3 tangent)
{
Vector3 planeY = Vector3.Normalize(Vector3.Cross(normal, tangent));
Vector3 planeX = Vector3.Normalize(Vector3.Cross(normal, planeY));
planeX *= size.X / 2;
planeY *= size.Y / 2;
vertices = new Vector3[]
{
(center - planeX - planeY),
(center - planeX + planeY),
(center + planeX - planeY),
(center + planeX + planeY),
};
return vertices;
}
planeX and planeY are normalized vectors which point along X and Y axes of the plane itself. By multiplying these by size / 2, we get 2 vectors that span from the center to the edge of the plane in both X and Y directions. By adding these two together in different ways, we get the four corners.
Here's a diagram so you can get a better picture in your head. The tangent vector T is "flattened" onto the X-axis.
This is fine as a definition of a plane: you have a point and the normal vector. You need to get two vectors (A & B) on the plane and add one (A * one of the size values) to the origin to get the second corner. Add the second vector (B * the other size value) to get the third corner and add both vectors * their corresponding size values to the origin to get the forth corner.
To get the first vector, calculate the cross product of the normal vector (Direction) with an arbitrary vector (not equal to the direction). This will give you vector A. To get vector B calculate the cross product of A and the Direction.

Categories