I am building a game that at one point relies on raw gyrometer readings. I am getting the readings using the following code (I'm using C#):
void Start () {
Input.gyrometer.enabled = true;
}
void Update() {
_guiText = Input.gyrometer.attitude.eulerAngles.ToString();
}
I'm displaying the values on screen in the OnGUI() method. As I'm using Unity I'm using a right-handed coordinate system and I've noticed that the Y value from the gyrometer corresponds to the X rotation in Unity.
The problem is that these values jump around massively. For example I can be holding the device at X:90° and rotate slowly through the Y Axis; after a certain point (around the 270° mark) the X will suddenly flip 180°. It means however, that if I start the app while the device is at 270° in the Y the X reading will be completely unusable. The same effect happens in the Z axis.
I have managed to get round this on iOS devices by resetting the gyro readings using the following snippet: Input.gyro.enabled = false; followed by Input.gyro.enabled = true; I use this to get an accurate reading of the device's absolute rotation and then I use rotational changes to get any other device movement. This doesn't work on Android however.
Has anyone come across this before? If so how did you get round it or is a bug that cannot be resolved?
This is because you are reading a conversion of Quaternion. Euler angles in Unity are just a friendly way to get a V3 representation of something that is hard to grasp. It is recommended not to affect eulerAngles directly as well as using them for reference. Instead, think of flat device as one rotation then start getting the angle between default rotation and current rotation and use that value to define your actions.
You can use Quaternion.Angle for that purpose.
Related
The problem:
I've been working on trying to create a 3D position in Worldspace based on a 2D face RGB face detection, similar to this Microsoft example. I am using Unity 2020.3.16f1, MRTK 2.8.2 and C# for the Unity scripts. I have been able to convert the C++ code shown in the link to C# with a lot of success. One final issue is accessing the HoloLens 2 origin SpatialCoordinateSystem to be used in the Transform between the camera's 2D coordinate system and the 3D worldspace system.
The SO question at this link asks a very similar question, and I have tried to use SpatialLocator.GetDefault().CreateStationaryFrameOfReferenceAtCurrentLocation().CoordinateSystem() as the answers suggest. I call this function in Unity's "Awake" method, to ensure it is set as early as possible, as shown below.
private void Awake()
{
worldSpatialCoordinateSystem = SpatialLocator.GetDefault().CreateStationaryFrameOfReferenceAtCurrentLocation().CoordinateSystem;
}
Problem is that, if the user's headset is moving while the application starts, I notice an offset in the 3D locations commensurate with the direction/position of the head when the application was starting. I have narrowed the problem down to the fact that the HL2 and Unity set an origin SpatialCoordinateSystem just before the function in Awake is called, accounting for the offset between what I expect and what I see.
What I've tried:
I have tried using some of the other solutions listed here as well. I cannot use UnityEngine.Windows.WebCam.PhotoCapture becuase of the way I am create still image captures, and (SpatialCoordinateSystem)Marshal.GetObjectForIUnknown(WorldManager.GetNativeISpatialCoordinateSystemPtr()) appears to be deprecated and unusable. Finally, I tried CreateStationaryFrameOfReferenceAtCurrentLocation(Vector3, Quaternion), and used the inverse of the current Camera.main position and rotation, hoping to compensate for the offset, but it did not appear to work (NumericsConversionExtensions is the UnityEngine-to-System.Numerics converver found here). That code is below.
worldSpatialCoordinateSystem = SpatialLocator.GetDefault().CreateStationaryFrameOfReferenceAtCurrentLocation(NumericsConversionExtensions.ToSystem(Camera.main.transform.position*-1),
NumericsConversionExtensions.ToSystem(Quaternion.Inverse(Camera.main.transform.rotation))).CoordinateSystem;
My question:
Is there either another way to access the origin spatial coordinates or possibly to compensate for the offset when the user is moving their head before Awake is called?
I spent 3 days working on the solution, and found one 1 hour after asking SO. For those who come here, use the code below, originally found here.
using Microsoft.MixedReality.OpenXR;
worldSpatialCoordinateSystem = PerceptionInterop.GetSceneCoordinateSystem(Pose.identity) as SpatialCoordinateSystem;
I have a camera that needs to orbit locally around an object. This object has an arbitrary rotation, described by a normal vector. Imagine a spherical planet, with a camera looking down at a certain triangle on that planet.
My current implementation is to use the classic vector crossing method to generate a rotation matrix from the triangle's normal, then use that matrix as the basis for the standard orbit camera. This works fine near the equator of the planet, but once it gets near the poles, it starts blowing up, with the camera behaving increasingly erratically the closer that it gets to the very center of the pole.
I've determined that this is due to the first vector cross, as the two vectors are close to one another in that case - I'm not sure what the technical name for the phenomena is. If first vector is 0,1,0, the craziness happens when the normal is close to 0, 1, 0 or 0, -1, 0.
I've found quite a few descriptions of this problem, but no working solutions. The closest I've come was here: http://xboxforums.create.msdn.com/forums/p/13278/13278.aspx It mentions that to handle the 'singularity', use a different vector when it is detected. I can easily determine when the camera is on planet face that will cause this to happen (as my planet sphere is generated from 6 quadtrees projected to spherical coordinates), but there is a very noticeable snap when I switch to a new vector.
Here's the current code:
Vector3 triNormal; //the current normal of the target vertex
Vector3 origin = Vector3.Forward;
Matrix orientation.Forward = origin;
orientation.Up = triNormal;
orientation.Right = Vector3.Cross(orientation.Up, orientation.Forward);
orientation.Right.Normalize();
orientation.Forward = Vector3.Cross(orientation.Right, orientation.Up);
orientation.Forward.Normalize();
I've experimented with detecting when triNormal is on one of the pole faces, and setting 'origin' to something else such as Right. The camera then behaves properly once it is on the face, but is immediately snapped to a new rotation as it crosses over. This makes sense, as its reference vector has just changed, but needs to be eliminated for a smooth user experience. I tried figuring out how to offset the camera's yaw for the orbit camera to counteract the new coordinate system, but it doesn't seem to be a constant value, depending on where on the sphere the camera is currently aiming. I'm not sure how I could calculate what the difference is.
Also note that as it's in XNA and C#, I'm using a right-hand coordinate system.
I don't understand why you do this:
orientation.Forward = Vector3.Cross(orientation.Right, orientation.Up);
orientation.Forward.Normalize();
when you've already used previous orientation.Forward to get orientation.Right.
(If you are "crossing" normal vector I don't think you'll need to normalize them.)
Anyway, if triNormal is the current normal of the target vertex, and your camera is looking down to it, I think you should have:
orientation.Forward = -triNormal
I'm trying to understand and use OpenCV. I wanted to know if it is possible to find and measure an angle between two frames.
I explain : The cam is fix and the frames could rotate around the center and won't move. For now I managed to rotate manually and I would like to be able to compare frames and return the angle. For instance :
double getRotation(Image img1, Image img2) {
//Compare the frames
//Return the value
}
and then I rotate following that angle.
If you're able to detect static objects, e. g. background, on the frames then you may find points called good_features_to_track (cvGoodFeaturesToTrack) on the background and track this points using optical_flow (cvCalcOpticalFlowPyrLK).
If rotation is only on 'xy' plain you're able to detect rotation using cvGetAffineTransform.
Since only rotation is allowed (no translation and scaling) it's not difficult to determine an angle of rotation using transformation matrix, obtained by cvGetAffineTransform. That matrix looks like (see wikipedia):
Where \theta is the rotation angle
Well this might be very tricky, just a simpler solution might be to find the hough lines of the frames. Of course you would need to determined where the best and stable lines are which you can track between the two frames, once that is available, you can then find the angle between the two frames. What Andrey has suggested for finding the angles should be usable as well.
I'm a real noob who just started learning 3d programming and i have a really hard time learning about rotation in 3D space. My problem is that I can't seem to figure out how to rotate an object using it's local coordinates.
I have a basic class for 3d objects and, for starters, i want to implement functions that will rotate the object on a certain axis with x degrees. So far i have the following:
public void RollDeg(float angle)
{
this.rotation = Matrix4.Mult(rotation,
Matrix4.CreateRotationX(MyMath.Conversions.DegToRad(angle)));
}
public void PitchDeg(float angle)
{
this.rotation = Matrix4.Mult(rotation,
Matrix4.CreateRotationY(MyMath.Conversions.DegToRad(angle)));
}
public void YawDeg(float angle)
{
this.rotation = Matrix4.Mult(rotation,
Matrix4.CreateRotationZ(MyMath.Conversions.DegToRad(angle)));
}
'rotation' is a 4x4 matrix which starts as the identity matrix. Each time i want to roll/pitch/yaw the object, i call one of the functions above.
for drawing, i use another function that pushes a matrix onto the ModelView stack, multiplies it with the translation, rotation and scale matrices of the object (in this order) and begins drawing the vertices. ofcourse, finally i pop the matrix off the stack.
the problem is that the functions above rotate the object on the GLOBAL axis, not on the LOCAL ones, even if, from my understanding, every time you rotate an object, the local system changes it's axis and then, when a new rotation is applyied on top of the others, the local axis are used for the new one.
i read different tutorials about the math behind it and how to rotate objects, but i couldn't find one the could help me.
if anyone has the time, i would really appreciate if he could help me understand HOW to rotate around local axis and, maybe even more important, what i did wrong on my current implementation.
If you want to perform your transformations in this order : translation -> rotation -> scale (which makes perfectly sense, it's what's wanted usually), you have to multiply your matrices in the reverse order.
In a right-handed coordinate system (i.e. the one openGL uses), matrix multiplication must be performed from right to left. This is why :
ModelViewTransform = Transform * View * Model // <- you begin by the model, right ? so it's this way
Note that in directX they use a left-handed coordinate system. It has his shortcomings, but it's more intuitive.
I am currently experimenting with some physics toys in XNA using the Farseer Physics library, however my question isn't specific to XNA or Farseer - but to any 2D physics library.
I would like to add "rocket"-like movement (I say rocket-like in the sense that it doesn't have to be a rocket - it could be a plane or a boat on the water or any number of similar situations) for certain objects in my 2D scene. I know how to implement this using a kinematic simulation, but I want to implement it using a dynamic simulation (i.e. applying forces over time). I'm sort of lost on how to implement this.
To simplify things, I don't need the dynamics to rotate the geometry, just to affect the velocity of the body. I'm using a circle geometry that is set to not rotate in Farseer, so I am only concerned with the velocity of the object.
I'm not even sure what the best abstraction should be. Conceptually, I have the direction the body is currently moving (unit vector), a direction I want it to go, and a value representing how fast I want it to change direction, while keeping speed relatively constant (small variations are acceptable).
I could use this abstraction directly, or use something like a "rudder" value which controls how fast the object changes directions (either clockwise or counter clockwise).
What kind of forces should I apply to the body to simulate the movement I'm looking for? Keep in mind that I would also like to be able to adjust the "thrust" of the rocket on the fly.
Edit:
The way I see it, and correct me if I'm wrong, you have two forces (ignoring the main thrust force for now):
1) You have a static "fin" that is always pointed in the same direction as the body. If the body rotates such that the fin is not aligned with the direction of movement, air resistance will apply forces to along the length of the fin, proportional to the angle between the direction of movement and the fin.
2) You have a "rudder", which can rotate freely within a specified range, which is attached some distance from the body's center of mass (in this case we have a circle). Again, when this plane is not parallel to the direction of movement, air resistance causes proportional forces along the length of the rudder.
My question is, differently stated, how do I calculate these proportional forces from air resistance against the fin and rudder?
Edit:
For reference, here is some code I wrote to test the accepted answer:
/// <summary>
/// The main entry point for the application.
/// </summary>
static void Main(string[] args)
{
float dc = 0.001f;
float lc = 0.025f;
float angle = MathHelper.ToRadians(45);
Vector2 vel = new Vector2(1, 0);
Vector2 pos = new Vector2(0, 0);
for (int i = 0; i < 200; i++)
{
Vector2 drag = vel * angle * dc;
Vector2 sideForce = angle * lc * vel;
//sideForce = new Vector2(sideForce.Y, -sideForce.X); // rotate 90 degrees CW
sideForce = new Vector2(-sideForce.Y, sideForce.X); // rotate 90 degrees CCW
vel = vel + (-drag) + sideForce;
pos = pos + vel;
if(i % 10 == 0)
System.Console.WriteLine("{0}\t{1}\t{2}", pos.X, pos.Y, vel.Length());
}
}
When you graph the output of this program, you'll see a nice smooth circular curve, which is exactly what I was looking for!
If you already have code to integrate force and mass to acceleration and velocity, then you just need to calculate the individual part of each of the two elements you're talking about.
Keeping it simple, I'd forget about the fin for a moment and just say that anytime the body of your rocket is at an angle to it's velocity, it will generate a linearly increasing side-force and drag. Just play around with the coefficients until it looks and feels how you want.
Drag = angle*drag_coefficient*velocity + base_drag
SideForce = angle*lift_coefficent*velocity
For the rudder, the effect generated is a moment, but unless your game absolutely needs to go into angular dynamics, the simpler thing to do is let the rudder control put in a fixed amount of change to your rocket body angle per time tick in your game.
I suddenly "get" it.
You want to simulate a rocket powered missile flying in air, OK. That's a different problem than the one I have detailed below, and imposes different limits. You need an aerospace geek. Or you could just punt.
To do it "right" (for space):
The simulated body should be provided with a moment of inertia around its center of mass, and must also have a pointing direction and an angular velocity. Then you compute the angular acceleration from the applied impulse and distance from the CoM, and add that to the angular velocity. This allows you to compute the current "pointing" of the craft (if you don't use gyros or paired attitude jets, you also get a (typically very small) linear acceleration).
To generate a turn, you point the craft off the current direction of movement and apply the main drive.
And if you are serious about this you also need to subtract the mass of burned fuel from the total mass and make the appropriate corrections to the moment of inertia at each time increment.
BTW--This may be more trouble than it is worth: maneuvering a rocket in free-fall is tricky (You may recall that the Russians bungled a docking maneuver at the ISS a few years ago; well, that's not because they are stupid.). Unless you tell us your use case we can't really advise you on that.
A little pseudocode to hint at what you're getting into here:
rocket {
float structuralMass;
float fuelMass;
point position;
point velocity;
float heading;
float omega; // Angular velocity
float structuralI; // moment of inertia from craft
float fuelI; // moemnt of inertia from the fuel load
float Mass(){return struturalMass + fuelMass};
float I(){return struturalI + fuelI};
float Thrust(float t);
float AdjustAttitude(float a);
}
The upshot is: maybe you want a "game physics" version.
For reason I won't both to go into here, the most efficient way to run a "real" rocket is generally not to make gradual turns and slow acceleration, but to push hard when ever you want to change direction. In this case you get the angle to thrust by subtracting the desired vector (full vector, not the unit) from the current one. Then you pointing in that direction, and trusting all out until the desired course is reached.
Imagine your in floating in empty space... And you have a big rock in your hand... If you throw the rock, a small impulse will be applied to you in the exact opposite direction you throw the rock. You can model your rocket as something that rapidly converts quantum's of fuel into some amount of force (a vector quantity) that you can add to your direction vector.