Trigonometry issue - Create a PointF using an angle - c#

I have been scratching my head on this for quite a while now. Guess I should have paid more attention in the trigonometry math classes when I was younger but here we go:
I have an angle and a point. I then want to put a second point in the direction of the angle, 200 units away from the first point. I use Atan2 to get the angle, then cos and sin to get the third point. But... I think something goes wrong when calculating the Sin for p3.Y.
EDIT: To clarify, I removed p2 and used the angle directly:
PointF p1 = new PointF(20, 20);
double angle = 1.3034851559678624f;
//Create a new PointF in the same direction, 200 pixels away from p1
//{ X = 199,9482, Y = 4,549629 }
PointF p3 = new PointF
{
X = (float)(Math.Cos(Math.PI * angle / 180.0) * 200),
Y = (float)(Math.Sin(Math.PI * angle / 180.0) * 200)
};
//This is where I would expect 1.3034851559678624 as the first angle
//but I get -4.9073849244837184
double angle2 = Math.Atan2(p3.Y - p1.Y, p3.X - p1.X) * 180 / Math.PI;
Here is a visual representation of the values above. The green line is the first angle.

The problem with your current code is that you don't calculate p3 relative to p1. You need to add p1.X and p1.Y to the coordinates of p3:
PointF p3 = new PointF
{
X = p1.X + (float)(Math.Cos(Math.PI * angle / 180.0) * 200),
Y = p1.Y + (float)(Math.Sin(Math.PI * angle / 180.0) * 200)
};

Well if p3 is supposed to be on the line p1-p2 you don't need to use any trigonometry at all. You can find the coordinates of p3 directly from those of p1 and p2. Suppose the distance between p1 and p2 is 1000 units. Then the location of p3, 200 units from p1 is:
{p1.x+(p2.x-p1.x)*(200/1000),p1.y+(p2.y-p1.y)*(200/1000)}
If the distance between the two points is not 1000 (which of course it won't be), replace that value by the Euclidean Distance between them.
If you still want the slope of the line you calculate the ratio
(p2.y-p1.y)/(p2.x-p1.x)
which gives you the slope as a single number, where 1 represents the line which passes through the origin at a +45deg angle. A little fiddling around will turn the slope into the angle, be careful when the line is vertical, ie where the denominator of the ratio is 0.

You did not have to use atan to get the angle and then calc sin/cos.
You can get sin and cos direct from coordinates of points.
cos = (p2.x - p1.x)/length
sin = (p2.y - p1.y)/length
http://en.wikipedia.org/wiki/Line_%28geometry%29

Related

Find the required angle of rotation to move from point A to point B

I want to make a bot that will walk along points from two coordinates (X and Y), I have the coordinates of the character, his rotation angle (1-180 / (- 1) - (-180)), and the required point where he is should get there. How do I know if the angle of rotation is necessary for the person to look directly at the point?
I have been sitting with this for a long time, and my head refuses to think at all, I tried to solve this by creating an angle between the radius vector, but nothing came of it.
public static double GetRotation(Point Destination)
{
double cos = Destination.Y / Math.Sqrt(Destination.X * Destination.X + Destination.Y * Destination.Y);
double angle = Math.Acos(cos);
return angle;
}
With regard to this problem, I would suggest generally using tan rather than cos due to the fact to get the angle between two point (X and y), the difference in the points provide the opposite and adjacent sides of the triangle for the calculation, rather than adjacent and hypotenuse.
If the player is at position X,Y, and the new position to walk to is X2,Y2, the equation to calculate the relative angle between them should be tan-1((Y2-Y)/(X2-X)) with regards to the X plane.
I think the below implementation should work for your example (though you may need to subtract the current player angle the player is facing to get the difference):
public static double GetRotation(Point source, Point Destination)
{
double tan = (Destination.Y - source.Y) / (Destination.X -source.X);
double angle = Math.Atan(tan) * (180/Math.PI) // Converts to degrees from radians
return angle;
}
Apologies for the formatting, am currently on mobile.
dx = Destination.X - Person.X;
dy = Destination.Y - Person.Y;
directionangle = Math.Atan2(dy, dx); // in radians!
degrees = directionangle * 180 / Math.PI;

Coordinates of perpendicular points

I would like to draw a rectangle made of Mesh.
I enter the A starting point and B ending point.
The width of the rectangle is known in advance and is equal to H.
How to correctly determine the coordinates of corner points? (Everything happens in 3D)
There are a lot of theoretical entries on the net (but mostly for 2D) and trying something like this:
var vAB = B - A;
var P1 = new Vector3(-vAB.z, vAB.y, vAB.x) / Mathf.Sqrt(Mathf.Pow(vAB.x, 2) + Mathf.Pow(vAB.y, 2) + Mathf.Pow(vAB.z, 2)) * 0.5 * H;
But I can't find the correct solution
The simple way should be to use the cross product. The cross product of two vectors is perpendicular to both input vectors. You will need to define the normal of your rectangle, In this I use vector3.up. A-B cannot be parallel to the normal vector, or you will get an invalid result.
var l = B - A;
var s = Vector3.Normalize(Vector3.Cross(l, Vector3.up));
var p1 = A + s * h/2;
var p2 = A - s * h/2;
var p3 = B - s * h/2;
var p4 = B + s * h/2;
Here's a quick explanation of the trig involved. There are other tools which will reduce the boilerplate a bit, but this should give you an understanding of the underlying maths.
I've tweaked your problem statement slightly: I'm just showing the XY plane (there's no Z involved), and I've rotated it so that the line AB forms a positive angle with the horizontal (as it makes explaining the maths a bit easier). A is at (x1, y1), B is at (x2, y2).
The first step is to find the angle θ that the line AB makes with the horizontal. Draw a right-angled triangle, where AB is the hypotenuse, and the other two sides are parallel to the X and Y axes:
You can see that the horizontal side has length (x2 - x1), and the vertical side has length (y2 - y1). The angle between the base and the hypotenuse (the line AB) is given by trig, where tan θ = (y2 - y1) / (x2 - x1), so θ = arctan((y2 - y1) / (x2 - x1)).
(Make sure you use the Math.Atan2 method to calculate this angle, as it makes sure the sign of θ is correct).
Next we'll look at the corner P1, which is connected to A. As before, draw a triangle with the two shorter sides being parallel at the X and Y axes:
This again forms a right-angled triangle with hypotenuse H/2. To find the base of the triangle, which is the X-distance between P1 and A, again use trig: H/2 * sin θ. Similarly, the Y-distance between P1 and A is H/2 cos θ. Therefore P1 = (x1 + H/2 sin θ, y2 - H/2 cos θ).
Repeat the same trick for the other 3 corners, and you'll find the same result, but with differing signs.
My approach requires you to use a Transform.
public Transform ht; // assign in inspector or create new
void calculatePoints(Vector3 A, Vector3 B, float H)
{
Vector3 direction = B - A;
ht.position = A;
ht.rotation = Quaternion.LookRotation(direction, Vector3.Up);
Vector3 P1 = new Vector3(A + ht.right * H/2f);
Vector3 P2 = new Vector3(A + ht.left * H/2f);
Vector3 P3 = new Vector3(B + ht.right * H/2f);
Vector3 P4 = new Vector3(B + ht.left * H/2f);
}
I think it's intuitive to think with "left" and "right" which is why I used the Transform. If you wanted to define a point along the way, you'd be using ht.forward * value added to A, where value would be something between 0 and direction.magnitude.

How to get the cursor position relative to ANY form of UIElement

I'm trying to find the cursor position related to the red line in the image below.
I tried these topics:
Using atan2 to find angle between two vectors
And How to calculate the angle between a line and the horizontal axis? but using
Math.Atan2
But there is a problem, P1 and P2 have not same values if I use these methods.
Is there any method to get any position of any points on a UIElement (e.g. Ellipse) using the red vector such as every point with the same angle (here P1 and P2) has the same value ?
Yes, and atan2 is perfectly the needed method. We can use cross- and dot-product to achieve result:
bx = redline_end.x - center.x
by = redline_end.y - center.y
// here bx=0 and by=75
px = p1.x - center.x
py = p1.y - center.y
angle = atan2(px * by - py * bx, px * bx + py * by) //and similar for P2

Find the point on a circle with given center point, radius, and degree

It's been 10 years since I did any math like this... I am programming a game in 2D and moving a player around. As I move the player around I am trying to calculate the point on a circle 200 pixels away from the player position given a positive OR negative angle(degree) between -360 to 360. The screen is 1280x720 with 0,0 being the center point of the screen. The player moves around this entire Cartesian coordinate system. The point I am trying trying to find can be off screen.
I tried the formulas on article Find the point with radius and angle but I don't believe I am understanding what "Angle" is because I am getting weird results when I pass Angle as -360 to 360 into a Cos(angle) or Sin(angle).
So for example I have...
1280x720 on a Cartesian plane
Center Point (the position of player):
let x = a number between minimum -640 to maximum 640
let y = a number between minimum -360 to maximum 360
Radius of Circle around the player: let r always = 200
Angle: let a = a number given between -360 to 360 (allow negative to point downward or positive to point upward so -10 and 350 would give same answer)
What is the formula to return X on the circle?
What is the formula to return Y on the circle?
The simple equations from your link give the X and Y coordinates of the point on the circle relative to the center of the circle.
X = r * cosine(angle)
Y = r * sine(angle)
This tells you how far the point is offset from the center of the circle. Since you have the coordinates of the center (Cx, Cy), simply add the calculated offset.
The coordinates of the point on the circle are:
X = Cx + (r * cosine(angle))
Y = Cy + (r * sine(angle))
You should post the code you are using. That would help identify the problem exactly.
However, since you mentioned measuring your angle in terms of -360 to 360, you are probably using the incorrect units for your math library. Most implementations of trigonometry functions use radians for their input. And if you use degrees instead...your answers will be weirdly wrong.
x_oncircle = x_origin + 200 * cos (degrees * pi / 180)
y_oncircle = y_origin + 200 * sin (degrees * pi / 180)
Note that you might also run into circumstance where the quadrant is not what you'd expect. This can fixed by carefully selecting where angle zero is, or by manually checking the quadrant you expect and applying your own signs to the result values.
I highly suggest using matrices for this type of manipulations. It is the most generic approach, see example below:
// The center point of rotation
var centerPoint = new Point(0, 0);
// Factory method creating the matrix
var matrix = new RotateTransform(angleInDegrees, centerPoint.X, centerPoint.Y).Value;
// The point to rotate
var point = new Point(100, 0);
// Applying the transform that results in a rotated point
Point rotated = Point.Multiply(point, matrix);
Side note, the convention is to measure the angle counter clockwise starting form (positive) X-axis
I am getting weird results when I pass Angle as -360 to 360 into a Cos(angle) or Sin(angle).
I think the reason your attempt did not work is that you were passing angles in degrees. The sin and cos trigonometric functions expect angles expressed in radians, so the numbers should be from 0 to 2*M_PI. For d degrees you pass M_PI*d/180.0. M_PI is a constant defined in math.h header.
I also needed this to form the movement of the hands of a clock in code. I tried several formulas but they didn't work, so this is what I came up with:
motion - clockwise
points - every 6 degrees (because 360 degrees divided by 60 minuites is 6 degrees)
hand length - 65 pixels
center - x=75,y=75
So the formula would be
x=Cx+(r*cos(d/(180/PI))
y=Cy+(r*sin(d/(180/PI))
where x and y are the points on the circumference of a circle, Cx and Cy are the x,y coordinates of the center, r is the radius, and d is the amount of degrees.
Here is the c# implementation. The method will return the circular points which takes radius, center and angle interval as parameter. Angle is passed as Radian.
public static List<PointF> getCircularPoints(double radius, PointF center, double angleInterval)
{
List<PointF> points = new List<PointF>();
for (double interval = angleInterval; interval < 2 * Math.PI; interval += angleInterval)
{
double X = center.X + (radius * Math.Cos(interval));
double Y = center.Y + (radius * Math.Sin(interval));
points.Add(new PointF((float)X, (float)Y));
}
return points;
}
and the calling example:
List<PointF> LEPoints = getCircularPoints(10.0f, new PointF(100.0f, 100.0f), Math.PI / 6.0f);
The answer should be exactly opposite.
X = Xc + rSin(angle)
Y = Yc + rCos(angle)
where Xc and Yc are circle's center coordinates and r is the radius.
Recommend:
public static Vector3 RotatePointAroundPivot(Vector3 point, Vector3 pivot, Vector3 angles)
{
return Quaternion.Euler(angles) * (point - pivot) + pivot;
}
You can use this:
Equation of circle
where
(x-k)2+(y-v)2=R2
where k and v is constant and R is radius

Find coordinate by angle

I am developing in application in XNA which draws random paths. Unfortunately, I'm out of touch with graphing, so I'm a bit stuck. My application needs to do the following:
Pick a random angle from my origin (0,0), which is simple.
Draw a circle in relation to that origin, 16px away (or any distance I specify), at the angle found above.
(Excuse my horrible photoshoping)
alt text http://www.refuctored.com/coor.png
The second circle at (16,16) would represent a 45 degree angle 16 pixels away from my origin.
I would like to have a method in which I pass in my distance and angle that returns a point to graph at. i.e.
private Point GetCoordinate(float angle, int distance)
{
// Do something.
return new Point(x,y);
}
I know this is simple, but agian, I'm pretty out of touch with graphing. Any help?
Thanks,
George
If the angle is in degrees, first do:
angle *= Math.PI / 180;
Then:
return new Point(distance * Math.Cos(angle), distance * Math.Sin(angle));
By the way, the point at (16, 16) is not 16 pixels away from the origin, but sqrt(16^2 + 16^2) = sqrt(512) =~ 22.63 pixels.
private Point GetCoordinate(float angle, int distance)
{
float x = cos(angle) * distance;
float y = sin(angle) * distance;
return new Point(x, y);
}
Note that the trigonometric functions probably take radians. If your angle is in degrees, divide by 180/Pi.
in general:
x = d * cos(theta)
y = d * sin(theta)
Where d is the distance from the origin and theta is the angle.
Learn the Pythagorean Theorem. Then this thread should have more specific details for you.

Categories