I have Line segment AB defined by two 2D points A,B.
What I am trying to do is find a point C, with distance d away from B.
The two constraints are that BC has to be perpendicular to AB and BC is always 90 degrees anti-clockwise relative to AB.
So far I have the following
double d = .01;
Coordinate C = new Coordinate(A.Lat - B.Lat, A.Long - B.Long);
C.Long = C.Long * A.Long * Math.Cos(90);
C.Lat = C.Lat * A.Lat * Math.Cos(90);
C.Long = A.Long + C.Long * d;
C.Lat = A.Lat + C.Lat * d;
Essentially what I am asking is, where am I going wrong with this? Is it the c# code? Is it the logic? What are the steps to solve for C using those two constraints.
Normalize AB vector and rotate it by 90 degrees:
ABY = B.Y - A.Y
ABX = B.X - A.X
Len = Sqrt(ABY*ABY + ABX*ABX)
C.X = B.X - d * ABY / Len
C.Y = B.Y + d * ABX / Len
Note that for geographic coordinates (Lat/Long) and large distances result is not exact.
Link for further reference (sections Bearing then Destination point given distance and bearing from start point)
MBo has the correct answer for your task (as you got 90 degrees turn) I just wanted to show you how to repair your own code (I deduced you wanted to do this) which is usable to any angular turn (but slower as it require goniometric):
d = .01;
a = atan2(B.y - A.y,B.x - A.x) (+/-) 90.0; // sign depends on your coordinate system
C.x = B.x + d*cos(a)
C.y = B.y + d*sin(a)
So you should obtain directional angle a of your AB and shift it by 90 deg. Then you just add d rotated by the a to the C which can be done by parametric circle equation.
Beware all the angles should be in units your goniometric functions accepts (so either degrees or radians) as I do not code in C# I have not a clue but in languages I code in it is usually in radians. In which case line:
a = atan2(B.y - A.y,B.x - A.x) (+/-) 90.0; // sign depends on your
Would change to:
a = atan2(B.y - A.y,B.x - A.x) (+/-) 90.0*Pi/180.0; // sign depends on your
Where Pi=3.1415926535897932384626433832795.
Related
I would like to draw a rectangle made of Mesh.
I enter the A starting point and B ending point.
The width of the rectangle is known in advance and is equal to H.
How to correctly determine the coordinates of corner points? (Everything happens in 3D)
There are a lot of theoretical entries on the net (but mostly for 2D) and trying something like this:
var vAB = B - A;
var P1 = new Vector3(-vAB.z, vAB.y, vAB.x) / Mathf.Sqrt(Mathf.Pow(vAB.x, 2) + Mathf.Pow(vAB.y, 2) + Mathf.Pow(vAB.z, 2)) * 0.5 * H;
But I can't find the correct solution
The simple way should be to use the cross product. The cross product of two vectors is perpendicular to both input vectors. You will need to define the normal of your rectangle, In this I use vector3.up. A-B cannot be parallel to the normal vector, or you will get an invalid result.
var l = B - A;
var s = Vector3.Normalize(Vector3.Cross(l, Vector3.up));
var p1 = A + s * h/2;
var p2 = A - s * h/2;
var p3 = B - s * h/2;
var p4 = B + s * h/2;
Here's a quick explanation of the trig involved. There are other tools which will reduce the boilerplate a bit, but this should give you an understanding of the underlying maths.
I've tweaked your problem statement slightly: I'm just showing the XY plane (there's no Z involved), and I've rotated it so that the line AB forms a positive angle with the horizontal (as it makes explaining the maths a bit easier). A is at (x1, y1), B is at (x2, y2).
The first step is to find the angle θ that the line AB makes with the horizontal. Draw a right-angled triangle, where AB is the hypotenuse, and the other two sides are parallel to the X and Y axes:
You can see that the horizontal side has length (x2 - x1), and the vertical side has length (y2 - y1). The angle between the base and the hypotenuse (the line AB) is given by trig, where tan θ = (y2 - y1) / (x2 - x1), so θ = arctan((y2 - y1) / (x2 - x1)).
(Make sure you use the Math.Atan2 method to calculate this angle, as it makes sure the sign of θ is correct).
Next we'll look at the corner P1, which is connected to A. As before, draw a triangle with the two shorter sides being parallel at the X and Y axes:
This again forms a right-angled triangle with hypotenuse H/2. To find the base of the triangle, which is the X-distance between P1 and A, again use trig: H/2 * sin θ. Similarly, the Y-distance between P1 and A is H/2 cos θ. Therefore P1 = (x1 + H/2 sin θ, y2 - H/2 cos θ).
Repeat the same trick for the other 3 corners, and you'll find the same result, but with differing signs.
My approach requires you to use a Transform.
public Transform ht; // assign in inspector or create new
void calculatePoints(Vector3 A, Vector3 B, float H)
{
Vector3 direction = B - A;
ht.position = A;
ht.rotation = Quaternion.LookRotation(direction, Vector3.Up);
Vector3 P1 = new Vector3(A + ht.right * H/2f);
Vector3 P2 = new Vector3(A + ht.left * H/2f);
Vector3 P3 = new Vector3(B + ht.right * H/2f);
Vector3 P4 = new Vector3(B + ht.left * H/2f);
}
I think it's intuitive to think with "left" and "right" which is why I used the Transform. If you wanted to define a point along the way, you'd be using ht.forward * value added to A, where value would be something between 0 and direction.magnitude.
I've already figured out how to make a 3D trajectory using a start point and an angle.
However, I am trying to make a trajectory from a start point, an end point, and a height.
I tried taking the approach of a parabola on a 2D plane in a 3D space. I calculated the Prabola's A, B, and C values as well as the plane it's on given 3 points on the Parabola.
However, I've had a few complications with this sort of calculation, I assume it has to do with the inability to properly calculate a Z-axis without a plane but I cannot tell.
Other than a 2D parabola on a plane google did not provide another possible answer and a 3D trajectory yields a formula using a start point, an angle, and a power multiplier.
Is there any way to calculate a 3D trajectory given the start point, end point, and height?
Appreciating your help
Edit:
My code to calculate a parabola using 3 points (in case someone would like to know how I've done that and perhaps fix what I've done wrong)
public Parabola(Vector3 pa, Vector3 pb, Vector3 pc)
{
this.pa = pa;
this.pc = pc;
float a1 = -pa.x * pa.x + pb.x * pb.x, b1 = -pa.x + pb.x, c1 = -pa.y + pb.y;
float a2 = -pb.x * pb.x + pc.x * pc.x, b2 = -pb.x + pc.x, c2 = -pb.y + pc.y;
float bm = -(b2 / b1), a3 = bm * a1 + a2, c3 = bm * c1 + c2;
float a = c3 / a3, b = (c1 - a1 * a) / b1, c = pa.y - a * pa.x * pa.x - b * pa.x;
this.a = a; this.b = b; this.c = c;
plane = Vector3.Cross(pb - pa, pc - pa);
}
public Vector3 GetPoint(float x)
{
float angle = Mathf.Atan2(pc.z - pa.z, pc.x - pa.x) * Mathf.Rad2Deg;
float xs = Mathf.Cos(angle * Mathf.Deg2Rad) * x, zs = Mathf.Sin(angle * Mathf.Deg2Rad) * x;
return new Vector3(xs, a * x * x + b * x + c, zs);
}
public Vector3 ProjectOn(float x) => Vector3.ProjectOnPlane(GetPoint(x), plane);
The result looks ok when it's only on 2 Axis, but not 3.
here are 2 images for demonstration:
Looking at the second image, the parabola seems to be correct aside from being scaled incorrectly. Let's take a look at your code.
public Vector3 GetPoint(float x)
{
float angle = Mathf.Atan2(pc.z - pa.z, pc.x - pa.x) * Mathf.Rad2Deg;
float xs = Mathf.Cos(angle * Mathf.Deg2Rad) * x, zs = Mathf.Sin(angle * Mathf.Deg2Rad) * x;
return new Vector3(xs, a * x * x + b * x + c, zs);
}
I'm making a lot of assumptions here, but it seems like x is meant as a value that goes from 0.0 to 1.0, representing the start and end of the parabola, respectively. If so, you are determining the X and Z coordinates of this point based exclusively on the sine/cosine of the angle and x. This means that the values xs and zs should only ever be able to be between -1 and 1, limiting yourself to the confines of the unit circle.
The values xs and zs look like they need to be scaled by a factor s calculated by measuring the 2D distance of the start and end points when projected onto the XZ plane. This should stretch the parabola just enough to reach the end point.
I found an answer, but it's kinda a workaround.
Before messing around with Parabolas in 3D, I messed around with linear equations in 3D.
Unlike parabolas, lines have a defined equation even in 3D(Pn = P0 + t x V)(Pn vector containing XYZ, P0 initial point containing XYZ, t float, V Vector3)
In addition, there's only ONE line that goes through 2 points, even in 3D.
I used that to make a trajectory that's made out of 2 points and a height.
I make a new point in the center of those two points and add the height value to the highest Y value of the points, thus creating an Apex.
then I use the same calculations as before to calculate the A, B, and C values that
a parabola with those 3 points would have had.
I made a method that takes in an X value and returns a Vector3 containing the point this X is on a linear equation, but instead, changing the vector's Y value based on the parabola's equation.
Practically creating an elevated line, I made something that looks and behaves perfectly like a 3D parabola.
If you're in C#, here is the code(images):
FIX!!
in the Linear's GetX(float x) method.
it should be:
public Vector3 GetX(float x) => => r0 + (x - r0.x)/v.x * v;
I made a slight mistake in the calculations which I noticed immediately and changed.
I am trying to develop an algorithm that involves normalizing GPS coordinates (latitude/longitude). That means, that being given two points A (lat1,lon1) and B(lat2,lon2) I would like to insert a point C that is linear with AB (same arc) and is placed at a specific distance from A and B (eg: A to B distance is 0.5km and I want point C to be at 0.1 km from A, on the AB arc). How can I calculate the coordinates for point C?
For the purpose given, it is enough to approximate Earth as a perfect spherical object.
I have found this article, but it gives the formula for midpoint only (and I don't fully understand it, in order to adapt).
midpoint between two latitude and longitude
Thank you.
Edit: I tried this but it gives wrong answers
public static void normalizedPoint(double lat1, double lon1, double lat2, double lon2, double dist){
double constant=Math.PI/180;
double angular = dist/6371;
double a = Math.Sin( 0* angular )/Math.Sin(angular);
double b = Math.Sin(1*angular)/Math.Sin(angular);
double x = a * Math.Cos(lat1) * Math.Cos(lon1) + b * Math.Cos(lat2) * Math.Cos(lon2);
double y = a * Math.Cos(lat1) * Math.Sin(lon1) + b * Math.Cos(lat2) * Math.Sin(lon2);
double z = a * Math.Sin(lat1) + b * Math.Sin (lon2);
double lat3 = Math.Atan2(z, Math.Sqrt( x*x + y*y ));
double lon3 = Math.Atan2(y, x);
Console.WriteLine(lat3/constant + " " + lon3/constant );
}
As far as I understood the original formulas this should return one of the 2 original points, but it does not(because the fraction used is 1). Also the variable dist is the distance from the 2 points and is properly calculated (checked with the same website).
Edit 2: I am providing as inputs coordinates for 2 geographic points (lat1, lon1, lat2 lon2) and the distance between them. I'm trying to get an intermediary point (lat3,lon3).
As I point out in an answer on the linked to question, you need to change all of your inputs to use radians rather than degrees.
I believe you also had an error for z where you used lon2 rather than lat2.
With those corrections, I get the answer you're seeking:
public static void normalizedPoint(double lat1, double lon1,
double lat2, double lon2,
double dist)
{
double constant = Math.PI / 180;
double angular = dist / 6371;
double a = Math.Sin(0 * angular) / Math.Sin(angular);
double b = Math.Sin(1 * angular) / Math.Sin(angular);
double x = a * Math.Cos(lat1* constant) * Math.Cos(lon1* constant) +
b * Math.Cos(lat2* constant) * Math.Cos(lon2* constant);
double y = a * Math.Cos(lat1* constant) * Math.Sin(lon1* constant) +
b * Math.Cos(lat2* constant) * Math.Sin(lon2* constant);
double z = a * Math.Sin(lat1* constant) + b * Math.Sin(lat2* constant);
double lat3 = Math.Atan2(z, Math.Sqrt(x * x + y * y));
double lon3 = Math.Atan2(y, x);
Console.WriteLine(lat3 / constant + " " + lon3 / constant);
}
Of course, the above can be vastly simplified by only converting angles ones, avoiding repeated calculations of the same Sin/Cos values, etc.
Calling:
normalizedPoint(47.20761, 27.02185, 47.20754, 27.02177, 1);
I get the output:
47.20754 27.02177
Not sure if the original author found some answer, but since I had similar problem and developed working solution, I think it would be good to post it here.
The Problem
Having two geographical points, A and B, find intermediate point C which lies exactly on the direct way from A to B and is N kilometers far from A (where N is less than distance between A and B, otherwise C = B).
My Context
I was developing small pet project based on microservices architecture. The idea was to launch missile from given deployment platform (point A) to chosen target location (point B). I had to create some kind of simulator that sends some messages about current missile Geo location after it is launched, so I had to find those intermediate points between A and B somehow.
Solution Context
Eventually, I developed C# based solution based on this great web page - https://www.movable-type.co.uk/scripts/latlong.html.
That web page has all the explanations, formulas and JavaScript code at the bottom. If you are not familiar with the C#, you can use their JavaScript implementation.
My C# Implementation
Your input is Location A, Location B and the distance.
You need to find bearing from A to B (see 'Bearing' section on that site)
You need to find position C from A having bearing (see 'Destination point given distance and bearing from start point' on that site)
The Code
I have working solution as part of my pet project and it can be found here - https://github.com/kakarotto67/mlmc/blob/master/src/Services/MGCC.Api/ChipSimulation/CoordinatesHelper.cs.
(Since original class is subject to change in the future, you might need to refer to this gist - https://gist.github.com/kakarotto67/ef682bb5b3c8bd822c7f3cbce86ff372)
Usage
// 1. Find bearing between A and B
var bearing = CoordinatesHelper.FindInitialBearing(pointA, pointB);
// 2. Find intermediate point C having bearing (above) and any distance in km
var pointC = CoordinatesHelper.GetIntermediateLocation(pointA, bearing, distance);
I hope somebody will find this helpful.
def get_intermediate_point(lat1 , lon1 , lat2 , lon2 , d):
constant = np.pi / 180
R = 6371
φ1 = lat1 * constant
λ1 = lon1 * constant
φ2 = lat2 * constant
λ2 = lon2 * constant
y = np.sin(λ2-λ1) * np.cos(φ2);
x = np.cos(φ1)*np.sin(φ2) - np.sin(φ1)*np.cos(φ2)*np.cos(λ2-λ1)
θ = np.arctan2(y, x)
brng = (θ*180/np.pi + 360) % 360; #in degrees
brng = brng * constant
φ3 = np.arcsin( np.sin(φ1)*np.cos(d/R ) + np.cos(φ1)*np.sin(d/R )*np.cos(brng) )
λ3 = λ1 + np.arctan2(np.sin(brng)*np.sin(d/R )*np.cos(φ1), np.cos(d/R )-np.sin(φ1)*np.sin(φ2));
return φ3/constant , λ3/constant
Hi I am writing small script that analyze point data - everything is almost done but I stuck on finding the intersection between two lines (edge) with a given length between those points. Image below illustrates the problem better:
Edit: its only two-dimensional problem & distance between DB & BE should be equal
Say you want the distance DE to be a given L. Your points {D} and {E} are
{D} = {B} + x * {a}
{E} = {B} + x * {c}
where {a} is the normalised vector BA and {c} is the normalised vector BC. (These vectors have to be of the same length so that the same factor x can be used for both. Normalisation is the easiest way to enforce this.)
Now you have the equation:
L = |{D} - {E}|
= |x*{a} - x*{c}|
Broken down to the vector components:
L = sqrt((x*ax - x*cx)² + (x*ay - x*cy)²)
= x * sqrt((ax - cx)² + (ay - cy)²)
Solve for x:
x = L / sqrt((ax - cx)² + (ay - cy)²)
and use the found x in the first equations above.
i do have 2 points on a 2d plane. one has already an vector that does determine in which direction it will move.
now i want to add a vector to this existing vector. so he accelerates in the direction of the other point.
to be a bit more clear, it is about 2 asteroids flying in space (only 2d) and gravitation should move them a bit closer to each other.
what i did build till now is this:
c = body.position - body2.position;
dist = c.Length();
acc = (body.masse * body2.masse) / (dist * dist);
xDist = body2.position.X - body.position.X;
yDist = body2.position.Y - body.position.Y;
direction = MathHelper.ToDegrees((float)(Math.Atan2((double)yDist, (double)xDist)));
body.velocity.Y = body.velocity.Y + (float)(Math.Sin(direction) * acc);
body.velocity.X = body.velocity.X + (float)(Math.Cos(direction) * acc);
in the moment the direction calculated is completly off. surely i am making just a stupid mistake, but i have no idea.
You need to pass your direction angle in in radians to Math.sin and Math.Cos (rather then in degree as you do in your smaple code).
see also:
http://msdn.microsoft.com/en-us/library/system.math.sin.aspx
The angle, a, must be in radians. Multiply by Math.PI/180 to convert degrees to radians.
My mechanics and linear algebra are a bit rusty but I think you should be able to do it without resorting to trigonometry. These formulae probably need tweaking, I'm not sure if I got u and -u mixed up.
Here it is in pseudo code
T is whatever time period you're iterating over
G is the gravitational constant
body1 starts with a velocity of v1
body2 starts with a velocity of v2
c = body.position - body2.position
c1 is a vector
use the vector c to get a vector of length 1 in the direction of the force
u = c1 / c.Length()
body1 should have an acceleration vector of a1 = G * body2mass/c.Length()^2 * (-u)
body2 should have an acceleration vector of a2 = G * body1mass/c.Length()^2 * (u)
body1 has a new velocity vector of v1 + a1/T
body2 has a new velocity vector of v1 + a2/T
rinse and repeat
Not completely sure what you try to do. Why can't you just use Vector2.Add(v1, v2)?