C# Drawing line parallel to a Polyline - c#

I'm creating a program in wpf that draws polyline as well as a line an offset away which is parallel. It works perfectly for the first set of parallel lines, but on each following line the right line is off angled(Shown in red) .
Code so far:
private void DrawingCanvas_MouseLeftButtonUp(object sender, MouseButtonEventArgs e) {
if (polylineLeft != null) {
var canvas = (Canvas)sender;
leftSegment.Points[1] = e.GetPosition(canvas);
var distance = (leftSegment.Points[0] - leftSegment.Points[1]).Length;
if (distance >= 20) {
polylineLeft.Points.Add(leftSegment.Points[1]);
//calculate second line
var L = Math.Sqrt((leftSegment.Points[0].X - leftSegment.Points[1].X) *
(leftSegment.Points[0].X - leftSegment.Points[1].X) +
(leftSegment.Points[0].Y - leftSegment.Points[1].Y) *
(leftSegment.Points[0].Y - leftSegment.Points[1].Y));
var x1p = leftSegment.Points[0].X + width * (leftSegment.Points[1].Y-leftSegment.Points[0].Y) / L;
var x2p = leftSegment.Points[1].X + width * (leftSegment.Points[1].Y-leftSegment.Points[0].Y) / L;
var y1p = leftSegment.Points[0].Y + width * (leftSegment.Points[0].X-leftSegment.Points[1].X) / L;
var y2p = leftSegment.Points[1].Y + width * (leftSegment.Points[0].X-leftSegment.Points[1].X) / L;
if (!initialLeftPoint) {
polylineRight.Points.Clear();
polylineRight.Points.Add(new Point(x1p, y1p));
initialLeftPoint = true;
}
polylineRight.Points.Add(new Point(x2p, y2p));
leftSegment.Points[0] = leftSegment.Points[1];
rightSegment.Points[0] = rightSegment.Points[1];
} else {
if (polylineLeft.Points.Count < 2) {
canvas.Children.Remove(polylineLeft);
}
polylineLeft = null;
polylineRight = null;
leftSegment.Points.Clear();
rightSegment.Points.Clear();
canvas.Children.Remove(leftSegment);
canvas.Children.Remove(rightSegment);
}
}
}
How do I ensure that on my second line, (Red) it is parallel with the main line (Green)?

One part of the problem is quite easy to solve with the help of the WPF Vector structure. Given a line segment between the two Points p1 and p2, you could calculate the normal vector like this:
Point p1 = ...
Point p2 = ...
var v = p2 - p1;
var n = new Vector(v.Y, -v.X);
n.Normalize();
// now n is a Vector of length 1, perpendicular to the line p1-p2
You could now create a parallel line segment (given by Points p3 and p4) like this:
var distance = 20d;
var p3 = p1 + n * distance;
var p4 = p3 + v;
However, the above code creates a parallel line segment of the same length as the original one. This may not be exactly what you want, as I guess you want to create a "parallel polyline". If that is the case, things get a bit more complicated because you would also have to calculate the intersections between adjacent segments of the parallel polyline. It may even happen that some of these segments disappear during these calculations.

Related

Why Radial tree layout drawing algorithm is making crossed edges?

I am implementing radial layout drawing algorithm, according to the publication of mr.Andy Pavlo link [page 18]
The problem is, that my result contains crossed edges. Which is something that is unacceptable. I found some solution, similiar problem link but I was not able to implement them into this algorithm (I would have to change the whole approach to the solution). In addition, the algorithm by Mr. Andy Pavlo should be able to solve this problem. When we look at the result of its algorithm, there are no crossed edges here. What am I doing wrong? Am I missing something? Thank you in advance.
Mr.Pavlo pseudo code of algorithm
My implementation of algorithm
public void RadialPositions(Tree<string> rootedTree, Node<string> vertex, double alfa, double beta,
List<RadialPoint<string>> outputGraph)
{
//check if vertex is root of rootedTree
if (vertex.IsRoot)
{
vertex.Point.X = 0;
vertex.Point.Y = 0;
outputGraph.Add(new RadialPoint<string>
{
Node = vertex,
Point = new Point
{
X = 0,
Y = 0
},
ParentPoint = null
});
}
//Depth of vertex starting from 0
int depthOfVertex = vertex.Depth;
double theta = alfa;
double radius = Constants.CircleRadius + (Constants.Delta * depthOfVertex);
//Leaves number in the subtree rooted at v
int leavesNumber = BFS.BreatFirstSearch(vertex);
foreach (var child in vertex.Children)
{
//Leaves number in the subtree rooted at child
int lambda = BFS.BreatFirstSearch(child);
double mi = theta + ((double)lambda / leavesNumber * (beta - alfa));
double x = radius * Math.Cos((theta + mi) / 2.0);
double y = radius * Math.Sin((theta + mi) / 2.0);
//setting x and y
child.Point.X = x;
child.Point.Y = y;
outputGraph.Add(new RadialPoint<string>
{
Node = child,
Point = new Point
{
X = x,
Y = y,
Radius = radius
},
ParentPoint = vertex.Point
});
if (child.Children.Count > 0)
{
child.Point.Y = y;
child.Point.X = x;
RadialPositions(rootedTree, child, theta, mi, outputGraph);
}
theta = mi;
}
}
BFS algorithm for getting leaves
public static int BreatFirstSearch<T>(Node<T> root)
{
var visited = new List<Node<T>>();
var queue = new Queue<Node<T>>();
int leaves = 0;
visited.Add(root);
queue.Enqueue(root);
while (queue.Count != 0)
{
var current = queue.Dequeue();
if (current.Children.Count == 0)
leaves++;
foreach (var node in current.Children)
{
if (!visited.Contains(node))
{
visited.Add(node);
queue.Enqueue(node);
}
}
}
return leaves;
}
Initial call
var outputPoints = new List<RadialPoint<string>>();
alg.RadialPositions(tree, tree.Root,0, 360, outputPoints);
mr.Pavlo result
My result on simple sample
Math.Cos and Sin expect the input angle to be in radians, not degrees. In your initial method call, your upper angle limit (beta) should be 2 * Math.PI, not 360. This will ensure that all the angles you calculate will be in radians and not degrees.

Windows Phone 8.1 - Issue with calculating covered distance

I'm writing an app that calculates covered distance in real time. I'm doing this by comparing my actual postion to the previous one and count the distance between them (using the Haversine formula) and sum everything up. This gives me some result, the distance is growing everything seems to be working. BUT the problem is with the accuracy. I tested this app on a route that is ~15km long many times and the covered distance is always bigger than my car's counter says. The difference is ALWAYS different - from 500m to even 2km.
This is my Geolocator object:
Geolocator gl = new Geolocator() {
DesiredAccuracy = PositionAccuracy.High, MovementThreshold = 20, ReportInterval = 50
};
In the constructor I declare that when the position is changed, "OnPositionChanged" method should be fired and also the method to find my actual location:
gl.PositionChanged += OnPositionChanged;
setMyLocation();
This is the "OnPositionChanged()" method:
async private void OnPositionChanged(Geolocator sender, PositionChangedEventArgs e)
{
await Dispatcher.RunAsync(CoreDispatcherPriority.Normal, () =>
{
setMyLocation();
});
}
This is the setMyLocation() method:
private async void setMyLocation()
{
try
{
p = new Position();
location = await gl.GetGeopositionAsync(TimeSpan.FromMinutes(5), TimeSpan.FromSeconds(5));
p.Latitude = location.Coordinate.Point.Position.Latitude;
p.Longitude = location.Coordinate.Point.Position.Longitude;
obla = location.Coordinate.Point.Position.Latitude;
oblo = location.Coordinate.Point.Position.Longitude;
if (prev_location.Latitude == 0)
{
prev_location = p;
}
positions.Add(new BasicGeoposition() {
Latitude = location.Coordinate.Latitude,
Longitude = location.Coordinate.Longitude
});
myMap.Children.Remove(myCircle);
myCircle = new Ellipse();
myCircle.Fill = new SolidColorBrush(Colors.Green);
myCircle.Height = 17;
myCircle.Width = 17;
myCircle.Opacity = 50;
myMap.Children.Add(myCircle);
MapControl.SetLocation(myCircle, location.Coordinate.Point);
MapControl.SetNormalizedAnchorPoint(myCircle, new Point(0, 0));
routeKM += (Distance(p, prev_location));
Distance_txt.Text = routeKM.ToString("f2") + " km";
prev_location = p;
}
catch
{
}
}
This is my double (routeKM) counted with the Haversine formula:
public double Distance(Position pos1, Position pos2)
{
var R = 6371d; // Radius of the earth in km
var dLat = Deg2Rad(pos2.Latitude - pos1.Latitude); // deg2rad below
var dLon = Deg2Rad(pos2.Longitude - pos1.Longitude);
var a = Math.Sin(dLat / 2d) * Math.Sin(dLat / 2d) +
Math.Cos(Deg2Rad(pos1.Latitude)) * Math.Cos(Deg2Rad(pos2.Latitude)) *
Math.Sin(dLon / 2d) * Math.Sin(dLon / 2d);
var c = 2d * Math.Atan2(Math.Sqrt(a), Math.Sqrt(1d - a));
var d = R * c; // Distance in km
return d;
}
double Deg2Rad(double deg)
{
return deg * (Math.PI / 180d);
}
So my question is: how to improve the accuracy? I'm not happy that my car's counter shows even 2km less than my app.
By doing DesiredAccuracy = PositionAccuracy.High you already suggested Windows Phone OS to look for highest accuracy.
I think you should change your distance finding logic and see if it works.
Use GeoCoordinate.GetDistanceTo as suggested in following answer.
https://stackoverflow.com/a/6366657/744616

Calculation of Center Point from List of Latitude and Longitude are slightly diffrent from actual

I am calculating center Lat/Lng fron list of available Lat/Lng using C# and rendering on OpenLayer Map.
I observed the calculation of getting center lat/lng will give me slightly difference in lat/lng. I am referring this link for the calculation
Calculate the center point of multiple latitude/longitude coordinate pairs.
C# Code:
static void Main(string[] args)
{
List<GeoCoordinate> listCoordinate = new List<GeoCoordinate>();
listCoordinate.Add(new GeoCoordinate() { Latitude = 22.9833, Longitude = 72.5000 }); //Sarkhej
listCoordinate.Add(new GeoCoordinate() { Latitude = 18.9750, Longitude = 72.8258 }); //Mumbai
listCoordinate.Add(new GeoCoordinate() { Latitude = 22.3000, Longitude = 73.2003 }); //Vadodara
listCoordinate.Add(new GeoCoordinate() { Latitude = 26.9260, Longitude = 75.8235 }); //Jaipur
listCoordinate.Add(new GeoCoordinate() { Latitude = 28.6100, Longitude = 77.2300 }); //Delhi
listCoordinate.Add(new GeoCoordinate() { Latitude = 22.3000, Longitude = 70.7833 }); //Rajkot
GeoCoordinate centerCoordinate = GetCentralGeoCoordinate(listCoordinate); //Output (Latitude:23.696708071960074, Longitude:73.681549202080149)
Console.WriteLine("Lat:" + centerCoordinate.Latitude + ",Lon:" + centerCoordinate.Longitude);
Console.ReadKey();
}
public static GeoCoordinate GetCentralGeoCoordinate(List<GeoCoordinate> geoCoordinates)
{
if (geoCoordinates.Count == 1)
{
return geoCoordinates.Single();
}
double x = 0, y = 0, z = 0;
foreach (var geoCoordinate in geoCoordinates)
{
var latitude = geoCoordinate.Latitude * Math.PI / 180;
var longitude = geoCoordinate.Longitude * Math.PI / 180;
x += Math.Cos(latitude) * Math.Cos(longitude);
y += Math.Cos(latitude) * Math.Sin(longitude);
z += Math.Sin(latitude);
}
var total = geoCoordinates.Count;
x = x / total;
y = y / total;
z = z / total;
var centralLongitude = Math.Atan2(y, x);
var centralSquareRoot = Math.Sqrt(x * x + y * y);
var centralLatitude = Math.Atan2(z, centralSquareRoot);
return new GeoCoordinate(centralLatitude * 180 / Math.PI, centralLongitude * 180 / Math.PI);
}
Javascrip Code:
var arrLonLat = [
{'Lon' : 72.5000, 'Lat' : 22.9833},
{'Lon' : 72.8258, 'Lat' : 18.9750},
{'Lon' : 73.2003, 'Lat' : 22.3000},
{'Lon' : 75.8235, 'Lat' : 26.9260},
{'Lon' : 77.2300, 'Lat' : 28.6100},
{'Lon' : 70.7833, 'Lat' : 22.3000}];
var centerLonLat = getCenterLonLat(arrLonLat);
var lonLatSarkhej = new OpenLayers.LonLat(arrLonLat[0].Lon,arrLonLat[0].Lat).transform(epsg4326,projectTo);
var lonLatMumbai = new OpenLayers.LonLat(arrLonLat[1].Lon,arrLonLat[1].Lat).transform(epsg4326,projectTo);
var lonLatVadodara = new OpenLayers.LonLat(arrLonLat[2].Lon,arrLonLat[2].Lat).transform(epsg4326,projectTo);
var lonLatJaipur = new OpenLayers.LonLat(arrLonLat[3].Lon,arrLonLat[3].Lat).transform(epsg4326,projectTo);
var lonLatDelhi = new OpenLayers.LonLat(arrLonLat[4].Lon,arrLonLat[4].Lat).transform(epsg4326,projectTo);
var lonLatRajkot = new OpenLayers.LonLat(arrLonLat[5].Lon,arrLonLat[5].Lat).transform(epsg4326,projectTo);
//Center Point of Average Markers
var lonLatCenter = new OpenLayers.LonLat(73.681549202080149,23.696708071960074).transform(epsg4326,projectTo);
var markers = new OpenLayers.Layer.Markers("Markers");
map.addLayer(markers);
var size = new OpenLayers.Size(24,24);
var offset = new OpenLayers.Pixel(-(size.w/2), -size.h);
var icon = new OpenLayers.Icon('icon/Marker-Pink.png', size, offset);
var iconCenter = new OpenLayers.Icon('icon/Marker-Green.png', size, offset);
markers.addMarker(new OpenLayers.Marker(lonLatSarkhej,icon)); //Sarkhej
markers.addMarker(new OpenLayers.Marker(lonLatMumbai,icon.clone())); //Mumbai
markers.addMarker(new OpenLayers.Marker(lonLatVadodara,icon.clone())); //Vadodara
markers.addMarker(new OpenLayers.Marker(lonLatJaipur,icon.clone())); //Jaipur
markers.addMarker(new OpenLayers.Marker(lonLatDelhi,icon.clone())); //Delhi
markers.addMarker(new OpenLayers.Marker(lonLatRajkot,icon.clone())); //Rajkot
I am rendering 6 different location with Pink marker, and center with Green marker.
Please see below image for more clarification.
Now have drawn box to get idea about the center marker (Green) which is not actually center. I think it should be positioned on Green dot which is crossed by green Horizontal and Vertical line.
Can anybody let me know, does my center point is calculated correctly
or not?
Why it is not display at center of the box?
I have also added Ruler for calculation of center point.
Please help me to find the actual solution, please do let me know if you need more details for the same.
The correct center depends actually on the definition for center and there are several possibilites:
One might calculate the smallest rectangle containing all points and
use the center of this rectangle as you did with your green lines
and green point. (This is unusual)
One might ask which point is closest to all other points, i.e.
find a center point so that all distances between this center and
all other points are smallest (abs-norm, used for practical
problems)
One might ask which center point results in the least error, i.e.
the smallest (sum of) quadratic distances to all other points
(used ins math, statistics, etc)
You see, depending on the definition, you'll have to use a different algorithm and will arrive on a different center point.
The algorithm you show in your question seems to calculate (2.)

Mandelbrot Fractal not working

I tried making this Mandelbrot fractal generator, but when I run this, I get an output like a circle.
Not sure exactly why this happens. I think something may be wrong with my coloring, but even if so the shape is also incorrect.
public static Bitmap Generate(
int width,
int height,
double realMin,
double realMax,
double imaginaryMin,
double imaginaryMax,
int maxIterations,
int bound)
{
var bitmap = new FastBitmap(width, height);
var planeWidth = Math.Abs(realMin) + Math.Abs(realMax); // Total width of the plane.
var planeHeight = Math.Abs(imaginaryMin) + Math.Abs(imaginaryMax); // Total height of the plane.
var realStep = planeWidth / width; // Amount to step by on the real axis per pixel.
var imaginaryStep = planeHeight / height; // Amount to step by on the imaginary axis per pixel.
var realScaling = width / planeWidth;
var imaginaryScaling = height / planeHeight;
var boundSquared = bound ^ 2;
for (var real = realMin; real <= realMax; real += realStep) // Loop through the real axis.
{
for (var imaginary = imaginaryMin; imaginary <= imaginaryMax; imaginary += imaginaryStep) // Loop through the imaginary axis.
{
var z = Complex.Zero;
var c = new Complex(real, imaginary);
var iterations = 0;
for (; iterations < maxIterations; iterations++)
{
z = z * z + c;
if (z.Real * z.Real + z.Imaginary * z.Imaginary > boundSquared)
{
break;
}
}
if (iterations == maxIterations)
{
bitmap.SetPixel(
(int)((real + Math.Abs(realMin)) * realScaling),
(int)((imaginary + Math.Abs(imaginaryMin)) * imaginaryScaling),
Color.Black);
}
else
{
var nsmooth = iterations + 1 - Math.Log(Math.Log(Complex.Abs(z))) / Math.Log(2);
var color = MathHelper.HsvToRgb(0.95f + 10 * nsmooth, 0.6, 1.0);
bitmap.SetPixel(
(int)((real + Math.Abs(realMin)) * realScaling),
(int)((imaginary + Math.Abs(imaginaryMin)) * imaginaryScaling),
color);
}
}
}
return bitmap.Bitmap;
}
Here's one error:
var boundSquared = bound ^ 2;
This should be:
var boundSquared = bound * bound;
The ^ operator means xor.

Hough with AForge on Kinect. How can i get the right lines?

i want to use my kinect to recognise certain objects. One of the methods that i want to use is the Houghtransformation. To do this i am using the AForge library.
But the lines that i find, are totally wrong. Important ones are missing and there are many useless lines in the picture.
To take care of this problem i thought:
First i am detecting the 100 most intense lines via hough to be sure that all the correct lines are detected as well.
I am writing all points from edge detection into a list and them I am checking for each line whether the detected line is at least on X points ( I used 15, 50 and 150) but nontheless my results are bad.
Have you got any idea, how i can find the right lines, before drawing them into the picture and analysing the geometry any further? Or maybe there is just a grave mistake in my code, that i dont see.
SobelEdgeDetector Kante = new SobelEdgeDetector();
System.Drawing.Bitmap neu2 = Kante.Apply(neu1);
neu2.Save("test.png");
for (int a = 0; a < 320; a++) //alle mögliche Kantenpunkte in eine Liste
{
for (int b = 0; b < 240; b++)
{
color = neu2.GetPixel(a, b);
if ((color.R+color.G+color.B)/3 >= 50)
{
Kantenpunkte.Add(new System.Drawing.Point(a, b));
}
}
}
Bitmap Hough = new Bitmap(320, 240);
Hough.Save("C:\\Users\\Nikolas Rieble\\Desktop\\Hough.png");
//houghtrans
HoughLineTransformation lineTransform = new HoughLineTransformation();
// apply Hough line transofrm
lineTransform.ProcessImage(neu2);
Bitmap houghLineImage = lineTransform.ToBitmap();
houghLineImage.Save("1.png");
// get most intensive lines
HoughLine[] lines = lineTransform.GetMostIntensiveLines(100);
UnmanagedImage fertig = UnmanagedImage.FromManagedImage(neu2);
foreach (HoughLine line in lines)
{
// get line's radius and theta values
int r = line.Radius;
double t = line.Theta;
// check if line is in lower part of the image
if (r < 0)
{
t += 180;
r = -r;
}
// convert degrees to radians
t = (t / 180) * Math.PI;
// get image centers (all coordinate are measured relative
// to center)
int w2 = neu2.Width / 2;
int h2 = neu2.Height / 2;
double x0 = 0, x1 = 0, y0 = 0, y1 = 0;
if (line.Theta != 0)
{
// none-vertical line
x0 = -w2; // most left point
x1 = w2; // most right point
// calculate corresponding y values
y0 = (-Math.Cos(t) * x0 + r) / Math.Sin(t);
y1 = (-Math.Cos(t) * x1 + r) / Math.Sin(t);
}
else
{
// vertical line
x0 = line.Radius;
x1 = line.Radius;
y0 = h2;
y1 = -h2;
}
// draw line on the image
int a = 0;
foreach (System.Drawing.Point p in Kantenpunkte) //count number of detected edge points that are on this line
{
double m1 = ((double)p.Y - y0)/((double)p.X - x0);
double m2 = ((y0 - y1)) / (x0 - x1);
if (m1-m2<0.0001)
{
a=a+1;
}
}
if (a > 150) //only draw lines, which cover at least A points
{
AForge.Imaging.Drawing.Line(fertig,
new IntPoint((int)x0 + w2, h2 - (int)y0),
new IntPoint((int)x1 + w2, h2 - (int)y1),
System.Drawing.Color.Red);
}
}

Categories