Procedural texture uv error - c#

I am creating a game with procedural generated plane and uv, it works mostly fine, but there is one place where the uv seems distorted in a very confusing way.
Intro: the uv is mapped in a "Global" uv coordinates, instead of a local one, because the texture has to be aligned with adjacent planes. Each texture has a tiling of 2*2, I introduce a uv scale of 2 which means each plane "v" coordinates will be mapped in range [0, 0.5] of the texture vertically, to make sure horizontally "u" coordinates do not go out of range [0, 1]. (horizontally the noise is unpredictable to some extend).
code:
Vector2[] CalculateU(float[] leftXMap, float[] rightXMap) {
int length = leftXMap.Length;
float totalHeight = (length - 1) * 1; // distance between two vertices vertically is 1
float uvheight = totalHeight * uvScale; // uvScale is 2, purpose is to map a plane's v(vertically), into a range of [0, 0.5]..
Vector2[] UMap = new Vector2[length * 2];
for(int i = 0; i < length; i++) {
float left = leftXMap[i];
float right = rightXMap[i];
// make left and right positive.
while (left < 0) {
left += uvheight;
}
while(right < 0) {
right += uvheight;
}
float leftu = (left % uvheight) / uvheight;
float leftv = (i / (float)(length-1)) / uvScale;
float rightu = (right % uvheight) / uvheight;
float rightv = leftv; //(i / (float)length) / 2f;
UMap[i * 2] = new Vector2(leftu, leftv);
UMap[i * 2 + 1] = new Vector2(rightu, rightv);
}
}
explain:
the parameters for the function, are the noise maps for the generated plane. the noise is the x coordinates. while the y coordinates are simple value of 0, 1, 2, .... (length-1).
Distorted texture:
Zoomed in:

I managed to solved the problem, the UV map gets messed where the left noise map values are negative, and right noise map is positive values.
Since I always try to map all negative values to positive values, so left noise map, which is also be my left vertex x coordinates, will be mapped to positive values. but then it will be larger than right noise map (right vertex x coordinates). then it messed up the UV.
I solved it by not mapping at all, and suddenly realize that UV map values can be negative, I don't have to map at all!

Related

Get a UnityEngine.UI.Images position in screenSpace and calculate a normalised offset (inside an overlay canvas)

The Problem at hand:
Simplified
Given an UnityEngine.Ui.Image How does one find the X,Y position
of a normalised offset (like 0.4, 0.3 from the top left) inside that
image in ScreenSpace units like 400,300
I guess I need to find the top left ScreenSpace value
and then knowing the rendered total size of the image scale the normalised offsets by the actual size ratio expressed in pixels.
Figure 1:
Figure 2 shows the normalisedOffsets that are to be used
Figure 2:
So, in precis, I need to find the offset in ScreenSpace pixels of the
topLeft of the Rect I have stored against the image.
I recognise it is probably a combination of Camera.main.ViewportToWorldPoint() and some reference to the bounds,
possibly scaling that by backgroundImage.sprite.pixelsPerUnit?
Struggling to visualise how to exactly get this done.
thanks
I guess you don't have any scale or rotation in the image parents and the position Y is 0.
First you can get the position of the upper left corner of your image with rectTransform.GetWorldCorners():
//Upper left corner
Vector3[] v = new Vector3[4];
image.rectTransform.GetWorldCorners(v);
var recPos = v[1];
Then you have to transform your normalized offset to a world space offset by a ratio between your image size and your rect size and add the position of the top left corner:
var recWidth = image.rectTransform.sizeDelta.x;
var imgWidth = image.sprite.texture.width;
var realOffsetX = offsetX * (recWidth / imgWidth);
var realPosX = recPos.x + realOffsetX;
(It is the same formula for the Y coordinate but you have to subtract by your ratio realOffsetY because the offset is calculated from the top left corner)
Here is the full method:
private Vector3 GetPositionOffset(Image image, float offsetX, float offsetY)
{
//Upper left corner
Vector3[] v = new Vector3[4];
image.rectTransform.GetWorldCorners(v);
var recPos = v[1];
//X coordinate
var recWidth = image.rectTransform.sizeDelta.x;
var imgWidth = image.sprite.texture.width;
var realOffsetX = offsetX * (recWidth / imgWidth);
var realPosX = recPos.x + realOffsetX;
//Y coordinate
var recHeight = image.rectTransform.sizeDelta.y;
var imgHeight = image.sprite.texture.height;
var realOffsetY = offsetY * (recWidth / imgWidth);
var realPosY = recPos.y - realOffsetY;
//Position
return new Vector3(realPosX, realPosY, image.transform.position.z);
}
Then if you want this World space to screen space just use the camera method:
camera.WorldToScreenPoint(positionOffset);

Math: Mercator Projection Convert 360 video pixel coordinate to sphere surface coordinate

I was trying to map the 360 video pixel coordinate to sphere surface coordinate but I couldn't get right result... It just mapped to the wrong position I already know the points of the XY data for 360 video pixels.
how map 2d grid points (x,y) onto sphere as 3d points (x,y,z)
I checked this link and I copied method from this but what I'm getting is not mapped to the desired position.
How can I get radius from the pixels?
I am not sure if I'm passing right radius for imageRadius but I thought it will be circumference/PI to get radius and the video ratio is 4096x2048. I also tried to pass the number 1 because UV is 0-1 but it was not right...
Is Method wrong?
Maybe the method is wrong. I passed random numbers into the imageRadius but couldn't get the right position... If I make X to negative number the seems like little bit closer to result....?
Current Result
https://youtu.be/t0I7Hlb-tbk
It mapped to up right position with the method that I found online...
Project File
https://drive.google.com/a/swordfish-sf.com/file/d/0B45RYzVs0t0_VVdaaHdmNHRWTk0/view?usp=sharing
If somebody can check the Unity project file that will be great...
Current Code
public class mapScript : MonoBehaviour {
public int input = 4098;
float imageRadius = 4098f / Mathf.PI; //2098? 3072? 4098?
float radius;
public GameObject testSphere;
void Start () {
radius = this.transform.localScale.x;
}
void Update () {
imageRadius = input / Mathf.PI;
int currentFrame = (int)this.GetComponent<VideoPlayer>().frame;
testSphere.transform.position = MercatorProjection(mapVals[currentFrame,0],mapVals[currentFrame,1]);
}
Vector3 MercatorProjection(float xVal, float yVal)
{
float lon = (xVal / imageRadius);
float lat = (2 * Mathf.Atan(Mathf.Exp(yVal / imageRadius)) - Mathf.PI / 2);
float calcX = radius * Mathf.Cos(lat) * Mathf.Cos(lon);
float calcY = radius * Mathf.Cos(lat) * Mathf.Sin(lon);
float calcZ = radius * Mathf.Sin(lat);
Vector3 result = new Vector3(calcX,calcY,calcZ);
Debug.Log(result);
return result;
}
float[,] mapVals = new float[,] {
{1969.21f, 928.625f},
{1969.6f, 928.533f},
{1968.92f, 928.825f},
{1968.68f, 929f},
{1968.47f, 929.067f},
{1968.41f, 929.025f},
{1968.48f, 928.992f},
....
};
}
Thank you.
As a side note, the radius is arbitrary. The pixel coordinates only map to the directional coordinates (polar [θ] and azimuthal [ϕ] angles).
We can do this by mapping each pixel to equal θ and ϕ intervals. The diagram below illustrates a low-resolution setup:
Let us adopt the convention that, for an image of with W, ϕ = 0 corresponds to:
Even W: half way between X = floor((W - 1) / 2) and X = ceil((W - 1) / 2)
Odd W: in the middle of the pixel column at X = floor((W - 1) / 2)
The pixel row at Y maps to the equilatitudinal line at θ = (Y + 0.5) / H * π.
To map all pixels in their entirety, let X start at -0.5 instead of 0, and end at W - 0.5; likewise for Y. Since integer coordinates map to the centers of the pixel regions shown above, this allows the whole area of any particular pixel to be addressed. You may need this later on if you plan on doing multi-sampling filtering for e.g. anti-aliasing.
Code:
Vector3 Mercator(float x, float y, int w, int h)
{
// outside of valid pixel region
if (x < -0.5f || x >= w - 0.5f || y < -0.5f || y >= h - 0.5f)
return new Vector3();
float theta = (y + 0.5f) / h * Math.PI;
float phi = ((x + 0.5f) / w - 0.5f) * 2.0 * Math.PI;
float c_t = Math.Cos(theta);
return new Vector3(c_t * Math.Cos(phi), c_t * Math.Sin(phi), Math.Sin(theta));
}
... and multiply the resulting direction vector by any "radius" you like, since it has (basically) nothing to do with the mapping anyway.

rotation matrix to facing direction vector

I have a position of a character as xyz coordinates x= 102, y= 0.75, z= -105.7 for example. And i have the rotation matrix for the character as
M11 = -0.14
M12 = 0
M13 = -0.99
M21 = 0
M22 = 1
M23 = 0
M31 = 0.99
M32 =0
M33 = 0.14
I don't have much understanding about graphics and how these data can correlate to the facing direction of the character. I want to find a vector such that i can use that vector to aim at a direction that the character is facing. How do i do that?
The direction your character is facing is the 3rd row of the rotation matrix. So that would be:
Vector3 facingDirection = new Vector3(0.99f, 0f, 0.14f);//(m31,m32,m33)
this vector appears to be normalized, but if it weren't, you would do:
Vector3 facingDirection = Vector3.Normalize(new Vector3(0.99f, 0f, 0.14f));
If the matrix is an XNA Matrix, then you would simply use the Matrix.Forward property of the Matrix structure and get the same result.
I am finally able to resolve it. Actually it's pretty simple vector mathematics. The rotation matrix was already giving me the direction vector as Steve suggests above, but I had to fire along that line to some particular point to make my animation working...So I actually needed a point along the line denoted by the direction vector. So, i just calculated a point some 1000 units away from the character's current position along the direction vector and fired at the line, it worked!
Vector3 facingDirection = new Vector3(RotMatrix[2, 0], RotMatrix[2, 1], RotMatrix[2, 2]); // direction vector
Vector3 currentPos = new Vector3(character.PosX, character.PosY, character.PosZ); // I also have position of the character
// Now calculate a point 10000 units away along the vector line
var px = currentPos.X + facingDirection.X * 10000;
var py = currentPos.Y + facingDirection.Y * 10000;
var pz = currentPos.Z + facingDirection.Z * 10000;
return new Vector3(px, py, pz);

Calculating the Area of a Closed Polygon on a Plane

I'm attempting to calculate the area of a polygon that lies on a plane (a collection co-planar points forming a non-intersecting closed shape), and I know a method that can calculate the area of an irregular (or any) polygon in two dimensions - but not three. My solution is to rotate the plane so that it's normal is 0 in the z direction (so I can treat it like it's 2D) and then run the 2D area function.
The problem is I have NO idea how to actually determine the rotation axes and amounts to flatten a plane on it's Z-axis. I do my rotation through the easiest method I could find for 3 dimensional rotation: Rotation Matrices. So, given that I'm trying to use rotation matrices to do my rotation, how do I figure out the angles to rotate my plane by to be oriented in the same direction as another vector? I don't actually know much calculus or Euclidean geometry, so whichever solution requires me to teach myself the least of both is the ideal solution. Is there a better way?
Here's my attempt below, which doesn't even come close to getting the plane flat on the Z axis. This is an instance method of my "Surface" class, which is a derivative of my "Plane" class, and has an array of co-planar points (IntersectPoints) forming a closed polygon.
public virtual double GetArea()
{
Vector zUnit = new Vector(0, 0, 1); //vector perprendicualr to z
Vector nUnit = _normal.AsUnitVector();
Surface tempSurface = null;
double result = 0;
if (nUnit != zUnit && zUnit.Dot(nUnit) != 0) //0 = perprendicular to z
{
tempSurface = (Surface)Clone();
double xAxisAngle = Vector.GetAxisAngle(nUnit, zUnit, Physics.Formulae.Axes.X);
double yAxisAngle = Vector.GetAxisAngle(nUnit, zUnit, Physics.Formulae.Axes.Y);
double rotationAngle = Vector.GetAxisAngle(nUnit, zUnit, Physics.Formulae.Axes.Z);
tempSurface.Rotate(xAxisAngle, yAxisAngle, rotationAngle); //rotating plane so that it is flat on the Z axis
}
else
{
tempSurface = this;
}
for (int x = 0; x < tempSurface.IntersectPoints.Count; x++) //doing a cross sum of each point
{
Point curPoint = tempSurface.IntersectPoints[x];
Point nextPoint;
if (x == tempSurface.IntersectPoints.Count - 1)
{
nextPoint = tempSurface.IntersectPoints[0];
}
else
{
nextPoint = tempSurface.IntersectPoints[x + 1];
}
double cross1 = curPoint.X * nextPoint.Y;
double cross2 = curPoint.Y * nextPoint.X;
result += (cross1 - cross2); //add the cross sum of each set of points to the result
}
return Math.Abs(result / 2); //divide cross sum by 2 and take its absolute value to get the area.
}
And here are my core rotation and get axis angle methods:
private Vector Rotate(double degrees, int axis)
{
if (degrees <= 0) return this;
if (axis < 0 || axis > 2) return this;
degrees = degrees * (Math.PI / 180); //convert to radians
double sin = Math.Sin(degrees);
double cos = Math.Cos(degrees);
double[][] matrix = new double[3][];
//normalizing really small numbers to actually be zero
if (Math.Abs(sin) < 0.00000001)
{
sin = 0;
}
if (Math.Abs(cos) < 0.0000001)
{
cos = 0;
}
//getting our rotation matrix
switch (axis)
{
case 0: //x axis
matrix = new double[][]
{
new double[] {1, 0, 0},
new double[] {0, cos, sin * -1},
new double[] {0, sin, cos}
};
break;
case 1: //y axis
matrix = new double[][]
{
new double[] {cos, 0, sin},
new double[] {0, 1, 0},
new double[] {sin * -1, 0, cos}
};
break;
case 2: //z axis
matrix = new double[][]
{
new double[] {cos, sin * -1, 0},
new double[] {sin, cos, 0},
new double[] {0, 0, 1}
};
break;
default:
return this;
}
return Physics.Formulae.Matrix.MatrixByVector(this, matrix);
}
public static double GetAxisAngle(Point a, Point b, Axes axis, bool inDegrees = true)
{ //pretty sure this doesnt actually work
double distance = GetDistance(a, b);
double difference;
switch (axis)
{
case Axes.X:
difference = b.X - a.X;
break;
case Axes.Y:
difference = b.Y - a.Y;
break;
case Axes.Z :
difference = b.Z - a.Z;
break;
default:
difference = 0;
break;
}
double result = Math.Acos(difference / distance);
if (inDegrees == true)
{
return result * 57.2957; //57.2957 degrees = 1 radian
}
else
{
return result;
}
}
A robust way to do this is to do a sum of the cross-products of the vertices of each edge. If your vertices are co-planar, this will produce a normal to the plane, whose length is 2 times the area of the closed polygon.
Note that this method is very similar to the 2D method linked in your question, which actually calculates a 2D equivalent of the 3D cross-product, summed for all edges, then divides by 2.
Vector normal = points[count-1].cross(points[0]);
for(int i=1; i<count; ++i) {
normal += points[i-1].cross(points[i]);
}
double area = normal.length() * 0.5;
Advantages of this method:
If your vertices are only approximately planar, it still gives the right answer
It doesn't depend on the angle of the plane.
In fact you don't need to deal with the angle at all.
If you want to know the plane orientation, you've got the normal already.
One possible difficulty: if your polygon is very small, and a long way away from the origin, you can get floating point precision problems. If that case is likely to arise, you should first translate all of your vertices so that one is at the origin, like so:
Vector normal(0,0,0);
Vector origin = points[count-1];
for(int i=1; i<count-1; ++i) {
normal += (points[i-1]-origin).cross(points[i]-origin);
}
double area = normal.length() * 0.5;
You need not to rotate the plane (or all points). Just calculate an area of polygon projection to Z-plane (if it is not perpendicular to polygon plane), for example, with you GetArea function, and divide result by cosinus of Poly-plane - Z-plane angle - it is equal to scalar product of zUnit and nUnit (I suggest that nUnit is normal vector to polygon plane)
TrueArea = GetArea() / zUnit.Dot(nUnit)

Find vertices in equilateral triangle mesh originating from a center vertex

I'd like to ask whether there is code out there or if you can give me some help in writing some (C#, but I guess the maths is the same everywhere).
I'd like to specify a center point from which an equilateral triangle mesh is created and get the vertex points of these triangles. The center point should not be a face center, but a vertex itself.
A further input would be the size of the triangles (i.e side length) and a radius to which triangle vertices are generated.
The reason behind it is that I want to create a mesh which is centered nicely on the screen/window center with as little code as possible. I just find mesh generation code, but not a "radial outward propagation" example.
In the end, I'd like to have the subsequently farther away vertices being displaced in a logarithmic fashion, but I guess that's just an easy addition once the mesh code is there.
Can anybody help me with that? Thanks!
You need to specify two things, a radius and the direction that the first triangle points.
The radius will be the distance from the initial point to the vertices of the first triangle. All triangles will have the same radius.
The direction is some specification in radians. I will assume that 0 means pointing to the right (PI would be point to the left).
Finding the vertices of the first triangle can be done like this (pseudo-code, not language specific):
float theta = 0; // The direction, 0 means pointing to the right
float thetaInc = TWO_PI/3; // 3 because you want a triangle
for (int i = 0; i < 3; i++) {
vertX[i] = initialPointX+cos(theta)*radius;
vertY[i] = initialPointY+sin(theta)*radius;
theta += thetaInc;
}
There are many ways to find the center points of the neighboring triangles. One way would be to use the same code but initialize theta = TWO_PI/6, replace radius with foo (see math below), assign new center points of neighboring triangles in the for loop, and then use the same code with an appropriately rotated direction (theta += PI) to find the vertices of those triangles.
Distance from one triangle center to another only knowing radius:
hypotenuse = sqrt(sq(radius)+sq(radius));
halfHypotenuse = hypotenuse/2.0;
Pythagorean theorem to find distance from center of triangle to center of an edge: foo = sqrt(sq(radius)-sq(halfHypotenuse));
Final distance = foo*2.0;
Code to find the center points of the neighboring triangles:
float[] nx = new float[3];
float[] ny = new float[3];
float theta = TWO_PI/6;
float hyp = sqrt(sq(radius)+sq(radius));
float halfHyp = hyp/2.0;
float foo = sqrt((sq(radius)-sq(halfHyp)))*2.0;
for (int i = 0; i < 3; i++) {
nx[i] = initialPointX+cos(theta)*foo;
ny[i] = initialPointY+sin(theta)*foo;
theta += thetaInc;
}
Thank you very much for your answer. I will play around with your code - the propagation part will come handy for sure.
In the meantime I have played around with hexagons instead of triangles and this codes works fairly alright for the same purpose.:
//populate array from the centre hex, going outwards the desired number of hex rings
for (int i = 0; i < numberOfHexagonRings; i++)
{
for (double j = -i; j <= i; j++)
for (double k = -i; k <= i; k++)
for (double l = -i; l <= i; l++)
if ((Math.Abs(j) + Math.Abs(k) + Math.Abs(l) == i * 2) && (j + k + l == 0))
{
positionX = (int)(screenCenterX + ((double)sideLength * (l / 2 + j)));
positionY = (int)(screenCenterY + (3/2 * ((double)sideLength / Math.Sqrt(3)) * l));

Categories