Why the 3D Texture based Volume Rendering has no 3D effects - c#

I implemented an 3D Texture based Volume Rendering using OpenTK. The code is based on this project. But the result showed no 3D effects, just flat 2D image.
The pipeline is simple enough: (1) Load 3D texture; (2) draw a series of quads(rectangles) and specify the corresponding texture coordinates.
The Vertices of each quads were generated using
private void generateVertices(int n)
{
vertices = new float[n * 3 * 4];
int cur;
for (int i = 0; i < n; i++)
{
cur = 3 * 4 * i;
vertices[cur] = -.5f;
vertices[cur + 1] = -.5f;
vertices[cur + 2] = -0.5f + i / n;
vertices[cur+3] = -.5f;
vertices[cur + 4] = .5f;
vertices[cur + 5] = -0.5f + i / n;
vertices[cur+6] = .5f;
vertices[cur + 7] = .5f;
vertices[cur + 8] = -0.5f + i / n;
vertices[cur+9] = .5f;
vertices[cur + 10] = -.5f;
vertices[cur + 11] = -0.5f + i / n;
}
}
To draw 256 quads, just call generateVertices(256) and opengl routine
GLDrawElements(GL_QUADS,...)
The TexCoordinates were calculated using vertices position in vertex shader like this:
texCoord = aPosition+vec3(0.5f,0.5f,0.5f);
Any ideas are welcome.

Integer i at range [0,n-1], i/n will be 0 forever.
So you indeed specify n overlaped quads.
The solution is simple, just explicitly cast i to float (float)i/n.

Related

Why does my procedural grid mesh not triangulate properly for grids bigger than 256x256?

I've been making procedural terrain height maps with the diamond square algorithm and the mesh with the triangulation method below:
public Map GenerateMap()
{
Mesh mapMesh = new();
vertices = new Vector3[(Resolution + 1) * (Resolution + 1)];
Vector2[] uv1 = new Vector2[vertices.Length];
Vector2[] uv2 = new Vector2[vertices.Length];
Vector2[] uv3 = new Vector2[vertices.Length];
DiamondSquare diamondSquare = new(Resolution, Roughness, Seed, HeightLevels);
float[,] heightFloatMap = diamondSquare.DoDiamondSquare();
tex = new Texture2D(Resolution, Resolution);
for (int y = 0, i = 0; y <= Resolution; y++)
{
for (int x = 0; x <= Resolution; x++, i++)
{
//float height = heightMap.GetPixel(x,y).r;
float height = heightFloatMap[x, y];
vertices[i] = new Vector3(x * CellSize.x, height * CellSize.y, y * CellSize.z);
tex.SetPixel(x, y, new Color(height, height, height, 1));
if (height == 0)
uv1[i] = new Vector2(vertices[i].x, vertices[i].z);
else if (height < 0.4)
uv2[i] = new Vector2(vertices[i].x, vertices[i].z);
else if (height < 0.4)
uv3[i] = new Vector2(vertices[i].x, vertices[i].z);
}
}
mapMesh.vertices = vertices;
mapMesh.uv = uv1;
mapMesh.uv2 = uv2;
int[] triangles = new int[Resolution * Resolution * 6];
Cell[,] cellMap = new Cell[Resolution / 4, Resolution / 4];
for (int ti = 0, vi = 0, y = 0; y < Resolution; y++, vi++)
{
for (int x = 0; x < Resolution; x++, ti += 6, vi++)
{
triangles[ti] = vi;
triangles[ti + 3] = triangles[ti + 2] = vi + 1;
triangles[ti + 4] = triangles[ti + 1] = vi + Resolution + 1;
triangles[ti + 5] = vi + Resolution + 2;
Vector3[] cellVerts = new Vector3[]
{
vertices[vi], vertices[vi + 1], vertices[vi + Resolution + 1], vertices[vi + Resolution + 2]
};
Cell cell = new(new Vector2Int(x, y), cellVerts, CalculateCellGeometry(cellVerts));
cellMap[x / 4, y / 4] = cell;
}
}
mapMesh.triangles = triangles;
mapMesh.RecalculateNormals();
mapMesh.RecalculateTangents();
mapMesh.RecalculateBounds();
Map map = new(mapMesh, cellMap, heightFloatMap, vertices);
return map;
}
}
This works fine with grid sizes 16x16, 32x32... 256x256 but breaks when I try it on 512x512 or above
256x256
Mesh is perfect
512x512
It successfully triangulates up until the rows starting y=128
On the underside of the terrain there are these bars
I've mapped out the vertices generated from 512x512 and above resolutions and they are all good so I'm 99% sure its down to the triangulation.
I'm new to procedural meshes and am stumped by this issue, any help would be greatly appreciated.
Turns out it wasn't triangulation, the vertex limit was being reached as my mesh was set to use a 16-bit index buffer.
I added this line
mapMesh.indexFormat = UnityEngine.Rendering.IndexFormat.UInt32;
and the issue is fixed. An annoying oversight on my part but that's part of the learning process!

Why my Triangles starts overlapping when terrainSize too high? [duplicate]

This question already has an answer here:
Is there maximum number of meshes in Unity? [duplicate]
(1 answer)
Closed last year.
I have a weird problem. At the moment I am doing a selfmade Terrain generator, for now I am doing a "plane" generation only for the bottom area with flat surface. My problem is that when I set terrainSize too high the triangles starts overlapping ironically.
Here is a picture when i set terrainSize to 120:
Here is a picture when i set the terrainSize to 200:
At the size 200 looks like its overlapping twice, i found out that the max terrainSize for me is 114 at 115 it starts overlapping.
Here is my code, maybe you can find something out and help me and other on this platform:
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
[RequireComponent(typeof(MeshFilter))]
public class MeshGeneratorSecond : MonoBehaviour
{
[SerializeField] private int terrainSize;
private Mesh myMesh;
private Vector3[] vertices;
private int[] triangles;
private int verticesInVertex = 5;
private int trianglesInVertex = 4;
void Start()
{
myMesh = new Mesh();
GetComponent<MeshFilter>().mesh = myMesh;
vertices = new Vector3[terrainSize * terrainSize * 5];
triangles = new int[terrainSize * terrainSize * 12];
StartCoroutine(CreateShape());
}
IEnumerator CreateShape()
{
int vertex = 0;
int triangle = 0;
for(int x = 0; x < terrainSize; x++)
{
for(int z = 0; z < terrainSize; z++)
{
vertices[vertex] = new Vector3(x, 0, z);
vertices[vertex + 1] = new Vector3(x, 0, z + 1);
vertices[vertex + 2] = new Vector3(x + 1, 0, z + 1);
vertices[vertex + 3] = new Vector3(x + 1, 0, z);
vertices[vertex + 4] = new Vector3(x + 0.5f, 0, z + 0.5f);
//First Triangle
triangles[triangle] = vertex;
triangles[triangle + 1] = vertex + 1;
triangles[triangle + 2] = vertex + 4;
//Second Triangle
triangles[triangle + 3] = vertex + 1;
triangles[triangle + 4] = vertex + 2;
triangles[triangle + 5] = vertex + 4;
//Third Triangle
triangles[triangle + 6] = vertex + 2;
triangles[triangle + 7] = vertex + 3;
triangles[triangle + 8] = vertex + 4;
//Fourth Triangle
triangles[triangle + 9] = vertex + 3;
triangles[triangle + 10] = vertex;
triangles[triangle + 11] = vertex + 4;
vertex += verticesInVertex;
triangle += trianglesInVertex * 3;
}
UpdateMesh();
yield return new WaitForSeconds(.1f);
}
}
private void UpdateMesh()
{
myMesh.Clear();
myMesh.vertices = vertices;
myMesh.triangles = triangles;
myMesh.RecalculateNormals();
}
public void OnDrawGizmos()
{
Gizmos.color = Color.red;
for(int i = 0; i < vertices.Length; i++)
{
Gizmos.DrawSphere(vertices[i], .1f);
}
}
}
Mesh buffers are 16 bit by default. See Mesh-indexFormat:
Index buffer can either be 16 bit (supports up to 65535 vertices in a mesh), or 32 bit (supports up to 4 billion vertices). Default index format is 16 bit, since that takes less memory and bandwidth.
Note! Ehen you want to change the 16-bit buffer to 32-bit buffer, you can change with this line of code:
mesh.indexFormat = UnityEngine.Rendering.IndexFormat.UInt32;

How do I reverse an SVG curve built using an array in C#?

I'm new to c#/svg and am attempting to convert a relative <90 curve whose points exists in an array: float arcArray[2,4] and trying to keep everything in my ArcPlot class using System only and put the actual svg functions within a separate class.
This will produce the correct curve visually but I need it to go in the opposite direction to append to an existing svg string:
float [,] arcPoint = ArcPlot.arcPointsArray(StartAngle, SweepAngle, Radius, -RadiusOffset, Clockwise);
svgOut += " m " + arcPoint[0, 0] + " " + arcPoint[1, 0] + " c " + arcPoint[0, 1] + " " + arcPoint[1, 1] + " " + arcPoint[0, 2] + " " + arcPoint[1, 2] + " " + arcPoint[0, 3] + " " + arcPoint[1, 3];
This:
float [,] arcPoint = ArcPlot.reverseArcArray(ArcPlot.arcPointsArray(StartAngle, SweepAngle, Radius, -RadiusOffset, Clockwise));
svgOut += " m " + arcPoint[0, 0] + " " + arcPoint[1, 0] + " c " + arcPoint[0, 1] + " " + arcPoint[1, 1] + " " + arcPoint[0, 2] + " " + arcPoint[1, 2] + " " + arcPoint[0, 3] + " " + arcPoint[1, 3];
using this function:
public static float[,] reverseArcArray(float[,] ArcArray)
{
float [,] arcArray = ArcArray;
float [,] swapArray = new float [2,4];
swapArray[0, 0] = arcArray[0, 3] - arcArray[0, 3];
swapArray[1, 0] = arcArray[1, 3] - arcArray[1, 3];
swapArray[0, 1] = arcArray[0, 2] - arcArray[0, 3];
swapArray[1, 1] = arcArray[1, 2] - arcArray[1, 3];
swapArray[0, 2] = arcArray[0, 1] - arcArray[0, 3];
swapArray[1, 2] = arcArray[1, 1] - arcArray[1, 3];
swapArray[0, 3] = arcArray[0, 0] - arcArray[0, 3];
swapArray[1, 3] = arcArray[1, 0] - arcArray[1, 3];
return swapArray;
}
starts the curve in the right place (0,0) and the remaining three control points are close, but are offset by something I'm overlooking. I'm assuming it's the difference between absolute and relative arcs and I'm missing something simple like a deduction on the actual curve coordinates.
Brute forcing/trial and error is not working for me.
I first attempted this with ArcPlot.arcPointsArray(StartAngle + SweepAngle, SweepAngle, Radius, -RadiusOffset, !Clockwise) without luck either, and this would be the preferred method to avoid reversing altogether but, again, I'm obviously missing something. I would still like to figure out the reverse function, as well, to better my understanding of relative svg.
If it helps, this is the actual function I use to create the arc:
public static float[,] arcPointsArray(double StartAngle, double SweepAngle, double Radius, double RadiusOffset = 0d,
bool Clockwise = false, float XCenter = 0f, float YCenter = 0f)
{
double radius = Radius, startAngle = StartAngle, sweepAngle = SweepAngle, radiusOffset = RadiusOffset;
bool arcClockwise = Clockwise;
float xCenter = XCenter, yCenter = YCenter;
double startRadiusAngle = arcClockwise ? startAngle - (pi / 2) : startAngle + (pi / 2);
startRadiusAngle -= Convert.ToInt32(startRadiusAngle / (pi * 2)) * (pi * 2); // mathematical overcircle check
sweepAngle -= Convert.ToInt32(sweepAngle / (pi * 2)) * (pi * 2);
double toCenterAngle = arcClockwise ? startAngle + (pi / 2) : startAngle - (pi / 2);
if (toCenterAngle > (pi * 2)) toCenterAngle -= pi * 2; // functional overcircle check
if (toCenterAngle < 0) toCenterAngle += pi * 2;
if (XCenter == 0f) xCenter = Convert.ToSingle(Math.Cos(toCenterAngle) * radius);
if (YCenter == 0f) yCenter = Convert.ToSingle(Math.Sin(toCenterAngle) * radius);
radius += radiusOffset;
float[,] arcArray = new float[2, 4];
arcArray[0, 0] = Convert.ToSingle(xCenter + (Math.Cos(startRadiusAngle) * radius)); // relocate start point
arcArray[1, 0] = Convert.ToSingle(yCenter + (Math.Sin(startRadiusAngle) * radius));
double circleFraction = pi * 2 / sweepAngle;
double bezierLength = radius * 4 / 3 * Math.Tan(pi / (2 * circleFraction));
arcArray[0, 1] = Convert.ToSingle(arcArray[0, 0] + (Math.Cos(startAngle) * bezierLength)) - arcArray[0, 0];
arcArray[1, 1] = Convert.ToSingle(arcArray[1, 0] + (Math.Sin(startAngle) * bezierLength)) - arcArray[1, 0];
double endRadiusAngle = arcClockwise ? startRadiusAngle + sweepAngle : startRadiusAngle - sweepAngle;
if (endRadiusAngle > (pi * 2)) endRadiusAngle -= pi * 2;
if (endRadiusAngle < 0) endRadiusAngle += pi * 2;
arcArray[0, 3] = Convert.ToSingle(xCenter + (Math.Cos(endRadiusAngle) * radius)) - arcArray[0, 0];
arcArray[1, 3] = Convert.ToSingle(yCenter + (Math.Sin(endRadiusAngle) * radius)) - arcArray[1, 0];
double endAngle = arcClockwise ? endRadiusAngle - (pi / 2) : endRadiusAngle + (pi / 2);
if (endAngle > (pi * 2d)) endAngle -= pi * 2;
if (endAngle < 0d) endAngle += pi * 2;
arcArray[0, 2] = Convert.ToSingle(arcArray[0, 3] + (Math.Cos(endAngle) * bezierLength));
arcArray[1, 2] = Convert.ToSingle(arcArray[1, 3] + (Math.Sin(endAngle) * bezierLength));
return arcArray;
}
I've seen similar questions in python and javascript but don't understand the syntax or structure enough to translate.
I'm assuming the answer is simply a transposition, incorrect assumption or math error but, if not, pseudocode would be preferred so that I can get the concept rather than cut/paste a solution.
The following gif shows a rotation issue I'm having because the inside relative arc is not being translated properly. I will deal with that separately as a previous attempt (which doesn't exist anymore since I didn't start using git until after) at rendering everything with absolute positioning doesn't present this issue. The actual issue I'm having is that the inside arc can be properly rendered but only in the wrong direction. When reversing it using either the reversing method shown above or using arcPointsArray to draw it backwards, those sections need to be identified and concatenated separately rather than using a loop since they require slightly different methods. The idea is to eventually wrap the green line in a red line at a uniform distance, regardless of the starting angle, direction and scale.
https://imgur.com/a/6SiItuv
Why not just modify your call to arcPointsArray()? Does something like this work?
float [,] arcPoint = ArcPlot.arcPointsArray(StartAngle + SweepAngle,
-SweepAngle,
Radius,
-RadiusOffset,
!Clockwise);
This is the code I eventually used for reversing the relative cubic svg curve:
public static float[,] reverseArcArray(float[,] ArcArray)
{
float [,] arcArray = ArcArray;
float [,] swapArray = new float [2,4];
swapArray[0, 0] = 0f;
swapArray[1, 0] = 0f;
swapArray[0, 1] = arcArray[0, 2] - arcArray[0, 3];
swapArray[1, 1] = arcArray[1, 2] - arcArray[1, 3];
swapArray[0, 2] = arcArray[0, 1] - arcArray[0, 3];
swapArray[1, 2] = arcArray[1, 1] - arcArray[1, 3];
swapArray[0, 3] = -arcArray[0, 3];
swapArray[1, 3] = -arcArray[1, 3];
return swapArray;
}
My issue was a misunderstanding of the relationship between the first and last coordinates. The function, as described in the question, would do the job correctly. It would both reverse a relative curve and convert an absolute to relative while reversing.
Since I'm only dealing with relative curves, I can discard the first coordinates since they will always be 0,0 and this can be overwritten with a starting location, as needed.
The solution reached in Paul's answer reveals this to be an xy problem. Re-evaluating how I use the arcPointsArray method eliminates the need for the reverseArcArray method.
I've left this answer so that anyone actually searching for the y problem doesn't get stuck with just the x solution.

Fixing extruded mesh normals in Unity C#?

In Unity C#, I'm using a procedular mesh extrusion, based on a flat polygon's 2D vector points. This works great using the code below, except for one detail: seemingly every second triangle which connects the front from the backside of the extruded mesh is flipped, as the image shows (using a double-sided shader, I can see all the triangles do exist fine, though). How would I fix the flip of these bridging normals? Thanks!
// Extrusion functionality via
// https://forum.unity.com/threads/trying-extrude-a-2d-polygon-to-create-a-mesh.102629/
// with Triangulator based on
// http://wiki.unity3d.com/index.php?title=Triangulator
public static Mesh GetExtrudedMeshFromPoints(Vector2[] points, float depth)
{
const float frontVertex = 0f;
Triangulator triangulator = new Triangulator(points);
int[] tris = triangulator.Triangulate();
Mesh m = new Mesh();
Vector3[] vertices = new Vector3[points.Length*2];
for (int i = 0; i < points.Length; i++)
{
vertices[i].x = points[i].x;
vertices[i].y = points[i].y;
vertices[i].z = frontVertex;
vertices[i+points.Length].x = points[i].x;
vertices[i+points.Length].y = points[i].y;
vertices[i+points.Length].z = depth;
}
int[] triangles = new int[tris.Length*2+points.Length*6];
int count_tris = 0;
// Front vertices
for (int i = 0; i < tris.Length; i += 3)
{
triangles[i] = tris[i];
triangles[i+1] = tris[i+1];
triangles[i+2] = tris[i+2];
}
count_tris += tris.Length;
// Back vertices
for (int i = 0; i < tris.Length; i += 3)
{
triangles[count_tris+i] = tris[i+2] + points.Length;
triangles[count_tris+i+1] = tris[i+1] + points.Length;
triangles[count_tris+i+2] = tris[i] + points.Length;
}
count_tris += tris.Length;
// Triangles around the perimeter of the object
for (int i = 0; i < points.Length; i++)
{
int n = (i+1) % points.Length;
triangles[count_tris] = i;
triangles[count_tris+1] = i + points.Length;
triangles[count_tris+2] = n;
triangles[count_tris+3] = n;
triangles[count_tris+4] = n + points.Length;
triangles[count_tris+5] = i + points.Length;
count_tris += 6;
}
m.vertices = vertices;
m.triangles = triangles;
m.RecalculateNormals();
m.RecalculateBounds();
m.Optimize();
return m;
}
If a face isn't shown it is as you already noticed flipped, this is due to the winding order which in unity is clockwise. The link of your triangluator already states under Troubleshooting:
"If you can't see a polygon created with this utility, remember to check if the polygon is facing the opposite direction. If it is, you can change that by constructing your mesh with the vertex indices in reverse order."
EDIT:
for further clarification: in your code this would mean you have to switch
// Triangles around the perimeter of the object
for (int i = 0; i < points.Length; i++)
{
int n = (i + 1) % points.Length;
triangles[count_tris] = i;
triangles[count_tris + 1] = i + points.Length;
triangles[count_tris + 2] = n;
triangles[count_tris + 3] = n;
triangles[count_tris + 4] = n + points.Length;
triangles[count_tris + 5] = i + points.Length;
count_tris += 6;
}
to
// Triangles around the perimeter of the object
for (int i = 0; i < points.Length; i++)
{
int n = (i + 1) % points.Length;
triangles[count_tris] = n;
triangles[count_tris + 1] = i + points.Length;
triangles[count_tris + 2] = i;
triangles[count_tris + 3] = n;
triangles[count_tris + 4] = n + points.Length;
triangles[count_tris + 5] = i + points.Length;
count_tris += 6;
}
But be careful because the order depends on your depth (if it is higher or lower than you frontVertex)
2nd EDIT:
the normal of a triangle depends on the winding order, this means that there is a difference in the ordering.
An example:
1: Vector2(1f, 1f);
2: Vector2(1f, 0f);
3: Vector2(0f, 0f);
the triangle of
Triangle 1,2,3
and
Triangle 1,3,2
have different normal.
You have to make sure that the winding order is consistent for every triangle you draw.
Alternativly you could tell your shader to disable culling (with Cull Off),as you already said above. But this comes with a cost of computation time. Which in most cases shouldn't matter that much, but it always depends on your purpose. Besides most of the time you dont want the side effects of disabling culling.

How to fix wrong procedural grid with normalized values in Unity

I have problem with normalized values when creating procedural grid in Unity. I have been following great tutorial from catlikecoding and I dumped in to weird behaving when I tried to use normalized values for my vertices. In some cases of xSize and ySize grid combinations all works, but in other combinations mesh get deformed. Let me give you couple of examples
xSize = 35; ySize = 25; // OK
xSize = 350; ySize = 250; // NOT OK
xSize = 150; ySize = 250; // OK
xSize = 350; ySize = 200; // NOT OK
xSize = 1000; ySize = 750; // NOT OK
First 2 cases I illustrated with sphere representing each 10th vertices.
35x25 case
350x250 case
I am using Unity3d 2018.3
private void Generate()
{
GetComponent<MeshFilter>().mesh = mesh = new Mesh();
mesh.name = "Procedural Grid";
vertices = new Vector3[(xSize + 1) * (ySize + 1)];
Vector2[] uv = new Vector2[vertices.Length];
float multX = 1 / (float)xSize;
float multY = 1 / (float)ySize;
for (int i = 0, y = 0; y <= ySize; y++)
{
for (int x = 0; x <= xSize; x++, i++)
{
//vertices[i] = new Vector3(x, y);
var xNormalized = x * multX;
var yNormalized = y * multY;
vertices[i] = new Vector3(xNormalized, yNormalized);
uv[i] = new Vector2(xNormalized, yNormalized);
}
}
mesh.vertices = vertices;
mesh.uv = uv;
var triangles = new int[xSize * ySize * 6];
for (int ti = 0, vi = 0, y = 0; y < ySize; y++, vi++)
{
for (int x = 0; x < xSize; x++, ti += 6, vi++)
{
triangles[ti] = vi;
triangles[ti + 3] = triangles[ti + 2] = vi + 1;
triangles[ti + 4] = triangles[ti + 1] = vi + xSize + 1;
triangles[ti + 5] = vi + xSize + 2;
}
}
mesh.triangles = triangles;
mesh.RecalculateNormals();
}
I expect the mesh be 1x1 in every case, no matter which xSize or ySize of the grid I use. Anybody can advise how to achieve that?
So my friend explained me, that by default meshes have a 65535 vertices limit in Unity. And I have to nicely ask if I want more.
I had to add
mesh.indexFormat = UnityEngine.Rendering.IndexFormat.UInt32;
after
mesh.name = "Procedural Grid";
here is more..
Suddenly all works as expected. Thank you all for support.

Categories