Two different Vector3.zero's, but no parent? - c#

UPDATE
I found out that the mesh center of the mesh object is not at 0,0,0. Does that do anything?
I have the following problem. I am generating a terrain from Perlin noise and that works fine. However, as soon as I try to instantiate any objects on it, some are spawned in the terrain area and some completely outside. When I reset the object's transform, it teleports to (0,0,0) as expected, but when I reset another object, that was not instantiated at runtime, the (0,0,0) is at a completely different location! I have no parent set to these objects and no parent set to the other object as well. Below is my code for generating the objects:
private void AddRocks(Terrain terrain, int count)
{
for (int i = 0; i < count; i++)
{
float randX = Random.Range(0, 256); //256 is my terrain size, the transform is all zeros and 1 for the transform size.
float randZ = Random.Range(0, 256);
GameObject newGameObject = Instantiate(rockPrefab,
new Vector3(randX, terrain.terrainData.GetHeight((int)randX, (int)randZ),
randZ), Quaternion.identity);
}
}
This is my code for generating the perlin noise terrain:
TerrainData GenerateTerrain(TerrainData terrainData)
{
terrainData.heightmapResolution = width + 1;
terrainData.size = new Vector3(width, depth, height);
terrainData.SetHeights(0, 0, GenerateHeights());
return terrainData;
}
float[,] GenerateHeights()
{
float[,] heights = new float[width, height];
for (int x = 0; x < width; x++)
{
for (int y = 0; y < height; y++)
{
heights[x, y] = CalculateHeight(x, y);
}
}
return heights;
}
float CalculateHeight(int x, int y)
{
float xCoord = (float)x / width * scale + offsetX;
float yCoord = (float)y / height * scale + offsetY;
return Mathf.PerlinNoise(xCoord, yCoord);
}
This is how I call them in Start:
terrain.terrainData = GenerateTerrain(terrain.terrainData);
AddRocks(terrain: terrain, count: 20);
This is how it looks after generating:
This is how the rocks look:
The rocks are generated from a script that lies on the mainterrain itself.

I have no parent set to these objects and no parent set to the other object as well.
Actually, you do set parent:
GameObject newGameObject = Instantiate(rockPrefab,
new Vector3(randX, terrain.terrainData.GetHeight((int)randX, (int)randZ),
randZ), Quaternion.identity, rockHolder.transform);
The last parameter ( rockHolder.transform) is a transform to which the instantiated object will be attached and the position you set will become a localPosition of the instantiated object relative to the parent ( rockHolder).
But I don't see the rockHolder object in the hierarchy view screenshot. Seems like you have rockHolder.transform = null, in other words it's not initialized. So, when you call Instantiate (...) and pass the rockHolder.transform as a desired parent for the rocks, it is null, so Unity spawns the objects and assign them to null (no parent).
Can't tell if this is the root of the problem but it's certainly not okay anyway.

Related

Raycast not capturing all Vector coordinates

I have a gameobject that occupies the whole screen just for testing purposes. I'm drawing a line btw. What I'm trying to achieve is if the mouse position hits a gameobject it will store the vector2 coordinates in a list. But raycast is not storing all the coordinates. Below is my code
private void Update()
{
if (Input.GetMouseButton(0))
{
Vector2 mousePos = Input.mousePosition;
Vector2 Pos = _camera.ScreenToWorldPoint(mousePos);
if(!mousePositions.Contains(Pos))
mousePositions.Add(Pos);
if (Physics.Raycast(Camera.main.ScreenPointToRay(mousePos), out RaycastHit hit))
{
Vector2 textureCoord = hit.textureCoord;
int pixelX = (int)(textureCoord.x * _templateDirtMask.width);
int pixelY = (int)(textureCoord.y * _templateDirtMask.height);
Vector2Int paintPixelPosition = new Vector2Int(pixelX, pixelY);
if (!linePositions.Contains(paintPixelPosition))
linePositions.Add(paintPixelPosition);
foreach (Vector2Int pos in linePositions)
{
int pixelXOffset = pos.x - (_brush.width / 2);
int pixelYOffset = pos.y - (_brush.height / 2);
for (int x = 0; x < _brush.width; x++)
{
for (int y = 0; y < _brush.height; y++)
{
_templateDirtMask.SetPixel(
pixelXOffset + x,
pixelYOffset + y,
Color.black
);
}
}
}
_templateDirtMask.Apply();
}
}
}
Everytime I checked the element count mousePositions are always greater than linePositions. I don't know what's causing this
the element count mousePositions are always greater than linePosition
well it is quite simple: In
int pixelX = (int)(textureCoord.x * _templateDirtMask.width);
int pixelY = (int)(textureCoord.y * _templateDirtMask.height);
you are casting to int and cut off any decimals after the comma (basically like doing Mathf.FloorToInt).
So you can totally have multiple mouse positions which result in float pixel positions like e.g.
1.2, 1.2
1.4, 1.7
1.02, 1.93
...
all these will map to
Vector2Int paintPixelPosition = new Vector2Int(1, 1);
Besides, you might want to look at some better line drawing algorithms like e.g. this simple one
And then note that calling SetPixel repeatedly is quite expensive. You want to do a single SetPixels call like e.g.
var pixels = _templateDirtMask.GetPixels();
foreach (Vector2Int pos in linePositions)
{
int pixelXOffset = pos.x - (_brush.width / 2);
int pixelYOffset = pos.y - (_brush.height / 2);
for (int x = 0; x < _brush.width; x++)
{
for (int y = 0; y < _brush.height; y++)
{
pixels[(pixelXOffset + x) + (pixelYOffset + y) * _templateDirtMask.width] = Color.black;
}
}
}
_templateDirtMask.SetPixels(pixels);
_templateDirtMask.Apply();
It happens because there is really could be a case, when several elements from mousePositions are associated with one elment from linePositions.
Rough example: your texture resolution is only 1x1px. In this case you linePositons will contain only one element. And this element will be associated with all elements from mosePositions.
So, relation of the number of elements in these lists depends on relation of your texture and screen resolutions.

Getting weird results from terrain generation sys using 2D Perlin Noise

I am trying to make a terrain generation system in Unity, similar to Minecraft's, but using Unity's Perlin Noise function (so only 2D noise).
So I have a 16x16x16 chunk with a vector2int that has it's position (so like, if x & z = 0, then the blocks inside are from 0 to 16 in world coordinates).
This is how I'm trying to generate the height map of a chunk:
public void generate(float scale) {
GameObject root = new GameObject("Root");
// this.z & this.x are the chunk coordinates, size is 16
for(int z = this.z * size; z < (this.z + size); ++z) {
for (int x = this.x * size; x < (this.x + size); ++x) {
float[] coord = new float[2] { (float)x / size * scale,
(float)z / size * scale };
Debug.LogFormat("<color='blue'>Perlin coords |</color> x: {0}; y: {1}", coord[0], coord[1]);
float value = Mathf.PerlinNoise(coord[0], coord[1]);
// temporary
GameObject Cube = GameObject.CreatePrimitive(PrimitiveType.Cube);
Cube.transform.position = new Vector3(x, value, z);
Cube.transform.parent = root.transform;
}
}
return;
}
The results are... bad. See for yourself:
What can I do?
It looks good, looks just scrunched on the y transform.
float value = Mathf.PerlinNoise(coord[0], coord[1]);
This is going to give you problems, I'm not sure what coord[0] and coord[1] are but Mathf.PerlinNoise will return a random float between coord[0] and coord[1], so a random float will never be able to produce well aligned tiles.
Better off doing something like
int numTilesHigh = Random.Range(0,15);
for (int i = 0; i < numTilesHigh; i++) {
GameObject Cube = GameObject.CreatePrimitive(PrimitiveType.Cube);
Cube.transform.position = new Vector3(x, <cube height> * i, z);
Cube.transform.parent = root.transform;
}
ps I kind of like your screen shot, not in a minecraft way but it does look cool : - )

Using Perlin Noise across multiple Unity Terrain objects

I have a class project in which we are supposed to use Unities Terrain 3D objects and create a 3x3 smoothly generated terrain. For this we have been told to create a central Terrain the has adjacent terrains in the 8 cardinal directions. I have gotten the Perlin Noise to work through this method
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public class TerrainNoiseGeneration : MonoBehaviour
{
private TerrainData myTerrainData;
public Vector3 worldSize;
public int resolution = 129;
private float userInput = (float)4.2;
public float offsetX;
public float offsetZ;
// Start is called before the first frame update
void Start()
{
myTerrainData = gameObject.GetComponent<TerrainCollider>().terrainData;
worldSize = new Vector3(100, 50, 100);
myTerrainData.size = worldSize;
myTerrainData.heightmapResolution = resolution;
float[,] heightArray = new float[resolution, resolution];
heightArray = PerlinNoise(userInput, offsetX, offsetZ);
myTerrainData.SetHeights(0, 0, heightArray);
}
// Update is called once per frame
void Update()
{
float[,] heightArray = new float[resolution, resolution];
heightArray = PerlinNoise(userInput, offsetX, offsetZ);
myTerrainData.SetHeights(0, 0, heightArray);
}
float[,] PerlinNoise(float userInput, float offsetX, float offsetZ)
{
float[,] heights = new float[resolution, resolution];
for (int z = 0; z < resolution; z++)
{
for (int x = 0; x < resolution; x++)
{
float nx = (x + offsetX) / resolution * userInput;
float ny = (z + offsetZ) / resolution * userInput;
heights[z, x] = Mathf.PerlinNoise(nx, ny);
}
}
return heights;
}
This code allows me to Generate a smooth terrain in the first Terrain object but when I try entering the offset values so that the edges can line-up they do not have the same values.
I would appreciate any assistance on this issue as I have tried a lot of different solutions, none of which are working
Update: I was able to solve the problem with a rather simple solution of the fact that I needed to use my resolution as the offset not the distance between the terrains
I needed to set the OffsetX and OffsetZ equal to that of their respective resolution positions instead of their unity positions.
For example my terrains are 100x100 so I was setting offset to 100 or -100 depending on its location but instead I needed to use 128 or -128 to keep it in line with the resolution

How can i create a plane out of many meshes?

What i want to do is to extrude a mesh plane.
The plane is in red in the scene view. Each mesh have two triangles.
First i don't understand what is the Res X and Res Z are for.
What i want to create first is a plane from vertices and triangles in size of 16x16 or any other size by height(Length should be height) and width.
But after i set all the properties to 16 the plane is built from 15x15 meshes not 16x16.
And my main goal is now to extrude the plane. I mean to use OnMouseDown and by a click on the plane to find the closet and neighbours of the vertices from where i clicked on and to extrude this vertice/s. Extrude i mean for example only the z to change the vertices i clicked on position on z only.
Something the same idea like in this image. Marked it in red circle:
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public class meshPlane : MonoBehaviour
{
public int length;
public int width;
public int resX;
public int resZ;
private MeshFilter meshf;
private Mesh mesh;
private Vector3[] vertices;
private void Start()
{
GenerateOrigin();
}
private void GenerateOrigin()
{
// You can change that line to provide another MeshFilter
meshf = GetComponent<MeshFilter>();
mesh = new Mesh();
meshf.mesh = mesh;
mesh.Clear();
#region Vertices
vertices = new Vector3[resX * resZ];
for (int z = 0; z < resZ; z++)
{
// [ -length / 2, length / 2 ]
float zPos = ((float)z / (resZ - 1) - .5f) * length;
for (int x = 0; x < resX; x++)
{
// [ -width / 2, width / 2 ]
float xPos = ((float)x / (resX - 1) - .5f) * width;
vertices[x + z * resX] = new Vector3(xPos, 0f, zPos);
}
}
#endregion
#region Normales
Vector3[] normales = new Vector3[vertices.Length];
for (int n = 0; n < normales.Length; n++)
normales[n] = Vector3.up;
#endregion
#region UVs
Vector2[] uvs = new Vector2[vertices.Length];
for (int v = 0; v < resZ; v++)
{
for (int u = 0; u < resX; u++)
{
uvs[u + v * resX] = new Vector2((float)u / (resX - 1), (float)v / (resZ - 1));
}
}
#endregion
#region Triangles
int nbFaces = (resX - 1) * (resZ - 1);
int[] triangles = new int[nbFaces * 6];
int t = 0;
for (int face = 0; face < nbFaces; face++)
{
// Retrieve lower left corner from face ind
int i = face % (resX - 1) + (face / (resZ - 1) * resX);
triangles[t++] = i + resX;
triangles[t++] = i + 1;
triangles[t++] = i;
triangles[t++] = i + resX;
triangles[t++] = i + resX + 1;
triangles[t++] = i + 1;
}
#endregion
mesh.vertices = vertices;
mesh.normals = normales;
mesh.uv = uvs;
mesh.triangles = triangles;
mesh.RecalculateBounds();
}
}
When you say "the plane is built from 15x15 meshes" you mean the plane is built from 15x15 squares. That whole plane is the mesh.
ResX and ResZ are how many points there are in each direction. You get one less square because you need two edges for the first square. You need another two for each square you add, but they can share an edge with the previous one so you need only one more.
To make your mesh clickable you need to add a mesh collider to your gameobject and assign the mesh you generate to it. Then, you can use the camera class to get a ray, put that in a raycast and if your raycast hits anything you can use the triangle index and the triangles array you created to get the three points of the triangle that was hit. In addition you can see which weight in the barycentric coordinates is bigger to know which exact vertex your click was closest to. And finally, now that you have the exact vertex you can modify its height.

XNA is pulling the wrong texture?

I am making (another) MineCraft clone, and I've run into an interesting problem. I have a public enum that lists all the cube types a particular cube can be, and I have a 3d array that holds cubes. Each cube has a specific type, and I iterate through this array to get the vertices for each cube, then pass those vertices to a vertex buffer designated for a particular cube type. When I create a random array of cubes, or a single cube, and tell it what texture it should be everything draws as expected. I'm now trying to figure out how to draw a random "surface" of grass cubes, and fill everything below those on the y-axis with dirt cubes. The strangest thing is happening though, the top most cube is dirt and it fills all the bottom ones with grass cubes! When I disable the loop to fill the underground with dirt, the top most cube is displaying grass as intended.
Here is what I believe to be the relevant parts of the code. Here is where the cube type is set:
// Create a random surface level
Perlin perlin = new Perlin();
for (int x = 0; x < Game.ChunkWidth_X; x++)
{
for (int z = 0; z < Game.ChunkDepth_Z; z++)
{
double XVal = Convert.ToDouble(x) * 1.1;
double ZVal = Convert.ToDouble(z) * 1.1;
double YVal = Game.ChunkHeight_Y / 2 * 1.1;
double PerlinValue = perlin.GetValue(XVal, YVal, ZVal);
int YVal_new = Convert.ToInt32(YVal + (PerlinValue * 10));
if (YVal_new > Game.ChunkHeight_Y - 1) { YVal_new = Game.ChunkHeight_Y - 1; }
if (YVal_new < 0) { YVal_new = 0; }
// Set the grass cube
Cube NewCube = new Cube(new Vector3(0.5f, 0.5f, 0.5f), new Vector3(x, YVal_new, z));
NewCube.cubeType = CubeType.Grass;
CubeGrid[x, YVal_new, z] = NewCube;
// Fill below it with dirt
for (int y = YVal_new - 1; y >= 0; y--)
{
Cube NewCube2 = new Cube(new Vector3(0.5f, 0.5f, 0.5f), new Vector3(x, y, z));
NewCube2.cubeType = CubeType.Dirt;
CubeGrid[x, y, z] = NewCube2;
}
// Fill above it with air
for (int y = YVal_new + 1; y < Game.ChunkHeight_Y; y++)
{
Cube NewCube2 = new Cube(new Vector3(0.5f, 0.5f, 0.5f), new Vector3(x, y, z));
NewCube2.cubeType = CubeType.Air;
CubeGrid[x, y, z] = NewCube2;
}
}
}
This is where I pull the vertices to put into the appropriate buffer:
Dictionary<CubeType, List<VertexPositionNormalTexture>> DrawableVertices = new Dictionary<CubeType, List<VertexPositionNormalTexture>>();
// Get the proper vertices for each cube type and put in the appropriate dictionary
for (int x = 0; x < Game.ChunkWidth_X; x++)
{
for (int z = 0; z < Game.ChunkDepth_Z; z++)
{
for (int y = 0; y < Game.ChunkHeight_Y; y++)
{
CubeGrid[x,y,z].CreateVertices();
string test = CubeGrid[x, y, z].cubeType.ToString();
foreach (VertexPositionNormalTexture TargetVertex in CubeGrid[x, y, z].DisplayableVertices)
{
if (!DrawableVertices.ContainsKey(CubeGrid[x, y, z].cubeType))
{
List<VertexPositionNormalTexture> NewList = new List<VertexPositionNormalTexture>();
NewList.Add(TargetVertex);
DrawableVertices.Add(CubeGrid[x, y, z].cubeType, NewList);
}
else
{
DrawableVertices[CubeGrid[x, y, z].cubeType].Add(TargetVertex);
}
}
}
}
}
Here is the second part of it:
foreach (KeyValuePair<CubeType, List<VertexPositionNormalTexture>> KVP in DrawableVertices)
{
VertexBuffer cubeBuffer = new VertexBuffer(device, typeof(VertexPositionNormalTexture), KVP.Value.Count, BufferUsage.WriteOnly);
cubeBuffer.SetData(KVP.Value.ToArray());
// Update our collection of vertex buffers
CubeType_VertexBuffers[KVP.Key] = cubeBuffer;
// Get the triangle count for the buffer
CubeType_TriangleCount[KVP.Key] = KVP.Value.Count / 3;
}
Lastly, here is my draw:
// Go through each vertex buffer we have created, and draw it.
foreach (KeyValuePair<CubeType, VertexBuffer> KVP in CubeType_VertexBuffers)
{
foreach (EffectPass pass in testEffect.CurrentTechnique.Passes)
{
if (CubeType_TriangleCount[KVP.Key] > 0) // if this buffer has triangles, draw it.
{
pass.Apply();
testEffect.View = camera.ViewMatrix;
testEffect.TextureEnabled = true;
testEffect.Projection = camera.ProjectionMatrix;
testEffect.World = worldMatrix;
testEffect.Texture = CubeType_Texture[KVP.Key];
device.SetVertexBuffer(KVP.Value);
device.DrawPrimitives(PrimitiveType.TriangleList, 0, CubeType_TriangleCount[KVP.Key]);
}
}
}
base.Draw(gameTime);
The weirdest thing is that when I manually set cube types everything draws with the proper texture as expected. What other things should I try to troubleshoot? I tried making a specific effect for each cube type to no avail.
After trying a bunch of random things in desperation, I found a fix for this. It turns out that if you use the same BasicEffect for different textures, it only uses the last texture assigned to it. I was iterating through a list of VertexBuffers and assigning a different texture for each one. By the time everything made it over to the video card, only the last texture used was rendered, or so it appears.
The solution was to create a separate BasicEffect for each texture I needed and assign only the VertexBuffers needed to the particular BasicEffect.

Categories