I've been working on a game for the team seas game jam, and I'm trying to make a boat that will actually float, I've but ran into a issue, so to explain it the way the current script works is it checks for the position of some point on the boat and if the points are below the y level of the plane/water then it will add an upward force to that point. Here's that code:
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public class Floater : MonoBehaviour
{
public float depthBeforeSubmerged = 1f;
public float displacementAmount = 3f;
public Transform[] floatPoints;
public Transform Water;
private Rigidbody rb;
private void Start()
{
rb = GetComponent<Rigidbody>();
}
void FixedUpdate()
{
foreach (Transform t in floatPoints)
{
if (t.position.y < Water.position.y) {
// if point is below water add force to said p
float displacementMultiplyer = Mathf.Clamp01(-t.position.y / depthBeforeSubmerged)
* displacementAmount;
rb.AddForceAtPosition(new Vector3(0f, Mathf.Abs(Physics.gravity.y) *
displacementMultiplyer, 0f), t.position,ForceMode.Acceleration);
}
}
}
}
This code works fine but the issue is that when I want to add waves to the plane/water the boat begins to float under the water, because shaders will edit the mesh plane but not the the y pose
of the plane/water it will not detect that and think the water is just sitting still with no elevation changes. So what I'm trying to do is get the vertices at the same x and z position as the points and get there y positions. Here's a diagram:
.
I'd do a racyast from the center of the cube straight up and add a force upward depending on the distance of that raycast, so as it gets lower underwater, it'll get forced up more. I dont know if there is any specific way to do it in the basic physics engine.
Related
Problem
I have created a navigation system using Rigidbody and NavMesh. There is an agent that tries to follow me around. It has a capsule collider, a script (see below), a Rigidbody, and a NavMeshAgent component. It works by running the NavMesh calculations, getting the coordinates for the next place to move, disabling the NavMesh, and moving there using MovePosition(). This system works pretty well usually, but lately I have been experiencing problems. When the Agent moves around an obstacle, it pauses on the corners. I believe the reason for this is my code, especially my MovePosition() function, but I don't know how to fix it.
https://vimeo.com/731216013
Code
using UnityEngine;
using UnityEngine.AI;
public class PlayerController : MonoBehaviour
{
public Rigidbody rb;
public float m_Speed = 3f;
public Transform goal;
public NavMeshAgent navMesh;
public float maxSpeed = 5f;
void FixedUpdate()
{
MoveToNextPosition();
}
void MoveToNextPosition()
{
navMesh.enabled = true;
if (navMesh.isOnNavMesh)
{
NavMeshPath path = new NavMeshPath();
if(navMesh.CalculatePath(goal.position, path))
{
navMesh.path = path;
}
}
Vector3 pos = navMesh.steeringTarget;
navMesh.enabled = false;
//Store user input as a movement vector
Vector3 distance = pos - transform.position;
Vector3 direction = distance.normalized;
float speed = rb.velocity.magnitude;
Debug.Log(direction);
rb.MovePosition(transform.position + direction * Time.deltaTime * m_Speed);
}
}
I have a big list of what the issues with the navmesh can be, hence I am posting it as an answer:
Open your navigation panel, if you can't find one or accidentally closed the tab, go to Window > AI > Navigation. The Navigation tab looks like this:
Open the Bake tab and set your agent's radius and height according to its dimensions. At the bottom, click to expand Advanced options. In that, there's a field Min Region Area, increasing the value increases the area around the static obstacles where the navmesh agent cannot move.
I'm trying to make an object on the ground follow a flying object, for example a drone leading a human (for now i'm just using shapes - a cube and a capsule). My cube follows the capsule like i desire but i want the cube to follow the capsule on the ground only, rather than go up on the y-axis with the capsule. Right now, it follows the capsule everywhere, I want the capsule to lead while the cube follows along on the ground.
I have done some research on Google and Youtube but I have not seen any results. Please let me know how I can achieve this.
This is the code script attached to the cube(ground object)
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public class follow_target : MonoBehaviour
{
public Transform mTarget;
float mSpeed = 10.0f;
const float EPSILON = 0.1f;
Vector3 mLookDirection;
// Start is called before the first frame update
void Start()
{
}
// Update is called once per frame
void Update()
{
mLookDirection = (mTarget.position - transform.position).normalized;
if((transform.position - mTarget.position).magnitude > EPSILON)
transform.Translate(mLookDirection * Time.deltaTime * mSpeed);
}
}
If the ground is planar, you can just set the y component to 0 (or whatever the ground y vector is).
If the ground changes in topology, you can do a raycast down from the capsule to get the hit point (vector3). You can use the hit point y component for the height. After that you will need to set the cubes rotation so that it is aligned to the ground. You could do that with a raycast as well, there are a number of examples of that online.
I hope that helps get you in the right direction.
Assuming a flat ground on Y = 0
either make sure your objects sticks to the ground so set
private const float EPSILONSQR = EPSILON * EPSILON;
void Update()
{
var difference = mTarget.position - transform.position;
mLookDirection = difference.normalized;
if(difference.sqrmagnitude > EPSILONSQR)
{
// In general be aware that Translate by default moves in the
// objects own local space coordinates so you probably would rather
// want to use Space.World
transform.Translate(mLookDirection * Time.deltaTime * mSpeed, Space.World);
var pos = transform.position;
// reset the Y back to the ground
pos.y = 0;
transform.position = pos;
}
}
or simply already map the direction down on the XZ plane (ignoring any difference in Y) like
private const float EPSILONSQR = EPSILON * EPSILON;
void Update()
{
var difference = mTarget.position - transform.position;
mLookDirection = difference.normalized;
// simply ignore the difference in Y
// up to you if you want to normalize the vector before or after doing that
mLookDirection.y = 0;
if(difference.sqrmagnitude > EPSILONSQR)
{
transform.Translate(mLookDirection * Time.deltaTime * mSpeed, Space.World);
}
}
The cubes are in formation of a big square to create some kind of border so the player can't move further.
transform is the player the script is attached to the player.
When I move the player to one of the cubes close to it the distance is about 95 but it should 0.1f or 0.
using System.Collections;
using System.Collections.Generic;
using TMPro;
using UnityEngine;
using UnityStandardAssets.Characters.ThirdPerson;
public class DistanceCheck : MonoBehaviour
{
public Transform[] cubes;
public GameObject descriptionTextImage;
public TextMeshProUGUI text;
// Start is called before the first frame update
void Start()
{
}
// Update is called once per frame
void Update()
{
for (int i = 0; i < cubes.Length; i++)
{
var dist = Vector3.Distance(transform.position, cubes[i].transform.position);
if (dist <= 0.1f)
{
descriptionTextImage.SetActive(true);
text.text = "I can't move.";
}
else
{
text.text = "";
descriptionTextImage.SetActive(false);
}
}
}
}
When I'm using a break point on the line :
if (dist <= 0.1f)
dist is a bit more then 95. Maybe the distance is checked to another cube or in another point on the cube?
You are checking all cubes every time and using same "dist" variable every time.
So it detects deistance being less then 0.1f, but goes on and changes value of dist to test another cube.This is why you are getting higher values despite being right next to a cube.
You need 4 distance variables that are each measuring distances to its specific cube and do something if either of them gets too short.
Another flaw is that you are measuring distance to a point in a box, so depending which side of a box you are its going to be showing different values.
I would suggest instead of using cube Transforms, use their Colliders instead. And then I'd be looking for the closest point on the collider bounds. As it currently stands, you're checking for the distance of this object, to the origin of the cube, and it sounds to me like you simply want to find the distance to the cube/collider in general.
First, change the field type for your cubes:
public BoxCollider[] cubes;
Then find the distance like this:
var closestPoint = cubes[ i ].ClosestPointOnBounds ( transform.position );
var distance = Vector3.Distance( transform.position, closestPoint );
Now you should be getting a distance to the closest point to the cube/collider.
More information on ClosestPointOnBounds can be found here at the Unity docs.
I am creating a first person controller in Unity. Very basic stuff, camera is a child of Player capsule. Codes work, but I need help explaining what's going on.
*camera is child of Player in hierarchy
These are my questions:
In PlayerMovement, why are we translating in the Z-axis to achieve vertical movement when Unity is in Y-Up axis?
In CamRotation, I have no idea what is going on in Update(). Why are we applying horizontal movement to our Player, but then vertical movement to our Camera? Why is it not applied on the same GameObject?
What is mouseMove trying to achieve? Why do we use var?
I think we are getting a value of how much mouse has been moved, but what then does applying Vector2.Scale do to it?
Code:
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public class PlayerMovement : MonoBehaviour {
public float speed = 5.0f;
// Use this for initialization
void Start () {
}
// Update is called once per frame
void Update () {
float mvX = Input.GetAxis("Horizontal") * Time.deltaTime * speed;
float mvZ = Input.GetAxis("Vertical") * Time.deltaTime * speed;
transform.Translate(mvX, 0, mvZ);
}
}
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public class CamRotation : MonoBehaviour {
public float horizontal_speed = 3.0F;
public float vertical_speed = 2.0F;
GameObject character; // refers to the parent object the camera is attached to (our Player capsule)
// initialization
void Start()
{
character = this.transform.parent.gameObject;
}
// Update is called once per frame
void Update()
{
var mouseMove = new Vector2(Input.GetAxisRaw("Mouse X"), Input.GetAxisRaw("Mouse Y"));
mouseMove = Vector2.Scale(mouseMove, new Vector2(horizontal_speed, vertical_speed));
character.transform.Rotate(0, mouseMove.x, 0); // to rotate our character horizontally
transform.Rotate(-mouseMove.y, 0, 0); // to rotate the camera vertically
}
}
XY is the plane for 2D Unity games. For 3D, you have Z-axis for height, and XY plane for positioning.
Note that different components from mouseMove are being applied (.x for character and .y for the camera). This means that the movement from the character is not equal to the movement from the camera; one should be faster/slower than the other.
var is a predefined C# keyword that lets the compiler figure out an appropriate type. In this case, it's the same as if you had written Vector2 as in Vector2 mouseMove = new Vector2(...);.
You are scaling the value from mouseMove, by multiplying its components by predefined values in your code. That's just it.
EDIT
You apply .x to your character because, as you commented after the line of code, you want to move it horizontally. As for the camera, .y is being applied because you want to move it vertically.
The negative value could be because the axis is inverted, so you make it negative so the camera has a natural movement. It's the same principle some games have in the setting, where they allow you to invert Y-axis.
Since the android should fly or move in the air too I guess I should also use physics too ? Not sure.
I added some spheres as waypoints and I want the android to move between them randomly including random range of movement and rotation speed. For example 1-10 range randomly speed for both movement and rotation.
Now what it does when I'm running the game the NAVI (1) is just moving in rounds none stop on small area not sure why, But it's not moving smooth between the spheres.
The idea is to make the NAVI droid to act like it's out of order.
This is screenshot of the NAVI (1) and it's inspector:
This screenshot is of the Waypoints attached to it Waypoints script:
This is the Waypoints script:
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public class Waypoints : MonoBehaviour
{
public Transform objectToMove;
public float moveSpeed = 3.0f;
public float rotationSpeed = 3.0f;
public GameObject[] waypoints;
public int currentWP;
private void Awake()
{
}
private void Start()
{
currentWP = 0;
}
private void Update()
{
if (waypoints.Length == 0) return;
var lookPos = waypoints[currentWP].transform.position - objectToMove.position;
lookPos.y = 0;
var rotation = Quaternion.LookRotation(lookPos);
objectToMove.rotation = Quaternion.Slerp(objectToMove.rotation, rotation, Time.deltaTime * rotationSpeed);
objectToMove.position += objectToMove.forward * moveSpeed * Time.deltaTime;
}
}
You are never changing currentWP. That means at every frame, you're rotating toward the same nav point and moving the droid back towards it.
You have to change the currentWP only when the droid reaches it. Use OnTrigger between the target nav point and the droid.
You can do it something like this .
//for example you've got 5 waypoints
currentWaypoint = Random.Range(1, 5);
Hope it helps