I always want to keep an object in the center of view of a camera.
The object can not change its position.
The camera can be rotated.
The camera can move up and down.
The object should always be centered in the view of the camera.
So when I rotate the camera by -45°, I would like to know the Y position of the camera at which the rotated camera would still directly face the object.
I know the "horizontal" distance between the camera and the object (as this never changes), and I know the angle of the camera.
How could the "required camera position Y value" be calculated?
Thank you.
So this code works for me:
//Distance between object and camera. Y-axis is pointing up, so we use x and z coordinates.
float R = Vector2.Distance(new Vector2(obj.position.x, obj.position.z),
new Vector2(camera.transform.position.x, camera.transform.position.z));
// Lets find rotation from zero to target angle
float rAngle = Mathf.Deg2Rad * (camera.transform.rotation.eulerAngles.y + angle );
// Using minus cause when we rotate camera to the "right" we have to move it to the"left"
float x = -R * Mathf.Sin(rAngle);
float y = -R * Mathf.Cos(rAngle);
// Apply changes.
camera.transform.position = new Vector3(x, camera.transform.position.y, y);
Vector3 cameraRotation = new Vector3(0, angle,0);
camera.transform.Rotate(cameraRotation);
And as i said you easily can use https://docs.unity3d.com/ScriptReference/Transform.RotateAround.html
Related
I meanwhile found a solution, but I'd really like a code review. Perhaps there is an easier solution. I updated the script at the end of my question.
I have a camera in my scene with a specific position and rotation. I want to move the camera on the x- and z-axis so that it targets (0,0,0).
I have the position vector posVector of the camera object and the displacement vector dirVector, which is the viewing direction. Now at some point, posVector + dirVector will intersection the ground plane. This currently is (x,0,z) but I like to move the camera on the x- and z-axis so that this is (0,0,0). So I really need to know x and z.
I tried multiple things, but I still have trouble wrapping my head around vectors and intersections.
using UnityEngine;
[ExecuteInEditMode]
public class DebugHelper : MonoBehaviour {
public Camera mainCamera;
// This DebugHelper script runs in EditMode and will help me visualize vectors and also move the camera to where I want it to be.
private void Update()
{
Vector3 posVector = mainCamera.transform.position;
Vector3 dirVector = mainCamera.transform.rotation * Vector3.forward;
// Find scalar for dirVector that displaces posVector.y to 0
// posVector + dirVector * scalar = (x,0,z)
float scalar = Mathf.Abs(posVector.y / dirVector.y);
// Get the position vector of the current camera target at y=0.
Vector3 y0TargetVector = posVector + dirVector * scalar;
// Subtract the y0TargetVector from the posVector to move it back to zero. This works because y0TargetVector.y is 0 and therefore the height is the same.
mainCamera.transform.position -= y0TargetVector ;
Debug.DrawRay(posVector, dirVector * scalar, new Color(0, 1, 0));
}
}
How to make a plane to "look" at the camera by rotating on only on one axis?
For example, I have a plane with a texture of smoke coming from the pipe. If I walk around the pipe the plane should always be facing the camera, rotating along the y axis. But the direction of the smoke should not change, therefore, along with the x and z axes, the plane should not rotate.
Here is a code example which helps to rotate the plane on all axes:
void Update()
{
transform.LookAt(Camera.main.transform.position, -Vector3.up);
}
How to make it rotate only on one axis?
One approach to this is to store the object's original rotation in a Vector3 using transform.eulerAngles. You can then create another Vector3 to store the object's rotation after the LookAt function has completed. You can then set the object's rotation to a new Vector3 using only the y value from the second variable and using the original x and y values. It would look something like this:
void Update()
{
Vector3 originalRotation = transform.eulerAngles;
transform.LookAt(Camera.main.transform.position, -Vector3.up);
Vector3 newRotation = transform.eulerAngles;
transform.eulerAngles = new Vector3(originalRotation.x, newRotation.y, originalRotation.z);
}
Im trying to get my GameObject to point towards the mouse. It has a child object with a sprite. The childs rotation is set to 0 on all zxis. The sprite points upwards (positive X) at start. The GameObject rotates with the mouse pointer but its always turning its right side towards the mouse pointer. Also when i add force forward it accelerates at the same direction the sprite is pointing which as stated before is not the direction the mouse is. What is my code missing?
var cam = Camera.main;
// Distance from camera to object. We need this to get the proper calculation.
float camDistance = cam.transform.position.y - transform.position.y;
// Get the mouse position in world space. Using camDis for the Z axis.
Vector3 mouse = cam.ScreenToWorldPoint(new Vector3(Input.mousePosition.x, Input.mousePosition.y, camDistance));
float AngleRad = Mathf.Atan2(mouse.y - transform.position.y, mouse.x - transform.position.x);
float angle = (180 / Mathf.PI) * AngleRad;
rb2d.rotation = angle;
Your shape is rotated by 90 degrees due to the way that your calculations resolve the angle, you can account for this by using:
rb2.rotation = angle - 90;
How can I position GameObject in the center of screen at runtime. My current GameObject is positioned on the left side of the screen and I want it to be positioned in the middle of the screen.
Get the center position of the screen with Camera.ViewportToWorldPoint. Pass in 0.5 to the x and y axis of the this function. Use Camera.nearClipPlane +yourCustomOffset for the z axis to make sure that the GameObject will be positioned where it can actually be seen.
void centerGameObject(GameObject gameOBJToCenter, Camera cameraToCenterOBjectTo, float zOffset = 2.6f)
{
gameOBJToCenter.transform.position = cameraToCenterOBjectTo.ViewportToWorldPoint(new Vector3(0.5f, 0.5f, cameraToCenterOBjectTo.nearClipPlane + zOffset));
}
Then you can call it with
centerGameObject(yourGameObject, Camera.main);
The default zOffset(2.5f) should work but you can change it by supplying the third parameter.
centerGameObject(yourGameObject, Camera.main, 5f);
In Unity I have a class that is creating a new bounds, for those of you who may not know unity bounds is just really a cube. I then have a object like a Camera, I want to add this bounds to the camera and scale it to fit around the view frustum. That would be easy if it never rotated but the camera does rotate and at the moment I am drawling a blank.
this is what I have right now witch works when the Camera does not rotate
float h = Mathf.Tan(cam.fov * Mathf.Deg2Rad * .5f) * cam.farClipPlane * 2;
Vector3 scale = new Vector3(h * cam.aspect, h, cam.farClipPlane);
newBounds.center = cam.transform.position + cam.transform.forward * (cam.farClipPlane/2);
newBounds.size = scale;
Whenever we set bounds on a BoxCollider we only set the Size and Center of the bounds only and these bounds contain the size and center with respect to the local position and rotation
so, you don't need to write the code for rotation just set the
transform.GetComponent<BoxCollider>().center and
transform.GetComponent<BoxCollider>().size
and the bounds will automatically rotate according to the Transform of the GameObject