I'm having trouble with getting a mouse event on a 3D object in Unity. MonoBehavior callbacks not responding, and ScreenPointToRay produces very strange results.
Using Unity 2021.3.12. I want to simply know when a 3d object is clicked. I tried using EventTrigger to no avail. I added this script, which also did nothing:
void OnMouseDown()
{
Clicked.Invoke();
}
So I added this bit to see what ScreenPointToRay thinks it's doing:
private void Update()
{
Vector2 mousePosition = Mouse.current.position.ReadValue();
Ray ray = Camera.main.ScreenPointToRay(mousePosition);
Debug.DrawRay(ray.origin, ray.direction * 100, Color.yellow);
if (Mouse.current.leftButton.wasPressedThisFrame)
{
print("mouse is down");
RaycastHit hit;
if (Physics.Raycast(ray, out hit, 10000f))
{
print("hit it");
}
}
}
And it shows the ray projecting way off into -X land. In this screen shot, I have a fairly empty 3d scene. One cube (with Box collider) is at 0 0 0, and camera at 0 0 -10. Pointer should be starting the ray at 0 0 -10 and have direction of 0 0 1, but look at it! It gets close to 0 0 1 if I move the pointer all the way to the top right of my screen (but not quite, still pointing down a little)
I must be missing something really basic.
The problem was on the Camera settings, 'target eye' set to anything but 'none' will cause this problem. Set to 'none' and it works as expected.
Related
I am trying to get an object to follow my mouse along the surfaces in my game. There is a floor and other objects in the game. The object should always be on the floor or on one of the other objects. What I want is for the object's position to change to where the mouse is moved. I am doing this by casting a ray from the mouse. When the ray hits a collider the object's position should change to where it hit.
When I hit play the object moves along the ray that is being cast, starting at the collision (hit.point) and traveling towards the camera. Once it reaches the camera it starts over at hit.point and repeats. When I move the mouse the object's position changes to the new hit.point and moves toward the camera. The value of hit.point changes in the dev log when I move the mouse as it should and when the mouse stays still the value does not change, even though the object is still moving towards the camera. Why is this happening? It seems like because the value of transform.position is set to hit.point and never changed elsewhere in the code that the position should always be hit.point.
This is the code attached to the object that is moving:
public class FollowCursor : MonoBehaviour
{
void Update()
{
RaycastHit hit;
Ray ray;
int layerMask = 1 << 8;
layerMask = ~layerMask;
ray = Camera.main.ScreenPointToRay(Input.mousePosition);
if (Physics.Raycast(ray, out hit))
transform.position = hit.point;
Debug.Log(hit.point);
}
}
This is a screenshot of my hierarchy and scene. The white sphere is the object that is moving. The floor I referred to is the sand at the bottom of the tank. I want the object to be on the sand or one of the other objects. Layer 8 is the tank itself, which is being excluded from the raycast because I do not want the object to be on the tank.
Because your GameObject has a collider too
The ray hits the ground. The object goes there
The next frame, the ray hits the object. The object moves there (the ray hit the surface, the pivot is in the center, so it moves towards the camera an amount equal to the radius of the collider, looks like 0.5 units based on your screenshot).
Repeat.
The best way to fix this is to put your ground on its own layer and raycast only against the layers you want the ray to hit using a LayerMask.
I'm trying my first test-game with Unity, and struggling to get my bullets to move.
I have a prefab called "Bullet", with a RigidBody component, with these properties:
Mass: 1
Drag: 0
Angular drag: 0,1
Use grav: 0
Is Kinematic: 0
Interpolate: None
Coll. Detection: Discrete
On the bullet prefab, I have this script:
public float thrust = 10;
private Rigidbody rb;
void Start()
{
rb = GetComponent<Rigidbody>();
rb.AddForce(transform.forward * 100, ForceMode.Impulse);
}
On my playerController script (not the best place for this, I know):
if (Input.GetAxisRaw("Fire1") > 0)
{
var proj = Instantiate(projectile, transform.position, Quaternion.identity);
}
When I click my mouse, the bullet gets created, but doesn't move. I've added velocity to the rigidbody, which works, but I can't get it to move in the right direction. After googling around, it seems I need to be using rigidBody.AddForce(), which I did, but still can't get my bullet to move.
I've seen the other solution, but this did not work for me either.
Any advice would be appreciated.
Screenshot:
When you're working with 2D in Unity, the main camera basically becomes an orthographic camera (no perspective effects) looking sideways at the game scene along the z-axis. Incidentally, the "forward" direction of a non-rotated object is also parallel to the z-axis.
This means that when you apply a force along transform.forward, you're sending the object down the z-axis - which will not be visible to an orthographic camera looking in the same direction. The quick fix here would be to use a direction that translates to an x/y-axis movement, like transform.up or transform.right.
As derHugo also mentioned in the comments, you might want to look into using Rigidbody2D. There are some optimizations that will allow it to behave better for a 2D game (though you may not notice them until you scale up the number of objects).
I created a Raycast that shoots from an object(not main camera) and I want it to hit UI elements. Example below.
Example
UI outlined in orange, big grey plane is a test object, red head is placed upon user's head and green is DrawRay of my raycast attached to an eye. I am trying to make it so user looks at a UI button and raycast can hit that button.
I've been playing around with Graphic raycast but nothing seems to work. Grey plane is still being hit by the raycast. I tried playing with OnPointerEnter but the thing is that my eye raycast is not a mouse/finger touch and is a second pointer.
Any ideas how to make it work with either isPointerOverGameObject or Graphic Raycaster or any other method?? Or how to create a second pointer that will be the eye raycast?
Current code below.
private void FixedUpdate()
{
GraphicRaycaster gr = this.GetComponent<GraphicRaycaster>();
PointerEventData pointerData = new PointerEventData(EventSystem.current);
pointerData.position = Input.mousePosition;
List<RaycastResult> results = new List<RaycastResult>();
gr.Raycast(pointerData, results);
//EventSystem.current.RaycastAll(pointerData, results);
if (results.Count > 0)
{
if (results[0].gameObject.tag == "Test tag")
{
Debug.Log("test");
}
}
//----- Failed Graphic raycast experiments above; working RaycastHit below --------
RaycastHit hit;
Debug.DrawRay(transform.position, transform.forward * -10000f, Color.green);
if (Physics.Raycast(transform.position, transform.forward * -1, out hit))
{
Debug.Log(hit.transform.name);
}
}
This is actually much more complicated if you want to do proper interaction with the canvas elements.
I suggest you do not try to roll out the solution yourself, but look at some already made solutions. Most VR toolkits have their own implementations, you could look into VRTK's example scene 007 - Interactions. You don't have to use VR to use it - just use the fallback SDK.
For more details on how this works, I might point you to the article from Oculus: Unity Sample Framework (a bit out of date).
EDIT2:
Thanks to Draco in the comments referring me to the LineRenderer class.
Here is a link to a script that helped me draw ray lines in the game world space: http://answers.unity3d.com/questions/1142318/making-raycast-line-visibld.html
EDIT:
I changed the unit length from 1 to 100,000 and from black color to blue in the Debug.DrawRay method, and now I am able to see the rays drawn in the scene view only.
Debug.DrawRay(transform.position, transform.forward, Color.blue, 100000);
Also, I am now able to track the touch controller's ray by applying the below script to the game object LocalAvatar > controller_right.
ORIGINAL POST:
I am attempting to do something seemingly simple using the Oculus VR packages OVRAvatar, OVRCameraRig, and OVRInput in Unity 5.6.1 and Visual Studio C#.
Goal: I want to get the raycast from my Oculus touch controller's pointer finger, and intercept a plane to apply pixels to the plane's texture.
I was able to get the camera's raycast to intercept the plane using OVRCameraRig like so:
void Update() {
RaycastHit hit;
Vector3 fwd = ovrCamera.transform.TransformDirection(Vector3.forward);
if (Physics.Raycast(ovrCamera.transform.position, fwd, out hit, 10000))
{
if (hit.collider.tag == "Plane"))
{
print("Hit the plane!");
// apply some pixels to plane texture
// setPixels(plane.texture)
}
}
// EDIT: I changed the unit length from 1 to 100,000 and now I am able to see the rays drawn in the scene view only.
// NOTE: this line below does not work in Unity's scene view or game view
Debug.DrawRay(ovrCamera.transform.position, ovrCamera.transform.forward, Color.black, 1);
}
Aside from the Debug.DrawRay function not working, the code which checks for the OVRcamera's raycast hitting the plane does in fact work.
However, now that I am trying to test the raycast hit from OVRInput.Controller.RTouch, I am unable to see the drawn ray and the hit collider check is not triggered.
I have added this script below as a component to OVRCameraRig's RightHandAnchor:
void Update() {
RaycastHit hit;
Vector3 fwd = transform.TransformDirection(Vector3.forward);
if (Physics.Raycast(transform.position, fwd, out hit, 10000))
{
if (hit.collider.tag == "Plane")
{
print("Hit plane with touch controller's raycast!");
}
}
// EDIT: I changed the unit length from 1 to 100,000 and now I am able to see the rays drawn in the scene view only.
// NOTE: this line below does not work in Unity's scene view or game view
Debug.DrawRay(transform.position, transform.forward, Color.black, 1);
}
Needless to say, the Debug.DrawRay function does not work here as well. I can't tell if the raycast from the touch controllers are actually being tracked.
Any help or insight is appreciated, thanks!
Thanks to Draco in the comments referring me to the LineRenderer class.
Here is a link to a script that helped me draw ray lines in the game world space: http://answers.unity3d.com/questions/1142318/making-raycast-line-visibld.html
I changed the unit length from 1 to 100,000 and from black color to blue in the Debug.DrawRay method, and now I am able to see the rays drawn in the scene view only.
Debug.DrawRay(transform.position, transform.forward, Color.blue, 100000);
Also, I am now able to track the touch controller's ray by applying the below script to the game object LocalAvatar > controller_right.
I'm busy working on something that uses a RTS style camera and I want an object to follow the mouse cursor but stay at the same Y-axes at all times. The camera location is set to 0, 15, -15
I've been playing with it for a while now and this is the best that I can come up with:
Ray ray = Camera.mainCamera.ScreenPointToRay(Input.mousePosition);
Vector3 point = ray.GetPoint(5);
transform.position = point;
print (point);
Any help would would be appreciated.
You are on the right track but you need to use that ray to raycast onto something to get the world position.
Since its an rts i assume the terrain is somewhat level in which case it would be easy to place a plane at the desired height. If thats not the case i recommend following #The Ryan advice and store the previous y value.
In either case you need to put the thing you raycast against in a separate layer from your other stuff so you dont move things on top of other units
Ray ray = Camera.mainCamera.ScreenPointToRay(Input.mousePosition);
RaycastHit hit;
if(Physics.Raycast(ray, out hit, maxDistance, layerMask))
{
float oldY = transform.position.y;
transform.position.Set(hit.point.x, oldY, hit.point.z);
}