Texture Annotation using OVRInput Touch Controller Raycast in Unity5 - c#

EDIT2:
Thanks to Draco in the comments referring me to the LineRenderer class.
Here is a link to a script that helped me draw ray lines in the game world space: http://answers.unity3d.com/questions/1142318/making-raycast-line-visibld.html
EDIT:
I changed the unit length from 1 to 100,000 and from black color to blue in the Debug.DrawRay method, and now I am able to see the rays drawn in the scene view only.
Debug.DrawRay(transform.position, transform.forward, Color.blue, 100000);
Also, I am now able to track the touch controller's ray by applying the below script to the game object LocalAvatar > controller_right.
ORIGINAL POST:
I am attempting to do something seemingly simple using the Oculus VR packages OVRAvatar, OVRCameraRig, and OVRInput in Unity 5.6.1 and Visual Studio C#.
Goal: I want to get the raycast from my Oculus touch controller's pointer finger, and intercept a plane to apply pixels to the plane's texture.
I was able to get the camera's raycast to intercept the plane using OVRCameraRig like so:
void Update() {
RaycastHit hit;
Vector3 fwd = ovrCamera.transform.TransformDirection(Vector3.forward);
if (Physics.Raycast(ovrCamera.transform.position, fwd, out hit, 10000))
{
if (hit.collider.tag == "Plane"))
{
print("Hit the plane!");
// apply some pixels to plane texture
// setPixels(plane.texture)
}
}
// EDIT: I changed the unit length from 1 to 100,000 and now I am able to see the rays drawn in the scene view only.
// NOTE: this line below does not work in Unity's scene view or game view
Debug.DrawRay(ovrCamera.transform.position, ovrCamera.transform.forward, Color.black, 1);
}
Aside from the Debug.DrawRay function not working, the code which checks for the OVRcamera's raycast hitting the plane does in fact work.
However, now that I am trying to test the raycast hit from OVRInput.Controller.RTouch, I am unable to see the drawn ray and the hit collider check is not triggered.
I have added this script below as a component to OVRCameraRig's RightHandAnchor:
void Update() {
RaycastHit hit;
Vector3 fwd = transform.TransformDirection(Vector3.forward);
if (Physics.Raycast(transform.position, fwd, out hit, 10000))
{
if (hit.collider.tag == "Plane")
{
print("Hit plane with touch controller's raycast!");
}
}
// EDIT: I changed the unit length from 1 to 100,000 and now I am able to see the rays drawn in the scene view only.
// NOTE: this line below does not work in Unity's scene view or game view
Debug.DrawRay(transform.position, transform.forward, Color.black, 1);
}
Needless to say, the Debug.DrawRay function does not work here as well. I can't tell if the raycast from the touch controllers are actually being tracked.
Any help or insight is appreciated, thanks!

Thanks to Draco in the comments referring me to the LineRenderer class.
Here is a link to a script that helped me draw ray lines in the game world space: http://answers.unity3d.com/questions/1142318/making-raycast-line-visibld.html
I changed the unit length from 1 to 100,000 and from black color to blue in the Debug.DrawRay method, and now I am able to see the rays drawn in the scene view only.
Debug.DrawRay(transform.position, transform.forward, Color.blue, 100000);
Also, I am now able to track the touch controller's ray by applying the below script to the game object LocalAvatar > controller_right.

Related

Problems trying to rotate my ship to be parallel to the mesh normal its floating on

So I'm making a game where you sail in a pirate ship, I animated a sea in blender and made a script to update the mesh collider so raycasts trigger properly on the animated mesh, so far so good.
now the problem I'm facing is with the boat, so far I have been able to make the boat ride the waves:
// This is the script that handles the riding of the waves.
[SerializeField] private LayerMask m_whatIsSea;
[SerializeField] private float m_offsetFromWaterSurface;
private void Update()
{
// Shoot a raycast from above the water down to detect the height.
RaycastHit _hit;
if (Physics.Raycast(gameObject.transform.position + (Vector3.up * (Mathf.Abs(m_offsetFromWaterSurface * 4))), -Vector3.up, out _hit, m_whatIsSea))
{
Float(_hit);
}
}
private void Float(RaycastHit hit)
{
// Than move the boat to the height.
Vector3 _newPosition = new Vector3(gameObject.transform.position.x, hit.point.y + m_offsetFromWaterSurface, gameObject.transform.position.z);
transform.position = _newPosition;
}
But when I move the ship with a ship controller, it does correctly change its height.
but all attempts at making my ship rotate according to the raycasted mesh normal (the hit.normal in the above code) haven't worked out for me.
I have made attempts that did somewhat work out for me, but the biggest issue in those is that the mesh consisting of big triangles doesn't give me a smooth rotation rather constantly snapping to the normals, which of course isn't good.
So the final result I'm trying to achieve is that my ship rotates so that it's visually going up and down the waves instead of having the front clipping through.
I appreciate any help! I've been trying to come up with a solution for longer than I can admit. :-)
I would try Vector3.RotateTowards.
The 4th argument maxRadiansDelta in the method public static Vector3 RotateTowards(Vector3 current, Vector3 target, float maxRadiansDelta, float maxMagnitudeDelta); should allow you to rotate the ship in the hit triangle normal's direction with maxRadiansDelta as a maximum value you can play with to set and make the rotation smooth enough.

Why is my GameObject traveling along the ray I am casting when I set transform.position = hit.point? Unity 3D

I am trying to get an object to follow my mouse along the surfaces in my game. There is a floor and other objects in the game. The object should always be on the floor or on one of the other objects. What I want is for the object's position to change to where the mouse is moved. I am doing this by casting a ray from the mouse. When the ray hits a collider the object's position should change to where it hit.
When I hit play the object moves along the ray that is being cast, starting at the collision (hit.point) and traveling towards the camera. Once it reaches the camera it starts over at hit.point and repeats. When I move the mouse the object's position changes to the new hit.point and moves toward the camera. The value of hit.point changes in the dev log when I move the mouse as it should and when the mouse stays still the value does not change, even though the object is still moving towards the camera. Why is this happening? It seems like because the value of transform.position is set to hit.point and never changed elsewhere in the code that the position should always be hit.point.
This is the code attached to the object that is moving:
public class FollowCursor : MonoBehaviour
{
void Update()
{
RaycastHit hit;
Ray ray;
int layerMask = 1 << 8;
layerMask = ~layerMask;
ray = Camera.main.ScreenPointToRay(Input.mousePosition);
if (Physics.Raycast(ray, out hit))
transform.position = hit.point;
Debug.Log(hit.point);
}
}
This is a screenshot of my hierarchy and scene. The white sphere is the object that is moving. The floor I referred to is the sand at the bottom of the tank. I want the object to be on the sand or one of the other objects. Layer 8 is the tank itself, which is being excluded from the raycast because I do not want the object to be on the tank.
Because your GameObject has a collider too
The ray hits the ground. The object goes there
The next frame, the ray hits the object. The object moves there (the ray hit the surface, the pivot is in the center, so it moves towards the camera an amount equal to the radius of the collider, looks like 0.5 units based on your screenshot).
Repeat.
The best way to fix this is to put your ground on its own layer and raycast only against the layers you want the ray to hit using a LayerMask.

How can Raycast interact with Unity's Canvas UI elements?

I created a Raycast that shoots from an object(not main camera) and I want it to hit UI elements. Example below.
Example
UI outlined in orange, big grey plane is a test object, red head is placed upon user's head and green is DrawRay of my raycast attached to an eye. I am trying to make it so user looks at a UI button and raycast can hit that button.
I've been playing around with Graphic raycast but nothing seems to work. Grey plane is still being hit by the raycast. I tried playing with OnPointerEnter but the thing is that my eye raycast is not a mouse/finger touch and is a second pointer.
Any ideas how to make it work with either isPointerOverGameObject or Graphic Raycaster or any other method?? Or how to create a second pointer that will be the eye raycast?
Current code below.
private void FixedUpdate()
{
GraphicRaycaster gr = this.GetComponent<GraphicRaycaster>();
PointerEventData pointerData = new PointerEventData(EventSystem.current);
pointerData.position = Input.mousePosition;
List<RaycastResult> results = new List<RaycastResult>();
gr.Raycast(pointerData, results);
//EventSystem.current.RaycastAll(pointerData, results);
if (results.Count > 0)
{
if (results[0].gameObject.tag == "Test tag")
{
Debug.Log("test");
}
}
//----- Failed Graphic raycast experiments above; working RaycastHit below --------
RaycastHit hit;
Debug.DrawRay(transform.position, transform.forward * -10000f, Color.green);
if (Physics.Raycast(transform.position, transform.forward * -1, out hit))
{
Debug.Log(hit.transform.name);
}
}
This is actually much more complicated if you want to do proper interaction with the canvas elements.
I suggest you do not try to roll out the solution yourself, but look at some already made solutions. Most VR toolkits have their own implementations, you could look into VRTK's example scene 007 - Interactions. You don't have to use VR to use it - just use the fallback SDK.
For more details on how this works, I might point you to the article from Oculus: Unity Sample Framework (a bit out of date).

Problems with Physics2D.Raycast

In a basic tutorial to make a 2D platformer I was given this bit of code:
if (Physics2D.Raycast(ray, out hit, Mathf.Infinity, collisionMask))
But Unity says you need Vector2s instead of Ray and out hit. Is there a way I could substitute Vectors for the same value and produce the same effect?
Here's the ray code if you want it:
ray = new Ray2D(new Vector2(x,y), new Vector2(x,dir));
The code is compiling now but it won't actually work (it's supposed to emulate gravity so the character will fall till they hit ground) but it just falls through the ground.
PlayerPhysics.cs
PlayerController.cs
In Unity the character has a Box Collider 2D which is correctly setup and the scripts are both attached and the platform has a Box Collider 2D correctly setup.
Your ray has an origin and direction components (both of which are vectors) so try this:
var hit = Physics2D.Raycast(ray.origin, ray.direction, Mathf.Infinity, collisionMask));

Object to follow mouse pointer unity and C#

I'm busy working on something that uses a RTS style camera and I want an object to follow the mouse cursor but stay at the same Y-axes at all times. The camera location is set to 0, 15, -15
I've been playing with it for a while now and this is the best that I can come up with:
Ray ray = Camera.mainCamera.ScreenPointToRay(Input.mousePosition);
Vector3 point = ray.GetPoint(5);
transform.position = point;
print (point);
Any help would would be appreciated.
You are on the right track but you need to use that ray to raycast onto something to get the world position.
Since its an rts i assume the terrain is somewhat level in which case it would be easy to place a plane at the desired height. If thats not the case i recommend following #The Ryan advice and store the previous y value.
In either case you need to put the thing you raycast against in a separate layer from your other stuff so you dont move things on top of other units
Ray ray = Camera.mainCamera.ScreenPointToRay(Input.mousePosition);
RaycastHit hit;
if(Physics.Raycast(ray, out hit, maxDistance, layerMask))
{
float oldY = transform.position.y;
transform.position.Set(hit.point.x, oldY, hit.point.z);
}

Categories