Object to follow mouse pointer unity and C# - c#

I'm busy working on something that uses a RTS style camera and I want an object to follow the mouse cursor but stay at the same Y-axes at all times. The camera location is set to 0, 15, -15
I've been playing with it for a while now and this is the best that I can come up with:
Ray ray = Camera.mainCamera.ScreenPointToRay(Input.mousePosition);
Vector3 point = ray.GetPoint(5);
transform.position = point;
print (point);
Any help would would be appreciated.

You are on the right track but you need to use that ray to raycast onto something to get the world position.
Since its an rts i assume the terrain is somewhat level in which case it would be easy to place a plane at the desired height. If thats not the case i recommend following #The Ryan advice and store the previous y value.
In either case you need to put the thing you raycast against in a separate layer from your other stuff so you dont move things on top of other units
Ray ray = Camera.mainCamera.ScreenPointToRay(Input.mousePosition);
RaycastHit hit;
if(Physics.Raycast(ray, out hit, maxDistance, layerMask))
{
float oldY = transform.position.y;
transform.position.Set(hit.point.x, oldY, hit.point.z);
}

Related

How to move an object on an inclined surface with mouse? UNITY3D

That's what I want
I have a surface that is at an angle (for example, the roof of a house), on which I need to move the object with the mouse!
enter image description here
Here is my code and iot works, but bit different from what i want.
Ray ray = Camera.ScreenPointToRay(Input.mousePosition);
RaycastHit hit;
if(Physics.Raycast(ray, out hit, 50f, _layer))
{
_objectToPlace.position = hit.point;
_objectToPlace.rotation =
Quaternion.FromToRotation(Vector3.up, hit.normal);
}
1)The object is teleported to the clicked location.
2)I want to be able to set an object's movement speed.
Basically this is what i want to solve!
I will be grateful gor any help!

Why is my GameObject traveling along the ray I am casting when I set transform.position = hit.point? Unity 3D

I am trying to get an object to follow my mouse along the surfaces in my game. There is a floor and other objects in the game. The object should always be on the floor or on one of the other objects. What I want is for the object's position to change to where the mouse is moved. I am doing this by casting a ray from the mouse. When the ray hits a collider the object's position should change to where it hit.
When I hit play the object moves along the ray that is being cast, starting at the collision (hit.point) and traveling towards the camera. Once it reaches the camera it starts over at hit.point and repeats. When I move the mouse the object's position changes to the new hit.point and moves toward the camera. The value of hit.point changes in the dev log when I move the mouse as it should and when the mouse stays still the value does not change, even though the object is still moving towards the camera. Why is this happening? It seems like because the value of transform.position is set to hit.point and never changed elsewhere in the code that the position should always be hit.point.
This is the code attached to the object that is moving:
public class FollowCursor : MonoBehaviour
{
void Update()
{
RaycastHit hit;
Ray ray;
int layerMask = 1 << 8;
layerMask = ~layerMask;
ray = Camera.main.ScreenPointToRay(Input.mousePosition);
if (Physics.Raycast(ray, out hit))
transform.position = hit.point;
Debug.Log(hit.point);
}
}
This is a screenshot of my hierarchy and scene. The white sphere is the object that is moving. The floor I referred to is the sand at the bottom of the tank. I want the object to be on the sand or one of the other objects. Layer 8 is the tank itself, which is being excluded from the raycast because I do not want the object to be on the tank.
Because your GameObject has a collider too
The ray hits the ground. The object goes there
The next frame, the ray hits the object. The object moves there (the ray hit the surface, the pivot is in the center, so it moves towards the camera an amount equal to the radius of the collider, looks like 0.5 units based on your screenshot).
Repeat.
The best way to fix this is to put your ground on its own layer and raycast only against the layers you want the ray to hit using a LayerMask.

Raycast from one object to side of the other object

I have a ai (the red circle) which should shoot at my player (the blue circle).
Currently i'm using a normal Raycast:
Ray ray = new Ray(transform.position, transform.forward);
which gives me the purple Line.
Now, when there is a corner (like on the image), it doesn't hit the player, but it could if it would shoot a bit more to the side.
I've seen a solution for this by adding an angle to that raycast, but this doesn't work for me, because the angle is different if the player is near or far away from the ai.
What i need is:
That the ai shoots a raycast from itself to the side (left & right) of the player, but i don't know how to do that.
Your player certainly has some kind of radius, no?
Just using the distance between the player and the shooting AI, the player's radius and some nice trigonometry, you should be able to generate two rays to test both if the shooting AI sees the player's "right side", as well as his "left side".
Vector3 direction = player.transform.position - shootingAI.transform.position;
float distance = direction.magnitude;
float playerRadius = player.radius;// this is a variable you need to set
float angleToRotate = Mathf.Atan(radius/distance); // trigonometry: tan(angle)=oppositeSideLength/adjacentSideLength
Vector3 rayDirection1 = Quaternion.Euler(0f, -angleToRotate, 0f) * direction;
Vector3 rayDirection2 = Quaternion.Euler(0f, angleToRotate, 0f) * direction;
Using these two rays as directions you should be testing for the "right side", and the "left side"
You can find good answer HERE.
Or maybe use Sphere Raycast and use Contact Point and shot at this point.

Unity3D - Move an object to the opposite direction by clicking on it

I'm new in Unity and I'm working on a simple project but I'm stuck.
So my idea is when the player touches the bottom side of a ball it goes the opposite direction. Pretty much like in real life, if you kick a ball on the bottom side it will go up, but in 2D version.
I tried finding the opposite point of where the player touched the ball and making a vector by subtracting the original point and the opposite point, and then applying force to move the ball but it doesn't work.
void MoveBall()
{
x = mouseClickPosition.x;
y = mouseClickPosition.y;
oppositeClickPosition.x = -x;
oppositeClickPosition.y = -y;
Vector2 direction = oppositeClickPosition - mouseClickPosition;
rb.AddForce(direction * force, ForceMode.Impulse);
}
You have to subtract the mousePosition and the center of the ball.
Vector3 clickedPosition = Camera.main.ScreenToWorldPoint (Input.mousePosition);
Vector3 ballCenter = ballTransform.position;
// If this goes the opposite direction just swap the values.
Vector3 forceDirection = (ballCenter - clickedPosition).normalized;
ballRigidbody.AddForce (forceDirection * strength, ForceMode.Impulse);
This way you find the direction and add the force you want, you can not use the normalized Vector3 if you want to use the distance as a factor too.
And you can catch that event properly on OnMouseDown if the ball has a collider
https://docs.unity3d.com/ScriptReference/MonoBehaviour.OnMouseDown.html
This may be helpful:
https://answers.unity.com/questions/681698/vector-between-two-objects.html
It Explains how to do it in 3D but the idea applies to both.
In your case, instead of oppositeClickPosition you should use the position of the ball:
Vector2 direction = transform.position - mouseClickPosition;
(replace transform.position with the position of the ball if this script is not attatched to the ball)
Hope that helps!

Texture Annotation using OVRInput Touch Controller Raycast in Unity5

EDIT2:
Thanks to Draco in the comments referring me to the LineRenderer class.
Here is a link to a script that helped me draw ray lines in the game world space: http://answers.unity3d.com/questions/1142318/making-raycast-line-visibld.html
EDIT:
I changed the unit length from 1 to 100,000 and from black color to blue in the Debug.DrawRay method, and now I am able to see the rays drawn in the scene view only.
Debug.DrawRay(transform.position, transform.forward, Color.blue, 100000);
Also, I am now able to track the touch controller's ray by applying the below script to the game object LocalAvatar > controller_right.
ORIGINAL POST:
I am attempting to do something seemingly simple using the Oculus VR packages OVRAvatar, OVRCameraRig, and OVRInput in Unity 5.6.1 and Visual Studio C#.
Goal: I want to get the raycast from my Oculus touch controller's pointer finger, and intercept a plane to apply pixels to the plane's texture.
I was able to get the camera's raycast to intercept the plane using OVRCameraRig like so:
void Update() {
RaycastHit hit;
Vector3 fwd = ovrCamera.transform.TransformDirection(Vector3.forward);
if (Physics.Raycast(ovrCamera.transform.position, fwd, out hit, 10000))
{
if (hit.collider.tag == "Plane"))
{
print("Hit the plane!");
// apply some pixels to plane texture
// setPixels(plane.texture)
}
}
// EDIT: I changed the unit length from 1 to 100,000 and now I am able to see the rays drawn in the scene view only.
// NOTE: this line below does not work in Unity's scene view or game view
Debug.DrawRay(ovrCamera.transform.position, ovrCamera.transform.forward, Color.black, 1);
}
Aside from the Debug.DrawRay function not working, the code which checks for the OVRcamera's raycast hitting the plane does in fact work.
However, now that I am trying to test the raycast hit from OVRInput.Controller.RTouch, I am unable to see the drawn ray and the hit collider check is not triggered.
I have added this script below as a component to OVRCameraRig's RightHandAnchor:
void Update() {
RaycastHit hit;
Vector3 fwd = transform.TransformDirection(Vector3.forward);
if (Physics.Raycast(transform.position, fwd, out hit, 10000))
{
if (hit.collider.tag == "Plane")
{
print("Hit plane with touch controller's raycast!");
}
}
// EDIT: I changed the unit length from 1 to 100,000 and now I am able to see the rays drawn in the scene view only.
// NOTE: this line below does not work in Unity's scene view or game view
Debug.DrawRay(transform.position, transform.forward, Color.black, 1);
}
Needless to say, the Debug.DrawRay function does not work here as well. I can't tell if the raycast from the touch controllers are actually being tracked.
Any help or insight is appreciated, thanks!
Thanks to Draco in the comments referring me to the LineRenderer class.
Here is a link to a script that helped me draw ray lines in the game world space: http://answers.unity3d.com/questions/1142318/making-raycast-line-visibld.html
I changed the unit length from 1 to 100,000 and from black color to blue in the Debug.DrawRay method, and now I am able to see the rays drawn in the scene view only.
Debug.DrawRay(transform.position, transform.forward, Color.blue, 100000);
Also, I am now able to track the touch controller's ray by applying the below script to the game object LocalAvatar > controller_right.

Categories