Unity3D Digital Joystick for cursor movement - c#

Hey im pretty new in c#,
im building a plane sim and on the pc you can control the plane with the cursor through mouse movement.
now i want an digital joystick on mobile device, so i can control the plane on that.
does anybody have an idea how to code that?
thx

You can search for a tutorial online. A quick google search leads to this video and hundreds more. On the asset store there are multiple controllers already made, for example this one made by Unity, which includes everything you need.
Next time, before asking a question, try to google what you need, 99 times out of 100 the thing you're trying to achieve has already been made over and over again ;)

Related

Gesture recognition with Unity, Oculus Quest and the Oculus Quest Integration package

I'm trying to implement hand gesture recognition for Oculus Quest with Unity and the Unity Oculus integration package.
I've read the "Hand Tracking in Unity" documentation on the Oculus developer website, but they only talk about getting the current pinch of the fingers, which is not what I want:
https://developer.oculus.com/documentation/unity/unity-handtracking/
I thought about getting fingers flexion for each finger (with a value between 0 and 1 for example), and then training a k-NN model with the 5 features to then be able to recognize the nearest gesture. But I've been searching for hours and didn't find anything about getting finger position, the only thing I found is getting the pinch.
By looking in the OVRSkeleton.cs file (from the Oculus Integration package), I've been able to get the current Transform for each bone (so the position as a vector and the rotation as a quaternion), but I don't really know how to calculate or get an estimate for the finger flexion with that (or anything useful to perform gesture recognition)
OVRSkeleton skeleton = GetComponent<OVRSkeleton>();
skeleton.Bones[(int) OVRPlugin.BoneId.Hand_Index1].Transform.position
skeleton.Bones[(int) OVRPlugin.BoneId.Hand_Index1].Transform.rotation
The list of bones IDs is in the "Hand Tracking in Unity" documentation page.
In fact, what I want to implement seems to look exactly like this package:
https://assetstore.unity.com/packages/tools/integration/vr-hand-gesture-recognizer-oculus-quest-hand-tracking-168685
Any help, ideas or comments about how to calculate fingers flexion, or any other solution to implement gesture recognition would be greatly appreciated!
Thanks
A few things/links I've explored so far:
https://www.reddit.com/r/OculusQuest/comments/elrn7a/unity_hand_tracking_and_different_gestures/
https://forums.oculusvr.com/developer/discussion/89615/detect-custom-hand-gestures
https://github.com/jorgejgnz/HandTrackingGestureRecorder is something im trying currently, the docs dont say it has a depenedency, but apparently it does, and i dont have it working yet.
you can access the bone rotations at runtime and compare those to a recorded gesture. i think its closer to "template matching" than machine learning, measuring the error between two poses.

How do you code unity ui button touch controls to receive an input?

I have a right, left and up button on the screen (its a 2D game). I am trying to figure out how to write a script so that when the user clicks the buttons, they receive the input and either move the player right left or up. I have watched some tutorials on Youtube but they are from a couple of years ago and are out of date and don't work for me. Could anyone send me a script or tell me how to code it (C#) so that it can recognise the touch input?
Functions OnMouseDown, OnMouseUp, OnMouseDrag, OnMouseUpAsButton etc. work excellent not only on PC but on Android, too.
But don't forget than you must add BoxCollider2d to the pbject to make these functions work:)

different images from different point of view

I want different images to be displayed from different point of view. For the whole concept explaination please look at the images. they explain my idea/query!
As in the first image you see that there are three people at different angle looking at the monitor. Now i want the webcam to track the eyes and show the particular defined image to the user> For example: If user is at 45 degree angle then show image1.png
Depending upon the user's prespective of watching. The computer should show the image.
(the lady is the game character for representation purpose)
Can you please guide me on what steps can be taken to accomplish this? Is there any plugin available for unity that tracks faces? Please guide me
Also thanks for the compliments on my sketching skills xD
Stackoverflow is not really meant to recommend plugins, since the choice is usually opinion based so there is no exact answer.
That being said, on of the most common used API for computer vision (meaning interpreting images, including face recognition) is OpenCV, so that could be a good start for you to look at that.
And fortunately for you, there is a Unity plugin for OpenCV
It is too broad to give you more details about how it works here. You should try to make it work, and if you have a problem with your code, open a new question with the code portion that you struggle with.
PS: nice sketching skills
Perhaps easier option would be to use Kinect
(trying to detect face or eyes from that far might be shaky?)
With Kinect you can get skeletons for multiple people, and getting the angle between target and those kinect avatars would be easy.
If there is no space to put kinect in good position,
could consider placing it on the ceiling above (and then use depth data only to detect people in its view)
Only issue is that apparently Microsoft has stopped Windows kinect support,
so you would need to find 2nd hand versions.. (Unity Asset store still has some kinect plugins and examples available)
https://www.polygon.com/2018/1/2/16842072/xbox-one-kinect-adapter-out-of-stock-production-ended
Or look for kinect alternatives that work with unity, try RealSense cameras:
https://www.intel.sg/content/www/xa/en/architecture-and-technology/realsense-overview.html

Get user inputs from the webcame for the game

I'm creating a simple game using Unity Studio which uses arrow keys to move the player. Now what I want to do is, use webcam as a movement detecting device and track user's movements and move the player according to them. (For example, when user move his hand to right, webcam can track it and move the player to the right...)
So, is this possible ? If so, what are the techniques APIs I should use for this...?
Thanks!
Have a look at OpenCV, it is being used a lot in the field of body and head tracking, and there's a unity plugin which implements it that might be useful.
Video Demo
It can't. But there is a lot of stuff out there on the internet.
This one has some interesting looking links.
Emgu CV looks interesting too.
There is some JavaScript handtracking tool too.
And of course there's kinect, but you need the 3d sensor.
You could also use LeapMoution.

Gameobject's position is different between unity and iPhone (5)

I'm making a simple 2D game in Unity. The gameobject's position is different from Unity editor to my iPhone. For example, if I want to place a gameobject in the center, then I need to place it a little further up before it is in the center on my iPhone. And i'm using iPhone 5 resolution in unity. Hope you guys can help or explain why this happen.
Check the offsets of parents of the game object. It's possible that the outtermost object isn't set to be at 0,0.
Otherwise we're probably going to need to have more information to help you. Screenshots or code snippets or something.

Categories