Is it possible to detect finger movements with Kinect. I am able to detect skeleton and do some mouse movement and perform a click based on OTHER HAND location. I would like to implement the 'mouse click' using finger movements.
Is it possible with Microsoft Kinect sdk or with the other open source similar projects?
Thanks.
Currently it is only possible by using a hack; there is no official setting or API for it but it is possible to analyze the image data and find the fingers.
Have a look here
Related
I'm trying to implement hand gesture recognition for Oculus Quest with Unity and the Unity Oculus integration package.
I've read the "Hand Tracking in Unity" documentation on the Oculus developer website, but they only talk about getting the current pinch of the fingers, which is not what I want:
https://developer.oculus.com/documentation/unity/unity-handtracking/
I thought about getting fingers flexion for each finger (with a value between 0 and 1 for example), and then training a k-NN model with the 5 features to then be able to recognize the nearest gesture. But I've been searching for hours and didn't find anything about getting finger position, the only thing I found is getting the pinch.
By looking in the OVRSkeleton.cs file (from the Oculus Integration package), I've been able to get the current Transform for each bone (so the position as a vector and the rotation as a quaternion), but I don't really know how to calculate or get an estimate for the finger flexion with that (or anything useful to perform gesture recognition)
OVRSkeleton skeleton = GetComponent<OVRSkeleton>();
skeleton.Bones[(int) OVRPlugin.BoneId.Hand_Index1].Transform.position
skeleton.Bones[(int) OVRPlugin.BoneId.Hand_Index1].Transform.rotation
The list of bones IDs is in the "Hand Tracking in Unity" documentation page.
In fact, what I want to implement seems to look exactly like this package:
https://assetstore.unity.com/packages/tools/integration/vr-hand-gesture-recognizer-oculus-quest-hand-tracking-168685
Any help, ideas or comments about how to calculate fingers flexion, or any other solution to implement gesture recognition would be greatly appreciated!
Thanks
A few things/links I've explored so far:
https://www.reddit.com/r/OculusQuest/comments/elrn7a/unity_hand_tracking_and_different_gestures/
https://forums.oculusvr.com/developer/discussion/89615/detect-custom-hand-gestures
https://github.com/jorgejgnz/HandTrackingGestureRecorder is something im trying currently, the docs dont say it has a depenedency, but apparently it does, and i dont have it working yet.
you can access the bone rotations at runtime and compare those to a recorded gesture. i think its closer to "template matching" than machine learning, measuring the error between two poses.
I'm creating a simple game using Unity Studio which uses arrow keys to move the player. Now what I want to do is, use webcam as a movement detecting device and track user's movements and move the player according to them. (For example, when user move his hand to right, webcam can track it and move the player to the right...)
So, is this possible ? If so, what are the techniques APIs I should use for this...?
Thanks!
Have a look at OpenCV, it is being used a lot in the field of body and head tracking, and there's a unity plugin which implements it that might be useful.
Video Demo
It can't. But there is a lot of stuff out there on the internet.
This one has some interesting looking links.
Emgu CV looks interesting too.
There is some JavaScript handtracking tool too.
And of course there's kinect, but you need the 3d sensor.
You could also use LeapMoution.
I'm now detecting all the skeleton in a wpf application, I want to know how to detect the fingers to appear with the skeleton? I'm using microsoft Kinect for windows sdk ver 1.5
Many thanks
The Kinect unfortunately is not sensitive enough to recognize fingers so the library will not provide that as part of the skeleton. Maybe the Kinect 2.0 rumored to come out with the Xbox 720 will be able to provide that level of detail.
Candescent NUI might be what you're looking for. As OpenUserX03 said, however, the Kinect isn't ideal for this task. Perhabs you should have a look at the coming-up LEAP technology, which specializes in finger detection.
The cameras on the Kinect are not meant to be able to do joint tracking for the hands to that level of detail. Tracking the individual fingers is possible but wont be very reliable. To represent a players hand in the skeleton, you can check if the players hand is opened or closed. A possible way to see if the hand is open or closed would be to do pixel checks in an area surrounding the hand. This way with some tuning you could calculate how much of that area is the hand (using the depth and color stream) and how much is not. For example: If 40% of that area is the same depth as the hand joint, the hand is closed in a fist. If 70% of that area is the same depth as the hand joint, the hand is open. Then you could possibly use the angle of the elbow and wrist joint to be able to represent a closed or open hand at that angle on the skeleton.
I want to move the avatar based on the movement the player using kinect and Unity, are there any good tutorials?
We are using unity and Kinect interface to create a simple application. Based on the movement of the player we need to move the avatar.
We are supposed to use Unity with GAKUNITY, No OpenNI or any third party tools.
Are there any good tutorials for GakUnity with Kinect?
GAK means Gadget Accelerator Kit
We just want to move any avatar with player movement in front of kinect interface. Even help in hand movement also highly appreciated.
You can also share useful links or books regarding to unity and kinect programming.
Custom Kinect Gesture Recognition using OpenNI and Unity3Dkinect.dashhacks.com
A tipster has tipped a lil tipple my way in the form of custom gesture recognition for the Kinect using OpenNI and Unity 3D. What this allows you to do is create you own custom gesture that are recognized by the kinect through the software interface.
Unity and Kinect tutorialnightmarekitty.com
For this chapter, we are going to be using a very popular game engine called Unity. By integrating OpenNI, Nite and Sensor Kinect into Unity we will control a 3d character and multiple user interfaces. After we cover the main Components, we will build an example of each from the bottom up.
Kinect Wrapper Example Projectwiki.etc.cmu.edu
This scene shows you how a skeleton is generated / tracked by placing spheres at each of the bones tracked by the Kinect, and how to use kinect to control your model. Use this to get a feel for what the Kinect is capable of. It also shows you how to prepare your GameObjects
I am not familiar with Unity 3d,but you can try using kinect with XNA. At the latest version of Kinect for windows SDK 1.5 and Developer Tookit 1.5.1,there is a sample demonstrating how to interact with 3d avatar using Kinect and XNA,you can find more information on http://msdn.microsoft.com/en-us/library/jj131041
I have no idea about GAKUNity, but you can use zigfu plugin for unity3d and do your project. It has got sample codes and character which you can use for reference, and the sample codes are commented nicely .
You can just import your bone, drag your bones on the zigskeleton script . just read through th zig skeleton script and you will understand the function and structures they are using.
And you can even refer the MEET THE KINECT book, they have explained about using zigfu in unity .
i am developing a 3d project and would like to include the following feature :
As my webcam is watching my face, if i move to the left or the right the projects camera position moves to the left or the right to create an "look-around-the-corner" effect .
Does anyone know a face detection project in .NET c# ?
You can use OpenCV for .NET - there is a wrapper for .NET which also comes with a sample application doing face recognition for images - easy to adapt for the camera if you can extract the samples.
Are you familiar with web services? Depending on how often you need to scan for faces during your webcam stream, you could grab a frame, send it to a face detection web service and it would return the coordinates of faces in that frame.
You could use http://detection.myvhost.de/ because its free!
bafta ;)