How to recieve input from HTC Vive tracker into Unity? - c#

I am attatching an HTC Vive tracker to a real life object to use that object in game. The tracker is found in the game itself and the movement and rotation are updated perfectly fine. But getting the input to work is the problem here. The Tracker its pins are connected correctly and the input can be seen working in the Input Debugger in the SteamVR Input Binding Tool.
I have tried to find any help on the internet but everything seems to be outdated. The controller themselves do work with the custom input function I have added but the tracker refuses to work. There are no errors at all. The code simply calls a shoot function to shoot a bullet out of a gun. The input is recieved by the controller, both of them, but the tracker which has the exact same settings as the controllers doesn't seem to work.
[SerializeField] private GunScript gunScript;
[SerializeField] private SteamVR_Action_Boolean input;
void Update()
{
if (input.stateDown)
{
gunScript.Shoot(gunScript.ShotTransform.rotation);
}
}
The current output does shoot the gun when using the trigger which is set in the Input Binding Tool when using a normal controller, but when pressing the trigger attatched to the tracker nothing happens, no errors either.

// on start
var allDevices = new List<InputDevice>();
InputDevices.GetDevices(allDevices);
InputDevice tracker = allDevices.FirstOrDefault(d => d.role == InputDeviceRole.HardwareTracker);
// on update
tracker.TryGetFeatureValue(CommonUsages.devicePosition, out var pos);
tracker.TryGetFeatureValue(CommonUsages.deviceRotation, out var rot);
// NOTE!!! pos and rot are in world position so you have to translate it to floor

SteamVR_Action_Boolean.stateDown is a shortcut to SteamVR_Input_Sources.Any
but as you can see in SteamVR_Input_Sources there is no entry for trackers at all so also Any will not get a tracker input.
There are a ton of tutorials for how to get controller input but I also couldn't find anything about getting button input for Vive Trackers. Apparently this is not intended ...
It makes sense though since the button is also the Trackers connection/power button ... it would probably not be a good idea to make the User press that button in order to trigger some actions.
In their HTC_Vive_Tracker Guidelines they point to
If you encounter problems in enabling VIVETracker(2018)on Unityor Unreal, refer to the following links:
For Unity 3D developers:
Downloadlink for the ViveInputUtility package: AssetStore or GitHub
Vive Input Utility source code repository:https://github.com/ViveSoftware/ViveInputUtility-Unity
Maybe there is something helpfull there though it seems this is also more for controllers etc. not for getting the Tracker's button input.

Related

Azure Kinect DK, Body tracking: Mapping separate avatars on separate skeletons

To recreate my issue, I've setup a project in Unity3d 2020, using this sample project.
Here I can successfully map an avatar to a single skeleton. In my project, however, I want to map multiple skeletons - not only the closest, as the sample provides.
This far I've successfully rendered multiple skeletons, but when I try to map separate avatars on each of them - the two avatars will follow the motion of only one of the skeletons in the scene.
The following code is used to setup the avatars:
var avatarPrefab = Instantiate(avatarPrefab, new Vector3(0,0,0), Quaterion.identity);
avatarPrefab.GetComponent<PuppetAvatar>().KinectDevice = this;
avatarPrefab.GetComponent<PuppetAvatar>().CharacterRootTransform = avatarPrefab.transform;
avatarPrefab.GetComponent<PuppetAvatar>().RootPosition = this.transform.GetChild(i).transform.GetChild(0).gameObject; // gets the pelvis of each "rootBody" prefab.
The creator of the PuppetAvatar.cs script has yet to release any updates to support multibody tracking, but I posed a similar question in this thread.
Take a look at MoveBox: https://github.com/microsoft/MoveBox-for-Microsoft-Rocketbox
Movebox can parser the SMPL body models extracted by an external tool for 3D multi-person human pose estimation from RGB videos.
And it supports Azure Kinect DK

The new Input System doesn't trigger anything anymore

This post is shamelessly a copy/paste from my post on the Unity Forums : https://forum.unity.com/threads/input-system-doesnt-trigger-anything-anymore.717386/, but Stack Overflow seems more active
TL;DR : InputSystem worked some days ago, don't trigger anything anymore, halp.
I tried the new Input System some days ago, and that's really neat ! I did a lot of stuff, trying to understand the best way to use it, and, in the end, I had a character jumping and moving everywhere, that was cool ! Then, I merged my code in our develop branch and went to bed.
Today, I want to continue my code, but my character doesn't move anymore, Actions are not triggered (even if inputs are detected in debugger) and I really don't know why. Either the code merge overwrote some important settings (I know what you're thinking and yes, the "Active Input Handling" is set on "Both" and I tried only running the preview) or I did something important during my little tests and I didn't realize.
So I decided to try to reproduce my steps on a fresh new project, maybe you guys can help me figure what do I do wrong ?
1/ Create a new 2D project (via the Hub)
2/ Install the latest Package (version 0.9.0)
3/ Click Yes on that message prompt to activate the new Input management in the settings
4/ Restart Unity Editor since it didn't restart even if the message said it would and check the project settings (yes, it's on "Both", and yes, my Scripting Runtime Version is 4.0)
5/ Create a new GameObject and add a PlayerInput on it
6/ Click on "Open Input Settings" and create an "InputSettings" asset
7/ Click on "Create Actions..." to create my ActionMap asset
8/ Create a "TestAction" on my "Player" ActionMap and set it to the key "t"
9/ Create a new Script "TestScript" that contains a OnTestAction() method (that only logs "test") and enables the test map/action (just to be sure) :
using UnityEngine;
using UnityEngine.InputSystem;
using UnityEngine.InputSystem.PlayerInput;
public class TestScript : MonoBehaviour
{
void Start()
{
InputActionMap playerActionMap = GetComponent<PlayerInput>().actions.GetActionMap("Player");
playerActionMap.Enable();
playerActionMap.GetAction("TestAction").Enable(); //Just to be sure
}
public void OnTestAction()
{
Debug.Log("test");
}
}
10/ Pressing "Play" and spamming "T" like a madman to try to display a debug (note that, in the debugger, a User is created, my "t" presses are detected, my TestAction exists and is mapped on the "t" key but no debug is displayed
It's probably a silly problem, but it's driving me crazy, what do I do wrong ? It's even more infuriating that it worked some days ago !
Additional information :
- Switching the Input Management from "Both" to "New Input System (preview) does nothing
- Checking in Update() is my action is enabled returns "True" every frame
- Checking in Update() is my action is triggered returns "False" every frame
- Using action.started/triggered/performed does nothing (I tried also switching to UnityEvent or C# events for this) :
public class TestScript : MonoBehaviour
{
InputAction a;
void Start()
{
InputActionMap playerActionMap = GetComponent<PlayerInput>().actions.GetActionMap("Player");
playerActionMap.Enable();
a = playerActionMap.GetAction("TestAction");
a.Enable(); //Just to be sure
a.started += OnTriggeredTestAction;
a.performed += OnTriggeredTestAction;
a.canceled += OnTriggeredTestAction;
}
public void OnTestAction()
{
Debug.Log("test");
}
public void OnTriggeredTestAction(InputAction.CallbackContext ctx)
{
Debug.Log("test triggered");
}
}
Injecting directly the InputActionReference of my TestAction and using it does nothing
Forcing "Default Control Scheme" and "Default Action Map" does nothing
Using BroadcastMessage or UnityEvents doesn't work
You probably tried to import a new input system package for multiple input devices compatibility. These types of errors are due to conflict between old and new input system packages and are probably resolved in the latest updates.
To resolve this issue, Go to Edit -> Project Settings->Player->Under Other Settings under Configuration is the option Active Input Handling. Select Both. Unity will restart. Now your problem should be solved. You will be able to use old input system packages and the new ones also simultaneously.
Check for rogue users in the input debugger
I was having very similar symptoms (Input System would randomly just stop sending callbacks). When I opened up the input debugger, it was registering the key presses, but the callbacks were never being called in my script.
Restarting Unity didn't help.
Rebooting didn't help.
I also discovered in the input debugger that there were 2 "users" in the input system and (by process of disabling Game Objects in the scene one at a time) discovered that I had accidentally attached another copy of my Input Action Asset to a different Game Object in the scene and that Unity was registering this other object as a 2nd player or "user", which was assigned all the input action bindings I was trying to capture.
The rogue Action Asset was essentially intercepting the actions, preventing the callbacks from being called on the intended script. I don't know if that's your particular issue, but maybe it will help someone else who (like me) has spent hours pouring through forums, looking for a solution to this elusive problem.
An easy way to tell if you have the same problem is to open the input debugger and see if the desired actions are actually mapped to the user of interest.
Screen clip of input debugger:
For me, there was an unexpected User #1 and only one of the users (not the intended one) actually had keys bound to the desired actions
Posting just incase others run into this issue, as this solved my problem. Make sure to call Enable() for it to start routing events.
//Create a and set the reference
private InputControls _inputMapping;
private void Awake() => _inputMapping = new InputControls();
//Route and Un-route events
private void OnEnable() => _inputMapping.Enable();
private void OnDisable() => _inputMapping.Disable();
I don't know if this will work for you but it worked for me and I was having the same issue.
I had created 2 control schemes. Mobile and Pc. Mobile required touch screen and PC required keyboard and Mouse. Doing this made my Mobile input event stop firing. So adding the Gamepad to my Mobile Control scheme allowed the events to fire again.
TLDR. Check your control scheme make sure it allows for the inputs your binding to.
I had a similar problem, reproduced with exactly the steps described in the question.
In my case, I forgot to set control schemes.
The problem was fixed after adding them.
To do so:
Open your Input Action Asset.
Select a control scheme, in the upper left corner. (say, Keyboard) (if you haven't added a control scheme to begin with, your problem may be different than mine)
Go Right Click > Edit Control Scheme.
EditControlScehme Screen Img
Click on the plus sign to add a control scheme to the list.
Add control scheme to the list Screen Img
Select the control scheme you want to add. (in this case, Keyboard)
Select control scheme Screen Img
Should look like this:
Added control scheme Screen Img
You're all set. Save everything and the problem should be fixed.
Play your game and it should work.
As of at least Unity 2020.1.2 and Input System 1.0.0 the input system will randomly stop working correctly. The only fix I'm aware of is restarting Unity.

Google Play Games Leaderboard post score reports success but doesn't update

I am using the google play games Unity3D api on Unity 2018.2.15f1, it should be noted that my google play games project isn't published however I don't want to publish it until I have all my achievements in there.
The leaderboard posting code is:
public void PostScore(long score)
{
Social.ReportScore(score, GPGSIds.leaderboard_high_scores, (bool success) => {
// handle success or failure
if(success)
{
Debug.Log("Posted Score of " + score);
}
else
{
Debug.Log("Failed to post score");
}
});
}
and logcat consistently shows that this returns a success, as so:
11-16 02:45:05.041: I/Unity(5503): (Filename: ./Runtime/Export/Debug.bindings.h Line: 43)
11-16 02:45:05.042: I/Unity(5503): Posted Score of 19
However no matter what the leaderboard doesn't update, it consistently shows a score of 8. I don't understand why this is here as it shows up exclusively on my device even and even if I wipe the leaderboard it persists.
Is the issue with the fact that the leaderboard isn't published yet? Or could there be something else at play.
In my case the problem got solved when I edited my GameServices profile and enabled the checkbox "Let others see your game activity".
Before that no matter what I did, all reported scores got ignored.
The issue was GPGS was caching the leaderboard scores and refusing to update them from the remote copy, the solution was to modify the IPlayGamesPlatform interface to support passing in a custom Types.DataSource parameter. This allows for the option to view the cached and networked version or just the networked version.

Disable/enable ARKit during runtime in Unity3d - C#

Iam working with Unity3d, using C# and the ARKit plugin (2.0 from Github)
In my current application iam using the ARKit for measuring distances. The tool iam creating needs this functionality only for this reason, so i was wondering how i could enable the ARKit, when the user needs the ruler and disable it, if not.
I want to avoid that there is some performance losing while the user is using a non ARKit Tool. Em i right if i would say, that the ARKit still works in the background, if you where initializing it once? Iam new on the ARKit thing, so i dont have an perfect overview of how to handle it.
To drop some Code lines makes no sence, its basically the plugin importet into the project, and my own script which depends on some functions - i didnt changed anything in the source code of the plugin. The measuring tool itself which i programmed works pretty well, but i could not determine how to activate and deactivate basically the ARKit.
Can someone help me out with this? When iam disabeling the GameObjects, the scripts are running at it seems to be a "dirty" method to avoid those functionallitys but i have to make it clean (for excample also the video map in the background needs to be disabled - and i guess those ARKit functions will not be paused or disabled, just because some scripts are disalbed, it seems the api still runs in the background because it lags when i do so)
If you need more informations, please let me know. Every help or suggestion would be very nice.
Thanks a lot!
The current ARKit API doesn't have a method to disable or enable it in Unity, during run-time at this point.
With than being said, Unity has its own function to enable and disable VR, AR or XR plugins. If ARKit is built correctly, this method should work. So, you might be able to disable/enable ARKit by setting XRSettings.enabled to false and enable it by setting it to true.
It's also a good idea to call XRSettings.LoadDeviceByName with an empty string, wait for frame, before setting XRSettings.enabled to false to disable it:
IEnumerator DisableAR()
{
XRSettings.LoadDeviceByName("");
yield return null;
XRSettings.enabled = false;
}
then call to disable:
StartCoroutine(DisableAR());
I guess I am answering a pretty old post. I found a way but I don't know if that is what you are expecting.
Like #Programmer told
The current ARKit API doesn't have a method to disable or enable it in Unity, during run-time at this point.
So the way that I used is incorporating Programmer's code along with that if you need the camera to render some skybox or solid color, I have done something like that in Non-AR mode by saving the current texture before changing it, as the live video is given as texture to the material and after saving that changed the texture to null and when you want to re-enable AR you set the textures back to the saved value and it gets loaded correctly.
bool ARMode;
bool isSupported;
Camera cam;
UnityARCameraManager ARCameraManager;
private Texture2D _videoTextureY;
private Texture2D _videoTextureCbCr;
private void Awake()
{
cam = Camera.main;
isSupported = FindObjectOfType<UnityARCameraManager>().sessionConfiguration.IsSupported;
ARMode = isSupported;
ARCameraManager = FindObjectOfType<UnityARCameraManager>();
}
void DisableAR()
{
XRSettings.enabled = false;
ARCameraManager.enabled = false;
_videoTextureY = (Texture2D)cam.GetComponent<UnityARVideo>().m_ClearMaterial.GetTexture("_textureY");
_videoTextureCbCr = (Texture2D)cam.GetComponent<UnityARVideo>().m_ClearMaterial.GetTexture("_textureCbCr");
cam.GetComponent<UnityARVideo>().m_ClearMaterial.SetTexture("_textureY", Texture2D.blackTexture);
cam.GetComponent<UnityARVideo>().m_ClearMaterial.SetTexture("_textureCbCr", Texture2D.blackTexture);
cam.clearFlags = CameraClearFlags.SolidColor;
cam.backgroundColor = Color.black;
cam.GetComponent<UnityARVideo>().enabled = false;
}
void EnableAR()
{
ARCameraManager.enabled = true;
XRSettings.enabled = true;
cam.clearFlags = CameraClearFlags.Depth;
cam.GetComponent<UnityARVideo>().m_ClearMaterial.SetTexture("_textureY", _videoTextureY);
cam.GetComponent<UnityARVideo().m_ClearMaterial.SetTexture("_textureCbCr", _videoTextureCbCr);
cam.GetComponent<UnityARVideo>().enabled = true;
}

Canon EDSDK Command TakePicture blocks everything after focus error

I'm using the Canon SDK 2.1 and i am trying to take a picture at the camera from C# code.
I started a session (EdsOpenSession) and everything works fine with this line of code:
EDSDK.EdsSendCommand(cameraDev, EDSDK.CameraCommand_TakePicture, 0);
the camera takes a picture and stores it on memory card.
The problem is here: if there is an AF error (e.g. the lens cap is on), the camera gets 'BUSY' and never gets back.
Also if i try to shut down the EDSDK with the functions EdsCloseSession or EdsTerminateSDK, they blocks. The only thing to get it up again is to restart the application and the camera.
I'm using an EOS 100D.
What can i do to get ignore these AF error and try to take another picture?
I have also just had this issue.
I have solved it by Sending a half button press to focus followed by a full button press to take the photo if that succeeds.
try
{
EDSDK.EdsSendCommand(cameraDev, EDSDK.CameraCommand_PressShutterButton, 1); // Half
EDSDK.EdsSendCommand(cameraDev, EDSDK.CameraCommand_PressShutterButton, 3); // Completely
}
finally
{
EDSDK.EdsSendCommand(cameraDev, EDSDK.CameraCommand_PressShutterButton, 0); // Off
}
I have the same problem with Canon EOS 1100D, but I've found http://digicamcontrol.com which is open source. They've managed to make autofocus working, but I haven't found what exactly they did. Maybe you can find it. I if you do, please share the solution.

Categories