Newer
Older
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
---
title: "Using Mercury hand tracking"
layout: main
---
# Monado's hand tracking
Monado comes with its own optical hand tracking pipeline, codenamed Mercury. The tracking quality is generally good on Index and Luxonis cameras, but it's still in development!
## Compatible hardware
Currently, Monado's hand tracking works on
- Valve Index
- Most Luxonis stereo cameras (via the North Star builder or "Hand-Tracking Demo" scene in monado-gui)
- Windows Mixed Reality headsets*
- Rift S*
\* Tracking quality is degraded on these headsets currently due to autoexposure issues.
This will hopefully be fixed soon.
It should also be a small amount of plumbing work to get our hand tracking working on most Intel Realsense stereo cameras, and maybe Vive Pro.
Basically any calibrated stereo camera should work.
Patches welcome!
## Building
### Get OpenCV
Monado's hand tracking depends on OpenCV for a few things. Find it in your distribution packages.
To get it on Ubuntu/Debian:
```sh
sudo apt install libopencv-dev libopencv-contrib-dev
```
To get it on Arch Linux:
```sh
sudo pacman -Syu opencv
```
### Get ONNX Runtime
[ONNX Runtime](https://onnxruntime.ai/about.html) is an open-source ML inference library that's used to run the neural nets that track your hands.
There aren't official packages for ONNX Runtime yet, so there's a few different ways you can install it:
Option 1 (easiest):
- Download a release from [ONNX Runtime's official releases page](https://github.com/microsoft/onnxruntime/releases) - most likely `onnxruntime-linux-x64-<version>.tgz` (you don't need the cuda/gpu/tensorrt version - we run our models on CPU)
- Extract the release archive somewhere
- Move the files in `include/` to your `/usr/include` or `/usr/local/include`, and move the files in `lib/` to your `/usr/lib/` or your `/usr/local/lib` depending on your preference.
Option 2 (easiest on Arch):
```sh
yay -S onnxruntime-git
```
Option 3 (if you want really high performance):
Build and install ONNX Runtime from source, from [its official repository](https://github.com/microsoft/onnxruntime).
This is left as an exercise to the reader.
### (Optional, only for DepthAI cameras) Get depthai-core
```sh
git clone --recursive https://github.com/luxonis/depthai-core
cd depthai-core
mkdir build && cd build
cmake .. -DDEPTHAI_ENABLE_BACKWARD=0 -DBUILD_SHARED_LIBS=1 -DCMAKE_INSTALL_PREFIX=/usr/local -DCMAKE_BUILD_TYPE=RelWithDebInfo -GNinja
sudo ninja install
```
Sometimes you get strange errors about code not being compiled with -fPIC.
These seem to be a problem caused by their CMake package manager.
If you get these errors, try removing `~/.hunter`.
### Check if CMake has found ONNX Runtime
Go to your Monado build directory, and try running `cmake ..`.
If you successfully installed ONNX Runtime and OpenCV, you'll get
```plaintext
<...>
-- # ONNXRUNTIME: ON
-- # OPENCV: ON
<...>
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
<...>
-- # DRIVER_HANDTRACKING: ON
<...>
```
in the output.
If any of these are set to OFF, go back and make sure you installed the relevant libraries correctly.
Once you have these in the CMake output, you should just be able to rebuild+reinstall Monado and have hand tracking work by default!
## Running
If Monado is built correctly with hand tracking enabled, you should be able to just run Monado with no controllers connected, and it'll track your hands through the headset's cameras!
## Tips and tricks
### Get lots of light
Our optical hand tracking is *optical!*
For tracking to work well, you need to have plenty of light wherever you are so that the cameras can see your hands easily.
Open your windows, turn all your lights on, get some lamps if you need them.
Also, front-lighting your hands helps a lot. You can put some bright lights on the front of your HMD that illuminate your hands, and then you can also turn any other lights off. The [OAK-D Pro W Dev](https://shop.luxonis.com/products/oak-d-pro-w-dev) has this built-in, but you can get very creative here.
If you have the choice, *red* light is the most useful here. Out of the visible spectrum, human skin reflects red light the best.
### Use the debug UI
Monado's hand tracking has a debug UI that'll show you what the cameras see and what the hand tracking is doing, and this can help you debug many issues. To get the debug UI, run Monado with the environment variable `OXR_DEBUG_GUI=1`.
### Index
Libsurvive head tracking is quite imperfect.
However, quite a lot of apps do well with 3dof head tracking and 6dof hand tracking, and it's generally the easiest path for testing *just* hand tracking. To use 3dof tracking instead of libsurvive, run Monado with the environment variable `VIVE_OVER_SURVIVE=1`.
### WMR/RiftS
Currently, tracking is not amazing on these headsets because of an issue with autoexposure and dynamic range.
This should be completely fixed soon and not require any other setup, but for now you can run Monado with the environment variable `AEG_USE_DYNAMIC_RANGE=true` and that should improve things.
In general if things are performing badly, open up the debug UI and look for issues in what the cameras can visually see.
### RiftS
If you're using RiftS tracking for SteamVR/OpenComposite, setting the environment variable `RIFT_S_HAND_TRACKING_AS_CONTROLLERS` to `true`
## Demos/apps/frameworks that support hand tracking
- [LÖVR](https://lovr.org/)
- The [toymaker demo](https://jmiskovic.itch.io/toymaker-vr) is especially fun
- [StereoKit](https://stereokit.net/)
- [OpenXR Playground](https://gitlab.freedesktop.org/monado/demos/openxr-simple-playground)
- [godot_openxr](https://github.com/GodotVR/godot_openxr)
- Most controller-only apps will somewhat work through hand tracking controller emulation. People have tried Beat Saber and VRChat through OpenComposite with some success, for example.
## Development
Most development happens [here.](https://gitlab.freedesktop.org/monado/utilities/hand-tracking-playground/mercury_train/)