Skip to content
Snippets Groups Projects
Commit 43432d9c authored by Jakob Bornecrantz's avatar Jakob Bornecrantz
Browse files

gsoc: Make youtube link work

parent b3b76f2f
No related branches found
No related tags found
No related merge requests found
......@@ -22,7 +22,7 @@ For all projects possible mentors are Jakob Bornecrantz (en/sv), Mateo de Mayo (
## Head Tracked Fish Tank Display
The core OpenXR 1.0 specification supports stereo VR headsets and monoscopic handheld displays like smartphones or tablets which are supposed to be used as a "magic window" into a VR world or for AR purposes; for this purpose the device's orientation and position is tracked to provide users the ability to move the "window" view by moving the display device. A further use of monoscopic displays is the "fish tank" configuration with a fixed display like a TV and instead the head position of the user is tracked, to render the content behind the magic window from the right perspective. (Example: https://www.youtube.com/watch?v=Jd3-eiid-Uw). For this project, the student will add support in Monado for tracking a face, figure out the relation of the face/eyes to the a monitor and calculate fov values. The focus of this is not making creating production ready code that in 100% of the cases, but the integration of the code into Monado. The small test application hello_xr will need changes to add better support for mono-scopic fish tank views, like improving the scene setup. Depending on progress, the student can modify one or all of Godot, Blender and Unreal to support mono-scopic fish tank mode.
The core OpenXR 1.0 specification supports stereo VR headsets and monoscopic handheld displays like smartphones or tablets which are supposed to be used as a "magic window" into a VR world or for AR purposes; for this purpose the device's orientation and position is tracked to provide users the ability to move the "window" view by moving the display device. A further use of monoscopic displays is the "fish tank" configuration with a fixed display like a TV and instead the head position of the user is tracked, to render the content behind the magic window from the right perspective. (Example: <https://www.youtube.com/watch?v=Jd3-eiid-Uw>). For this project, the student will add support in Monado for tracking a face, figure out the relation of the face/eyes to the a monitor and calculate fov values. The focus of this is not making creating production ready code that in 100% of the cases, but the integration of the code into Monado. The small test application hello_xr will need changes to add better support for mono-scopic fish tank views, like improving the scene setup. Depending on progress, the student can modify one or all of Godot, Blender and Unreal to support mono-scopic fish tank mode.
Deliverables:
* Integrated face-tracking code in Monado that transforms the data and pipes it up to the application.
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment