Newer
Older
---
title: "GSoC ideas 2021"
layout: main
---
# Monado - GSoC Ideas 2021
{:.no_toc}
* TOC
{:toc}
# Head Tracked Fish Tank Display
The core OpenXR 1.0 specification supports stereo VR headsets and monoscopic handheld displays like smartphones or tablets which are supposed to be used as a "magic window" into a VR world or for AR purposes; for this purpose the device's orientation and position is tracked to provide users the ability to move the "window" view by moving the display device. A further use of monoscopic displays is the "fish tank" configuration with a fixed display like a TV and instead the head position of the user is tracked, to render the content behind the magic window from the right perspective. (Example: https://www.youtube.com/watch?v=Jd3-eiid-Uw). For this project, the student will add support in Monado for tracking a face, figure out the relation of the face/eyes to the a monitor and calculate fov values. The focus of this is not making creating production ready code that in 100% of the cases, but the integration of the code into Monado. The small test application hello_xr will need changes to add better support for mono-scopic fish tank views, like improving the scene setup. Depending on progress, the student can modify one or all of Godot, Blender and Unreal to support mono-scopic fish tank mode.
Deliverables:
* Integrated face-tracking code in Monado that transforms the data and pipes it up to the application.
Requirements:
* Advanced C
* basic linear algebra
* Some computer vision experience would be helpful. OpenCV provides some required functionality.
Difficulty: medium
# SLAM-Jam
The WMR and Oculus Rift S PC VR headsets only support camera based inside out tracking as their positional tracking system. We want to support these headsets in our Monado runtime, but to do so we need an implementation of simultaneous localization and mapping (SLAM) tracking. Several open source implementations exist, but mostly optimized for robotics/research, not for real time VR headset tracking. The task would be to design and run a comparison of MIT licensed SLAM implementations such as Maplab, OpenVSLAM or Kimera, and perhaps others, and later to use a HMD with cameras or put a camera on a headset and integrate into a complete solution.
Deliverables:
* A report detailing the performance charactaristics of the various SLAM, detailing acurracy, latency, CPU usage, ease of portability and integrability.
* A driver integrating one integrating one of the SLAM tracker into Monado.
Requirements:
* Advanced C & C++
* Experience with multimedia on Linux (UVC, V4L2)
* Experience in DSP/signal filtering (e.g. kalman) for the complete solution would be a bonus
Difficulty: medium to hard
# Frame Timing Tools
In VR, stutters, judders and jitters in rendering are a much bigger deal than on desktop monitors. Instead of being just annoying, they can cause disorientation, and even motion sickness.
Most importantly the VR runtime must push frames to the VR hardware at the native hardware refresh rate.
To that end it must keep track of the frames submitted by client applications, reproject old frames if the client application missed an interval, and push the frames to the VR hardware while sharing GPU hardware resources with the client application. Such a system benefits greatly from good visual frame timing tools.
A good frame timing tool would gather timing information from both the frames submitted by the client application and the frames submitted by the compositor to the hardware and relate them in a visual way. The visualization could be either done with an existing visualization tool like Perfetto, but it could also be a custom built UI for Monado.
Deliverables
* A system gathering frame timing information from client applications and the runtime compositor and relating them to each other
Requirements:
* Basic C
* Experience with graphics programming (Vulkan) would be helpful
Difficulty: medium