Skip to content
Snippets Groups Projects
Commit d6b28b65 authored by Jakob Bornecrantz's avatar Jakob Bornecrantz
Browse files

gsoc: Add 2022 page

parent 2ee929e8
No related merge requests found
Pipeline #514865 passed
---
title: "GSoC ideas 2021"
layout: main
---
# Monado - GSoC Ideas 2021
{:.no_toc}
* TOC
{:toc}
# Head Tracked Fish Tank Display
The core OpenXR 1.0 specification supports stereo VR headsets and monoscopic handheld displays like smartphones or tablets which are supposed to be used as a "magic window" into a VR world or for AR purposes; for this purpose the device's orientation and position is tracked to provide users the ability to move the "window" view by moving the display device. A further use of monoscopic displays is the "fish tank" configuration with a fixed display like a TV and instead the head position of the user is tracked, to render the content behind the magic window from the right perspective. (Example: https://www.youtube.com/watch?v=Jd3-eiid-Uw). For this project, the student will add support in Monado for tracking a face, figure out the relation of the face/eyes to the a monitor and calculate fov values. The focus of this is not making creating production ready code that in 100% of the cases, but the integration of the code into Monado. The small test application hello_xr will need changes to add better support for mono-scopic fish tank views, like improving the scene setup. Depending on progress, the student can modify one or all of Godot, Blender and Unreal to support mono-scopic fish tank mode.
Deliverables:
* Integrated face-tracking code in Monado that transforms the data and pipes it up to the application.
Requirements:
* Camera: Any webcame or laptop camera will suffice.
* Advanced C
* basic linear algebra
* Some computer vision experience would be helpful. OpenCV provides some required functionality.
Difficulty: medium
# Frame Timing Tools
In VR, stutters, judders and jitters in rendering are a much bigger deal than on desktop monitors. Instead of being just annoying, they can cause disorientation, and even motion sickness.
Most importantly the VR runtime must push frames to the VR hardware at the native hardware refresh rate.
To that end it must keep track of the frames submitted by client applications, reproject old frames if the client application missed an interval, and push the frames to the VR hardware while sharing GPU hardware resources with the client application. Such a system benefits greatly from good visual frame timing tools.
A good frame timing tool would gather timing information from both the frames submitted by the client application and the frames submitted by the compositor to the hardware and relate them in a visual way. The visualization could be either done with an existing visualization tool like Perfetto, but it could also be a custom built UI for Monado.
Deliverables
* A system gathering frame timing information from client applications and the runtime compositor and relating them to each other
Requirements:
* Basic C
* Experience with graphics programming (Vulkan) would be helpful
Difficulty: medium
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment