Commit 292339e4 authored by 157.24.24.192's avatar 157.24.24.192
Browse files

Xinerama intro

parent db66bf4d
......@@ -8,6 +8,15 @@ SCREENs are independent. They can have, not only different video modes, but also
When a graphical application starts, it connects to an X server. Usually the environment variable {{{DISPLAY}}} specifies which X server to connect to. The value of {{{DISPLAY}}} is of the form {{{host:displaynumber.screen}}} and the default value set by practically all graphical environments is {{{:0.0}}}, which means the local computer, the X server instance 0, and SCREEN 0. Host specifies the computer, as the X11 protocol can work over a network. Displaynumber specifies the X server instance, as one computer may have several X servers running. Screen number specifies the SCREEN the application has access to. In other words, the SCREEN where the application windows will appear, must be selected before the application starts (or connects to the X server). The SCREEN cannot be changed without opening a new connection to the X server, which in practice means you have to quit and restart the application.
== Multiple Graphics Cards, or the story of Xinerama ==
The above limitation with SCREENs and applications is very inconvenient, when one has several graphics cards to run several monitors. The Xinerama desktop feature was developed to combine the graphics cards into a single SCREEN. This allows windows to span more than one monitor, and allow moving windows from monitor to monitor.
Along with the Xinerama feature came the Xinerama extension, which allows applications to query the physical monitor configuration. For instance, a Xinerama-aware window manager can maximize a window to fit one monitor instead of covering all monitors. For applications (window managers), there is the Xinerama library, {{{libXinerama}}} for using the Xinerama extension.
The drawbacks of Xinerama are noticeable. Xinerama is implemented as a big shadow framebuffer that covers all monitors. The framebuffer is in system RAM, not in the graphics cards' VRAM. This means that practically all acceleration done on graphics cards (2D drawing, 3D rendering, video overlays, Xv) must be disabled, since they do not work with system RAM. Originally this was not a big issue, since acceleration on GPU was not that big of a deal as it is now. Another minor annoyance is, that the DPI (dots-per-inch) resolution is fixed to the same value over all monitors.
== Dual-head Graphics Cards ==
Add a dual-head graphics card. Suddenly there is a single card with two physical monitors, and things get complicated. One solution for a driver (DDX) is to create one SCREEN per head, which is called Zaphod mode (after Zaphod Beeblebrox, from the Hitchhiker's Guide to the Galaxy). Another solution is to pretend that there is only one monitor, and use just one SCREEN, which is what the Nvidia TwinView mode does. The third and the only proper way to deal with it is the Randr extension, which is a relatively new invention. Randr exposes the dual-head card as a single SCREEN, yet having a standard way of managing the multiple monitors.
......
Supports Markdown
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment