xserver issueshttps://gitlab.freedesktop.org/xorg/xserver/-/issues2023-11-10T16:12:07Zhttps://gitlab.freedesktop.org/xorg/xserver/-/issues/1595Xwayland failing to set viewport and _XWAYLAND_RANDR_EMU_MONITOR_RECTS2023-11-10T16:12:07ZneofeoXwayland failing to set viewport and _XWAYLAND_RANDR_EMU_MONITOR_RECTSHi, I am trying to find a solution to mutter not being able to scale 640x480 wine games with xwayland properly. I am on mantic, with fairly latest packages, but for sure not the absolute latest xwayland release. Although, this is what gn...Hi, I am trying to find a solution to mutter not being able to scale 640x480 wine games with xwayland properly. I am on mantic, with fairly latest packages, but for sure not the absolute latest xwayland release. Although, this is what gnome mutter devs where able to find out. They basically claim that xwayland seems the culprit here.
Link to the issue: https://gitlab.gnome.org/GNOME/mutter/-/issues/3131
What the dev said there:
Sebastian Keller
@skeller
The way this is supposed to work is that XWayland detects when an X11 application tries to change the resolution of the monitor and then sets up a Wayland viewport that scales the window accordingly. This does not seem to happen for those applications.
Running env WINEDEBUG=+x11drv wine ./CLAW.EXE 2>&1 | grep apply_display_settings gives:
0114:trace:x11drv:apply_display_settings handler:XRandR 1.4 changing L"\\\\.\\DISPLAY1" to position:(0,0) resolution:640x480 frequency:59Hz depth:8bits orientation:0.
So wine attempts to change the resolution, but XWayland does not set up a viewport. It does not set the _XWAYLAND_RANDR_EMU_MONITOR_RECTS property on the window either. That might mean that some of the heuristics used to detect these fullscreen + resolution changes are no longer working, possibly due to changes on the mutter side (maybe the frames client). Now it would be good to know if this is an issue on the XWayland side or on the mutter side. Does this work on other Wayland compositors?https://gitlab.freedesktop.org/xorg/xserver/-/issues/1205--scale scales a rasterized frame-buffer-bitmap to the display-resolution, le...2021-07-29T09:42:43ZJohannes Kalliauer--scale scales a rasterized frame-buffer-bitmap to the display-resolution, leading to massive quality losses## Problem
xrandr --scale leads to a quality-loss (generally blurry)
(a)`xrandr --output eDP-1 --mode 1920x1080 --scale .5` and (b)`xrandr --output eDP-1 --mode 960x540 --scale 1` result in the same Screenshot, however (a) should be a 1...## Problem
xrandr --scale leads to a quality-loss (generally blurry)
(a)`xrandr --output eDP-1 --mode 1920x1080 --scale .5` and (b)`xrandr --output eDP-1 --mode 960x540 --scale 1` result in the same Screenshot, however (a) should be a 1920x1080 image and (b) should only use a quarter of the screen (to avoid blurriness).
It is correct that (a) and (b) show the same amount of content and the same relative Windowsizes/Font-sizes, however (a) should have more pixels for the same content, however they end up in the same Screenshot.
## actual result
xrandr --scale scales a frame-buffer-bitmap to the display-resolution
## expected result
xrandr --scale should scale first and then create raster-image identical to the physical display-resolution, as done by `gnome-control-center>Settings>Display>Scale` or by `gnome-tweaks>Fonts>Scaling Factor`.
However gnome-control-center and gnome-tweaks are global for all monitors, and can't compensate different dpi on Monitors.
## Examples
`xrandr --output eDP-1 --mode 1920x1080 --scale 1 --filter nearest`
![Screenshot_from_2021-07-27_16-55-21](/uploads/70ff4e22a03c83448c052f3e962fe16c/Screenshot_from_2021-07-27_16-55-21.png)
----
`xrandr --output eDP-1 --mode 1920x1080 --scale .5 --filter nearest`
![Screenshot_from_2021-07-27_16-57-53](/uploads/3508ea29126da198c6d3b853e85b4063/Screenshot_from_2021-07-27_16-57-53.png)
----
`xrandr --output eDP-1 --mode 960x540 --scale 1 --filter nearest`
![Screenshot_from_2021-07-27_16-57-19](/uploads/5f0c2b53fc760b7f81c2bee70d579ee5/Screenshot_from_2021-07-27_16-57-19.png)
----
`xrandr --output eDP-1 --mode 960x540 --scale 0.5 --filter nearest`
![Screenshot_from_2021-07-27_16-54-19](/uploads/af0cf287d8d78b35aa6cb2d07bace3d8/Screenshot_from_2021-07-27_16-54-19.png)https://gitlab.freedesktop.org/xorg/xserver/-/issues/1140hw/xfree86/modes: Use per-screen monitor for all card outputs by default2021-04-08T17:09:24ZOlivier Certnerhw/xfree86/modes: Use per-screen monitor for all card outputs by defaultInstead of just the first one.
An explicit DisplaySize directive in a Monitor section referenced by a Screen section in `xorg.conf` is not taken into account when the monitor is not connected to the Screen's card's first output.
Desire...Instead of just the first one.
An explicit DisplaySize directive in a Monitor section referenced by a Screen section in `xorg.conf` is not taken into account when the monitor is not connected to the Screen's card's first output.
Desired outcome: Apply the directives of the monitor section to all card's outputs that do not have an explicit monitor linked to them. This would allow to specify a monitor section for the screen without bothering to which actual card output the monitor is connected to (in a single-monitor setup).
Rationale:
1. Desktop/server graphic cards commonly have several outputs, even
for a single output type (DP, HDMI, etc.). This would ease configuring
for single-monitor cases. Multi-monitor setups require per-output
monitor sections anyway (or autoconfiguration, see next point).
2. The change preserves compatibility with single-output setups and
autoconfiguration.
3. Seems a priori that applying the per-screen monitor to the first
output is rather arbitrary.https://gitlab.freedesktop.org/xorg/xserver/-/issues/1002RRAddOutputMode only generates root ConfigureNotify once2020-03-30T10:11:13ZyshuiRRAddOutputMode only generates root ConfigureNotify onceIt seems like only the first ever RRAddOutputMode request cause the server to generate a ConfigureNotify event for the root window. Any subsequent requests, even if with new modes, doesn't generate such event.
I am not sure if RRAddOutp...It seems like only the first ever RRAddOutputMode request cause the server to generate a ConfigureNotify event for the root window. Any subsequent requests, even if with new modes, doesn't generate such event.
I am not sure if RRAddOutputMode is supposed to trigger ConfigureNotify, but this inconsistency seems odd to me.https://gitlab.freedesktop.org/xorg/xserver/-/issues/704Screen size cannot rely on wl_output scale and geometry2021-02-15T08:54:02ZBugzilla Migration UserScreen size cannot rely on wl_output scale and geometry## Submitted by Jonas Ådahl `@jadahl`
Assigned to **Wayland bug list**
**[Link to original bug (#101436)](https://bugs.freedesktop.org/show_bug.cgi?id=101436)**
## Description
Currently, Xwayland will configure its screen and moni...## Submitted by Jonas Ådahl `@jadahl`
Assigned to **Wayland bug list**
**[Link to original bug (#101436)](https://bugs.freedesktop.org/show_bug.cgi?id=101436)**
## Description
Currently, Xwayland will configure its screen and monitors given the wl_output's it sees being advertised.
It uses the dimensions of the current mode, together with its x/y coordinate, where each wl_output is treated as a separate monitor. The wl_output.scale event is completely ignored.
In practice, the actual screen size and monitor sizes that Xwayland should have, may thus be something else.
Some examples:
* A compositor may advertise a wl_output with scale 2, and have a logical pixel coordinate space it places windows on where the content wl_output region is also scaled by 2. Here Xwayland should treat a 1024x768 wl_output with scale 2 as 512x384 large internal monitor.
* A compositor may advertise a wl_output with scale 2 in the same way as above, but in fact its logical representation of the output scaled with a fractional scale, which is not advertised at all. In this case, Xwayland has no way to know the expected screen and monitor size.
* A compositor may advertise a wl_output with scale 2, but its logical coordinate space is always identical to the physical pixel coordinate space, meaning Xwayland should as it does now completely ignore the wl_output scale.
To solve this, we need to introduce a protocol (Xwayland specific or not) that communicates the logical geometry of each wl_output.https://gitlab.freedesktop.org/xorg/xserver/-/issues/234xrandr --above does not take --scale into account2018-12-13T18:33:41ZBugzilla Migration Userxrandr --above does not take --scale into account## Submitted by Emmanuel Beffara
Assigned to **Xorg Project Team**
**[Link to original bug (#107721)](https://bugs.freedesktop.org/show_bug.cgi?id=107721)**
## Description
(This does not feel like a server side bug, I wanted to fi...## Submitted by Emmanuel Beffara
Assigned to **Xorg Project Team**
**[Link to original bug (#107721)](https://bugs.freedesktop.org/show_bug.cgi?id=107721)**
## Description
(This does not feel like a server side bug, I wanted to file this bug in App/xrandr but somehow the bugzilla does not offer this option)
It seems that relative placement options --above, --below etc do not take into account the scaling specified by --scale.
My precise setup is as follows:
- Dell XPS 9370 laptop (13" with Intel UHD Graphics 620)
- internal HiDPI LCD display "eDP1" with native resolution 3840x2160
- external LCD screen "DP1" with native resolution 1280x1024
In order to get a satisfactory dual-screen setup, I use the 2x2 scaling on the external screen to make it appear logically as 2560x2048 so that things can be moved smoothly between the two screens. If I want the external screen appear to be on the right of the internal one, the following works fine:
xrandr --output DP1 --scale 2x2 --right-of eDP1
However, if I want the external display to appear to be above the internal one, the natural variation does not work:
xrandr --output DP1 --scale 2x2 --above eDP1
This actually sets eDP1 at position 0x1024 so that the two screens overlap and I see the lower half of DP1 at the top of eDP1. I tried to specify DP1 first as
xrandr --output DP1 --scale 2x2 --output eDP1 --primary --below DP1
but the result is the same. I get a working setup if I place the internal display explicitly:
xrandr --output eDP1 --pos 0x2048 --output DP1 --scale 2x2
This does work, but if I plug a screen with a different natural resolution, I cannot use the same script to just "put the external screen above the internal one", instead I have to come up with a new invocation each time.https://gitlab.freedesktop.org/xorg/xserver/-/issues/231Add a way to set and get the color temperature.2018-12-13T18:33:36ZBugzilla Migration UserAdd a way to set and get the color temperature.## Submitted by Navid Zamani
Assigned to **Xorg Project Team**
**[Link to original bug (#103985)](https://bugs.freedesktop.org/show_bug.cgi?id=103985)**
## Description
Currently, People need external tools to set the temperature o...## Submitted by Navid Zamani
Assigned to **Xorg Project Team**
**[Link to original bug (#103985)](https://bugs.freedesktop.org/show_bug.cgi?id=103985)**
## Description
Currently, People need external tools to set the temperature of their screens, e.g. to avoid blue light at night.
Usually they abuse gamma for this, which of course doesn’t help with 100% white.
The tool sct allows setting the temperature, by altering the gamma color mapping tables. A fork of it allows setting the brightness too, which otherwise would conflict: https://github.com/mgudemann/sct/
Additionally, reading the temperature back again, would be necessary aswell, so a slow fading can be implemented via e.g. a a script that runs once every minute, and for other calculation uses.
I tried to start writing a patch for sct, to read the temperature too:
https://github.com/mgudemann/sct/issues/1
But ideally, this should be implemented right inside of xrandr. Which would make tools like sct, redshift and f.lux unnecessary, and enable many other useful usages.https://gitlab.freedesktop.org/xorg/xserver/-/issues/220mouse confined with prime2018-12-13T18:33:06ZBugzilla Migration Usermouse confined with prime## Submitted by bit..@..il.com
Assigned to **Xorg Project Team**
**[Link to original bug (#103374)](https://bugs.freedesktop.org/show_bug.cgi?id=103374)**
## Description
Created attachment 134939
xorg.conf
# Overview
Mouse canno...## Submitted by bit..@..il.com
Assigned to **Xorg Project Team**
**[Link to original bug (#103374)](https://bugs.freedesktop.org/show_bug.cgi?id=103374)**
## Description
Created attachment 134939
xorg.conf
# Overview
Mouse cannot get into VGA-1-1.
$ xrandr | grep " connec"
VGA-0 connected 1280x1024+3320+0 (normal left inverted right x axis y axis) 337mm x 270mm
DVI-I-1 connected 1920x1080+1400+0 (normal left inverted right x axis y axis) 531mm x 299mm
VGA-1-1 connected 1400x1050+0+0 (normal left inverted right x axis y axis) 408mm x 306mm
# Setup
Intel integrated graphics card connected to VGA-1-1. Other monitors connected to nvidia graphics card.
See attachments.
**Attachment 134939**, "xorg.conf":
[xorg.conf](/uploads/f9dbd17c2314dabe2813b4148225f2e8/xorg.conf)https://gitlab.freedesktop.org/xorg/xserver/-/issues/225PRIME output slaves should be set up automatically2018-12-13T18:33:24ZBugzilla Migration UserPRIME output slaves should be set up automatically## Submitted by pos..@..lfj.de
Assigned to **Xorg Project Team**
**[Link to original bug (#103213)](https://bugs.freedesktop.org/show_bug.cgi?id=103213)**
## Description
## Steps to reproduce
* On a laptop with an integrated and ...## Submitted by pos..@..lfj.de
Assigned to **Xorg Project Team**
**[Link to original bug (#103213)](https://bugs.freedesktop.org/show_bug.cgi?id=103213)**
## Description
## Steps to reproduce
* On a laptop with an integrated and a dedicated GPU and some outputs connected the dedicated GPU, start X11.
* Plug a screen into the dedicated GPU's output.
* Run xrandr -q
## Expected behavior
The external screen should be listed and configurable.
## Actual behavior
No external screen is listed. I have to first run "xrandr --setprovideroutputsource 1 0". That is a problem in particular for graphical display managers -- getting a login on all connected screens requires somehow running this command in the DM session, which I have not yet managed to do reliably for gdm3.
Xorg clearly already knows which card is the primary card, because it can talk to my internal screen and "glxinfo" shows the Intel card, while "DRI_RPIME=1 glxinfo" shows the NVidia card. Given that it knows which card is the primary, Xorg should just add all other cards (that support it and that have outputs) as output slaves. That would significantly improve the out-of-the-box experience.
## Further information
I have a Lenovo P50 and I am using Debian testing with Xorg 7.7 and Linux 4.12. The machine has its HDMI connector wired to the an NVidia GM107GLM (Quadro M2000M), while the internal display is driven by an Intel HD graphics P530 (Skylake GT2). On the Xorg side, both cards are using the modesetting driver. My DE is Gnome 3.26.
Version: 7.7 (2012.06)https://gitlab.freedesktop.org/xorg/xserver/-/issues/235Qt rendering problems with Samsung SyncMaster P2470HD screen2018-12-13T18:33:42ZBugzilla Migration UserQt rendering problems with Samsung SyncMaster P2470HD screen## Submitted by Deposite Pirate
Assigned to **Xorg Project Team**
**[Link to original bug (#103013)](https://bugs.freedesktop.org/show_bug.cgi?id=103013)**
## Description
I've had these rendering glitches (wrong rendering of the x...## Submitted by Deposite Pirate
Assigned to **Xorg Project Team**
**[Link to original bug (#103013)](https://bugs.freedesktop.org/show_bug.cgi?id=103013)**
## Description
I've had these rendering glitches (wrong rendering of the xfce start menu icon and wrong scaling of some Qt 5 apps, see attached screenshots) for years on an AMD Athlon 64 X2 + Radeon HD 6670 PC. I have two other newer AMD CPU+GPU PCs that don't have this problem. I have no special system wide config on any of my machines and I've tried a fresh config with another user account. So I'm thinking this must be a GPU rendering issue. On the affected system there is also a motherboard integrated radeon which I disabled in the BIOS.
The strange thing is that KeepassXC used to have this scaling problem, but it no longer has. The scaling problem never disappeared with qTox however.https://gitlab.freedesktop.org/xorg/xserver/-/issues/665xrandr --delmode fails2019-03-20T16:04:18ZBugzilla Migration Userxrandr --delmode fails## Submitted by Jeffrey
Assigned to **Keith Packard `@keithp`**
**[Link to original bug (#93479)](https://bugs.freedesktop.org/show_bug.cgi?id=93479)**
## Description
```
Theater:/lib/firmware # xrandr --verbose --delmode HDMI1 "1...## Submitted by Jeffrey
Assigned to **Keith Packard `@keithp`**
**[Link to original bug (#93479)](https://bugs.freedesktop.org/show_bug.cgi?id=93479)**
## Description
```
Theater:/lib/firmware # xrandr --verbose --delmode HDMI1 "1920x1200"
X Error of failed request: BadAccess (attempt to access private resource denied)
Major opcode of failed request: 139 (RANDR)
Minor opcode of failed request: 19 (RRDeleteOutputMode)
Serial number of failed request: 35
Current serial number in output stream: 36
Theater:/lib/firmware # xrandr --version
xrandr program version 1.4.3
Server reports RandR version 1.4
```
Running 4.4.0-rc6 kernel with a special EGL openelec build.
I am trying to remove an unsupported resolution that my projector seems to be announcing but getting the above message when I try.Keith PackardKeith Packardhttps://gitlab.freedesktop.org/xorg/xserver/-/issues/233Slow desktop environment performance with multi-head configuration2018-12-13T18:33:39ZBugzilla Migration UserSlow desktop environment performance with multi-head configuration## Submitted by Alen Skondro
Assigned to **Xorg Project Team**
**[Link to original bug (#93413)](https://bugs.freedesktop.org/show_bug.cgi?id=93413)**
## Description
Hello,
I already opened an issue on the arch linux forums:
http...## Submitted by Alen Skondro
Assigned to **Xorg Project Team**
**[Link to original bug (#93413)](https://bugs.freedesktop.org/show_bug.cgi?id=93413)**
## Description
Hello,
I already opened an issue on the arch linux forums:
https://bbs.archlinux.org/viewtopic.php?id=206297
But now I know that it has something to do with my dual-head configuration.
The main issue is that almost all DEs (gnome, gnome-wayland, enlightenment, cinnamon) show noticeable visual performance decrements when dual-head is enabled. Disabling the secondary head results in normal and fluid animation, where the dual-head configuration results in degraded visual animations.
I'm using the nouveau (gtx770) driver on arch linux (4.4-rc5).
What could be the problem? How can I help to resolve this issue?https://gitlab.freedesktop.org/xorg/xserver/-/issues/230when rotating and reflecting screen mouse trapped in half screen2018-12-13T18:33:34ZBugzilla Migration Userwhen rotating and reflecting screen mouse trapped in half screen## Submitted by Aivars
Assigned to **Xorg Project Team**
**[Link to original bug (#92528)](https://bugs.freedesktop.org/show_bug.cgi?id=92528)**
## Description
When rotating and reflecting screen mouse gets trapped in half of scre...## Submitted by Aivars
Assigned to **Xorg Project Team**
**[Link to original bug (#92528)](https://bugs.freedesktop.org/show_bug.cgi?id=92528)**
## Description
When rotating and reflecting screen mouse gets trapped in half of screen after reflect.
So we have widescreen monitor which is vertically positioned and has a mirror in front, so we need to rotate screen 270 and then do reflect.
commands run:
xrandr --display :0 --output HDMI1 --rotate left --reflect x
After reflecting screen mouse gets trapped in top half of screen, and cannot get with mouse lower than half screen.
xorg logfile before xrandr: https://gist.github.com/Atoms/99c459b841de2dfa5744
xorg logfile after xrandr: https://gist.github.com/Atoms/27efa6c739c4a9d223d5
Video card is intel:
00:02.0 VGA compatible controller: Intel Corporation 2nd Generation Core Processor Family Integrated Graphics Controller (rev 09)https://gitlab.freedesktop.org/xorg/xserver/-/issues/226Invalid DPI reported with screen spanning several displays2018-12-13T18:33:25ZBugzilla Migration UserInvalid DPI reported with screen spanning several displays## Submitted by Yaroslav
Assigned to **Xorg Project Team**
**[Link to original bug (#86967)](https://bugs.freedesktop.org/show_bug.cgi?id=86967)**
## Description
Long story short: while using several displays as single screen, xr...## Submitted by Yaroslav
Assigned to **Xorg Project Team**
**[Link to original bug (#86967)](https://bugs.freedesktop.org/show_bug.cgi?id=86967)**
## Description
Long story short: while using several displays as single screen, xrandr report wrong (very small or very large) DPI, e.g. for it becomes something like 130 DPI for 3 23" LCD displays 1920x1080 set in vertical array.it looks like xrand calculates pixel diagonal for entire screen - sqrt(1920^2 + 3240^2) = 3766 and divides by diagonal in inches of single LCD (got it from EDID?) In some cases DPI is very small - 30-20 DPI and I didn't figured what causes that, but I experienced it while using same display setup with "Nvidia" driver. Result is huge or very small font or spacing between lines in most of application that use font relying on DPI, in first place Qt-based GUI , text editors, etc.
Only workaround for that which I found is to set DPI manually equal to native DPI of panel in display section of configuration file (may be there is also xrandr command which can do that?), but I wonder if such behavior is known and acceptable.
Version: 7.7 (2012.06)https://gitlab.freedesktop.org/xorg/xserver/-/issues/227xrandr --setprovideroffloadsink with 0x0 as sink crashes the server2018-12-13T18:33:27ZBugzilla Migration Userxrandr --setprovideroffloadsink with 0x0 as sink crashes the server## Submitted by Yuxuan Shui
Assigned to **Xorg Project Team**
**[Link to original bug (#83303)](https://bugs.freedesktop.org/show_bug.cgi?id=83303)**
## Description
First I use
```xrandr --setprovideroffloadsink radeon Intel
the...## Submitted by Yuxuan Shui
Assigned to **Xorg Project Team**
**[Link to original bug (#83303)](https://bugs.freedesktop.org/show_bug.cgi?id=83303)**
## Description
First I use
```xrandr --setprovideroffloadsink radeon Intel
then I try to remove the offload sink with
```xrandr --setprovideroffloadsink radeon 0x0
which result in Xserver crash.
[ 300.912] (EE) Backtrace:
[ 300.912] (EE) 0: /usr/bin/Xorg.bin (xorg_backtrace+0x56) [0x593966]
[ 300.912] (EE) 1: /usr/bin/Xorg.bin (0x400000+0x197b69) [0x597b69]
[ 300.912] (EE) 2: /usr/lib/libc.so.6 (0x7fda7edd1000+0x33df0) [0x7fda7ee04df0]
[ 300.912] (EE) 3: /usr/bin/Xorg.bin (DetachOutputGPU+0x1f) [0x437e9f]
[ 300.912] (EE) 4: /usr/bin/Xorg.bin (0x400000+0xb9e33) [0x4b9e33]
[ 300.912] (EE) 5: /usr/bin/Xorg.bin (ProcRRSetProviderOffloadSink+0x103) [0x4ff043]
[ 300.912] (EE) 6: /usr/bin/Xorg.bin (0x400000+0x376d7) [0x4376d7]
[ 300.912] (EE) 7: /usr/bin/Xorg.bin (0x400000+0x3b866) [0x43b866]
[ 300.912] (EE) 8: /usr/lib/libc.so.6 (__libc_start_main+0xf0) [0x7fda7edf1000]
[ 300.912] (EE) 9: /usr/bin/Xorg.bin (0x400000+0x25d0e) [0x425d0e]
Version: 7.7 (2012.06)https://gitlab.freedesktop.org/xorg/xserver/-/issues/217X crashes when rotating or enabling a 3rd screen when using dual graphics cards.2018-12-13T18:32:59ZBugzilla Migration UserX crashes when rotating or enabling a 3rd screen when using dual graphics cards.## Submitted by met..@..oo.com
Assigned to **Xorg Project Team**
**[Link to original bug (#77075)](https://bugs.freedesktop.org/show_bug.cgi?id=77075)**
## Description
Created attachment 96930
Xorg.0.log after attempting to enable...## Submitted by met..@..oo.com
Assigned to **Xorg Project Team**
**[Link to original bug (#77075)](https://bugs.freedesktop.org/show_bug.cgi?id=77075)**
## Description
Created attachment 96930
Xorg.0.log after attempting to enable 2 monitors on card B.
I'm not sure if this bug belongs under "xrandr" really, but I'll try to provide a good description of what happens.
My system has 2 ATI Radeon HD 4850s and some integrated adapter by Aspeed which is unused.
When I start X, one card (referred to as card A now) is visible in xrandr. To get the second card(card B), I use "xrandr --setprovideroutputsource 1 0". If I use "0 1" instead of "1 0" X will crash.
I have 3 identical monitors, and 4 DVI outputs(2 per card). If I have 2 outputs on card B(thus one on card A) and try to enable both, X will crash.
If I have 2 outputs on card B, enable one, and try to rotate it, X will crash.
If I have 2 outputs on card A (and thus 1 on card B), I can enable all 3 outputs. I can even rotate both of the outputs on card A. However, when I have one of the outputs on Card A rotated, and I enable card B's one output, the rotated monitor on card A will be roughly half blacked out. From this point, trying to move it may change where the blacked out region is, but will also provide the following error:
X Error of failed request: BadValue (integer parameter out of range for operation)
Major opcode of failed request: 140 (RANDR)
Minor opcode of failed request: 7(RRSetScreenSize)
Value in failed request: 0x0
Serial Number of failed request: 44
Current serial number in output stream: 45
The cursor cannot be moved into this black region.
I haven't written a bug report before, so if I've left any ambiguities I'll attempt to respond to comments.
I've attached a log which resulted from an attempt to enable 3 monitors when 2 would be on card B.
**Attachment 96930**, "Xorg.0.log after attempting to enable 2 monitors on card B.":
[file_77075.txt](/uploads/72e878ee03e562c1f8f6d4f4bf9d077e/file_77075.txt)https://gitlab.freedesktop.org/xorg/xserver/-/issues/666randr getscreeninfo malformed reply2019-03-16T20:47:02ZBugzilla Migration Userrandr getscreeninfo malformed reply## Submitted by sny..@..yh.org
Assigned to **Keith Packard `@keithp`**
**[Link to original bug (#73180)](https://bugs.freedesktop.org/show_bug.cgi?id=73180)**
## Description
Created attachment 91355
the wireshark captured pa...## Submitted by sny..@..yh.org
Assigned to **Keith Packard `@keithp`**
**[Link to original bug (#73180)](https://bugs.freedesktop.org/show_bug.cgi?id=73180)**
## Description
Created attachment 91355
the wireshark captured packet when run above python code
```
import xcb
from xcb import randr
con = xcb.connect()
ran = con(randr.key)
ver = ran.QueryVersion(1, 3).reply()
root = con.get_setup().roots[0].root
ran.GetScreenInfo(root).reply()
```
this will panic:
```
Traceback (most recent call last):
File "a.py", line 11, in `<module>`
ran.GetScreenInfo(root).reply()
File "/usr/lib/python2.7/dist-packages/xcb/randr.py", line 101, in __init__
self.rates = xcb.List(parent, offset, (self.nInfo - self.nSizes), RefreshRates, -1)
File "/usr/lib/python2.7/dist-packages/xcb/randr.py", line 56, in __init__
(self.nRates,) = unpack_from('H', parent, offset)
struct.error: unpack_from requires a buffer of at least 2 bytes
```
because the xserver reply an malformed package, seems caused by this commit https://bugs.freedesktop.org/show_bug.cgi?id=21861#c4
**Attachment 91355**, "the wireshark caputred packet when run above python code":
[randr.pcapng](/uploads/0bd142dfe2558daf0ff84e8432730d65/randr.pcapng)Keith PackardKeith Packardhttps://gitlab.freedesktop.org/xorg/xserver/-/issues/221`xrandr --setprovideroutputsource 0 0` causes an assert in the server2018-12-13T18:33:09ZBugzilla Migration User`xrandr --setprovideroutputsource 0 0` causes an assert in the server## Submitted by Daniel Martin
Assigned to **Xorg Project Team**
**[Link to original bug (#65096)](https://bugs.freedesktop.org/show_bug.cgi?id=65096)**
## Description
With these (obviously wrong) values the server asserts at
d...## Submitted by Daniel Martin
Assigned to **Xorg Project Team**
**[Link to original bug (#65096)](https://bugs.freedesktop.org/show_bug.cgi?id=65096)**
## Description
With these (obviously wrong) values the server asserts at
dix/dispatch.c:3938 in DetachUnboundGPU().
It happens with other values, too. I.e. by passing the id of an existing provider 2 times.
Version: githttps://gitlab.freedesktop.org/xorg/xserver/-/issues/236Backlight is getting set to maximum when reading it2018-12-13T18:33:51ZBugzilla Migration UserBacklight is getting set to maximum when reading it## Submitted by tom..@..il.com
Assigned to **Xorg Project Team**
**[Link to original bug (#59942)](https://bugs.freedesktop.org/show_bug.cgi?id=59942)**
## Description
When I try to get current backlight setting with xbacklight -g...## Submitted by tom..@..il.com
Assigned to **Xorg Project Team**
**[Link to original bug (#59942)](https://bugs.freedesktop.org/show_bug.cgi?id=59942)**
## Description
When I try to get current backlight setting with xbacklight -get or xrandr --prop it's getting set to maximum, and is reported as being set to maximum.
How to reproduce:
xbacklight -set 40 # backlight is actually getting dimmer
xbacklight -get # backlight is getting brighter, and is reported as 100.000000
Distribution: Ubuntu 12.10
Xorg server version: xorg-server 2:1.13.0-0ubuntu6.1
Laptop model: Samsung NP300V5 (with i7-2670QM CPU)
GPU:
- Intel Corporation 2nd Generation Core Processor Family Integrated Graphics Controller (rev 09)
- NVIDIA Corporation GF119 [GeForce GT 520MX] (Optimus, this one is actually disabled with bumblebee)
Version: 7.7 (2012.06)https://gitlab.freedesktop.org/xorg/xserver/-/issues/223less xrandr modi for LVDS1 with current kernel/driver than with the older ones2018-12-13T18:33:17ZBugzilla Migration Userless xrandr modi for LVDS1 with current kernel/driver than with the older ones## Submitted by Toralf Förster
Assigned to **Xorg Project Team**
**[Link to original bug (#54571)](https://bugs.freedesktop.org/show_bug.cgi?id=54571)**
## Description
With a new ThinkPad T420 (4180f65) with integrated intel graph...## Submitted by Toralf Förster
Assigned to **Xorg Project Team**
**[Link to original bug (#54571)](https://bugs.freedesktop.org/show_bug.cgi?id=54571)**
## Description
With a new ThinkPad T420 (4180f65) with integrated intel graphic I'm wondering why with a RedHat system (kernel 2.6.32, Xorg 7.3, intel driver 2.16.0) xrandr gives :
$ cat /mnt/E/xrandr.2.6.32
Screen 0: minimum 320 x 200, current 2960 x 900, maximum 8192 x 8192
LVDS1 connected 1600x900+0+0 (normal left inverted right x axis y axis) 309mm x 174mm
1600x900 60.0*+ 40.0
1440x900 59.9 59.9
1360x768 60.0
1280x800 59.8 59.9
1280x768 59.9 60.0
1024x768 60.0
800x600 60.3 56.2
848x480 60.0
640x480 59.9
VGA1 connected 1360x768+1600+0 (normal left inverted right x axis y axis) 700mm x 390mm
1360x768 49.9*+
1024x768 60.0
800x600 60.3 56.2
640x480 60.0
while with a Gentoo Linux (kernel 3.5.3, xorg-server 1.12.4, intel driver 2.20.5) I get less LVDS1 modi (different VGA monitor) :
$ xrandr
Screen 0: minimum 320 x 200, current 1680 x 1050, maximum 8192 x 8192
LVDS1 connected (normal left inverted right x axis y axis)
1600x900 60.0 + 40.0
1024x768 60.0
800x600 60.3 56.2
640x480 59.9
VGA1 connected 1680x1050+0+0 (normal left inverted right x axis y axis) 430mm x 270mm
1680x1050 59.9*+
1600x1200 60.0
:-(
Version: 7.7 (2012.06)