Due to an influx of spam, we have had to impose restrictions on new accounts. Please see this wiki page for instructions on how to get full permissions. Sorry for the inconvenience.
Admin message
Equinix is shutting down its operations with us on April 30, 2025. They have graciously supported us for almost 5 years, but all good things come to an end.
Given the time frame, it's going to be hard to make a smooth transition of the cluster to somewhere else (TBD). Please expect in the next months some hiccups in the service and probably at least a full week of downtime to transfer gitlab to a different place.
All help is appreciated.
VRR: Allow VRR enable/disable without requiring a full modeset
VRR/Adaptive Sync does not appear to be working for me. Interestingly enough, if "Adaptive Sync" is enabled in the monitor settings, the monitor reports the refresh rate as 145Hz, instead of the 144Hz that it is set to according to the display configuration. If "Adaptive Sync" is disabled then the refresh rate is correctly reported as 144Hz. Relevant part of the log seems to be:
Nov 15 20:52:36 andrew-gentoo-pc kernel: i915 0000:03:00.0: [drm:pipe_config_mismatch] [CRTC:131:pipe B] fastset mismatch in vrr.enable (expected no, found yes)Nov 15 20:52:36 andrew-gentoo-pc kernel: i915 0000:03:00.0: [drm:pipe_config_mismatch] [CRTC:131:pipe B] fastset mismatch in vrr.vmin (expected 0, found 1480)Nov 15 20:52:36 andrew-gentoo-pc kernel: i915 0000:03:00.0: [drm:pipe_config_mismatch] [CRTC:131:pipe B] fastset mismatch in vrr.vmax (expected 0, found 4442)Nov 15 20:52:36 andrew-gentoo-pc kernel: i915 0000:03:00.0: [drm:pipe_config_mismatch] [CRTC:131:pipe B] fastset mismatch in vrr.flipline (expected 0, found 1481)Nov 15 20:52:36 andrew-gentoo-pc kernel: i915 0000:03:00.0: [drm:pipe_config_mismatch] [CRTC:131:pipe B] fastset mismatch in vrr.guardband (expected 0, found 40)
The full dmesg with debug info is attached here: dmesg
I have tried setting the refresh rate of the monitor to 100Hz instead, I have also tried different games, and I have tried toggling the "Adaptive Sync" setting in KDE Plasma Display Configuration settings from "Automatic" to "Always". This does not change anything.
System Information:
Gentoo Linux x86_64
Kernel Version 6.1.0-rc5+ (latest drm-tip, also occurs with the latest kernel in Gentoo: 6.0.8)
Mesa Version 22.2.3
Motherboard: MSI Z370-A PRO (upgraded to latest firmware, Resizeable BAR enabled)
Display: MSI Optix MAG342CQR connected using a DisplayPort 1.4 cable
GPU: Intel ARC A770
CPU: Intel i9-9900KS with Intel UHD Graphics 630 (rev 02) (iGPU is the boot GPU, dGPU is primary in X/wayland)
DE: KDE Plasma 5.26.3 with KDE Frameworks 5.100.0 on Qt 5.15.5 (wayland session)
To upload designs, you'll need to enable LFS and have an admin enable hashed storage. More information
Child items
0
Show closed items
No child items are currently assigned. Use child items to break down this issue into smaller parts.
Linked items
0
Link issues together to show that they're related.
Learn more.
That shows the driver is at least OK with enabling VRR in the end, but there is no actual modeset after that so this looks like more like a TEST_ONLY atomic commit, and then afterwards userspace decided to not enable VRR anyway. So the on a first glance this looks more likely to be a userspace problem.
What happens if you disable all the other displays except the VRR capable one?
I suppose it could also be due the need for full modeset. Maybe kwin doesn't want to do one. In that case you should try to enable VRR before enabling the display. Hopefully kwin would enable VRR already at the same time as it lights up the display.
What happens if you disable all the other displays except the VRR capable one?
No difference :(
I suppose it could also be due the need for full modeset. Maybe kwin doesn't want to do one. In that case you should try to enable VRR before enabling the display. Hopefully kwin would enable VRR already at the same time as it lights up the display.
How can I ensure that VRR is enabled before the display is enabled? SDDM runs before kwin does and I'm pretty sure SDDM does not enable VRR since it uses X and I never configured this in X. I have tried to configure SDDM to not use the VRR capable monitor, so kwin is the first thing to display something on it, but this also made no difference.
The reason I thought this was a kernel level bug is because this exact same setup worked before I replaced the GPU. I did not have to do anything special apart from enabling it in the KDE Display Settings. I can report this issue to the kwin bug tracker if you really think it is a userspace problem.
I'll see if I can get a kwin debug log, maybe this will show us what is going wrong.
Also, while investigating I discovered something odd in lspci:
Lspci reports the PCI-e link to the GPU as PCI-e version 1 with width 1. While it is actually in a PCI-e version 3 slot with width 16. I doubled checked if it is inserted properly, and it is. The slot should be fine (with the old GPU it detects properly as 3.0x16). The GPU is brand new so I think it is safe to assume that its PCI-e connector is also fine. Furthermore, the performance is about what I would expect, it does not at all feel like it is being bottlenecked by a PCI-e 1.0 x1 slot. This leaves a detection error in the GPU or Motherboard driver/firmware as an explantation for the lspci output. Do you think this is a i915 bug, should I report it here in a new bug report?
drm.debug=0x1e should show something along the lines of "[CRTC:id:name] requires full modeset" when that is the cause of the failure.
It looks like this is indeed the root cause of my problem. I added this boot parameter and found a bunch of "requires full modeset" in the log: dmesg-full-2
I disabled and re-enabled the monitor, and also tried switching the "Adaptive Sync" setting from Automatic to Always. My understanding is that "Automatic" will only enable VRR when a full screen application is running. Neither setting makes VRR work, no matter how often I cycle the monitor. I'll report this over at kwin since it seems that the problem is as you say, kwin does not enable VRR directly when it enables the monitor.
If I understand you correctly then the kwin VRR "Automatic" setting is fundamentally incompatible with i915 currently. At least not without forcing a full modeset each time a full screen application is started, which does not sound desirable. But things should be able to work with the "Always" setting provided that kwin enables VRR immediately when it enables the monitor.
Presumably amdgpu allows vrr enable/disable w/o a full modeset. i915 does not, at least not yet.
Are there any plans to change this at some point? It would be great if kwin's VRR "Automatic" setting could also work on Intel platforms, as I understand it "Automatic" is the default value.
Requiring a modeset for changing VRR_ENABLED is not acceptable for userspace, it goes against the established uAPI and is simply not practical. The drm API provides no usable way to simulate a non-vrr mode from userspace while leaving VRR_ENABLED set to 1, and neither allowing the refresh rate to constantly fluctuate (at least while that can still cause brightness flicker) nor doing modesets during normal usage is a good idea.
Are there any estimates for how long it will take to fix this in i915?
Are there any estimates for how long it will take to fix this in i915?
Is there anything I can do as a user to help here? I tried to enable VRR in Xorg.conf with Option "VariableRefresh" "true" in the hope that it might somehow work if SDDM already enables it before kwin wayland starts, but sadly this did not help. Writing the code is beyond my skill, but I'd be more than happy to test and debug any patches in this area.
Nowa Ammerlaanchanged title from VRR not working: [drm:pipe_config_mismatch] [CRTC:131:pipe B] fastset mismatch in vrr to VRR: Allow VRR enable/disable without requiring a full modeset
changed title from VRR not working: [drm:pipe_config_mismatch] [CRTC:131:pipe B] fastset mismatch in vrr to VRR: Allow VRR enable/disable without requiring a full modeset
This should eliminate the need for a full modeset to toggle VRR:
https://github.com/vsyrjala/linux.git vrr_fastset
Quickly smoke tested here on TGL/ADL + ASUS MG278Q (there's also a horrendous hack in that branch for the MG278Q). And turns out this ADL T14 gen3 laptop also has a VRR eDP panel (only 40-60Hz though) so tested on that one as well.
Amazing 🥳! With your branch VRR works exactly as it did with my old AMD card. Kwin-wayland enables it automatically when a full screen game starts, and then disables it again when the game stops. Thank You!
I did notice that during boot, when the kernel loads, the screen blanks significantly longer, but I guess this is related to your MG278Q workaround?
The workaround should only kick in the first time VRR actually gets enabled. Presumably nothing should enable VRR during boot, so not sure what's going on there.
You could try a build from the baseline commit of that branch commit 022aa65c6694 ("drm-tip: 2023y-03m-07d-20h-59m-24s UTC integration manifest"). If that still exhibits the long blank then we know it's caused by something else than the VRR stuff. But if the blank is gone then there might be something a bit fishy going on with my VRR changes.
There should also be a debug message Waiting up to 800 ms before enabling VRR when the workaround does happen. So if you boot with drm.debug=0xe but never actually start anything that should enable VRR, then you could check the dmesg for that message.
Actually, never mind. This happened twice yesterday, but today no matter what I do I can't seem to reproduce it. Probably this was related to something else. I have a Bluetooth adapter that is very flaky and sometimes causes boot to hang for a bit so it was probably just the Bluetooth adapter bugging out again. Dmesg outputs usb 1-12: can't read configurations, error -110 when this happens and this is in yesterdays log as well.
There should also be a debug message Waiting up to 800 ms before enabling VRR when the workaround does happen. So if you boot with drm.debug=0xe but never actually start anything that should enable VRR, then you could check the dmesg for that message.
I don't see any of these messages either before starting VRR.
I've noticed that sometimes when VRR is on the display starts flickering. This appears to only happen when the FPS is lower then the lowest refresh rate the monitor supports (48Hz). Normally in this case the display would run at some integer multiple of the FPS, i.e. for 30 it would run at 60, 90 or 120. If I turn on the refresh rate indicator of the monitor it shows that the FPS is rapidly switching, it appears unable to decide whether the refresh rate should be double or triple the FPS and instead just alternates between these two very frequently causing this flickering effect. So far I have only observed this in the menus of Hitman III, where the FPS is throttled to 30. Once the game loads the FPS increases and the problem disappears.
Possibly this could also be a bug somewhere else and unrelated to your vrr_fastset branch.
The flickering is a common issue with many monitors.
Not sure if there could be something done in Mesa/etc. about the throttling issue. I also noticed wild frame rate changes with dxvk when I tried to use DXVK_FRAME_RATE=45 on my laptop. ~45 being the best my GPU could maintain with that game, and so I thought I could use VRR to get a steady 45. Didn't work that way. So I went back to using DXVK_FRAME_RATE=30 an no VRR.
@vsyrjala you're a god. Your patches allow me to play with VRR, at least in Wayland on my setup.
I'm in a desktop prime render offloading the NVIDIA proprietary drivers on a RTX 3090 along with the 13700kf Intel iGPU driving the desktop and the display. All tests on Plasma.
In vrrtest there's the hardware cursor bug (when moving the mouse, VRR wrongly stops syncing the fullscreen window), BUT in gaming (most override the cursor, and it doesn't affect camera movement) it worked flawlessly.
In heavy games like Cyberpunk 2077 (on VKD3D) I like to lock fps a bit under my refresh rate for optimal latency+vrr smoothness so 72fps @ 75hz (range 40-75), did it with the in-game frame limiter and vsync off. Runs flawlessly, latency and smoothness on par with Windows or NVIDIA-only Xorg with everything on Ultra+Ray-tracing.
Multiversus is a competitive fighting game that runs locked at 60fps, so VRR is a must on a 75hz monitor. Also ran flawlessly.
Using mangohud instead of the in-game limiter kinda worked but latency was higher and it stuttered, so thats something to keep in mind. Haven't tried DXVK_FRAME_RATE or libstrangle yet though.
On Xorg (modesetting) unfortunately only works for the Intel card, and prime render offload seems to ignore it. Any ideas why @vsyrjala ?
It does, but still only with drm-tip. The relevant patches still haven't made it into a kernel release.
To enable it, there should be a drop down menu in the KDE Display Settings for that monitor, if that is not there then there is some other issue.
Maybe @vsyrjala knows when we can expect VRR to work with a released kernel. At some point I'd like to go back to the releases again instead of riding the drm-tip.
The flickering is a common issue with many monitors.
With the latest mesa and proton experimental the flickering issue is also gone. The in-menu refresh rate of Hitman 3 is now exactly the minimum rate my monitor supports (around 53 instead of 30). Whether this is something intentional or coincidental I don't know.
Hi, bringing this up again to clarify some of the discussion here and in drm/amd#2200. There seems to be both a push towards not requiring a modeset for switching VRR on/off [1,2,3], and towards allowing it given ALLOW_MODESET is specified in the request [4].
For the former, the suggestion was to always leave the display in VRR, and handle VRR toggles by controlling the frame timings purely from the driver. The issue here is there are certain features on many displays (such as Ultra Low Motion Blur) that will be permanently locked out since we never issue a modeset to take the display out of VRR. This is not a problem on DP AdaptiveSync since we can set the IGNORE_MSA bit as discussed by @aaurabin, but I'm assuming it is for other displays? For the latter, the problem here is that if we use ALLOW_MODESET to take the display out of VRR via modeset, then try to enable VRR without ALLOW_MODESET, this will not work. What should be the behavior in this case? The current API contract seems ambiguous in this case. Should there be an additional VRR property besides VRR_ENABLED = {true/false}? Say, for example, VRR_ENABLED = ADAPTIVE? With {TRUE, ADAPTIVE, FALSE}, VRR_ENABLED = FALSE would shut off VRR on the display (potentially with a modeset), and TRUE/ADAPTIVE would turn it on (again, potentially with modeset). ADAPTIVE would enable the driver-side control over VRR timing, while keeping VRR enabled on display. With this design, users would be able to turn VRR off on the display if desired. Of course, this would involve an upstream proposal to kernel, and adoption by compositors, which could be a lengthy process. I would appreciate some discussion/follow-up on these points. If my understanding has gaps, please let me know.
Making a new property that userspace can use to control the actual VRR state for the display sounds like a good idea to me, but I wonder if it's necessary - at the display next hackfest we discussed an API to allow compositors to set a refresh rate min+max with vrr, which should cover this as well by allowing the compositor to just set the refresh rate to max at all times.
@Zamundaaa I'm not sure if that would address the problem of not issuing a modeset for applicable displays. For example, true G-SYNC displays with an FPGA require a modeset to disable G-SYNC. Without it, various display features may be unavailable as VRR is never disabled fully.
Current status is that VRR transition does not require full modeset on amdgpu. There is no ambiguity IMO - DP must support seamless VRR transition whereas HDMI does not mandate it in the spec. Despite HDMI not mandating it, amdgpu does not issue full modeset for VRR transition on HDMI. This is the part of contention AFAIU.
Correct thing to do here is to reject seamless VRR transition on HDMI only, provided the compositors are ready to accept it (which is why decided to wait before Plasma was ready to handle it). Though, if we implement the correct behavior that is compliant with the HDMI spec in the driver, we lose the ability to provide a seamless transition for displays that support it even on HDMI. One option is to let the compositor decide whether a monitor is known to require full modeset for the transition on HDMI. Other option is to let amdgpu decide whether to require full modeset for the particular monitor. Since its a hardware-specific anomaly, I'm inclined towards putting it in the driver.