Due to an influx of spam, we have had to impose restrictions on new accounts. Please see this wiki page for instructions on how to get full permissions. Sorry for the inconvenience.
Admin message
Our infrastructure migration is complete. Please remember to update your SSH remote to point to ssh.gitlab.freedesktop.org; SSH to the old hostname will time out. You should not see any problems apart from that. Please let us know if you do have any other issues.
I wanted to work on implementing HDR support in GTK using gnome-shell's new HDR support and got this shiny new laptop, but it seems HDR support doesn't work. We (read: me with the help of the gnome-shell developers) made sure that the kernel calls are actually succeeding, but I could not see any visual difference from switch HDR on and off.
Steps to reproduce:
Get a ThinkPad X1 Carbon Gen 9 20XXS95H00 containing "PCI 8086:9a49 Intel Corporation TigerLake-LP GT2 [Iris Xe Graphics]".
Make sure it contains the fancy HDR monitor with this edid:
Block 0, Base EDID: EDID Structure Version & Revision: 1.4 Vendor & Product Identification: Manufacturer: CSO Model: 5123 Made in: 2020 Basic Display Parameters & Features: Digital display Bits per primary color channel: 10 DisplayPort interface Maximum image size: 30 cm x 19 cm Gamma: 2.20 Supported color formats: RGB 4:4:4 First detailed timing includes the native pixel format and preferred refresh rate Display is continuous frequency Color Characteristics: Red : 0.6650, 0.3359 Green: 0.2539, 0.6875 Blue : 0.1386, 0.0527 White: 0.3066, 0.3408 Established Timings I & II: none Standard Timings: none Detailed Timing Descriptors: DTD 1: 3840x2400 60.000000 Hz 16:10 148.800 kHz 595.200000 MHz (302 mm x 189 mm) Hfront 48 Hsync 32 Hback 80 Hpol N Vfront 3 Vsync 6 Vback 71 Vpol N Display Range Limits: Monitor ranges (Bare Limits): 48-60 Hz V, 149-149 kHz H, max dotclock 600 MHz Alphanumeric Data String: 'CSOT T3' Alphanumeric Data String: 'MNE007ZA1-2' Extension blocks: 1Checksum: 0x71----------------Block 1, CTA-861 Extension Block: Revision: 3 Native detailed modes: 0 Colorimetry Data Block: BT2020RGB HDR Static Metadata Data Block: Electro optical transfer functions: Traditional gamma - SDR luminance range SMPTE ST2084 Supported static metadata descriptors: Static metadata type 1 Desired content max luminance: 106 (496.743 cd/m^2) Desired content max frame-average luminance: 106 (496.743 cd/m^2) Desired content min luminance: 36 (0.099 cd/m^2)Checksum: 0x9a Unused space in Extension Block: 112 bytes
Hi @company, sure we will look into this issue. At first, seeing your drm_info results, it seems to correctly set the monitor into HDR modes. See the Colorspace correctly set to BT2020_RGB and HDR_OUTPUT_METADATA blob property is set.
Usually, external HDR monitors will show a pop-up HDR or HDR vivid when it is switched to HDR mode.
the merge request adding HDR modes only forces monitor to go into HDR mode, it does not increase or decrease luminance value or any such metadata. So this may be expected to not see any major visual differences.
We have tried HDR video playback using mpv player using latest gnome-shell/mutter 44.1 and we do see visual or HDR enhancement on screen on Intel platforms, see here play HDR video using mpv player
Usually, external HDR monitors will show a pop-up HDR or HDR vivid when it is switched to HDR mode.
Note that this is about an internal laptop panel. I unfortunately do not know of any way to validate that those screens successfully turned on HDR mode other than observing the changed behavior.
IME, if you put an external monitor into HDR mode and do not alter the pixel values accordingly, the resulting image is very clearly different from SDR mode: it is far too contrasty. Colors may or may not be too saturated as well, depending on how the monitor behaved in SDR.
But internal panels are probably very different from external monitors, and I believe there is a great responsibility in drivers to handle them appropriately, since things like HDR_OUTPUT_METADATA or Colorspace properties may not actually be hooked up to anything at all at the panel end. Or at least, that is my guess, since why would they be for built-in displays - there is no need for color-standardised signalling like BT.2100 when you simply cannot unplug the panel and plug something else instead.
I recently started asking these questions on dri-devel.
Another anecdote I witnessed with the sample set of one external monitor was that max luminance in HDR_OUTPUT_METADATA needs to be greater than 100 cd/m² for the this particular monitor to engage HDR mode. Setting the EOTF is not enough. It does not matter what the max luminance value is for that one monitor, it acts like a boolean.
I see absolutely no visual difference, no matter if I watch HDR videos, look at images or if I just set the whole screen to #FF0000, #00FF00 or #0000FF.
Well, that is strange, why it happens with internal panel only!!
Not sure if we have any eDP panel with HDR support to replicate this issue. We will check and let you know. thanks
As reported by @company, we can reproduce this issue on our local setup TigerLake-LP GT2 [Iris Xe Graphics] with HDR supported 4K eDP panel, it correctly sets the eDP panel to HDR mode, at least we can confirm it seeing drm_info & mutter logs:
@mpearson Do you have any input or specs on the ThinkPad X1 Carbon HDR eDP panels? We seem to think we're enabling HDR but can't see any visual difference.
I'll forward this on to the graphics team for their input - I have no experience with HDR at all. Internal ticket is LO-2452 (for my benefit and tracking)
Just a thought (and I hate to ask it) but on that particular system do you see a difference with HDR on Windows? That might help rule out panel capabilities..
Just the fact that the PQ and 2.2 Gamma curves are very different means that different shades of gray in SDR are different shades of gray in HDR. It would be nice to test on Windows though.
Please let me know if there are any specifics that would help - I have (I think) the right contacts internally now for any questions. They confirmed that this panel supports HDR and recommended testing HDR content under Windows to make sure it is working correctly.
I wasn't sure what follow up questions I should ask I'm afraid.
Let's see if I can formulate the questions for @mpearson to ask.
From eDP signalling point of view, what must the source do to have the sink/panel enter "HDR mode"? HDR mode means access to the full color gamut and the full dynamic range of the panel.
Do you also need to touch backlight controls to enter HDR mode, and how?
Does a HDR mode even exist separately, or should one just turn the backlight to max, use high bits-per-channel pixel format, and tone map in a HDR fashion at the source?
Once HDR mode is on, what is the expected pixel color encoding? Is it panel native color gamut, BT.2020, or something else? What is the expected transfer characteristic? How does the emitted light luminance range map to pixel values?
What fields of the panel EDID can we trust? Does the fact that it is an internal panel or eDP change the meaning of any EDID fields in practice?
Does the eDP sink/panel understand colorimetry in infoframes? E.g. telling the sink that the source is encoding pixels with BT.2020 primaries and white point. Does the sink understand and process HDR Static Metadata Type 1 messages? Are there requirements in the static metadata values to enable HDR, e.g. require setting max luminance > 100 cd/m²?
Is there something completely outside of the eDP signalling that would be relevant? Some platform ACPI call to enable HDR or whatever?
Some discussion and trying to track down someone who can help with the questions.
Thinking aloud (and hopefully this turns out to be an unnecessary question) - but if needed do we have someone from Intel/Red Hat/other who is under NDA with Lenovo already for discussion and would be OK for an 'internal' discussion first?
The reason is it's always difficult making the translation from internal proprietary and protected information (schematics or internal design documents) and finding just the bits of what is needed for a driver implementation which could be open-sourced. We used this approach on another project and it worked pretty well.
Note - I don't know if the team would accept it here, but floating it as an idea to maybe accelerate things and save on the levels of translation between me and the team in Japan. Let me know if there is anybody and I'll float the idea internally.
HI @company, @mpearson,
We tried installing Windows OS on my local setup TigerLake-LP GT2 [Iris Xe Graphics] with HDR supported 4K eDP panel and we were able to see visual differences when toggling HDR. Attaching screen captures while toggling Use HDR option:
HDR - OFF
HDR - ON
However, as reported earlier, we couldn't see visual differences with Linux on this setup.
Thanks for testing, but I think it still does not explain who is changing their behavior: is it only the Windows compositor and panel driver software, or is it also the panel and/or backlight hardware?
If the only difference intended by the hardware manufacturer is in the software tone-mapping, meaning that the panel hardware is intentionally ignoring most HDR signalling parameters, then Linux/Mutter changing only the display signalling is expected to make no difference.
I guess the essence of the question here is: is HDR for laptop panels created purely by the operating system and drivers, or is there some actual hardware toggle too?
If a hardware toggle exists, is a Linux DRM driver perhaps hardcoding that toggle to "HDR" already? I forget if it was @vsyrjala or @jani mentioning something about a panel behavior switch in IRC.
HI @pq, @uma_mnnit,
As suggested, we tried enabling backlight option on eDP HDR supported panel as following:
i915.enable_dpcd_backlight=1
However, we still don't see any visual differences if we turn ON or OFF HDR option. Backlight brightness level always remains to 600, see here
Comparing Max/peak brightness level of the panel when HDR is enabled in Windows OS is also 600nits, see here. So, it looks like HDR on eDP panel does not change brightness level, even when the HDR & Colorspace properties are set. It simply ignores.
Note: Prior to enabling backlight option from i915 boot parameters, the default brightness level was 184. We also tried to change the brightness level to 255 and enabling HDR does not change the brightness. It remains to 255 during HDR is ON or OFF.
As per our initial understanding, looks like we never change the brightness level on eDP panels except for Suspend/Resume. @jani, please correct if our understanding is right, as soon we turn on the HDR mode, the brightness level is expected to increase to its max/peak value, however that is something missing here.
Also, when enabling HDR mode, the brightness level increases to peak/max on external HDR monitor and similar is observed with Windows OS. Along with brightness level, bit color depth should also increase to 10 or 12bit from default.
If a hardware toggle exists, is a Linux DRM driver perhaps hardcoding that toggle to "HDR" already? I forget if it was @vsyrjala or @jani mentioning something about a panel behavior switch in IRC.
For userspace and end users, it would be best if the KMS driver does not hammer the backlight to max on enabling HDR. If people want to improve battery life, they might choose to lower the backlight level even in HDR mode, and have their display server tone-map according to the reduced peak luminance. Diffuse white level will probably be another end user control or userspace automatic control.
Of course, that requires knowing how the backlight control behaves. I have doubts that /sys/class/backlight/intel_backlight/brightness would actually be in nits.
I have an AMD Lenovo Legion 7 Gen 7 and the experimental_hdr toggle does nothing for me. There's no modeset or perceivable change in colors or any output in the journal.
EDID:
Block 0, Base EDID: EDID Structure Version & Revision: 1.4 Vendor & Product Identification: Manufacturer: BOE Model: 2715 Made in: week 28 of 2020 Basic Display Parameters & Features: Digital display Bits per primary color channel: 10 DisplayPort interface Maximum image size: 34 cm x 21 cm Gamma: 2.20 Supported color formats: RGB 4:4:4 First detailed timing includes the native pixel format and preferred refresh rate Display is continuous frequency Color Characteristics: Red : 0.6416, 0.3349 Green: 0.2998, 0.6210 Blue : 0.1513, 0.0605 White: 0.3134, 0.3291 Established Timings I & II: none Standard Timings: none Detailed Timing Descriptors: DTD 1: 2560x1600 60.001613 Hz 16:10 103.923 kHz 282.670000 MHz (344 mm x 215 mm) Hfront 48 Hsync 32 Hback 80 Hpol P Vfront 3 Vsync 6 Vback 123 Vpol N Display Range Limits: Monitor ranges (Bare Limits): 60-165 Hz V, 286-286 kHz H, max dotclock 780 MHz Alphanumeric Data String: 'BOE CQ' Alphanumeric Data String: 'NE160QDM-NY1' Extension blocks: 2Checksum: 0xd5----------------Block 1, CTA-861 Extension Block: Revision: 3 Native detailed modes: 0 Colorimetry Data Block: BT2020RGB HDR Static Metadata Data Block: Electro optical transfer functions: Traditional gamma - SDR luminance range SMPTE ST2084 Supported static metadata descriptors: Static metadata type 1 Desired content max luminance: 106 (496.743 cd/m^2) Desired content max frame-average luminance: 106 (496.743 cd/m^2) Desired content min luminance: 36 (0.099 cd/m^2)Checksum: 0xff----------------Block 2, DisplayID Extension Block: Version: 1.3 Display Product Type: Extension Section Video Timing Modes Type 1 - Detailed Timings Data Block: DTD: 2560x1600 165.003906 Hz 16:10 285.787 kHz 777.340000 MHz (aspect 16:10, no 3D stereo, preferred) Hfront 48 Hsync 32 Hback 80 Hpol N Vfront 3 Vsync 6 Vback 123 Vpol NChecksum: 0x90----------------EDID conformity: PASS
However, I fail to notice any visual difference (the modeset doesn't help).
I tried toggling the HDR on Windows and honestly I couldn't see a visual difference there either, also perhaps due to the modeset, and due to this panel only being DisplayHDR 400, which as I understand stands for "basically useless for HDR".
So I'm not sure this warrants a drm/amd issue from me until there's a more clear cut visual test case.
If you don't change the pixels too when switching between SDR and HDR, then you should be able to see a difference (HDR mode looks somewhat "bad"). I'm guessing the Linux test case is like this, and either there really is no difference (e.g. because SDR mode was driven wrongly as if it was HDR already, meaning SDR is the one that looks "bad" but you're just accustomed to it), or something is missing.
If you do change the pixels in perfect way to match the mode, then switching between SDR and HDR may not show any difference - that's a theoretical ideal. So in the Windows test case, you should try showing some actual HDR content, and compare how that looks between SDR and HDR modes.
In the end, the difference between SDR and HDR on a lower end display might be just whatever the source (windows system) does with tone mapping and not about the display at all. Especially if the SDR mode was already using the full capabilities of the display instead of being artificially limited.
It's really hard to say until we know more about how the panel is supposed to behave.
The proprietary Intel eDP backlight control has a bunch of TCON controls which sound very suspicious and are not hooked up to anything. Namely INTEL_EDP_HDR_TCON_2084_DECODE_ENABLE, INTEL_EDP_HDR_TCON_2020_GAMUT_ENABLE, INTEL_EDP_HDR_TCON_TONE_MAPPING_ENABLE, INTEL_EDP_HDR_TCON_SEGMENTED_BACKLIGHT_ENABLE, INTEL_EDP_HDR_CONTENT_LUMINANCE and INTEL_EDP_SDR_LUMINANCE_LEVEL.
Would be nice to get more documentation about those controls.
The situation is unfortunate. There's an old slide deck that's really not a proper specification, but might offer some clues. But even that I can't share. :(
If HDR doesn't work on laptop displays, or doesn't work like userspace expects, please at least remove the HDR_OUTPUT_METADATA and Colorspace properties for these cases until that's resolved. Users are being presented with a "Enable HDR" checkbox in Plasma 6, which makes the output incorrect because KWin assumes the properties do what they're supposed to...
For the record, because this hasn't been addressed and the Colorspace property is handled wrong on external monitors too, I hardcoded KWin to not expose support for Colorspace by default on i915 (which also disables the HDR checkbox in the settings).
If you want to do testing for this with KWin, you can get the option back with the KWIN_DRM_ALLOW_INTEL_COLORSPACE=1 env var.
Hi, I'm having the same issue on a Lenovo Slim Pro 9i, I tried with Plasma 6 (thanks to @Zamundaaa for helping me find this issue) and Gnome's implementation and both didn't work (washed colors on Plasma and no changes on Gnome). Not sure if there is anything that can be done here but I will be happy to try new solutions if needed.
Do INTEL_EDP_HDR_TCON_2084_DECODE_ENABLE and INTEL_EDP_HDR_TCON_2020_GAMUT_ENABLE mean that the sink is allowed to honour infoframe metadata, or do they actually set the display to BT.2020/PQ mode regardless of infoframe metadata?
Does INTEL_EDP_HDR_TCON_SEGMENTED_BACKLIGHT_ENABLE make sense when HDR_OUTPUT_METADATA exists and explicitly indicates SDR?
It indeed works! I didn't test it rigorously yet, but the SDR content seems very similar to how Windows displays it and the black color have the pixels turned off, which only happens with the HDR mode in my laptop.
Yes @pq ideally this should be done only if metadata has HDR enabled, so it would be good to have an extra check for EOTF as well. Same for segmented_backlight. Seems in current test metadata always have HDR enabled hence it works but we should handle the SDR case as well. @surajk8 Can you check this once and update the series.
Do I understand correctly, that those "hardware flags" will set the panel to BT.2020/PQ HDR mode, and the HDR_OUTPUT_METADATA is not actually used outside of these conditions in the driver code? I mean, the hardware does not itself use the metadata even if it was sent via infoframe or such?
In that case, yes, you do need verify that HDR_OUTPUT_METADATA sets EOTF to PQ before you program hardware to PQ, and you need to check that Colorspace sets colorimetry to BT.2020 before you set hardware to BT.2020.
You also need to ensure that the EDID delivered to userspace is truthful: if you can do PQ, then EDID must say so. If you cannot do HLG, then EDID must say so.
The kernel driver cannot reject HDR_OUTPUT_METADATA based on its contents, so delivering correct EDID to userspace is important.
You also need to ensure that the EDID delivered to userspace is truthful: if you can do PQ, then EDID must say so. If you cannot do HLG, then EDID must say so.
This is for internal panels only so I suspect that the EDID was crafted for this specific panel and TCON combination. Definitely something we should check though.
I agree that this has to be hooked up the the right HDR_OUTPUT_METADATA and Colorspace values though.
@wangwillian0 thanks for the confirmation do tell if you see any issues after more testing
@pq thanks for the feedback you are correct there needs to be a check if hdr_output_metadata indicates sdr or hdr mode. Shared the patch here for testing and it will for sure require some more refinement before it is ready for merge
@uma_mnnit Sure ill get that done and update the series
Did a quick test of the patch (on top of drm-tip) using Mutter's experimental switch. It blanks the screen for 1-2 seconds and then the screen comes back but brightness is set to minimum and the brightness controls in Mutter stop working.
Unsetting the HDR mode in Mutter again reverts everything fine, brightness works as before.
I looked at this in a bit more detail and there are numerous issues with the patch. I don't have the hardware at hand so this patch I've come up with (on top of v4) is completely untested but I believe the logic is more accurate than what we have here.
diff --git a/drivers/gpu/drm/i915/display/intel_display_types.h b/drivers/gpu/drm/i915/display/intel_display_types.hindex e67cd5b02e84..090d93119e71 100644--- a/drivers/gpu/drm/i915/display/intel_display_types.h+++ b/drivers/gpu/drm/i915/display/intel_display_types.h@@ -401,6 +401,11 @@ struct intel_panel { } vesa; struct { bool sdr_uses_aux;+ bool supports_2084_decode;+ bool supports_2020_gamut;+ bool supports_segmented_backlight;+ bool supports_sdp_colorimetry;+ bool supports_tone_mapping; } intel; } edp;diff --git a/drivers/gpu/drm/i915/display/intel_dp_aux_backlight.c b/drivers/gpu/drm/i915/display/intel_dp_aux_backlight.cindex 4f58efdc688a..730b7c7c6f90 100644--- a/drivers/gpu/drm/i915/display/intel_dp_aux_backlight.c+++ b/drivers/gpu/drm/i915/display/intel_dp_aux_backlight.c@@ -40,11 +40,6 @@ #include "intel_dp.h" #include "intel_dp_aux_backlight.h"-/* TODO:- * Implement HDR, right now we just implement the bare minimum to bring us back into SDR mode so we- * can make people's backlights work in the mean time- */- /* * DP AUX registers for Intel's proprietary HDR backlight interface. We define * them here since we'll likely be the only driver to ever use these.@@ -127,9 +122,6 @@ intel_dp_aux_supports_hdr_backlight(struct intel_connector *connector) if (ret != sizeof(tcon_cap)) return false;- if (!(tcon_cap[1] & INTEL_EDP_HDR_TCON_BRIGHTNESS_NITS_CAP))- return false;- drm_dbg_kms(&i915->drm, "[CONNECTOR:%d:%s] Detected %s HDR backlight interface version %d\n", connector->base.base.id, connector->base.name, is_intel_tcon_cap(tcon_cap) ? "Intel" : "unsupported", tcon_cap[0]);@@ -137,6 +129,9 @@ intel_dp_aux_supports_hdr_backlight(struct intel_connector *connector) if (!is_intel_tcon_cap(tcon_cap)) return false;+ if (!(tcon_cap[1] & INTEL_EDP_HDR_TCON_BRIGHTNESS_NITS_CAP))+ return false;+ /* * If we don't have HDR static metadata there is no way to * runtime detect used range for nits based control. For now@@ -158,6 +153,18 @@ intel_dp_aux_supports_hdr_backlight(struct intel_connector *connector) panel->backlight.edp.intel.sdr_uses_aux = tcon_cap[2] & INTEL_EDP_SDR_TCON_BRIGHTNESS_AUX_CAP;+ panel->backlight.edp.intel.supports_2084_decode =+ tcon_cap[1] & INTEL_EDP_HDR_TCON_2084_DECODE_CAP;+ panel->backlight.edp.intel.supports_2020_gamut =+ tcon_cap[1] & INTEL_EDP_HDR_TCON_2020_GAMUT_CAP;+ panel->backlight.edp.intel.supports_segmented_backlight =+ tcon_cap[1] & INTEL_EDP_HDR_TCON_SEGMENTED_BACKLIGHT_CAP;+ panel->backlight.edp.intel.supports_sdp_colorimetry =+ tcon_cap[1] & INTEL_EDP_HDR_TCON_SDP_COLORIMETRY_CAP;+ panel->backlight.edp.intel.supports_tone_mapping =+ tcon_cap[1] & INTEL_EDP_HDR_TCON_TONE_MAPPING_CAP;++ /* FIXME the caps should be validated against the EDID */ return true; }@@ -215,13 +222,30 @@ intel_dp_aux_hdr_set_aux_backlight(const struct drm_connector_state *conn_state, connector->base.base.id, connector->base.name); }+static bool+intel_dp_aux_is_supported_hdr_mode (const struct drm_connector_state *conn_state)+{+ struct intel_connector *connector = to_intel_connector(conn_state->connector);+ struct intel_panel *panel = &connector->panel;+ struct hdr_output_metadata *hdr_metadata;++ if (!conn_state->hdr_output_metadata)+ return false;++ hdr_metadata = conn_state->hdr_output_metadata->data;++ return hdr_metadata->hdmi_metadata_type1.eotf == HDMI_EOTF_SMPTE_ST2084 &&+ panel->backlight.edp.intel.supports_2084_decode;+}+ static void intel_dp_aux_hdr_set_backlight(const struct drm_connector_state *conn_state, u32 level) { struct intel_connector *connector = to_intel_connector(conn_state->connector); struct intel_panel *panel = &connector->panel;- if (panel->backlight.edp.intel.sdr_uses_aux) {+ if (intel_dp_aux_is_supported_hdr_mode (conn_state) ||+ panel->backlight.edp.intel.sdr_uses_aux) { intel_dp_aux_hdr_set_aux_backlight(conn_state, level); } else { const u32 pwm_level = intel_backlight_level_to_pwm(connector, level);@@ -251,8 +275,10 @@ intel_dp_aux_hdr_enable_backlight(const struct intel_crtc_state *crtc_state, } ctrl = old_ctrl;- if (panel->backlight.edp.intel.sdr_uses_aux) {+ if (intel_dp_aux_is_supported_hdr_mode (conn_state) ||+ panel->backlight.edp.intel.sdr_uses_aux) { ctrl |= INTEL_EDP_HDR_TCON_BRIGHTNESS_AUX_ENABLE;+ intel_dp_aux_hdr_set_aux_backlight(conn_state, level); } else { u32 pwm_level = intel_backlight_level_to_pwm(connector, level);@@ -262,6 +288,64 @@ intel_dp_aux_hdr_enable_backlight(const struct intel_crtc_state *crtc_state, ctrl &= ~INTEL_EDP_HDR_TCON_BRIGHTNESS_AUX_ENABLE; }+ /* FIXME: All of the HDR and Colorimetry below doesn't really belong to+ * the backlight! It's not even clear to me that this gets called+ * whenever the connector state changes. */+ if (!panel->backlight.edp.intel.supports_sdp_colorimetry &&+ intel_dp_aux_is_supported_hdr_mode (conn_state)) {+ struct hdr_output_metadata *hdr_metadata =+ conn_state->hdr_output_metadata->data;++ if (hdr_metadata->hdmi_metadata_type1.eotf ==+ HDMI_EOTF_SMPTE_ST2084) {+ ctrl |= INTEL_EDP_HDR_TCON_2084_DECODE_ENABLE;+ } else {+ drm_dbg_kms(&i915->drm, "[CONNECTOR:%d:%s] TCON: Cannot decode requested EOTF\n",+ connector->base.base.id, connector->base.name);+ }++ /* FIXME: Enabling segmented backlight should be+ * controlled via a new KMS prop */+ if (panel->backlight.edp.intel.supports_segmented_backlight)+ ctrl |= INTEL_EDP_HDR_TCON_SEGMENTED_BACKLIGHT_ENABLE;++ /* FIXME: Why guarded by DISPLAY_VER(i915) < 11 ? */+ if (panel->backlight.edp.intel.supports_tone_mapping) {+ u8 buf[4];++ buf[0] = hdr_metadata->hdmi_metadata_type1.max_cll & 0xFF;+ buf[1] = (hdr_metadata->hdmi_metadata_type1.max_cll & 0xFF00) >> 8;+ buf[2] = hdr_metadata->hdmi_metadata_type1.max_fall & 0xFF;+ buf[3] = (hdr_metadata->hdmi_metadata_type1.max_fall & 0xFF00) >> 8;++ ret = drm_dp_dpcd_write(&intel_dp->aux,+ INTEL_EDP_HDR_CONTENT_LUMINANCE,+ buf, sizeof(buf));+ if (ret < 0) {+ drm_dbg_kms(&i915->drm,+ "Content Luminance DPCD reg write failed, err:-%d\n",+ ret);+ }++ ctrl |= INTEL_EDP_HDR_TCON_TONE_MAPPING_ENABLE;+ }++ ctrl &= ~INTEL_EDP_HDR_TCON_SDP_COLORIMETRY_ENABLE;+ } else if (intel_dp_aux_is_supported_hdr_mode (conn_state)) {+ /* FIXME: does the Intel driver already send SDP for Colorimetry and HDR+ * Metadata? */+ ctrl |= INTEL_EDP_HDR_TCON_SDP_COLORIMETRY_ENABLE;+ }++ /* FIXME: should we enable sRGB colorimetry in all the other cases? */+ if (!panel->backlight.edp.intel.supports_sdp_colorimetry &&+ panel->backlight.edp.intel.supports_2020_gamut &&+ (conn_state->colorspace == DRM_MODE_COLORIMETRY_BT2020_RGB ||+ conn_state->colorspace == DRM_MODE_COLORIMETRY_BT2020_YCC ||+ conn_state->colorspace == DRM_MODE_COLORIMETRY_BT2020_CYCC)) {+ ctrl |= INTEL_EDP_HDR_TCON_2020_GAMUT_ENABLE;+ }+ if (ctrl != old_ctrl && drm_dp_dpcd_writeb(&intel_dp->aux, INTEL_EDP_HDR_GETSET_CTRL_PARAMS, ctrl) != 1) drm_err(&i915->drm, "[CONNECTOR:%d:%s] Failed to configure DPCD brightness controls\n",@@ -322,7 +406,6 @@ intel_dp_aux_hdr_setup_backlight(struct intel_connector *connector, enum pipe pi connector->base.base.id, connector->base.name, panel->backlight.min, panel->backlight.max);- panel->backlight.level = intel_dp_aux_hdr_get_backlight(connector, pipe); panel->backlight.enabled = panel->backlight.level != 0;
The problems it addresses:
Use the SDP path if available
enable BT2020 gamut based on the Colorspace prop
enable gamut mapping and HDR metadata based on the HDR_OUTPUT_METADATA prop
only enable HDR at all when the requested TF is PQ/ST2084
There are a number of FIXMEs in the code which need further investigation.
@surajk8 I can control the backlight perfectly in normal/HDR mode with the rev3 in Plasma 6.
@swick I just tried the rev4 and now enabling HDR turns off the display completely, attempts of changing the brightness during the black screen doesn't work. I attached dmesg logs with level 10e (I hope I did it correctly) of me enabling the HDR, the display going completely black and HDR mode being disabled after 15s (because KDE requires confirmation after display config changes, otherwise it reverts to the previous config automatically).
I verified with @company that enabling HDR in mutter while using INTEL_EDP_HDR_TCON_SDP_COLORIMETRY_ENABLE (i.e. using the SDP packets for Colorimetry and HDR metadata) does also work.