Due to an influx of spam, we have had to impose restrictions on new accounts. Please see this wiki page for instructions on how to get full permissions. Sorry for the inconvenience.
Admin message
Equinix is shutting down its operations with us on April 30, 2025. They have graciously supported us for almost 5 years, but all good things come to an end. We are expecting to transition to new infrastructure between late March and mid-April. We do not yet have a firm timeline for this, but it will involve (probably multiple) periods of downtime as we move our services whilst also changing them to be faster and more responsive. Any updates will be posted in freedesktop/freedesktop#2011 as it becomes clear, and any downtime will be announced with further broadcast messages.
radeon: allow the user to set a maximum HDMI pixel clock (in MHz) by a kernel parameter
As far as I have seen radeon sets the pixel clock (in MHz) for the HDMI/DVI output to a value as recommended by the firmware, only. Since kernel 4.5.0 the nouveau driver has a kernel parameter called hdmimhz to override this setting by hand: As far as I could test it the results I get that way are just wonderful (see for Bug 93405).
The situation with my '4K ready' XFX Radeon R5 230 card is similar than it was with the nouveau driver before kernel 4.5.0: The promised 3840x2160@30 mode is not offered by default and when I try to set this mode manually the screen remains black (no signal). It was exactly like this with my Nv. Geforce 9600M GT until the hdmimhz kernel parameter appeared. Since then it works even better than officially specified by Nvidia with nouveau (nobody would have believed that).
Why isn`t there a similar parameter for radeon?
Please attach your xorg log and dmesg output. The driver will properly filter modes in conjunction with the hw limits of the PLL. Overriding this is not recommended. If the clock is within the hw limits and it's not working, it's just a bug that needs to be fixed.
Only with DP? How can that be? I have just had a look at various Radeon R5 230 cards via geizhals.at and all producers (MSI,Asus,XFX) feature at least 625MHz for the clock and just VGA, DVI and HDMI as output. I have never heard of a DisplayPort connector for that type of card.
The chip certainly would support DP, but unfortunately graphic card manufacturers usually do not include it on low end cards, so that's why (even the next higher up cards typically do not have it, on the nvidia side it's even worse fwiw).
And unfortunately your card is listed as hdmi 1.4, but (as is very often the case, as this is allowed per hdmi spec) it's one of those old pseudo-hdmi 1.4 implementations which support some hdmi 1.4 features almost noone ever cares about but does not support the higher link rates introduced with even hdmi 1.3, so the 2560x1600@60Hz and 3840x1260@30Hz modes are not possible.
Only with DP? How can that be? I have just had a look at various Radeon R5
230 cards via geizhals.at and all producers (MSI,Asus,XFX) feature at least
625MHz for the clock and just VGA, DVI and HDMI as output. I have never
heard of a DisplayPort connector for that type of card.
The hw supports DP, it's up to the AIB vendors to decide what configurations they want to produce. The 625 Mhz is the 3D engine clock; nothing to do with displays.
Likewise for the G96M [GeForce 9600M GT] nobody would have believe that this card can yield 3840x2160, be it with 23Hz or 46Hz interlaced. Gonna provide the logs tomorrow when the computer with the XFX radeon card is free for testing. Just wanna tell that I still hope for a similar radeon tuning parameter like hdmimhz. The fact that the card was sold with HDMI as 4K ready should be a strong indication that 3840x2160@30/24/23 is possible. If I remember that correctly 3840x2160@30 was initially stated to be supported officially by ATI for the XFX card (though withdrawn now). I would even take the risk to test it if the card should not work like this for any reason (old HDMI1.4 incompatibility or so.).
The card has a dual-link DVI port - it should, in theory, be possible to support 3840x2160@30Hz through that (as dual-link DVI has twice the bandwidth of that HDMI port). For that to work you need of course a monitor which has a dual-link dvi port (obviously, passive dvi->hdmi adapters won't help).
That is however from a theoretical point of view (that is, such a resolution should not exceed bandwidth limits) - YMMV.
Likewise for the G96M [GeForce 9600M GT] nobody would have believe that
this card can yield 3840x2160, be it with 23Hz or 46Hz interlaced. Gonna
provide the logs tomorrow when the computer with the XFX radeon card is free
for testing. Just wanna tell that I still hope for a similar radeon tuning
parameter like hdmimhz.
That hw was not designed to support 4k over hdmi. If you want to hack the driver, you are welcome to (take a look at radeon_dvi_mode_valid()), but it's not something I want to enable or support out of the box. If you break your card or monitor, you get to keep the pieces.
Hmm; the u2868pqu is not said to support a dual-link DVI port. However I have a DeLock DVI-HDMI adapter here and have already tested it with the DVI output of a Fujitsu Celsius mobile H270 equipped with a Nvidia G96GLM [Quadro FX 770M]. It did feature 3840x2160@30 even under Windows 7 without a problem. So if it does not work over the specs then that adapter should be a pretty intelligent one.
Roland; the nouveau developers have told me that hdmimhz=297 would be sufficient for 3840x2160@30; there should be a report online about it. Maybe you err because dual link DVI should be equivalent to hdmimhz=330. Even more hdmimhz=225 is sufficient for 3840x2160@23 as far as I have tested it.
Ahh the joys of TMDS overclocking.
Yeah I suppose overclocking to ~225Mhz (should be sufficient for 24Hz with the right modeline) or so should have some chance of working (seems extremely unlikely it would damage something, albeit working stable is an entirely different question, this is 40% higher than the official limit after all).
In any case, AMD doesn't want to support it, so you'd have to hack the driver yourself.
Dear radeon developers;
Today I have tried to give the radeon kernel module a hack: By changing radeon_dvi_mode_valid to return MODE_OK as soon as the clock is below what I state in a new kernel parameter called radeon.hdmimhz it was possible to make new graphics modes appear when hdmimhz is high enough. Besides this the new parameter has no effect; i.e. My monitor still stays black on higher modes while it boots with any hdmimhz setting. This is contrary to the behaviour of nouveau.hdmimhz where it blackscreens on boot when the setting is wrong.
At next I tried to override max_tmds_clock and mode_clock in radeon_get_monitor_bpc; without success. Finally I have tried to extend radeon_best_single_encoder to look through all available encoders and pick the one with the highest radeon-pixel_clock. Unfortunately there was only one such decoder to decide from.
When I look at the output of journalctl after enabling drm.debug=1 it seems to honor the settings at first but then always goes down to a dotclock of 165MHz or sth. very similar. I have no idea on what to do; further guesses may be of little success either.
Does anyone here have an idea on what should/could be done or on how to resolve the issue? It needs to be possible in some way to set TMDS timing for the card in deed and not just simulate higher TMDS for graphics mode detection. thx.
While the patch in radeon_dvi_mode_valid seems to work (new modes are offered at a high enough radeon.hdmimhz) the patch around radeon_get_monitor_bpc/radeon.hdmikhz seems insufficient to keep a high enough TMDS frequency.
... and here comes a corresponding journal.log. If you grep the output of drm.debug=1 you see that drm_calc_timestamping_constants seems to reduce the frequency to 164250 kHz which is not sufficient for 3840x2160. In spite of debugging and backtracing both function calls (drm_calc_timestamping_constants and radeon_get_monitor_bpc) I have not found out by which function and in what way they are called. Please help.
Attachment 121572, "journal.log for patch001 radeon.hdmikhz=297000 radeon.hdmimhz=300": journal.log
I have now had a look at the manual of my XFX Radeon R5 230 Core 2GB graphics card. It is clearly marketed as 4K ready there though the card only has a DVI, a HDMI and a VGA port. Any claim that 4K would only be possible over a non existant display port thus needs to be clearly stated as wrong.
Once again I wanna ask if anyone here would be ready to help me with a patch for the driver; or if anyone here could attempt such a patch including a radeon.hdmimhz parameter without me? Why should it not work? It has proven in practice for the nouveau driver.
I have now had a look at the manual of my XFX Radeon R5 230 Core 2GB
graphics card. It is clearly marketed as 4K ready there though the card only
has a DVI, a HDMI and a VGA port. Any claim that 4K would only be possible
over a non existant display port thus needs to be clearly stated as wrong.
As Roland mentioned, you could run 4k over dual link DVI.
> Once again I wanna ask if anyone here would be ready to help me with a
> patch for the driver; or if anyone here could attempt such a patch including
> a radeon.hdmimhz parameter without me? Why should it not work? It has proven
> in practice for the nouveau driver.
As I said before, the hw was not designed to support it. You'll need to hack radeon_dvi_mode_valid() to return MODE_OK and radeon_dig_monitor_is_duallink() to return false.
Unfortunately making radeon_dig_monitor_is_duallink return true under all or certain circumstances produces distortions of otherwise working modes while it can not enable a single mode that would not have worked before. Besides this I heavily doubt that a true-return for dual-link would be necessary at all for a hdmimhz of 225 and modes like 3840x2160_23.00. Is there anything else I could try? - or was it neXus mobile who has betrayed me claiming a card 4K ready which definitely isn`t?
Unfortunately making radeon_dig_monitor_is_duallink return true under all or
certain circumstances produces distortions of otherwise working modes while
it can not enable a single mode that would not have worked before. Besides
this I heavily doubt that a true-return for dual-link would be necessary at
all for a hdmimhz of 225 and modes like 3840x2160_23.00. Is there anything
else I could try? - or was it neXus mobile who has betrayed me claiming a
card 4K ready which definitely isn`t?
Read my comment again. You need to return FALSE in radeon_dig_monitor_is_duallink(). HDMI is always single link. You need to return false so the driver attempts to set up the link as single rather than dual.