Power consumption for HW accelerated video decoding for Radeon iGPUs is simply outrageous
I'm using the Ryzen 7 7840HS CPU along with fully updated Fedora 39, running Linux 6.7.5 with mesa-va-drivers-freeworld, i.e. with full HW acceleration for H.264/H.265.
I don't quite understand a simple use case.
Open any 1080p 60fps stream on Twitch, e.g. https://player.twitch.tv/?autoplay=true&channel=asot&parent=twitch.com - that's a Twitch embed, i.e. just a video stream with little to no additional JavaScript), that's H.264 and observe the following in Mozilla Firefox 123.0:
- The CPU is running around 4700MHz (amd-pstate-epp driver running by default)
- Power consumption is around 13W
These are truly insane numbers for a 5nm CPU.
Here's a comparison for the same video with the 8 years old Intel Core i5 6200U CPU using the very first iteration of Intel's 14 nm node:
- Average CPU frequency is 700MHz
- Average power consumption is 3.5W
Why is the new super advanced AMD part is so terribly inefficient? That doesn't seem right.
I would expect it to consume at most 2-3W given all the advances in manufacturing, not four times more.
Addendum 1:
This
echo power | tee /sys/devices/system/cpu/cpufreq/*/energy_performance_preference
Makes the CPU run at around 2300MHz and cuts power consumption roughly in half (8W) but that still is hugely inefficient. Unfortunately this workaround is not good for 4K 60fps VP9/AV1 videos because they start occasionally stuttering and skipping frames. I'm talking about YouTube.
Addendum 2:
Firefox needs media.ffmpeg.vaapi.enabled
enabled in about:config in order to use HW video acceleration.
Addendum 3:
Under Windows 10 LTSC 2021 (21H2) with the VP9 extension installed and active:
For a 1440p VP9 60Hz stream on YouTube under Firefox 123:
Best performance
- 11W
- 3.9GHz
Default power scheme:
- 8.5W
- 2.8GHz
Better battery life
- 8W
- 1.8GHz
8.5W < 14W but it's still far from optimal.