Background story: I recently bought a computer with AMD 7000 series CPU and GPU.
amdgpu_top reports 15 ~ 20 watts in normal desktop usage, but as soon as I have video playing in VLC, it goes to 45 watts constantly which is undesirable behavior especially in summer. (I hope that is just reporting issue… but my computer is hot)
When I do DRI_PRIME=1 vlc
and then play videos, amdgpu_top doesn't report the power surge. (I have iGPU enabled)
Is there anything more convenient then modifying individual .desktop files? KDE malfunctions when I put export DRI_PRIME=1
in .xprofile so that's a no go.
Solved: removing mesa related hardware acceleration package makes VLC fall back to libplacebo which doesn't do these weird things.
I am assuming you have the monitor connected directly to the 7800xt. Which is why it is the default GPU.
Is the decoding being done when watching the video¿? amdgpu_top shows if the application(vlc in this case) is using the decoding hardware(column named DEC).
Also using the iGPU for video decoding should be more efficient because the massive number of cores in dGPU aren't needed while decoding yet are kept active because the dGPU is active
The problem has been solved, it's caused by mesa's video decoding package, I will answer anyway.
Yes, VCN (Video Core Next) column stays at constant value while playing video (3% for VA-API with mesa, 5% for VDPAU with mesa, 0% for libplacebo), GFX fluctuates between 0% and 1%.
Just playing a 1080P video (not even a high bit rate one) is enough to make GPU fan go spinning, disappointing.
Hmm must be some bug in mesa or the way it interacts with vlc . I use VA-API with mesa for my decoding purposes on a system(laptop) with Vega iGPU and RDNA1 dGPU and I don't see high energy usage. In fact I get much better battery life with vaapi hardware decoding.