You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Please do not dismiss this issue as misconfiguration or other things, I have spent countless hours debugging and researching this.
As the current nightly (0.23.1.dirty) the config option for the VA-API card selection is ignored and even with a card specified, sunshine will always use the card that pops up using vainfo with no switches.
It seems to me that the code used in video.cpp does not select the correct card pulled from the config.
This means that on multi gpu systems, no other card can be used apart from the one libva selects as default.
This bug cannot be solved by using any env var (DRI_PRIME) or some suggested wrappers such as switcherooctl, since libva does not implement any mechanism for telling it which card to use, apart from selecting the card in vaGetDisplay().
It seems to me that the intended card is not picked in the config and passed to the mentioned function, resulting in the wrong card being selected as the encoder.
Expected Behavior
The specified card should be used to encode using libva.
In the log section log has been captured using the flag min_log_level=verbose.
Mind that I am certain that the card I selected in the config is the RX6800M, which in my system is set as renderD128, double checked also by looking at the symlinks in /dev/dri/by-path. When this card is selected in vainfo --display drm --device /dev/dri/renderD128 it is called navi22, and sunshine tries to use renoir which is renderD129 and the card used by vainfo when no option are specified
Every time the card capabilities are read, they refer to the wrong card.
I have also looked at the code myself to try and debug this
Thanks for the detailed report. I can confirm this issue on my desktop PC with Intel iGPU and AMD dGPU when using KMS capture. The adapter setting is ignored, which is unexpected since the documentation states: "Linux + VA-API: Unlike with amdvce and nvenc, it doesn’t matter if video encoding is done on a different GPU."
There is a PR though to implement cross-encoding on a different GPU in an efficient way: #2053.
If I switch to x11grab as capture method, cross-encoding on the iGPU (when rendering on the dGPU) is working. But those codepaths are probably quite inefficient.
As for seeing which card is actually used by libva I submitted a minor PR (#2502) to have slightly more informative output at info level.
Is there an existing issue for this?
Is your issue described in the documentation?
Is your issue present in the nightly release?
Describe the Bug
Please do not dismiss this issue as misconfiguration or other things, I have spent countless hours debugging and researching this.
As the current nightly (0.23.1.dirty) the config option for the VA-API card selection is ignored and even with a card specified, sunshine will always use the card that pops up using vainfo with no switches.
It seems to me that the code used in video.cpp does not select the correct card pulled from the config.
This means that on multi gpu systems, no other card can be used apart from the one libva selects as default.
This bug cannot be solved by using any env var (DRI_PRIME) or some suggested wrappers such as switcherooctl, since libva does not implement any mechanism for telling it which card to use, apart from selecting the card in vaGetDisplay().
It seems to me that the intended card is not picked in the config and passed to the mentioned function, resulting in the wrong card being selected as the encoder.
Expected Behavior
The specified card should be used to encode using libva.
Additional Context
Here are the relevant discord threads:
https://discord.com/channels/804382334370578482/1211017600344137789
https://discord.com/channels/804382334370578482/1233274689846247537
Related libva thread:
intel/libva#221
In the log section log has been captured using the flag min_log_level=verbose.
Mind that I am certain that the card I selected in the config is the RX6800M, which in my system is set as renderD128, double checked also by looking at the symlinks in /dev/dri/by-path. When this card is selected in
vainfo --display drm --device /dev/dri/renderD128
it is called navi22, and sunshine tries to use renoir which is renderD129 and the card used by vainfo when no option are specifiedEvery time the card capabilities are read, they refer to the wrong card.
I have also looked at the code myself to try and debug this
Host Operating System
Linux
Operating System Version
Arch Linux, 6.8.9-arch1-2
Architecture
64 bit
Sunshine commit or version
g26e0ff8
Package
Linux - AUR (Third Party)
GPU Type
AMD
GPU Model
AMD Radeon RX6800M, AMD Radeon Renoir
GPU Driver/Mesa Version
24.0.6
Capture Method (Linux Only)
KMS
Config
Apps
No response
Relevant log output
The text was updated successfully, but these errors were encountered: