I'm using two cards in my system: RTX 4080 (connected to the main monitor) + Radeon W600 (6 outputs).
In Output Advanced, when I add outputs from the RTX4080 everything is fine, the FPS is high.
But if I add a screen using a single output from the W600, the FPS drops a lot and the processing of the W600 goes to 99% usage.
Isn't processing supposed to stay on the RTX 4080?
Why does the FPS drop so much when I use a W600 output if it is not connected to the main monitor?
Thanks in advance for your help
RTX 4080 + Radeon W600
Re: RTX 4080 + Radeon W600
The processing stays on the renderer GPU, but that means the texture it rendered has to be transferred to the GPU with the output. That will come with a performance hit. So the best performance woudl come from a single GPU and external display splitters like Datapath FX4s
A lot depends on the drivers, and using two different GPU manufacturers is also probably not the safest way to go at it.
Does the CPU have enough PCI-e lanes to handle the GPUs?
A lot depends on the drivers, and using two different GPU manufacturers is also probably not the safest way to go at it.
Does the CPU have enough PCI-e lanes to handle the GPUs?
Software developer, Sound Engineer,
Control Your show with ”Enter” - multiple Resolume servers at once - SMPTE/MTC column launch
try for free: http://programs.palffyzoltan.hu
Control Your show with ”Enter” - multiple Resolume servers at once - SMPTE/MTC column launch
try for free: http://programs.palffyzoltan.hu