I recently upgraded my aging hardware; i5 4460 + GTX 1060, to R5 5600 + RX 9060 XT 16GB, and I ran into a problem that my old monitor only accepts 144hz through the DVI, and the DVI to DP adaptor that supports dual link costs as much as a new monitor...
So, the GTX 1060 comes out of retirement to act as a DVI adaptor...
But I also stumbled upon lossless scaling with dual GPUs setup.
So I set the physics rate to 60hz, while capping the framerate at 30fps, raise the physics cap to 2, which is the setting that even an entry-level card can handle.
In lossless scaling, set 9060 to render, while 1060 handles 4x frame generation, Boom! 120fps.
And the best part is that both cards combine only draws about 90W, less than a single 1060 running at full speed.
I mean, in a desktop mode, it works great; you only have a ghosting effect in the menu, but the animation looks clean.
So, the GTX 1060 comes out of retirement to act as a DVI adaptor...
But I also stumbled upon lossless scaling with dual GPUs setup.
So I set the physics rate to 60hz, while capping the framerate at 30fps, raise the physics cap to 2, which is the setting that even an entry-level card can handle.
In lossless scaling, set 9060 to render, while 1060 handles 4x frame generation, Boom! 120fps.
And the best part is that both cards combine only draws about 90W, less than a single 1060 running at full speed.
I mean, in a desktop mode, it works great; you only have a ghosting effect in the menu, but the animation looks clean.