supersampling vs high native resolution
I’m thinking to build a new gaming pc. I already have an old 1440x900 monitor, but prefer 1080p resolution. My question is, should I just stick with my current monitor and supersample my resolution up to 1080p (lower resolution to 720p and supersample to 150%)?

Does supersampling use more resources that using that actual resolution which I’m “simulating”, or is the resource usage the same? Or maybe, is the resource usage somehow lower?

(Basically, is there a difference in resource usage between actual 1080p and 720p at 150% render scale?)
< >
Visualizzazione di 1-1 commenti su 1
It's the same performance impact, bu you may end up with a worse picture depending on how it ends up looking on your particular monitor. And using AMD VSR or NVIDIA DSR for the OS desktop for everyday use is no ideal at all.

I would just get a 1080p Monitor that is at least 144hz and has DisplayPort on it. Dump monitors that have just VGA and/or DVI on them.

At least if the display has HDMI only, and you run out of HDMI on your gpu, you can always use an active dp adapter. As most all modern gpus tend to have 1x HDMI and 3x DP on them.
< >
Visualizzazione di 1-1 commenti su 1
Per pagina: 1530 50

Data di pubblicazione: 25 giu 2020, ore 11:29
Messaggi: 1