Lossless Scaling

Lossless Scaling

Kitty Skin 30 AGO 2021 a las 3:32 p. m.
Feature Suggestion: Use FSR as antialiasing
As the tittle says, is it possible to allow us to use FSR as antialiasing? Lets say, game runs at native screen resolution, we use FSR factor 2 and the resulted output its resized to fit the screen?
While it wont be ofc as good as native AA, some games like NFS Rivals lack AA at all or have TERRIBLE AA implementations like only SSAA, so even if FSR used as anti aliasing wont be the best, it will be way better than nothing.
< >
Mostrando 16-21 de 21 comentarios
SteveZee 4 SEP 2021 a las 3:56 p. m. 
Publicado originalmente por Nill:
Publicado originalmente por SteveZee:
You could upscale to a higher target resolution like 4k from a lower one, run the fsr filter then downscale back to a lower one, but in doing so it might not produce what you think. What you'd be doing is upscaling , say a 1440p image, or a 1.3x FSR image, applying FSR then downscaling to your native resolution (you wouldnt apply fsr then upscale to 4k then back again, as that would do nothing effectively). FSR works by adding "high resolution edges" , which would add the higher resolution edges to the 4k output and that might help w/ some AA and then the downscale would, in theory, capture that to some degree. But it also might capture other unwanted aspects and make the final image worse than intended (blur for example). Its really hard because FSR is "fake 4k". It doesn't add quality to an image, it just tries to sharpen things that all ready exist and make edges better in quality.

An interesting idea though.

I used Nvidia DSR to run the game between 1200p and 2160p occupying most of my two screens and then enabled LS with FSR and the result was disapointing. It just looks the same :(

It really only works with resolutions lower than native. Not higher. FSR wasn't meant to be used w/ supersampled images. All though it would be interesting to see what i does with them.

One issue with that is that DSR and VSR both use a gaussian filter, amongst other things, which adds blur and slight distortions. DSR apparently has options to allow a "sharper" image versus a more accurate one. Did you try enabling that or looking for such an option in the control panel? Perhaps running it w/ the sharper option might help make a difference.

HOWEVER....

In any event, enabling DSR and then using FSR isn't the same as using FSR on a native 4k monitor. So effectively what you're doing is supersampling the image, then (probably) letting the driver downscale it, grabbing the result, then using FSR on that (pointless as the image is all ready supersampled). It wont really produce much difference unless you can supersample the image, run fsr on it, and then downscale it. And im not sure lossless grabs the frame in the right spot to do this.
SteveZee 4 SEP 2021 a las 4:28 p. m. 
Publicado originalmente por Nill:
Publicado originalmente por SteveZee:
You could upscale to a higher target resolution like 4k from a lower one, run the fsr filter then downscale back to a lower one, but in doing so it might not produce what you think. What you'd be doing is upscaling , say a 1440p image, or a 1.3x FSR image, applying FSR then downscaling to your native resolution (you wouldnt apply fsr then upscale to 4k then back again, as that would do nothing effectively). FSR works by adding "high resolution edges" , which would add the higher resolution edges to the 4k output and that might help w/ some AA and then the downscale would, in theory, capture that to some degree. But it also might capture other unwanted aspects and make the final image worse than intended (blur for example). Its really hard because FSR is "fake 4k". It doesn't add quality to an image, it just tries to sharpen things that all ready exist and make edges better in quality.

An interesting idea though.

I used Nvidia DSR to run the game between 1200p and 2160p occupying most of my two screens and then enabled LS with FSR and the result was disapointing. It just looks the same :(

I think i know why it wont work for you.

For me VSR also only works in exclusive fullscreen mode ( i have amd). If i set it to 4k, then run in windowed mode, it plays the game in native 1440p. Thats why nothing looks different. For me the FPS is the same 1440p in full screen as 4k is in windowed, which means VSR isn't activated at all. DSR might do the same, which may be why you see no difference.

Lossless only works in windowed mode.
BetterWarrior 5 SEP 2021 a las 2:39 a. m. 
Some games work with VSR and DSR resolution it's all about if the game accept different than desktop resolution when on windowed.
In case it doesn't you can try fullscreen not exclusive fullscreen but fullscreen which is basically borderless windowed i tested it and it work with LS and accept VSR resolution in some games.
Also the reason we need it in FSR is SS will always beat the best AA there is, heavy on performance but if your rig can do it i don't see why not.
Max7 25 AGO 2023 a las 10:59 a. m. 
After 2 years, we now have a Radeon version of DLAA, called "FSR Native", which uses an upscaled image of your Native rendered game, then downscales it to your Native resolution, getting rid of jagged edges and shimmering, much like a mix of SSAA and TAA, but way less heavy on the GPU than SSAA.
Velovar 27 FEB 2024 a las 10:49 a. m. 
Publicado originalmente por Max7:
After 2 years, we now have a Radeon version of DLAA, called "FSR Native", which uses an upscaled image of your Native rendered game, then downscales it to your Native resolution, getting rid of jagged edges and shimmering, much like a mix of SSAA and TAA, but way less heavy on the GPU than SSAA.

Is it possible for this to be implemented into Lossless Scaling?
ogioto 27 FEB 2024 a las 12:34 p. m. 
Publicado originalmente por Velovar:
Publicado originalmente por Max7:
After 2 years, we now have a Radeon version of DLAA, called "FSR Native", which uses an upscaled image of your Native rendered game, then downscales it to your Native resolution, getting rid of jagged edges and shimmering, much like a mix of SSAA and TAA, but way less heavy on the GPU than SSAA.

Is it possible for this to be implemented into Lossless Scaling?
No, this is temporal solution and the app doesn't have access to such data
< >
Mostrando 16-21 de 21 comentarios
Por página: 1530 50

Publicado el: 30 AGO 2021 a las 3:32 p. m.
Mensajes: 21