Cyberpunk 2077

Cyberpunk 2077

View Stats:
Like Button Feb 21, 2022 @ 1:14am
AMD FSR looks blurry
Im on Quality mode, compared to CoD Vanguard, this game looks more blurry, or is my vision just getting bad?
< >
Showing 1-3 of 3 comments
harlekinrains Feb 21, 2022 @ 1:50am 
FSR does nothing to make a game look blurry. Except lowering render resolution.

If we now have to explain to everyone that complexity reduction to the level of "I'm on Quality Mode" doesnt work, well all hang ourselves within two weeks, or leave this forum never to come back. Probably the second one.

Why doesnt it work?

Because you are pronouncing "I AM LOWERING RESOLUTION BY DEVIDING X by 1.5" but you are not telling us what X is. So whatever you wrote, doesnt mean much.

Here, look up the table of the factors here:
https://www.tomshardware.com/news/amd-fidelityfx-super-resolution-fsr-performance-tested

You take that value - and devide horizontal and vertical resolution figures by it. Because I see the puzzled look in your eyes already, just divide the vertical resolution number by it, then you'll get it and have to do less math! isnt that something?

So

You are on 1440p, you use Quality mode:

1440/1.5 = 960 Horray! You now are using 960p

Now set the game to 4k and use FSR performance mode:
2160/2 = 1080 Horray You now are using 1080p

♥♥♥♥; why does performance mode look better than quality mode!? But almost runs at the same speed/performance envelope?! Because Output resolution matters with FSR.

PRO tip, use 4k output resolution, FSR performance, volumetric cloud quality medium, every thing else maxed, (But never use PSYCHO level quality settings), Ray tracing off, and Anisotropic filtering set to 4x = and you get a very playable experience on a 1070, 1660TI, 1070TI, or equivalent. (45-55 fps, but pretty constant.)

Or do the thing that people will recommend to you to have it more easy - and just use FSR Ultra quality, when targeting 1440p.

Ultra Quality mode:
1440/1.3 = 1108 Horray You are now using 28 extra pixels than 1080p for vertical resolution.

4k at performance (2160/2 = 1080) still looks better, but you have reduced complexity and dindt have to learn anything - isnt that something...
-

Next step is to explain to you what FSR does.

So it lowers input resolution. Super. You've got that. Lower resolution, less power needed to render the game world. It then takes the output image (as in 2d image), before stuff like filmgrain get applied, does a sharpening pass, then does artefact reduction for sharpening on the image. Then it does edge detection, a sharpening pass and or a smudge filter on edges (also taking into account if the edge is in the sky or a box in a dark alley (so contrast dependant), and a ringing filter on top of it). So to condense that down, it does arteficially sharpen the 2d image - but intelligently so, and with filterpasses to remove sharpening artefacts, so you dont see an artefacty (oversharpened) mess, but somtheing that seem like its higher detail. But it does it on the image level. Not in the gameworld.

This is very beneficial, because at the same output resolution setting when setting it to ultra quality, you have to calculate fewer pixels, but get roughly the same quality. But if you are running it at targeting 1440p and are setting it to quality, your base render resolution is lower than 1080p.
When you are targeting 4k and FSR performance your base resolution is 1080p though.

When upscaling 1080p to 4k the image is less blurry, then when upscaling 960p to 1440p - because the input image had more pixels. Do you get that? Better input == better output.

Horray.

Other than lowering input resolution FSR does NOTHING to blur an image.

Last two points.

What FSR is good at is simulating "higher resolutions than it is rendering". It cant invent lighting effects, or shadow detail, or diffuse fog, or reflections. Which is why with FSR, you are usually instructed to max your game settings (dont use psycho settings, dont use ray tracing (except when you have a 3080ti and want to run the game with raytracing maxed at 1080p render resolution close to 60 fps (but then you'd use DLSS not FSR)), but max them otherwise)) -- and then use FSR on top. FSR good at simulating higher resolution, not good at simulating higher effect passes. (As in it cant do that. It looks at a 2d image, how should it know how to better render shadows or lighting?)

But it needs a somewhat high input resolution to achieve good results. So - below 1080p is stretching it... (Why doe me game look blurry? Thats why.)

Last point. For some reason 4k with FSR performance, looks bad when using anisotropic filtering set to 16x, but good, when you set it to 4x.

If thats too much for you -

stick to 1440p with FSR set to ultra quality or quality - then you can leave that setting to 16x.

But then you make threads that state that it looks blurry to your eyes - because 960p is probably not enough input resolution for your desired detail level. (Its just image enhancements (sharpening, then removing artefacts, making edges smoother, then removing artefacts), not magic. It cant invent detail that isnt in the 960p image.
--

Thats all crsital ball stuff in terms of analysis, because you havent told us what output resolution you are targeting.

If you are targeting 1080p and use FSR quality - here is your calculation

1080/1.5 = 720 congratulation, you render resolution is 720p now.

Do you have any questions?

Why blurry? Just because of lowered input resolution. Not because of anything the FSR algo does. Inheriently. Probably. (To a large extent.)

I can ensure you that 2160/2 = 1080p and AF to 4x doesnt look blurry. So OMG not even performance looks blurry!

But 1080/2 = 540 so performance looks blurry! Because you are now rendering SD resolutions before moving the images through the FSR image processing.

Got it? Good.

Now imagine us explaining that to everyone that comes up with the question, misrepresenting what FSR does entirely. Because it doesnt blur a thing.

edit: Also with FSR set motionblur to low or off, otherwise - really blurry in motion. But thats not necessarily FSRs fault.
Last edited by harlekinrains; Feb 21, 2022 @ 2:08am
Bishop-Six Feb 21, 2022 @ 2:21am 
Originally posted by harlekinrains:
FSR does nothing to make a game look blurry. Except lowering render resolution.

If we now have to explain to everyone that complexity reduction to the level of "I'm on Quality Mode" doesnt work, well all hang ourselves within two weeks, or leave this forum never to come back. Probably the second one.

Why doesnt it work?

Because you are pronouncing "I AM LOWERING RESOLUTION BY DEVIDING X by 1.5" but you are not telling us what X is. So whatever you wrote, doesnt mean much.

Here, look up the table of the factors here:
https://www.tomshardware.com/news/amd-fidelityfx-super-resolution-fsr-performance-tested

You take that value - and devide horizontal and vertical resolution figures by it. Because I see the puzzled look in your eyes already, just divide the vertical resolution number by it, then you'll get it and have to do less math! isnt that something?

So

You are on 1440p, you use Quality mode:

1440/1.5 = 960 Horray! You now are using 960p

Now set the game to 4k and use FSR performance mode:
2160/2 = 1080 Horray You now are using 1080p

♥♥♥♥; why does performance mode look better than quality mode!? But almost runs at the same speed/performance envelope?! Because Output resolution matters with FSR.

PRO tip, use 4k output resolution, FSR performance, volumetric cloud quality medium, every thing else maxed, (But never use PSYCHO level quality settings), Ray tracing off, and Anisotropic filtering set to 4x = and you get a very playable experience on a 1070, 1660TI, 1070TI, or equivalent. (45-55 fps, but pretty constant.)

Or do the thing that people will recommend to you to have it more easy - and just use FSR Ultra quality, when targeting 1440p.

Ultra Quality mode:
1440/1.3 = 1108 Horray You are now using 28 extra pixels than 1080p for vertical resolution.

4k at performance (2160/2 = 1080) still looks better, but you have reduced complexity and dindt have to learn anything - isnt that something...
-

Next step is to explain to you what FSR does.

So it lowers input resolution. Super. You've got that. Lower resolution, less power needed to render the game world. It then takes the output image (as in 2d image), before stuff like filmgrain get applied, does a sharpening pass, then does artefact reduction for sharpening on the image. Then it does edge detection, a sharpening pass and or a smudge filter on edges (also taking into account if the edge is in the sky or a box in a dark alley (so contrast dependant), and a ringing filter on top of it). So to condense that down, it does arteficially sharpen the 2d image - but intelligently so, and with filterpasses to remove sharpening artefacts, so you dont see an artefacty (oversharpened) mess, but somtheing that seem like its higher detail. But it does it on the image level. Not in the gameworld.

This is very beneficial, because at the same output resolution setting when setting it to ultra quality, you have to calculate fewer pixels, but get roughly the same quality. But if you are running it at targeting 1440p and are setting it to quality, your base render resolution is lower than 1080p.
When you are targeting 4k and FSR performance your base resolution is 1080p though.

When upscaling 1080p to 4k the image is less blurry, then when upscaling 960p to 1440p - because the input image had more pixels. Do you get that? Better input == better output.

Horray.

Other than lowering input resolution FSR does NOTHING to blur an image.

Last two points.

What FSR is good at is simulating "higher resolutions than it is rendering". It cant invent lighting effects, or shadow detail, or diffuse fog, or reflections. Which is why with FSR, you are usually instructed to max your game settings (dont use psycho settings, dont use ray tracing (except when you have a 3080ti and want to run the game with raytracing maxed at 1080p render resolution close to 60 fps (but then you'd use DLSS not FSR)), but max them otherwise)) -- and then use FSR on top. FSR good at simulating higher resolution, not good at simulating higher effect passes. (As in it cant do that. It looks at a 2d image, how should it know how to better render shadows or lighting?)

But it needs a somewhat high input resolution to achieve good results. So - below 1080p is stretching it... (Why doe me game look blurry? Thats why.)

Last point. For some reason 4k with FSR performance, looks bad when using anisotropic filtering set to 16x, but good, when you set it to 4x.

If thats too much for you -

stick to 1440p with FSR set to ultra quality or quality - then you can leave that setting to 16x.

But then you make threads that state that it looks blurry to your eyes - because 960p is probably not enough input resolution for your desired detail level. (Its just image enhancements (sharpening, then removing artefacts, making edges smoother, then removing artefacts), not magic. It cant invent detail that isnt in the 960p image.
--

Thats all crsital ball stuff in terms of analysis, because you havent told us what output resolution you are targeting.

If you are targeting 1080p and use FSR quality - here is your calculation

1080/1.5 = 720 congratulation, you render resolution is 720p now.

Do you have any questions?

Why blurry? Just because of lowered input resolution. Not because of anything the FSR algo does. Inheriently. Probably. (To a large extent.)

I can ensure you that 2160/2 = 1080p and AF to 4x doesnt look blurry. So OMG not even performance looks blurry!

But 1080/2 = 540 so performance looks blurry! Because you are now rendering SD resolutions before moving the images through the FSR image processing.

Got it? Good.

Now imagine us explaining that to everyone that comes up with the question, misrepresenting what FSR does entirely. Because it doesnt blur a thing.

edit: Also with FSR set motionblur to low or off, otherwise - really blurry in motion. But thats not necessarily FSRs fault.

Thats very useful thank you!
Aldaris Feb 21, 2022 @ 4:01am 
Originally posted by O.Gefr.Löring.363VD:
Thats very useful thank you!
One thing they didn't explain though, is scaling of the output image to the monitor's native resolution. Using 4k as an example, 1080p output of the end render multiplies into 4k precisely 2x in each direction. This means for 2x2 grid of pixels in the output, it nicely occupies a 4x4 pixel grid on the monitor when it's scaled up to the monitor's res. This is known as integer scaling, because it's a nice round whole number, and it keeps the image from the monitor nice and crisp.

Something that isn't integer scaled means your GPU (or your monitor if it does its own scaling) is trying to guess at which pixels should go where in a grid as there's always a remainder of pixels and some info will be lost (or made up) if you run it at full screen and this leads to blur. The alternative is that the image is centered and not full screen, resulting in black bars to the sides and top.

For this reason, if you have a 4k monitor, performance mode can look crisp and not blurry even though the output has fewer pixels than quality.
Last edited by Aldaris; Feb 21, 2022 @ 4:03am
< >
Showing 1-3 of 3 comments
Per page: 1530 50

Date Posted: Feb 21, 2022 @ 1:14am
Posts: 3