安裝 Steam
登入
|
語言
簡體中文
日本語(日文)
한국어(韓文)
ไทย(泰文)
Български(保加利亞文)
Čeština(捷克文)
Dansk(丹麥文)
Deutsch(德文)
English(英文)
Español - España(西班牙文 - 西班牙)
Español - Latinoamérica(西班牙文 - 拉丁美洲)
Ελληνικά(希臘文)
Français(法文)
Italiano(義大利文)
Bahasa Indonesia(印尼語)
Magyar(匈牙利文)
Nederlands(荷蘭文)
Norsk(挪威文)
Polski(波蘭文)
Português(葡萄牙文 - 葡萄牙)
Português - Brasil(葡萄牙文 - 巴西)
Română(羅馬尼亞文)
Русский(俄文)
Suomi(芬蘭文)
Svenska(瑞典文)
Türkçe(土耳其文)
tiếng Việt(越南文)
Українська(烏克蘭文)
回報翻譯問題
CP77 for some reason, starts showing artifacts and aliasing even when turning down render resolution even by a 10%. It just does not work out on resolutions below 4k. There are artifacts on the wired fences, on the roads, everywhere.
From what I’ve seen so far is, any game that has a lot of distance vegetation (KCD, witcher3, horizon zero dawn) will definitely show noise in the distance and sometimes closer. Or in games in which meshes are not complex ( fortnite- never played, paladins, valorant type graphics).
Tried it on Control, hellblade senua’s sacrifice, RE3 and such games. Never had any artifact or something.
One thing though, make sure you put the scaling to “integer” from Nvidia panel and use “1.3x, 1.5x, 1.7x or 2x scaling” ONLY because FSR is designed to work only on these scaling factors (as far as I know). Turn off “force resize” as it negates every other setting you did in LS.
Things will get better and we might see lesser artifacts in the future.
1. I suppose you have "Force resize" option checked in both cases. Then with scale factor 1.0 it'll simply resize game window to native resolution and FSR will do only sharpening pass because input and output image size is the same. With 1.3 it will first resize game window to (native res)/1.3 and then scale with FSR.
2. These programs are not injecting anything when comes to upscaling. They capture game window with WGC or GDI and then process the image with shaders. But they are created by different people and use different APIs (direct2d and direct3d). There are so many factors that can affect performance even within the same program, so it's hard for me to say what is the reason for the difference in performance between them.
3. You mean performance? FSR isn't free upscaling, it eats GPU resources. Plus there is additional overhead to capture and render a frame (compared to native game implemlentation). Performance depends on a GPU, a game, a game framerate before scaling, input resolution, target resolution and Windows version. So you have to choose when to use it and when not depending on results.
Do you have an email or some way I can talk to you about something non-publically?
(PC specs: Radeon 5700XT, Ryzen 3800X and 16GB DDR4 3600MHz RAM)
TL;DR here are two videos of what is written below:
(a video a made for Magpie before I discovered Lossless Scaling)
https://www.youtube.com/watch?v=ywTYSsHKtS0&t=799s
(and LS edition)
https://www.youtube.com/watch?v=RIKf8V3n8Sw
First, I use these software:
- Process Lasso to adjust process priorities so Lossless Scaling and my game can have the priority over other Windows or background processes. I give LS "higher" priority tag and "above normal" for the game. So, LS is superior.
- RTSS. I have a 1440p 144Hz screen. I use RTSS to cap both the game and LS to 143.973. "OnScreen Display Support" is disabled for LS.
Once I'm satisfied with the performance results, I change FPS caps to what I wanna play my game at. Mostly 60, so I set 59.973 in RTSS for "both" LS and the game. I also set my screen Hz to 120. So, it perfectly aligns with 60 FPS.
I don't use AMD's recommended resolution scaling values, BTW. I use;
1/1.25 (2048x1152) (64% of native pixel count)
1/1.45 (1760x990) (47% of native pixel count)
1/1.6 (1600x900) (39% of native pixel count)
Going as low as 1/2 scale without an in-game integration gives very low quality results and is mostly unnecessary unless you have a 4K, 5K, 8K screen. . Because natively implemented FSR also adjusts LOD bias settings for better texture preservation. We don't have that luxury.
As for performance specs, FSR already eats as high as 1ms per frame at 4K resolution. In 1440p case, it is roughly 0.50ms.
For example, I set AC Valhalla to 2048x1152 fullscreen exclusive mode and did a benchmark. Result is 67 FPS. Then I put it to windowed mode, I put Windows visual effects to "performance" and used Lossless scaling FSR to upscale it to 1440p. I did the benchmark again. This time result was: 64 FPS. This means, LS with FSR eats up roughly 0.70ms per frame.
This is why going too close to native resolution can actually give performance loss compared to native. You have to reduce enough pixels to compensate for FSR's processing time and only then you can hope for a performance gain.
For lower resolutions, like 1600x900 or 1280x720, etc. coupled with older/slower cards, it is better to use classical resolution scaling with Radeon Image sharpening.
One more thing: your game needs a sample-based anti-aliasing to be able to properly upscaled. These are:
SSAA
MSAA
SMAA 2Tx
SMAA 4x
TAA
These AA solutions render their images with more samples than their resolution. So, they would still look good even after upscaled to a higher resolution. For example, MSAA 4x renders the geometry at 4 times higher than selected resolution. A 1080p scene with MSAA 4x has its geometry rendered at 4K. Even though the textures, lighting, shadows and other effects are still at 1080p, overall image quality is greatly increased and pixel shimmering while movement is greatly reduced. Add that frame a good FXAA and it is ready for FSR.
For above reasons, FSR won't give clean results for games like the Witcher 3, Batman Arkham Knight, as they don't have sample based AA solutions.
Speaking of adding FXAA. I generally use Reshade and add FXAA to images. Default FXAA in Reshade comes at PS3 quality (15) but you can make it full quality (39) under the settings. If the game's textures are too blurry (like RDR 2) I add slight sharpen with Reshade, too.
They both are optional, though. If the game doesn't have enough AA, like Crysis 3, I add FXAA and, if the game has strong but blurry AA like RDR 2, I add sharpening. If the game has both issues, like GTA V, I add both FXAA and sharpening.
About the numbers:
With LS, I always start with scaling + sharpening = 2.
For example, if the scaling is 1.25, then sharpening is 0.75
if the scaling is 1.45, then sharpening is 0.55
For in game Reshade numbers, I use the fractional part of scaling.
For 1.25 scaling, .25 is used for both FXAA and CAS sharpening.
Of course each game is different. So these numbers are for starting. When I apply all the numbers I sit back and ask myself "is FXAA needed" or "is sharpening too much or too low" and then make final adjustments.
Finally, Freesync works in my monitor. But it tries to catch up with both LS's FPS and the game's at the same time. If you cap their FPS's to same value like I did, Freesync works perfectly.
Good job! You could find the reason why I can't get VRR to work.
Laptop Intel Core i5 3210M Nvidia GT 650M Windows 8.1
Windows bring me an error message and close Lossless Scaling.
I use´d Loosless Scaling for older games, and it works great bevor the new update.
Hope for Help! THANKS!
https://atyuwen.github.io/posts/optimizing-fsr/
BUT
This time the game stutters like anything. Like 15 fps which used to run fine until now (With LS) on 50-55 FPS.