Instalar o Steam
Iniciar sessão
|
Idioma
简体中文 (Chinês Simplificado)
繁體中文 (Chinês Tradicional)
日本語 (Japonês)
한국어 (Coreano)
ไทย (Tailandês)
Български (Búlgaro)
Čeština (Checo)
Dansk (Dinamarquês)
Deutsch (Alemão)
English (Inglês)
Español-España (Espanhol de Espanha)
Español-Latinoamérica (Espanhol da América Latina)
Ελληνικά (Grego)
Français (Francês)
Italiano (Italiano)
Bahasa Indonesia (Indonésio)
Magyar (Húngaro)
Nederlands (Holandês)
Norsk (Norueguês)
Polski (Polaco)
Português (Brasil)
Română (Romeno)
Русский (Russo)
Suomi (Finlandês)
Svenska (Sueco)
Türkçe (Turco)
Tiếng Việt (Vietnamita)
Українська (Ucraniano)
Relatar problema de tradução
I recommend using DLAA. The game will use the DLSS AI upscaling technology to make 1080p better than rendering at normal 1920x1080, and that would use a lot less processing power than running at 4K.
(it's not upscaling. It's not just rendering at 1920x1080. It's using features of the DLSS to try to optimize for the native resolution of your monitor)
So the highest FPS and the best quality for a 1920x1080 screen is almost certainly to just use DLAA.
I'm assuming you have an RTX NVidia card. If you didn't you wouldn't be able to handle 4K in this game anyway. (4K without DLSS is too demanding for an RTX4080).
Well, if you can't use 4K->1080 via DSR then you can't really compare to DLAA. I get that you're comparing to OTHER games but the art assets might not work great on a 1080p monitor.
Also, I don't recommend using Frame Generation if you are as that adds a lot of artifacts like smearing between moving objects, and the lag is just below 1/2 of whatever the FPS shows (so at "80FPS" the lag is just below what it would be at 40FPS).
If you "trick" the computer by using "Windows Screen Options" as said I'm not clear what happens. I'd assume the game thinks you support a 4K screen which might be a problem for how the HUD displays. I just don't know.
Honestly, I think getting a half decent GSync-Compatible 1440p monitor is the way to go. $300 USD or less?
MONITORS UNBOXED has some good videos.
https://www.youtube.com/@monitorsunboxed/videos
I have a nice 2k HDR display. Nvidia DSR works just fine in 90% of games and 2k its a better compromise for framerates usually.
Most people seem to mean 1080p so I assumed that's what you meant.
Anyway, 1440p DLAA looks really good AFAIK (I tested Horizon Forbidden West at 1440p DLAA before going 4K) and I doubt 4K->1440p via DSR looks better
Either way, I don't think there's an easy way to do what you want. So..
a) what's your HARDWARE (GPU, CPU), and
b) what EXACTLY does (1440p?) DLAA do that you don't like?
2560x1440p is 2k for me (used to be the norm in pc i dont get calling 1080p it lol)
My system is Win11, 3080Ti OC. 5800X3d latest drivers, 32G ram,
I usually play in DSR 4k in singleplayer games when i can (Ghost of Tushima, Uncharted 4, etc) its definetively doable with my card at 60fps without Raytracing/ ultra features.
Its not thats something specially wrong with my game and DLAA, im just surprised it doesnt work out of the box like in other Sony games.
Im getting around 90fps 2k maxed out with DLAA right now for reference.
105 with FSR 3.1 but it flickers a lot as of now so really not usable.
I'm still confused. WHAT exactly is going on that you think 4K rendering would improve on? Are you getting too much aliasing (jagged edges)?
(The reason "2K" = 1440p was confusing and I thought terribly stupid was that the reason "4K" screens were called that was the number of pixels was roughly 4000. i.e. "3840" across the horizontal.. for 1080p that number was 1920 so "2K" would make sense. Monitor manufacturers were using "2K" to mean different things and then they threw their hands up in the air and either just use the full numbers (i.e. 2560x1440) or if they say "2K" monitor they are likely to say "2K (1440p)" to avoid confusion.
I'm not sure where 2K=1440p came from. Possibly because people decided the number of pixels was roughly HALF of 4K but, again, the "K" means thousands of pixels in the horizontal. Sigh.
Hey, I'm still annoyed about "IDE vs SATA" or "x86 vs x64" which makes no sense...
It SHOULD have been 2K=1080p.)
These projectors use horizontal resolutions of 2048, 4096 or 8192 pixels with variable vertical resolution that depends on the aspect ratio of the material.
When "4k" TVs first appeared on the market, there was no official home video standard like FullHD (1920x1080) so many people simply started calling them "4k". Today it's officially called UltraHD (3840x2160).
Following that naming, a "2k" Monitor would be FullHD or 1080p. I've never heard anyone calling a 1440p display "2k" before.