Steam telepítése
belépés
|
nyelv
简体中文 (egyszerűsített kínai)
繁體中文 (hagyományos kínai)
日本語 (japán)
한국어 (koreai)
ไทย (thai)
Български (bolgár)
Čeština (cseh)
Dansk (dán)
Deutsch (német)
English (angol)
Español - España (spanyolországi spanyol)
Español - Latinoamérica (latin-amerikai spanyol)
Ελληνικά (görög)
Français (francia)
Italiano (olasz)
Bahasa Indonesia (indonéz)
Nederlands (holland)
Norsk (norvég)
Polski (lengyel)
Português (portugáliai portugál)
Português - Brasil (brazíliai portugál)
Română (román)
Русский (orosz)
Suomi (finn)
Svenska (svéd)
Türkçe (török)
Tiếng Việt (vietnámi)
Українська (ukrán)
Fordítási probléma jelentése
btw.. https://steamcommunity.com/app/236870/discussions/0/143388250766965858/
"nothing out right now will play this game at 1080p with 2x supersampling with a single card. you are basicly playing the game at 3840 x 2160 (4k)"
No NVIDIA DSR, no over-rides in NVIDIA Control Panel what so ever.
But yes, Hitman 2016 is MUCH more demanding overall.
But it's also still worse under DX12 vs 11, so keep that in-mind too.
Your linked post is also from January, alots changed since then regarding game performance via updates, as well as GPU Drivers.
https://www.howtogeek.com/316682/how-to-run-pc-games-at-resolutions-higher-than-your-monitor-with-supersampling/
"Here’s an Overwatch character being rendered with standard, screen-matching resolution on the left and a 200% super sampling technique on the right. Both are displaying at 1080p, the maximum resolution of many standard monitors. But the image on the left is being rendered in the game’s engine at 1080p, while the image on the right is rendering at 4K (3840×2160)"
To "mimic" 4k, you would need 4x SSAA, but it all depends heavily on implementation again.