Nainstalovat Steam
přihlásit se
|
jazyk
简体中文 (Zjednodušená čínština)
繁體中文 (Tradiční čínština)
日本語 (Japonština)
한국어 (Korejština)
ไทย (Thajština)
български (Bulharština)
Dansk (Dánština)
Deutsch (Němčina)
English (Angličtina)
Español-España (Evropská španělština)
Español-Latinoamérica (Latin. španělština)
Ελληνικά (Řečtina)
Français (Francouzština)
Italiano (Italština)
Bahasa Indonesia (Indonéština)
Magyar (Maďarština)
Nederlands (Nizozemština)
Norsk (Norština)
Polski (Polština)
Português (Evropská portugalština)
Português-Brasil (Brazilská portugalština)
Română (Rumunština)
Русский (Ruština)
Suomi (Finština)
Svenska (Švédština)
Türkçe (Turečtina)
Tiếng Việt (Vietnamština)
Українська (Ukrajinština)
Nahlásit problém s překladem
Now...as for the input lag issues. Usually when we talk 4K resolutions on a monitor, we are usually talking about 3840x2160. The problem is that 3840x2160 is an interlace resolution digital film standards so your GPU outputs as such while your screen is trying to display it in progressive scan mode...you can easily get the progressive scan version of that resolution with 4096x2304. You will have to add it as a custom resolution if you use NVIDIA Control Panel, but it may reduce input lag if you use 4096x2304. So what's the big deal? Interlace resolutions usually already run at half the frame rate of the refresh rate on a monitor. That's because each scan line of resolution is displayed one at time.
What in the world are you going on about? Where do you get your information?
4K UHD is not interlaced... And changing the pixel output doesn't change that fact. The numbers you grabbed for that second resolution are out of nowhere and isn't a native resolution. Perhaps you are confusing it with cinema 4K (4096 x 2160)? Which that is a change in aspect ratio, not a change in scanning method.
It's usually proportional to the resolution % like Monk put it, but also depends on many factors like VRAM. The higher resolution you go, potentially the higher VRAM usage. Easiest example is Wolfenstein II, using ultra (not max) preset at 1080p the GTX 1060 6gb is 71% faster than GTX 970 3.5gb. Keeping the settings and step up to 1440p or 4K the gtx 1060 6Gb is up to 100% faster than the gtx 970 3.5gb. 3.5gb VRAM likely to be one of the factor here.
And you could always use virtual super resolution and play your games at that super resolution to see what FPS you get.
You can set up a virtual super resolution in your GPU control panel.
This is not 1440p. It is ultra wide 1080p.
1440p would be 2x.