Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
I was playing Monster Hunter World lately with Gen 9 i9 + RTX 2080Ti.
A) 1440p is about 120-140fps.
B) 4K is about 80fps.
In this case, I much prefer A).
When I play on SoulCalibur 6 (limited 60fps as fighting game).
Both runs 60fps on 4K and 1440p.
I just use 4K for video. Audience prefer 4K than 1440p.
My 4K environment is used TV (Sony X85K 42inch).
The TV does not support RGB (4:4:4) and only support 4:2:2 on 120Hz.
It is very poor resolution and almost half.
(60Hz is supprted RGB though)
Ref.
https://www.pengohome.com/Learn_Detail.asp?LiD=256CFF928D8943134016E659D7F6946B
So if you are OK with 60Hz, some low standard 4K TV would be fine.
But if you like to have 4K 120Hz, it's better to check detail specs and reviews.
4k 27 inch 163ppi
1440p 27 inch 109ppi
Do the maths you see the gains. Its about 25% sharper to go from 1080p to 1440p and 1440p to 4k is 50%. Why do people say its not as much as 1080 to 1440 when in reality its twice as much.
Because 99% of the people who chime in have never actually used a 4k monitor and are just repeating something they read on a forum or saw on Youtube. You can really spot the clueless ones when they start telling you that you need some specific monitor size to resolution ratio.
I game on two different rigs, hence two different monitors. One's a 3840x1080 49" 32:9 144Hz display, and the other a 34" 21:9 3440x1440 75Hz display. I think as long as the monitor goes up to 32' to 34", 1440P is perfectly fine. It's only when you go higher, like >40" that I think 4K becomes necessary.
Also, bear in mind that when you go up to 4K, it only makes sense IF you have, or intend to get, a powerful GPU to go with it, minimum RX 6950XT/RTX3080 Ti or RTX 4080/90 or incoming RX 7900 series.
Basically, unless ya using a magnifying device, you'll not notice it past 300 ppi.
Same reason why humans can't view microscopic things with the naked eye.
Build quality matters though as well, why rtings.com shows the sub pixel layout in each monitor it tests.
Like so.
https://i.rtings.com/assets/products/tPauqa85/dell-alienware-aw3423dw/pixels-large.jpg
I currently use a 4k and I am downgrading to 3440x1440.
There are far more important things than resolution imo. In particular, you better be prepared to spend a lot more on your graphics card because 4K is still difficult to drive if you want good fps
rate and you don't want to have to start messing with things like DLSS.
Also why I mentioned build quality matters.
And even I can easily tell the visual difference between two monitors of the same size but different resolution design. The only time I've struggled was in displaying a game/image on a 8k TV, was it at 4k or 8k for what was shown on screen? Couldn't tell. But could tell it was an 8k screen.
Simply because I like higher fps and higher monitor refresh rates
4k I see no need for. It's just going to slow down my games
But if you're planning to get 32" monitor then you need a 4K version.
If bad game coding and lack of foresight it how we move with tech then we will have mega size 8k displays. I really get irked by thier silly arguement lets all stay on 90ppi so the UI is just right.
Id rather move on and play something that lets me enjoy high PPI and has a scaling that remains the same no matter the PPI. If the programs UI is too small then that software is not modern and should be patched.