Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
A 4090 on Full HD with medium settings. The difference is absolute meaningless, because no one will play this game on this settings with such a GPU. And those settings are only chosen for the single purpose to make the graph show at least any difference at all.
Because on realistic settings (like 4k max details) almost non of the CPUs will make any meaningful difference (besides the absolute low end).
And even a Ryzen 7 7700X is able to produce constant 60 FPS+ on max settings with 4k and would be even able to produce more if the limiting factor wouldnt be the GPU...
Every gamer that thinks such a CPU will make any meaningful difference in an actual game has absolute no clue about computers and games...
But: the CPU is still astonishing (not so for gameing, more in general the rough power for example for applications even the non X3D performs a bit better here.
Lower graphical settings are chosen because they remove the chance of a GPU bottleneck and give a more accurate representation of relative CPU performance. It's the industry standard for benchmarking. The 0.1/1% lows do also matter for reducing noticeable micro-stutter.
It still has no real relevance for actual games and is absolutely misleading...
If I remember correctly (could mistake him with someone else) gamer nexus even admitted this a while ago in a video...