Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
Heard Microsoft Flight Simulator is demanding on all it's "pro" settings so perhaps, nowadays, that game is the "ultimate test". Someone correct me if I'm wrong.
The one melting machines like Crysis, though, seems to be The Last of Us.
Tragic.
Games aren't something that parallelize well linearly or by default. People have been saying this for ages and now that we have CPUs with the core and thread counts we do today, whenever a game is limited by a single thread (almost always when not GPU limited...) and only uses a single digit or barely double digit percent of a CPU, people think it justifies labeling it as "poor optimization"? What?
Cores aren't something that just help by default, but people think if they don't, the coders don't know what they're doing? When cores get only twice as fast over a decade, yes, something that was performance heavy back then, might still be today (especially with higher refresh rate gaming being more common).
Hardware being new or "high end" are BOTH arbitrary labels and nothing more. A game does NOT care for either aspect. So just because something is new or high end, does not mean it SHOULD perform to whatever imagined level of desired performance in a given scenario.
Apparently you have no idea of the games history with hardware and performance do you? It not optimize then when it was a hit, its not optimized now, the engine was garbage when it was new and still is, nothings changed....
Can't really have more resolution on a flatscreen than flatscreens can display.
I was just saying that in general, it's annoying when people go straight for the "it's unoptimized" excuse either when something runs poorly and continues to years later, or when it only uses a small portion of a CPU's total resources on today's multi core CPUs, and I explained why either can happen, and that neither necessarily mean something isn't optimized. If you weren't doing that, sorry, but a LOT of people go to that excuse for either of the reasons I mentioned. "I have a high end this or that, so why isn't it using much of this part, or why is performance only this high" is a really common complaint from people who don't understand.
I can't speak for Grand Theft Auto IV as I'm not too familiar with it, no.
I CAN speak for Crysis as I ran it back then more than fine on a Core 2 Duo E8400, so if people are having issues with it on anything faster, then I don't know what to tell you. It was just graphics cards that struggled with that game, and because it was visually heavy (and coincided with the time frame where widescreen and higher resolutions were being adopted), but by time Fermi came around, it was far from unplayable.