Установить Steam
войти
|
язык
简体中文 (упрощенный китайский)
繁體中文 (традиционный китайский)
日本語 (японский)
한국어 (корейский)
ไทย (тайский)
Български (болгарский)
Čeština (чешский)
Dansk (датский)
Deutsch (немецкий)
English (английский)
Español - España (испанский)
Español - Latinoamérica (латиноам. испанский)
Ελληνικά (греческий)
Français (французский)
Italiano (итальянский)
Bahasa Indonesia (индонезийский)
Magyar (венгерский)
Nederlands (нидерландский)
Norsk (норвежский)
Polski (польский)
Português (португальский)
Português-Brasil (бразильский португальский)
Română (румынский)
Suomi (финский)
Svenska (шведский)
Türkçe (турецкий)
Tiếng Việt (вьетнамский)
Українська (украинский)
Сообщить о проблеме с переводом
device=c:\dos\himem.sys
device=c:\dos\emm386.exe 2048
dos=high
files=30
It was an art...
Pentium 200MHz
64MB RAM
6GB WD hdd
S3 2MB video card
Back then, hardware was advancing so fast, and as a result, as was gaming. If you wanted to play a new game, there was legitimately a chance that your PC at merely three years old might not play it too well. Hardware advancements slowed a lot in some areas, especially CPUs but it's also occurring with GPUs (relative to price point, anyway). It was pretty much the case to where I didn't have to look at requirements with my prior Core i5 2500K and GTX 1060 because it was just going run about everything. While that obviously might not handle some of the more demanding few games at a high level, it was a decade old so that's understandable, but yet there wasn't much it wouldn't do acceptably for how ancient much of it really was. That actually leads me right into my next point.
Years and years ago, the norm was also much different than it is today. It was more normal to have frame rates that might slip into the 40s or even 30s on newer titles, and most people would be fine playing that way. People didn't always get 60 FPS, whereas today some people seem to act like they are going to melt if they fall below that. So what you really need to game enjoyable is different depending on if you're okay with, say, 1080p, 60 Hz, and tuning the settings, versus "I got to have 120 FPS at ultra settings at 1440p". As long as you're not to stressful over having it all, there's typically more GPU than most people need, and paired with the recent supply and demand balance, it's easy to see why people are staying with aged hardware. There's little value in upgrading from something that still largely works fine.
This sort of belongs with the above as it's still a part of the second overall difference of "the norms of the gaming landscape have changed" but in addition to having higher norms and standards, gaming as a WHOLE is more broader than it has ever been in what it needs. There's still demanding triple A titles, but they aren't the norm like they used to be anymore. Crysis was one of the last of its kind. Even many of them run fine on older hardware.
Between the recent consoles being a real uplift (whereas the last generation they sort of weren't), CPUs with more than 4 cores being readily available, and Windows 11 likely cutting off older stuff and establishing a new floor (due to TPM 2.0), I am somewhat guessing that in two or three years (presuming demand doesn't remain outstripping supply for as long) that you'll start to see it change a bit, to where the really old hardware will struggle a bit, but I don't foresee it going totally back to the days of old where two to three year upgrade cycles are more common and necessary rather than a luxury because hardware just isn't advancing like that anymore.
Minimal requirements
Recommended requirements
Hardcore gamer requirements
Welcome to godhood requirements
Tbh illusion how much further can things progress in a short time, 8k I doubt will become a mainstream thing for some time as nothing really yet runs on 8k. There is a new lighting system soon that may make DLSS ray tracing old but what much can they do. The only way to go is to expand rather than upgrade, such as Xbox and Playstation expanding into PCIE specialist cards to run console games on a pc native. DDR5 is just on the horizon but not a massive jump in terms of tech.
Lot's of the very early 3D games dropped below 30 and even 20 fps even with the hardware's they where made for and peoples still played these games.
As for tiers of requirements, good idea, but in practice, even minimum and recommended are often 'this is what we the developers sanction" and is all over as far as accuracy goes. I've played games below minimum (typically on CPU) and still not only played it, but played it well, but played some near recommended and had some issues at times (think heavier games). It's pretty much just how it goes that you're going to have to learn if you want to PC game and adjust what you need if you know you're targeting 4k, 120 FPS+, etc.
Yep, it was just different times. I understand the whole "it's hard to go back" when you've had 120 FPS (I had a CRT and moving to LCD was a bit of an adjustment as even the desktop was sluggish), but part of that IMO comes from the display itself (as in, even at 60 FPS, you have a nicer experience with a nicer response panel). Might just be my "low standards" and because I don't have the funds to throw the better half of four figures at a GPU every 2 or 3 years, but 60 FPS is absolutely okay with me. I get the impression that some people are absolutely unable to live with less. I see people talking about "drops to 80 or drops to 60" and it's an eye opener to me (though despite the frame rate, frequent shifts up and down can be more distracting than a stable experience).
Not quite AS far back as you're referencing, but one of my favorite hardware reviews ever is the one done my Tome's Hardware of the GeForce 6800 series.
https://www.tomshardware.com/reviews/performance-leap,789.html
The opening paragraph is enlightening; they talk about how reaching 50 FPS in high settings in top demanding titles was something to get excited for, and that was at 1024 x 768 when the same paragraph mentioned 1600 x 1200 was something a top end gamer may have had in those days (CRT times, so you could just drop it and not lose as much image quality as today).
Also notice how generations were more regularly a year apart in those days; these days they are more like two with refreshes. GPUs are stagnating now like CPUs and you have to pay up to get the improvements now. I can't justify that for 120 FPS when I'm okay with 60 FPS even if there's some drops.
A 49 page GPU review... Back when tech news/media took theirs jobs serious and put out some super detailed write-ups.
Also back when there was a list of 20+ worthwhile benchmark worthy games to list that people would actually have interest in...