Steam'i Yükleyin
giriş
|
dil
简体中文 (Basitleştirilmiş Çince)
繁體中文 (Geleneksel Çince)
日本語 (Japonca)
한국어 (Korece)
ไทย (Tayca)
Български (Bulgarca)
Čeština (Çekçe)
Dansk (Danca)
Deutsch (Almanca)
English (İngilizce)
Español - España (İspanyolca - İspanya)
Español - Latinoamérica (İspanyolca - Latin Amerika)
Ελληνικά (Yunanca)
Français (Fransızca)
Italiano (İtalyanca)
Bahasa Indonesia (Endonezce)
Magyar (Macarca)
Nederlands (Hollandaca)
Norsk (Norveççe)
Polski (Lehçe)
Português (Portekizce - Portekiz)
Português - Brasil (Portekizce - Brezilya)
Română (Rumence)
Русский (Rusça)
Suomi (Fince)
Svenska (İsveççe)
Tiếng Việt (Vietnamca)
Українська (Ukraynaca)
Bir çeviri sorunu bildirin
Win7 came out in 2009 so yes, by 2010-2011, 8GB was the norm for a gaming PC. 4GB RAM was fine for more basic usage needs.
Heck in 2005 I was already using 4GB RAM with XP on my Athlon64-939 system.
Anyways no point in us going on about it.
Anyway it sounds like you were way ahead of everyone in 2005 then.
My own memory of the mid 2000s is that the big 1 GB was the desirable amount to have around then. Along with the Athlon 64 and the GeForce 6800. I remember discussions on whether 512 MB was still enough a bit after games like Far Cry 3 and Doom 3 were out (and the consensus I remember from then was that yes, it was insofar as actual game performance, but it might be a bit constrained like having some longer loading, or be pushing it if you were multi-tasking). Back in the mid 2000s, many of us may have still been scuffling along with just 256 MB or 512 MB RAM (I specifically remember having 384 MB for a long while).
It was a bit before Windows Vista (and partly because of it) that 2 GB started becoming common in my mind. By time 4 GB was actually standard was closer to the release of Windows 7. Then DDR2 got super cheap in the very late 2000s (which is why I got another 4 GB to add to my existing 4 GB) right before DDR3 came, which was super expensive at first and then got cheap after a single generation (which made Sandy Bridge super popular) which made some people go with 16 GB (like me) so yeah you're probably rather accurate on 8 GB being the early 2010s, at least if you're talking about the go-to amount to buy for gamers/enthusiasts. But there's a difference between that and what is "common". People may have started buying 8 GB in those times but it took a bit longer to become more common. Like people may have shifted more towards buying 32 GB these days but Steam shows the average as 16 GB, and likely will for a while (as I think a couple/few short years ago it still showed 8 GB?).
Ah, this post. I don't think it's hardware issue.
https://steamcommunity.com/discussions/forum/11/3820780544825870131/
Yes, everyone in the industry was just laughing when folks like Dell were offering supposed good spec PCs with Vista and 2 or 4 GB of RAM. And then also to see many of those systems shipping with Vista 32bit
Vista was just terrible all around. Win7 ran much better overall with the same amount of RAM compared to Vista. Win7 64bit SP1 was very doable with 4GB of RAM. I remember fixing a hand-me-down PC I had redone for my sister to use. It had AM2+ Motherboard, Phenom II X4 CPU, 4x 1GB DDR2-800, GTX 550 Ti. Wasn't great but it could run Dying Light at 1080p without much issues.
32bit has a 4g limit
windows 10-64 is a much heavier os
During the late 90's and 00's, PCs evolved at what can be considered breakneck speeds. Like a "big bang" start to the industry. 4 years was a revolution.
Windows XP wasn't around until about 2003, and 1GB machines were prohibitively expensive for most consumers in the USA, but Windows XP brought existing PCs to their knees and forced upgrades. It was the end of the 32-bit era, and since the consumer 32-bit x86 platform didn't support more than 4GB, there was not almost zero appetite for more than that throughout the rest of the decade.
Then Windows 7 arrived as a primarily 64-bit OS, and demanded 4GB minimum. I don't know how many people got around to running the Athlon 64 with Windows 7, as its single core performance is better suited for the 32-bit era.
Vista's launch was terrible. Vista itself was AWESOME. I 100% preferred it over Windows 7.
Windows Vista was much more aesthetically pleasing and had all the necessary bells and whistles for as long as video games were designed to be backwards compatible with Windows XP. It also had parity with Windows 7 feature support throughout the rest of its lifecycle.
Likewise, Windows 7 really didn't "need" less hardware for the most part, and it didn't even inherently "fix" all of the things with Windows Vista (not to say it didn't fix any). Instead, the mere passing of time meant the barriers to entry, like high hardware demands and poor driver support, were just less of issues.
I think the long lifespan of Windows XP just made people lash out at what came next.
Almost like we're seeing the same thing again. Windows 10 enjoys a lifespan twice as long, and Windows 11 is blasted, and people are ALREADY praising Windows 12 without knowing what it will even be! I'm anticipating it will be Windows 11 2.0, just like Windows 7 was Windows Vista 2.0. And likewise, people will likely embrace Windows 12 and not realize the hypocrisy.
Though I personally prefer Windows 7 to Windows Vista, it's refreshing to see others can distinguish between "had a bad launch and life cycle" and "was inherently bad".