安裝 Steam
登入
|
語言
簡體中文
日本語(日文)
한국어(韓文)
ไทย(泰文)
Български(保加利亞文)
Čeština(捷克文)
Dansk(丹麥文)
Deutsch(德文)
English(英文)
Español - España(西班牙文 - 西班牙)
Español - Latinoamérica(西班牙文 - 拉丁美洲)
Ελληνικά(希臘文)
Français(法文)
Italiano(義大利文)
Bahasa Indonesia(印尼語)
Magyar(匈牙利文)
Nederlands(荷蘭文)
Norsk(挪威文)
Polski(波蘭文)
Português(葡萄牙文 - 葡萄牙)
Português - Brasil(葡萄牙文 - 巴西)
Română(羅馬尼亞文)
Русский(俄文)
Suomi(芬蘭文)
Svenska(瑞典文)
Türkçe(土耳其文)
tiếng Việt(越南文)
Українська(烏克蘭文)
回報翻譯問題
The moral of the story is, console gaming has extremely little impact on how PC gaming hardware is developed. You can't really directly compare them either because they're fairly different, and console games are made to a specific hardware set while PCs can have anything. The fact of the matter is, most users still have ~4 cores. 6+ cores are only becoming more common now because people are deciding now to upgrade.
Ahh, I see. That makes sense, yes. Never looked up for PS4 specs. Somehow assumed that core count would be lower for it. No need to hold breath for PS5 ports for PC speculation then either I guess.
Yes, that is what everyone believed a while back, but it is no longer true.
Personally, I never heard this "eight cores will be needed soon" seven years ago. The only 8 core CPUs the mainstream even had was, what, some AMD CPUs which were loosely defined as eight cores but weren't eight full, traditional cores? All I heard seven years ago was, and even this wasn't common, was that four core, eight THREAD CPUs (ala, Core i7s) had some justifications over four core, four thread CPUs in some situations (such as Battlefield multiplayer), but not that they were by and large needed.
The "PlayStation 3/4 had eight cores and we didn't need it then" is false dismissal IMO because, the PlayStation 3 had it but it was so ahead of its time and the Xbox 360 only had a tri core or quad core if I'm remembering right, and as for both consoles having it last generation, yes that is true, but those were low clocked AND low IPC (Jaguar) cores, compared to the rather speedy quad cores or better on the PC side. Both consoles pretty much have nearly fully fledged Ryzen 7 3700Xs now. This doesn't mean I'm saying you'll need 8 cores and 16 threads in X amount of years, just that using the "they existed X years ago and it didn't cause Y change then" isn't necessarily DIRECTLY applicable.
That's your opinion.
But will you say the same for every one who bought a RTX 2080 Ti to "future proof" themselves, only for Nvidia to release 3000 series 2 years later with way better performance, considering how much money they paid to "future proof" themselves? Nobody could predict whether Nvidia would release a new product sooner or later that's substantially better than it's previous generation considering their still large market share in the GPU market. Don't come and say that every RTX 2080 Ti owner is stupid for spending that extra money if you support the idea of future proofing with a GTX 1080 Ti.
And as I mentioned, the same money spent for people to "future proof", can instead be saved for the future, because that same extra money can be used for upgrades which will likely tie with the initial cost of a "future proof system", but brand new components with renewed warranty, better efficiency, versus buying "future proof" components.
There is no such thing as Future Proofing because technology moves so fast and any tech giant can choose to release a product that can alter the spec requirement or set a new standard for gaming performance or whatever industry they are in.
Funny too because you said you had unlimited supply of arguments to back up your say but I have seen none of them.
”Future-proof” is something able to let you play games at the resolution/settings/frame rates you are fine with.
The more you spend now the more years you will be fine in gaming performances.
Anyway the components I would consider more important are CPU and PSU while RAM can be added later.
The component to save a bit might be GPU since after 4/5 years you could replace with a better one.
If you are fine with 1080p gaming 60 fps you might want to buy a good i7 non K (F for example) and a mid end GPU like GTX2060.
While a mid end GPU is easily repleceable after 5 years a CPU will be more expensive since you will need to replace both CPU+MB+RAM.
I myself bought a strong CPU 7 years ago (4771) now I’m still able to play at 1080p high 60 FPS with any game with a mid end GPU if I would like to upgrade but since there are none new gen games I like I keep my RX580 4GB which is still able to run all my games as I want.
Then answer me this, How come those those who bought GPUs like GTX 1080Ti (as commented above) 5 years ago, are still gaming totally fine 1440p 100+fps, still totally satisfied by the performance. I can see it easily lasting 7+ years (5 years + 2 years in the future), if not more.
How is that happening?
Thanks.
Same goes to people those who bought i7 8700k, 3 years ago. I can see it easily lasting for at least 6+ years (3 years + 3 years in the future), if not more.
How is that happening?
These would be the latest ATX power standard
Latest USB standard.
A CPU socket that seldom changes (Intel changes these often because they can't be bothered to build a long term socket design with unused pins unlike AMD has for a long time) realistically it's better to buy midish-high rather than ultra high or high because of electronics rapid depreciation rate.
Yes, i agree that there is no guarantee. But there is a possibility that it will happens again. And the possibility that it will happen again, is greater than Not happening. Because i see no reason why the game developers will raise their game's system requirements all of a sudden. It happens slowly and gradually over time in a steady pace. Like we have seen in the past 20+ years of gaming records. So, if that is true, than, for example, the flagship RTX 3080 will get the same lifespan as GTX 1080 Ti (or close to it). Same formula applies to CPU side of things.
NVIDIA literally aims to replace rasterisation (the current rendering standard) with real-time raytracing, that's their end goal with RTX. You can't say it isn't, because it is, they can keep AMD at an arm's length because AMD will likely never catch up in RT performance at this rate, since their first attempt ended up being worse than Turing's RT performance.
As GPUs become more powerful, you need a faster CPU to handle it. The 3080 and 3090 will get bottlenecked to hell by the majority of processors, you literally already need an overclocked Intel K (8th gen or newer) or Ryzen 5000 to make proper use of those 2 GPUs. Anything less and you're not getting the full performance metric you paid for.