安装 Steam
登录
|
语言
繁體中文(繁体中文)
日本語(日语)
한국어(韩语)
ไทย(泰语)
български(保加利亚语)
Čeština(捷克语)
Dansk(丹麦语)
Deutsch(德语)
English(英语)
Español-España(西班牙语 - 西班牙)
Español - Latinoamérica(西班牙语 - 拉丁美洲)
Ελληνικά(希腊语)
Français(法语)
Italiano(意大利语)
Bahasa Indonesia(印度尼西亚语)
Magyar(匈牙利语)
Nederlands(荷兰语)
Norsk(挪威语)
Polski(波兰语)
Português(葡萄牙语 - 葡萄牙)
Português-Brasil(葡萄牙语 - 巴西)
Română(罗马尼亚语)
Русский(俄语)
Suomi(芬兰语)
Svenska(瑞典语)
Türkçe(土耳其语)
Tiếng Việt(越南语)
Українська(乌克兰语)
报告翻译问题
I'll list them out, for the millionth time:
1. Motherboards change and lose compatibility within 2 to 3 generations depending on brand and circumstances; sockets can change, chipsets have limited microcode which makes compatibility in BIOS a bit tricky (nobody can complain except the people who have to actually make that code, it's not simple to do), other tech standards can and will change. DDR5 is expected within the next few years, as is a new socket for AMD.
2. Processors are changing at a rate much faster than in previous years when it was only Intel selling to the masses due to AMD FX and APUs being a total joke in comparison. There was no competition before, now there is.
Zen came out in 2017 and it's already pretty much obsolete for high performance gaming, similar story for Zen+/1.5.
Zen2 is already proving to be problematic for high performance gaming above the 2080 Ti/3070 level as there's very little gain from the 3080 and 3090 for those GPUs, and Pre-Zen3 CPUs do not support SAM due to architectural limitations that are ironically present in Intel CPUs since Haswell.
The ONLY case for futureproofing CPUs is that Intel barely makes any progress at all so it's not even worth upgrading from a 9900K.
3. DDR5 is coming, so DDR4 will inevitably be phased out, and DDR5 will not fit on DDR4 motherboards.
4. 2080 Ti went from top end to upper mid-range overnight because of Ampere. With RTX 40 series or whatever they'll call it, the 2080 Ti will basically be bottom level performance within RTX by that point, if not weaker than what RTX offers by then if NVIDIA has to push it that far due to Radeon's head-turning comeback.
The only parts of the system that are truly futureproof are things that can actually last 10+ years because they don't contribute to the direct performance of the system components. But keep telling yourself that it's not true. Doesn't change the fact that it's simple fact that it's BS.
Can you name several games in particular that benefit from having an 8 core CPU instead of a 6 core CPU?
I ask because a few days ago I looked at some benchmarks for a game that I typically think of as being processor-intensive (Civilization 5 and/or 6), and noticed that the AI turn time was pretty much the same with the 6 core Ryzen 5600X as it was with the 8 core Ryzen 5900X.
So far in this thread, only two games were mentioned that are claimed to actually use the additional cores (Death Stranding and some Assassin's Creed game).
And those are all false claims. There are ABSOLUTELY no games out there that scales above 6 cores. If anyone continues to argue that fact, i can produce UNLIMITED proofs from dependable source.
And, taking about Future-Proofing proofs? I can see at least two GTX 1080 Ti owners commented in this thread above^, saying that they are still TOTALLY happy with their cards, satisfying their needs of 1440p high fps gaming, even after 5 years after purchase. That's the PROOF of future-proofing in real time, in front of your eyes^. And it will continue to serve another many years still. Same applies with i7 8700k.
Those are the Facts & proofs that Future-Proofing exists. In real time, happening now, in front of your eyes. In the peoples life.
Just because someone is happy with their purchase doesn't mean that futureproofing exists. Nor does building a PC to last.
I've explained before, the premise of futureproof is to make it last forever (or until it breaks, because of some external force.)
You physically cannot do that with computer hardware, it will constantly progress and evolve, sometimes catching snags (like Intels 2nd-7th gen CPUs, essentially being rehashes of the previous generation), but that hardware won't last you forever, and you shouldn't expect it to.
If you think that at some point your hardware is going to be unusable, or less than what you expect it to do, then you haven't got a 'futureproof' computer.
I'll refer to my previous questions;
If you want to argue you can build a computer to last, I'll back that, I agree, you absolutely can. However, futureproofing isn't the same as, or synonymous with building to last.
I'd like to be clear, I understand what they mean to say, and I agree it's possible, just their terminology is wrong, which leads to a misunderstanding from people with lesser knowledge of the subject. Misinformation isn't the way to get people to do things.
You CAN build a computer that will last you a reasonable amount of time, half a decade is possible.
Well, it would depend on the architecture of the CPU, and the clockspeed of the CPU(s) in question.
As someone that's used an i5 for 5-6 years now, I can say, I wish I got an i7 - It would've lasted me better and gotten me better performance in games now.
So, more cores/threads I recommend, but clockspeed (if the architecture is the same), is just as important.
You would have to look which games you want to play, which they favour more, cores or clocks.
And, more threads always help for making it last, if a CPU doesn't have Hyperthreading (or SMT), I wouldn't ever recommend it, because they will have a shorter life than one with.
Remember, you run more programs than just a game (and it's launcher), and Windows does stuff in the background. So more cores than the games like, I would advise.
Which also leads to more consistent frametimes (smoothness of the game), because there isn't any fighting for resources.
There is also the trend of games starting to use more and more cores/threads.
I would say, if you want a computer that lasts, and has good performance, the Ryzen 5800x would be a fine choice.
The 3600x is a good choice, but it won't last quite as long.
As for the Civ benchmarks, if I'm not mistaken, Civ likes clocks more than cores (but cores can have an impact), and the 5600x and 5800x have similar clocks, so the performance should be about the same.
https://www.overclock3d.net/reviews/software/death_stranding_pc_performance_review_and_optimisation_guide/4
You did not mentioned which Assassin's Creed game you want to see. If it's Assassin's Creed Origins, then it's a 3 years old game, not CPU heavy at all. It scales same across all CPUs.
Old games, hard to find benchmarks with newer hardware.
https://www.techspot.com/article/1525-assassins-creed-origins-cpu-test/
Future proofing is a pretty speculative concept. The only known constant the future will hold is that it'll cost more than it does now to do the same things.
You repeatedly keep explaining the meaning of the word ''future-proof'' in your comments as a defence of your own standings (and to prove other person wrong) is not a great tactics, at all. Everybody here understands very well what is the meaning of ''future-proof' according to Dictionary. But when we talk about ''Future-proofing'' in computing hardware, it means SAME as LONG LASTING (or very close to it).
People those who come here to seek advice about buying Hardware, use the word ''Future-proof'', it does not means that they want their PC to LAST FOREVER, which the word ''future-proof'' actually means.
So i suggest, you keep your English Dictionary at your home and try to Understand what the actual point here asking by the OP or there users. Thanks.
you're not wrong
but I don't think autumn and some of the others are meaning ill intent. see my dewlap in the picture?
i chose this photo for a reason -- cause all my friends in IT all do the same thing when it comes to being factually correct even if the user's computer or the server or the network is now down and everything is still broke in the end--which isn't what anyone wanted to have as the outcome.
It bears repeating that sometimes what someone asked for isn't what they wanted but they pretty much knew what they intended when they said it.
but, they get the support for what they asked for, even if there are differences in that interpretation.
so hey my advice is get workstation grade motherboard, and upgrade the cpu in four years. 3 years after that everything that was too risky to upgrade to will be available cheap and you can probably look at upgrading at that time, or put it off for another year or two.
unless your OS arbitrarily pulls support for it overnight.
It is not just about ''someone is happy''. Quoting only a portion of a sentence to gain advantage in YOUR favour is not a sign of honesty Mr Autumn. You can't be a HERO that way. Btw.. If you can read the full paragraph (comment #64), you will understand why it is called Future-proof.
It is called future-proof because, Those GTX 1080 Ti owners who commented in this thread are happy, because the cards are still successfully satisfying their need of 1440p High-fps gaming, even after 5 years from purchase. And it will continue to serve many UPCOMING years as well, because there are still plenty of life remaining in that hardware. Same applies to i7 8700k. That's why they are called Future-proof.
That's the PROOF of future-proofing in real time, in front of your eyes^, Happening now, in people's life.
It has NOTHING to do with happiness; it's called comparing hardware then and now. Simple.
In the past when AMD wasn't a strong competitor, things moved slowly, but not anymore. It's a simple concept that people like you just want to refuse so you feel better about having older hardware and wanting to call something high end when it's not anymore.
https://www.youtube.com/watch?v=vVjdhXAdKE0
I can also produce UNLIMITED proofs that some games do scale above 6 cores, just that there are diminishing returns into price to performance. Also, there are people who may do more in the background than just gaming on it's own.
If you are so focused on the "future proofing", you might as well tell every user now, buy a 8 core, 12, 16 or even a 32 core since you know, in the future a game eventually is going to "use more cores" anyway. For instance, anyone who bought a X99 and 5960X that had 8 cores when in 2014-2015, will lose to a Z390 and 9900K (in terms of gaming performance and efficiency. Only benefit is they have more PCIe lanes) considering that same amount of money spent to "future proof" themselves would be better saved to sell that old PC and get a new one that has basically better IPC, efficiency, node process, etc etc.
Plus, you ought to know that a 8700k was released as a product more to compete with AMD, rather than actually staying ahead in the curve like Nvidia is doing, considering Intel 14nm was clearly built for 4 cores and not 6 and above.
That's a no mane fake video. No description about which CPUs has been used. Are they intel and AMD mixture. Which generations? The only thing it says is the Core Counts. What a joke.
And of course you're going to call it fake, because it's pretty obvious that you're in denial and won't accept visual evidence that refutes your claim.
And nice job ignoring the last part where he suggested that if you were in favour of futureproofing that you'd recommend more cores. Yet your argument of cores counters your opinion on futureproofing.
Are you so in denial?
Why not you provide the real evidence then that's 100% yours and your benchmarks?
More importantly, man up and report my post as fake information or go to that youtube video and accuse fake information.
And even more importantly, that video has 1.2 million views, so I guess they must have received fake information too.
I don't need to tell anyone anything. Anyone with a minimum knowledge on computer hardware can tell you that this video is fake.
Any hardware reviewers that is even remotely legit, would put his test rig's specification in first place, before conducting the benchmarks tests. And here in this video, he is testing the CPU.
And the CPU's name is absent. LoL.
Says it all.