安裝 Steam
登入
|
語言
簡體中文
日本語(日文)
한국어(韓文)
ไทย(泰文)
Български(保加利亞文)
Čeština(捷克文)
Dansk(丹麥文)
Deutsch(德文)
English(英文)
Español - España(西班牙文 - 西班牙)
Español - Latinoamérica(西班牙文 - 拉丁美洲)
Ελληνικά(希臘文)
Français(法文)
Italiano(義大利文)
Bahasa Indonesia(印尼語)
Magyar(匈牙利文)
Nederlands(荷蘭文)
Norsk(挪威文)
Polski(波蘭文)
Português(葡萄牙文 - 葡萄牙)
Português - Brasil(葡萄牙文 - 巴西)
Română(羅馬尼亞文)
Русский(俄文)
Suomi(芬蘭文)
Svenska(瑞典文)
Türkçe(土耳其文)
tiếng Việt(越南文)
Українська(烏克蘭文)
回報翻譯問題
Yet the core scaling is consistent with other benchmarks. Quite a few newer games do show minimal increases above 6 cores, but increases nonetheless.
Are you going to address the fact that you're in favour of the idea of "futureproofing" yet you argue against having more cores because it's "pointless?" Or are you just going to ignore that?
So you are in other words saying, we don't know what the heck we are talking about and whatever we are saying or even sharing is false information.
Why not if you claim that your information is so true like as if you're a tech god, why not provide the UNLIMITED information as you claim?
https://www.youtube.com/watch?v=iZBIeM2zE-I
Oh no! Look, LinusTechTips posted a video showing the performance of the multicore CPUs, particularly comparing 5000 Ryzen series and the Intel Core 10th Gens. Look at the scaling performance of the i5-10600K which has 6 Cores and the i7-10700K which has 8 Cores. Must be fake, to you. Lmao
He must be wrong too and we are all stupid too.
At this point I am just assuming you're a troll or if not, a super denial techie.
That's because of clock speeds, not core counts. i7 has 400 Mhz higher clock speeds than i5 out of the box. That's the reason i7 is faster. Not because its core numbers.
If you overclock the i5 10600k. It will perform same as i7 10700k.
The only way I can see L3 cache being the culprit in every single instance that this has been tested, is when using a Ryzen 9 processor, for example, the 3900X. In the 3900X, each CCX is using 16MB of L3 cache totalling 64MB. If you disable CCXs or only allow a program to use a certain amount of cores, some of the total L3 cache is left out entirely... but there's still some gain between 8~12 cores, which implies that there are games out there that can use at least enough cores to trigger that last CCX and get gain from that last 16MB of L3 cache.
No matter what you say, there is still clearly a benefit. Extremely small, but performance seeking users don't care as long as they get every ounce they can.
https://www.youtube.com/watch?v=W8xC2VellUg
Wrong.
Look at Fortnite, CS:GO, Battlefield 5 at 1080p.
Also, despite the i5-10600K being 0.1Ghz lower than the i7-10700K, it's not going to make a 10-20 FPS difference in Fortnite, Battlefield 5.
And again, as I mentioned, some people do more than just game only.
And also, where is your unlimited supply of arguments, considering you're calling us fakers.
And another also: This video is fake, and anyone who believes this is stupid, to you, right?
Where did you find 20 fps difference in this video? I see the exact same fps between i5 & i7.
The video does not give you any results. It just randomly changes the fps is every seconds real time. Sometimes i5 is faster, and sometimes i7 is faster. Try pausing several different moment, you will see it too.
*facepalm*
and fine, I can remove CS:GO since Valve Source Games aren't multithreaded optimized..
Sure, the difference isn't massive from a 4 Core and 6 Core. But you need to stop going around calling people Stupid and Fakers.
And again, some people may run more things in the background, though realistically, they aren't prob gonna like run Valorant and WarZone together at the same time since the GPU is the bottleneck by then (tested it on my own 9900K system)
I still would like to see more stuff related. (Because, honestly, I've only seen that one slide that scales to 24 thread.
The second benchmark you've provided is a GPU bottleneck (evident by the lack of difference between the CPUs), and the settings at max.
I have agreed with the point you're trying to make, that you can build a computer to last.
I'm saying that you're using the word wrong, and it's not what you think it means.
There is no futureproofing.
He's not wrong about what he's actually saying, just his terminology is wrong - it leads to misinformation, and misconceptions. Something that we shouldn't be doing.
I may come off like a bit of an ass, but I don't mean anything bad by it. I just don't like people spreading misinformation, which is something we can all get behind, right?
I don't quote portions, unless to give exact focus to them, of which was my own post, that were unwanted questions.
You say it's not about them being happy, but then you say it is. You contradict your own statement.
And you have gotten your facts wrong, one person said they game at 1080p, so both cannot be at 1440p.
It will be fine for a few more years, I agree, but it's still only a xx60 class card (performance wise) now, so realistically it's last less than the 3060, because the 3060 will have more support, and has more features.
The 1080ti was a great card, and it has lasted a long time, I agree. But it's by no means futureproof, because at some point, it will become useless, or there abouts. You cannot have something futureproof, and have it become useless, it's a contradiction.
The CPU used in this benchmark/test was a 3900x with SMT disabled.
I'm pretty sure they disabled cores in BIOS too, so they can produce the realistic performance of the hardware. But that's speculative.
How exactly is it a fake video? It's a recording of the same scene with difference corecounts. Showing clockspeed, cores, usage, framerate (average, 1%, and .1%), and frametimes.
Games do scale, just there are lots of deminishing returns. Now, you can claim 1-5% scaling isn't much, and I would personally agree, that doesn't mean there's no scaling.
Sure, you can argue about testing methodology, and what not. But it's still evidence, it cannot be dismissed instantly.
It's strange, you're the only one here that seems to say the video is fake.
They did put the specs in the description, you have to click more and scroll to the bottom.
He is testing score scaling, not a CPU. The CPU, provided it has enough cores, isn't as important, but I understand your reluctance, this is information I would expect to be present in the title, or on screen.
Average FPS isn't very telling, but in a couple of the games (Fortnite and PUBG) there were higher FPS by about ~25.
Now, that can be attributed to things like cache, like Vadim said.
Also, worth noting, there is a large difference in 1 and .1% lows, which means more than any average FPS will.
Since it directly relates to how stuttery the game will be, and with more cores, there are better lows, so the game will play smoother. (Evident in both this video, and the one you called fake.)
There is a large difference between 4 and 6 cores, not so much between 6 and 8. (With, or without SMT/HT.)
I have an i5-6600k, and it's pretty poor performance in a lot of games, because games just like to use a lot more than they used to
People do run programs in the background, I personally don't do it, but apparently people run browsers in the background (on a second monitor?) and discord/TS, anti-virus, and whatever else Windows wants to do.
This is reason enough to get a CPU with 2 more cores than you need for you game, it gives you some headroom on other programs.
I'd bet same goes now for 8c/16t CPUs. 6 is utterly fine for now, but let's see how the situation is after 2-3 years. 8 cores could be beneficial even sooner, when PS5 ports start to hit the shelves.
You can build to make something last, but it takes a bit of foresight into what you know your uses are, some risk on gambling how things go in the future, and trading off performance now (in short, getting less of a GPU to get more of a CPU/platform). This is literally more likely than ever, though, as CPUs last far longer than they used to. The big thing is how well things parallelize, so getting more cores helps (but again, requires more budget invested into CPU and less towards other things). IPC gains have slowed and I don't expect rapid breakthroughs like in the past here (at least not as consistently, although hopefully for the sake of progress it does occur more than it has these last number of years).
That's really all there is to it; it's a balancing game. I upgraded from a Core 2 Duo (E8600) to a Core i5 (2500K) around 9 years ago, probably a bit sooner than I "needed" to. Just so happened that core count AND IPC scaling slowed a lot after that. Combined with the fact I doubled (quadrupled?) down on RAM at 16 GB in 2011, it let my platform last, just doing GPU (and storage) upgrades. Something like a Zen 2 or better 8 or 12 core CPU likely has a good chance of lasting a really long time, but whether that's the best route to go is subjective; some would rather get a faster and/or cheaper lesser core CPU, and upgrade between that time for a faster CPU yet later. It's rather subjective and you have to ask yourself what your own needs and long term budget are and go from there.
That's exactly what we are seeing here, happening now. Those who have bought the GPU or CPU like GTX 1080 Ti or i7 8700k, are still totally satisfied by the performance, still fulfilling their needs, many years after purchase. That's why they are future-proof. That's what the OP (or other users who come to seek advice) are looking for, or they meant by ''Future-proofing''. it exists. And it's happening in front of your eyes.
Are you sure owners of MacBooks Pro with i7/i9 CPUs still satisfied?
Maybe I miss your point, but 7 years ago there was no need for 8 cores in gaming nor it's really beneficial at the moment. What I predict is that 8 cores will be good for gaming within few years as games start utilize 8 cores. PS5 was mentioned because it has 8 cores and it's possible that PC ports from PS5 could benefit from having 8 core CPU as well. Of course, that's something we'll see only later on.
But that's just my bet based on experience with previous gens. I've always bought a bit extra in terms of CPU and it has lasted way longer than buying just what I'd currently need. i7 6700k was a good bet in the past and still is okay. Now my bet is with i7 10700k and hopefully that will last too.