安裝 Steam
登入
|
語言
簡體中文
日本語(日文)
한국어(韓文)
ไทย(泰文)
Български(保加利亞文)
Čeština(捷克文)
Dansk(丹麥文)
Deutsch(德文)
English(英文)
Español - España(西班牙文 - 西班牙)
Español - Latinoamérica(西班牙文 - 拉丁美洲)
Ελληνικά(希臘文)
Français(法文)
Italiano(義大利文)
Bahasa Indonesia(印尼語)
Magyar(匈牙利文)
Nederlands(荷蘭文)
Norsk(挪威文)
Polski(波蘭文)
Português(葡萄牙文 - 葡萄牙)
Português - Brasil(葡萄牙文 - 巴西)
Română(羅馬尼亞文)
Русский(俄文)
Suomi(芬蘭文)
Svenska(瑞典文)
Türkçe(土耳其文)
tiếng Việt(越南文)
Українська(烏克蘭文)
回報翻譯問題
There might be some titles that would benefit from more cores/threads, but the extent and amount of them wouldn't be major. This doesn't need upgrading unless you already know it does (symptoms would be stuttering in games due to near 100% CPU use so if this isn't happening then the games you play won't need more). The newer generations aren't vastly faster than it in individual core speed to warrant a replacement IMO.
The GPU would be more impactful to upgrade, but also much, much more expensive given you're starting point would be a GTX 1070 Ti, in which I'd consider a GTX 3060 Ti or GTX 3070 the minimum worthwhile target for an upgrade from.
Wanted 8 cores because warzone and some other newer titles ive been playing has maxed out my 9600k and I was running at like 100% cpu usage it sucked hard
just did a 9700k + z390 +16g 3600cl18 kit upgrade for $370 after tax at microcenter
Having good ram like 32GB @ 3200 or 3600 helps too depending on Motherboard support. Maybe a couple SSDs. Clean install of latest version of Win10 64bit
Some folks are saying 12th gen i5's might get 10 cores, but I think that's a little optimistic.
If you're having trouble understanding cores and threads, this video does a good job of explaining:
https://www.youtube.com/watch?v=EjIjvwLt76Y
I wouldn't recommend upgrading to the 3060 and below, as price-to-performance is not good compared to the 3060 Ti and above.
As for your CPU: Wait for 12th/13th gen to come out and see what they have to offer. I have a feeling these LGA1200 mobo's are going to get dropped the moment Intel releases 10nm CPU's, which may be soon.
Its near to impossible that a 1070TI can run todays AAA games on 60FPS and 1440p, even with medium settings.
Yup, you need a 1080 Ti for that.
2080 Super or 3060 would be a worthy upgrade.
For CPU you might as well just keep the cpu you have and OC it. Get better cooling, etc.
When that cpu holds you back then upgrade motherboard + cpu + ddr5 ram
+1
Frankly I'm a little bit suprised that some users approve that the 1070ti would be enough for 1440p with a minimum of 60FPS.
I mean even Watch Dogs 2 which was released back in 2016, doesn't run on more than 50-60FPS and 1440p with ultra settings.
So I would say that AAA games between 2013-2018(generously) would run fine with OPs setup but everything else after 2018 is too much to handle for a 1070TI.
CPU is fine though.
These days, the minimum you'd want to choose if buying would be a hex core with SMT, but if you already have one without, it's not bad enough that you necessarily need to replace it either.
And what people are probably referring to with the next Core i5 having 10 cores is probably the extra ARM cores being added. To those unaware, Intel is adding ARM cores alongside the traditional cores in what has been unofficially dubbed by some as the "big.LITTLE" approach, and If it remains a hex core as it is is now, and has 4 ARM cores added (this seems to be standard amount I've heard but I haven't kept up with rumors), then that would make it a 10 core CPU.
Huh? I only saw one user mention specifically anything about 60 FPS at 1440p, and depending on settings, that may be possible.
That's sort of besides the point anyway. Besides one comment, OP and nobody else implied a goal of never falling below 60 FPS anyway. What people are saying is that it still CAN BE solid overall, and to get a meaningful increase, you'd need to move to a rather higher end card, which in today's market, would be expensive. Therefore, people are saying you can do that to get an increase, or stay with it for a while. That is what most people are saying, and it is absolutely true.
For the time being I'm getting wildly varying fps ranges from Kingdom Come: Deliverance ultra mode, but this is understandable given the experimental nature of ultra in KCD. It even says 'for future hardware', which my current clearly isn't from the perspective of 2018. So for example I get 55 fps out in the woods or in villages but as low as 30 in cities, sometimes 26–28 when combined with significant weather effects at night, generally involving light (which is consistent with how this particular game tends to gain many frames by reducing shadows in the settings).
My backlog includes games like Grid 1, Eisenwald, Ember, some newer but stiill old titles like Grid 2, Dirt 4, Pathfinder: Kingmaker, perhaps eventually something like Dirt 5 or Jedi: Fallen Order, or Outer Worlds. True new releases would be Mass Effect 4/5 and Dragon Age 5, which are probably not going to release before Q3 2022, possibly 2023. And then I'm also hoping to play some Neverwinter Nights: Enhanced Edition modules.
As a result, I'm not likely to be truly hard-pressed to upgrade, especially if I overcome the urge to play ME and DA sequels as soon as they release. And I might, because waiting for patches and DLCs is the reasonable thing to do. Playing GOTY editions and such like has the advantage of not needing to restart or go significantly back, not missing out on DLCs anchored to earlier parts of the storyline, not having to replay the final battle, etc. So this perhaps mostly comes down to defeating the urge.
On the one hand, while 40 fps does not feel the same as the 75 my monitor can do, I can mostly live with the 40. On the other hand, however, there's clearly much room for upgrade from 40 to 75 or even just 60, let alone for games that are newer than KCD and more demanding. Or think TW series.
So in a normal situation, with prices close to MSRP levels, I would probably be grabbing 3070ti or 3080 near the manufacturer price and that's it. But then I thought perhaps the CPU could be a wiser choice of path this once, because of less inflation of CPU prices compared to GPU. However, both 490/590 and X570 mobos are offputtingly expensive right now. More and more people are saying bad times are lying ahead for CPU availability and pricing, so perhaps nailing a 12700K and a decent mobo with it at release (before the prices manage to rise) could be the solution. Not the ideal one but a viable one.
However, I cringe at the elevated mobo cost and the cost of going DDR5.
So maybe the better way is in fact to go for a GPU upgrade and forego a CPU upgrade for the time being, easily saving some money for a GPU by delaying the platform change.
And yes, I'm close to 5 GHz on this one. On a lower-end (Strix-H) Z370 mobo auto settings give me 4.8 perm and 4.9 max boost at 1.5V-ish, which my cooler keeps manageable. The cooler is TC14PE, which is a double tower just marginally better than Noctua's D15, fitted out with P14 fans.
Right now, my normal mobo — Z390 Aorus Pro — has just returned from repairs, so I'm going to do the overclocking the right way, with proper optimization rather than speeding through the process and sticking with auto results. I don't know how far the chip can go, but temperature won't be a problem. I could buy a Liquid Freezer II 420, but that doesn't seem to be necessary. Maybe I'll just grab an AIO when actually changing the platform.
I know 9600K tends to bridge the gap toward 9900K and toward 10th gen when overclocking, though I'm also aware it doesn't catch up fully. Still, if people get good results from combining 11400 with a strong GPU like 3080, then obviously I can do almost just as well with a 9600KF.
So maybe yeah, maybe I'll economize by leaving the platform alone and put the money in the GPU jar. And sell the 1070ti.
I'm sure the non-ti 3060 or 6600XT would be a nice upgrade, but yes, by far not ideal for modern 1440p. It probably has to be at least 3070 or at least 6700XT, preferably nVidia due to DLSS and RT.
The one exception is that I'm subscribed with a store that sometimes sells 6600XT astonishingly close to MSRP, to the point that I could probably sell my 1070ti for just $100, maybe $150 less while getting warranty back in case the card dies on me while waiting for the prices to fall down.
As far as I can tell, 3080 is the best bang for the buck right now, but I've never spent the kind of money that a --80 costs at MSRP, and certainly not something to the tune of fifteen hundred.
I get the idea of wanting to change CPU given GPU prices, but changing the CPU alone won't be as large of a difference for gaming in broad terms. I'm not saying it'd a bad consideration, mind you, but be aware you'll still also want a GPU upgrade later. I went from a 2500K to a 3700X and even in a CPU heavy game like Minecraft, it didn't really allow me to do much that I couldn't before (but given the random nature of that game it's hard to get exact like for like results and compare them, so this is just going off blind playability).
As for games that indicate they are for future hardware, or honestly even for ones that don't, as a lot of times things like "Ultra" settings or very high resolution textures are often there partly for 4K or for those who have excess performance to spare, it's often in the realm of relatively small image quality gains for a disproportionate increase in demand of needed performance, so it's only worth it if you have the hardware to spare but not something you should chase as the default goal IMO. This is why I loathe today's mindset of "it can't do ultra in a number of certain titles so it's not good enough for it" because it completely misses the forest as a whole for a few standout trees, which themselves also work just fine if you actually understand you can (the horror!) turn it from ultra to very high and not really miss anything. Even Medium often changes little but people act like this is the equivalent of putting it all on low.
It sort of sounds like you have your mind made up though. You feel the best idea is waiting a year or two and hoping new generations/changes in market factors lead to better gains with a future GPU purchase. You then seem tied up on if a CPU upgrade is wise since you seem to have the funds and itch to do something now. My opinion is, if you have the money but don't want to waste it on a new platform because a GPU will do more is to either change to a better CPU on your existing platform rather than replacing it outright, so it'll cost less and not be as much of a waste, or, just go for the GPU upgrade now regardless of it being an extra money sink, but only do this if you're willing to part with what today's prices demand. If you go that route, I'd avoid the RTX 3060 as it's not only a relatively poor value (this is coming from a firm mid-range buyer that prioritizes value) but the RTX 3060 Ti, while more expensive, is a bit better buy, especially given your starting point. If the RTX 3080 is too expensive, I'd go with that instead.