安裝 Steam
登入
|
語言
簡體中文
日本語(日文)
한국어(韓文)
ไทย(泰文)
Български(保加利亞文)
Čeština(捷克文)
Dansk(丹麥文)
Deutsch(德文)
English(英文)
Español - España(西班牙文 - 西班牙)
Español - Latinoamérica(西班牙文 - 拉丁美洲)
Ελληνικά(希臘文)
Français(法文)
Italiano(義大利文)
Bahasa Indonesia(印尼語)
Magyar(匈牙利文)
Nederlands(荷蘭文)
Norsk(挪威文)
Polski(波蘭文)
Português(葡萄牙文 - 葡萄牙)
Português - Brasil(葡萄牙文 - 巴西)
Română(羅馬尼亞文)
Русский(俄文)
Suomi(芬蘭文)
Svenska(瑞典文)
Türkçe(土耳其文)
tiếng Việt(越南文)
Українська(烏克蘭文)
回報翻譯問題
I did the exact same upgrade last March (from an i5-4590 to a Ryzen 7 2700). Kept the same gpu for the time being (an RX 470). Saw a big fps improvement in Just Cause 3, Batman Arkham Knight, and Assassin's Creed Unity / Syndicate / Origins / Odyssey. In heavily populated areas in Odyssey, the 2700 routinely passes 50% utilization.
Next-gen consoles are going to be based on 8 core, 16 thread Ryzen cpus, so I expect games' cpu demands to go up, not down. Unless there's new DirectX optimizations on the way, or developers jump en masse to Vulkan (and properly implement it like the Rainbow Six Siege dev team seem to be doing).
Overall you buying that i5 was poor choice from the very start. If you have say a 4970K and a half decent motherboard you wouldn't have had such issues.
Overall not a bad upgrade choice there for you and yes modern cpus, even budget friendly Ryzen cpus allow for more throughput allows gpus to perform better then on older platforms and with extra cores and ram, you won't struggle with multi-tasking while gaming at the same time.
I still use my own simple trick of flushing my ram after playing games or after having closed demanding apps if I'm not seeing the ram come back as free like it should.
IObit Smart RAM
and Intel Burn Test is all that's needed to do that easily and quickly.
The advice on the i5 wasnt bad advice when it was a new chip. Most felt the extra threads on the i7 were not needed for gaming and in some titles back then HT could actually hurt performance.
That siad, myself and many others called it back then when we said that the i5's would die off long before an i7 was irrelevant, mainly due to the extra threads, and now we are seeing that happen. It was easy to see even back then, as game devs were *just* starting to push hard into 6-8 threaded terrirory after having spent the last few years on the then new consoles with 8 core AMD cpus that had poor IPC, forcing the devs to thread out for any compute ability.
Now with the Ryzen chips multi-core is the next big deal.
We are only a probably 3-5 years from affordable consumer level (not HEDT) cpus in the 64c/128t range unless something happens to stop the progress. AMD is pushing such a chip into their HEDT platform via threadripper this year. Consider for a second that chips comparable to their first gen threadripper are now on their consumer x570 platform...
Worst case, we get a sag time and get an 8-10 year span out of 16-32c chips before they jump it up on the consumer level, but I dont see why they would stop the steam they have running.
Im making a similar upgrade shortly, parts are in but I have to assemle it yet, 4790k > 3900x.
My view was that no matter how well the i5's OC, you can't magically get the extra threads when the time came when HT plays a bigger role. I'd put my money were my mouth was by getting a i7 3960X (6C/12T), and later on, an i7 4770K as a backup unit. I figured that when game devs started optimizing games for more threads, HT/SMT would prove to be useful.
Even now, my 3960X is able to handle all the games I've thrown at it with ease despite getting long in the tooth architecture wise. I have it combined with a GTX1080 and 16GB od DDR3 2133 RAM, and all modern games run pretty well at 3840x1080.
I also have a 3900X + Aorus Xtreme + 16GB DDR4 3733 RAM (also have 32GB DDR4 3200CL16 on standby should 16GB become insufficient) and a VEGA 64 Red Devil (yes, I know the VEGA64 is the bottleneck, but I'm waiting for Ampere or big NAVI before I'd upgrade my GPU later this year). Right now, I can play games at med-high setting at 3440x1440 and I'm happy enough with the performance I can eek out of the VEGA64.
As you can see, I prefer to stay a little ahead of the curve, when most are going for 8C/16T, I hope the 12C/24T of my 3900X can tide me over for a longer time....
Back then i5 had better value. Now, after 8 years, i7 aged slightly better, but does it really matter? As example i moved from i7-3770k with decent board and i've had experience similar to OP. Extra threads are useful, but they only provide ~25% more performance at best in ideally multi-threaded app.
Now something like 8C16T ivy bridge CPU would have been entirely different (about the same as 8C16T ryzen 2***) but those only existed for server market and were impractically expensive. To the point when getting cheap 4C4T i5 and then upgrading to something new when it became limiting was far better alternative.
I also chose... let's say different option. Specifically old ivy bridge server chip. It was cheaper than ryzen back when i bought it, i also wanted "server" features it provides (ecc, 40 pci-e) as my pc is used for some work apart from games, and performance is kind of surprising[www.cpubenchmark.net]. So yes, for someone who bought high-end 2011 i7 (did 8C models exist?) back then all those "ryzen is great" hype must have looked funny.
Threadripper is dual cpu for all intents and purposes. And only 32C practically exists right now.
18-core server chips existed back in 2014 (=>36 core for dual cpu).
The reason why we still do not have as much cores in desktop is - it is pointless. Very few tasks exist that can use more than basic 2-4 cores for general desktop usage, and for games it is 6-8 right now. Even modern 16-core AM4 ryzens are kind of pointless IMO, because for desktop/games they are not needed and for professional use they lack other HEDT features like connectivity, more RAM support etc. Very niche product at best.
As for how many cores needed it is game dependent.
Will upcoming game releases be well optimised & scale well taking advantage of more cores if available?
It seems there are games still being released using old game engines that cannot. It means single core performance is most important factor
Although a few games seem to do it well.
I hope with the demise of windows 7, Direct X 11 will be forgotten and games only use directx 12 & Vulcan. That is no guarantee a game will perform well . It will provide the game developers opportunity to gradually improve performance through scaling of core use in their game engines.
Whether that happens is a different story.
I hope so.
My setup is a MSI B450 tomahawk, 3600X & 32GB 3200 DDR4 CL4.
When the next Ryzen processors come out & if I buy a 16 core CPU how long will it last as a well performing game machine?
3 years?
5 years?
A decade?
I upgraded from an i7-3770k (4 core, 8 logical cpu) . Some games perform the same it seems but massively mutiplayer performance was boosted.
Sooner - unlikely. Too many systems are still out there which do not support it and too many existing tools/game engines. Jumping to new tech with no backwards compatibility does not make sense for gamedevs from commercial perspective...
Also dx12 or not - there are some fundamental issues with tasks like games and multithreading. There always going to be single "main" thread with pretty high requirements which limits performance. Especially for certain types of games.
Not even. 3950x is the only right answer.
For any normal power user they will get just as much benefit from the 3900x for normal enthusiast level compute/render workloads vs the increased price of the 3950x.
Likewise most enthusiast end users will benefit from the 3800x for a small increase in price, though not b/c of the stock increased speed but from the better binning and better top end speeds of the 3800x vs the 3700x.
Right now the 3800x is on a sale for like $10 more than the 3700x for better stock speeds and better binned silicon.
I use to think I knew what smooth was and sure my games ran at whatever framerate I was targeting with modest settings but now with the same 1060 it has really highlighted how important a good cpu upgrade and be.