安装 Steam
登录
|
语言
繁體中文(繁体中文)
日本語(日语)
한국어(韩语)
ไทย(泰语)
български(保加利亚语)
Čeština(捷克语)
Dansk(丹麦语)
Deutsch(德语)
English(英语)
Español-España(西班牙语 - 西班牙)
Español - Latinoamérica(西班牙语 - 拉丁美洲)
Ελληνικά(希腊语)
Français(法语)
Italiano(意大利语)
Bahasa Indonesia(印度尼西亚语)
Magyar(匈牙利语)
Nederlands(荷兰语)
Norsk(挪威语)
Polski(波兰语)
Português(葡萄牙语 - 葡萄牙)
Português-Brasil(葡萄牙语 - 巴西)
Română(罗马尼亚语)
Русский(俄语)
Suomi(芬兰语)
Svenska(瑞典语)
Türkçe(土耳其语)
Tiếng Việt(越南语)
Українська(乌克兰语)
报告翻译问题
My troubles are that according to this graphics looks like a Ryzen 7 1700 (and also Ryzen 3300) released in 2017 just have bottleneck with a 2080 TI (released in 2018). Maybe an I9 9900K released in 2018 will have bottleneck with next GPUs.
https://i.imgur.com/8nDA1OF.jpg
If they were AMDs flagship processors, you might have a point, but both those CPUs were outdated on release, 3300 comfortably being beaten by Intel's 6th gen i-core CPUs released in 2015.
https://www.youtube.com/watch?v=GYNI5Nyk5KE
Bandwidth only matters in multi GPU situations and when you're using a lot of stuff using Lanes like SSDs/PCI-E cards, etc etc.
So no
I dont think your PC using PCI-E gen 3.0 is going to be a problem with those GPUs
9900K probably won't bottleneck as much; maybe barely.
plus, a 2080 Ti doesn't even take full use of a PCIe 3.0 x 16 slot anyway so, PCIe 4.0 is a 50/50 but probably not a necessity.
games like ACO can't even do 144fps constantly on a 5.0ghz i9 9900k
nether can RDR2
but with that said, no cpu can right now.
not really worth upgrading the cpu until we get a decent generational improvement
you've basically got the best money can buy for gaming, there's nothing you can upgrade to right now that makes any sense for games
it's pretty hard to say how realistic that is considering no one has one,
and we only have sony's claims.
each of the big console companies have had claims be
lets say.... "over ambitious" to put it nicely.
same can kinda be said about how fast big navi or 3080 is gonna be
though most of that isn't official at all, it's rumors.
tbh i'd wait until they release or official information
and comparisons over previous cards
is released.
pcie 4.0 again might not even be that useful this generation
any current gpu using 4.0 see almost no gain over 3.0.
its possible it becomes more relevant this time around, but again no one knows yet.
since you know we don't have the new cards
just hold back on upgrades for the time being is my advice.
Why you say i shouldn't upgrade? I don't think i can actually handle next upcoming games.
lol you said you already have I have a Z390 master with an I9 9900k
why upgrade right now? just wait until actual information about the new hardware releases
your basing your upgrade path apon rumors right now.
For the cpu, maximum fps is dependent on thread speed plus having enough threads for the game's technical architecture. A 9900k has the fastest single thread speed and more than enough threads. A 9900k is only going to possibly bottleneck a gpu at 1080p anyway.
For memory, different memory stick models run at different speeds even for the same frequency/cl value. Faster is obviously better.
The pcie bus speed won't matter that much as tests show negligible or no difference between pcie 3 and 4 with current gpus. (Which also means negligible difference between all pcie versions). That might change but it might not too. Time will tell and by that time there will be new cpus and gpus.
SSD's in some situations perform faster on pcie4. Maybe not gaming though.
For example, an i9-9900k will bottleneck a 2060 in CSGO, but in some games a pentium won't bottleneck a 2080ti.
It all depends.
Generally it's not worth worring about if you have decent hardware.
To answer, it would not be wasteful to pair any upcoming GPU with a Core i9 9900K, no. Considering that's one of the better performers for games these days, wouldn't it be a bit silly to expect that it would be a waste, unless you think EVERY existing CPU is already on the edge of being no more capable of pushing any more GPU power than they do now, and that we are FUBAR without a major improvement in CPU IPC, which is not the case.