Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
Hearts of iron 4
But because the A.I. is limited to 1 core, still it will lags. But yeah, it will make the calculations little bit faster
My 9900k is realy close to being a relevant bottleneck and I'm playing at 3840x1600.
I get over 30 frames back at max settings just by dropping from 1440p to 1080p
If it should or shouldn't I don't know but I probably spend around 6hrs testing the game the past few days and I can for sure say I had occasions where I was in my 60-70%.
Thanks Ralf. But this game ist at Version 1.6 now. The testing in your Link was done with Version 1.04.
Most settings have changed. Especially the demanding ones and the variety of demands of different areas is vast and I'm doubtfull were took into consideration when doing a broud CPU evaluation.
I also don't want to make my testing results a bigger part of this thread anyway.
When I tested 1080p with Psycho settings, I was getting minimum 80~90 FPS in the middle of the city near Jig Jig Street, normally at 1440p with the same settings the range in the same area the range is 60~70 FPS at best with the same settings. This is all with DLSS on Balanced, not Quality.
An overclocked i9-9900K should have no problems handling that. It's not as considerable of a bottleneck as my old R9 3900X, which gained pretty much nothing when going down from 1440p to 1080p, and that was after Patch 1.3. My 3900X was a garbage bin chip that couldn't even do better than 4.1.
I won't continue participation in this. I understand you are not of ill intent but I definetly won't go over countless charts and results of yet around 8hrs of testing and another 4hrs of watching Youtube and reading reddit etc.
You may have your own findings and research. Maybe those are better. But after around 12hrs of research and testing I stand by mine.
Have a good one.
What?! I will agree with you that games(well some games) will benefit from lots of cache and fast memory, but what games will benefit from PCIe revision? That is PCI-e 3.0 vs. 4.0. You will notice that there is almost no difference when running a current gen GPU at PCI-e 3.0 vs. 4.0.
And to say that the 9900K falls short of all these by 2022 standards is utter nonsense. I will agree that the 5800X3D is a faster processor than the 9900K, but outside of 1080p, there is almost no difference in most games between the two. At 1080p, the 5800X3D will perform better in CPU intensive games, but at 1440p that gap will close significanlty. And at 4K there will literally be no difference between the two CPUs.
Only in the rare case of a really CPU intensive game will there much difference. And at 1440p, or 4K, it will still be slight. Most games are GPU bound and intensive. While the 5800X3D is a faster processor with more L3 cache, the 9900K is no slouch and still a performer in 2022. I would have no issue pairing either with a high end GPU. And if playing at a res like 4K, it would make more sense buying a cheaper 8-core CPU, like a 5700X, than it would be to buy a 5800X3D.
If you are playing at 3840 x 1600, your 9900K WILL NOT bottleneck any GPU you pair with it in Cyberpunk 2077. If you are having performance issues, it most certainly isn't your CPU.
What GPU are you using?
https://www.techspot.com/review/2502-upgrade-ryzen-3600-to-5800x3d/
Look at this article. It is comparing a 3600, a 5600, and a 5800X3D. Now go down to the Cyberpunk benchmark. At 1080p, there is quite a difference between the CPUs. But when you go up to 1440p, there is almost no difference. At 4K, which is close to what you are running, there would literally be no difference.
And there is quite a difference between these 3 CPUs. But aside from 1080p, and CPU intensive games, CPU really doesn't matter as much as GPU in gaming. Most games are GPU bound. These benchmarks were taken with a 6950XT and 6600XT.
Techspot has other articles comparing other CPUs. Here is one comparing the 5800X3D and 5800X.
https://www.techspot.com/review/2451-ryzen-5800x3D-vs-ryzen-5800x/
As you can see, there is a big difference in performance at 1080p. At 4K? Virtually no difference.
I don't have a 9900K but I was using a 10700K, very similar to a 9900K with a 3070 Ti and Resizable Bar. The 10700K is only PCI-e 3.0, so it is like running PCI-e 4.0 x 8, and there was absolutely no performance loss compared to running on a system that is PCI-e 4.0. Trust me. And every publication online will say the same thing.
Now, like you said, if running a GPU like the 6500XT, and maybe the 3050 if am not mistaken, it could be an issue. As those GPUs are not running at x16. But I guarantee that if you pair one of those GPUs with a 9900K, the bottleneck will not come from the CPU, as most games are GPU bound, and those GPUs will be the bottleneck.
And again. What resolution you are running at makes a big difference. Just look at the article I posted comparing the 3600, 5600, vs. 5800X3D, even the 5800X vs. the 5800X3D. On paper, there is quite a discrepancy between all of those CPUs. And while the 5800X3D is much faster at 1080p, at 1440p that gap between the CPUs is lowered considerably. At 4K, there is virtually no difference. Also, Techspot has other extensive articles comparing all sorts of CPUs. The story is the same.
Now I don't know about VR, and quite frankly don't care, personally. That wasn't what was at topic here. Just straight up gaming performance. And my point remains the same. When running a modern GPU, that is PCI-e 4.0 X 16, the difference between running that GPU with a CPU that is PCI-e 3.0 vs 4.0 capable, will be very negligible.
But this right here IS utter nonsense. Go look at benchmarks. The IPC of Zen 3 is not all that dissimilar than Intel 11th Gen(Rocket Lake and NOT Alder Lake). In fact, a processor like the 5800X is trading blows with a processor like the 11700K. Some games the 5800X is faster. Some games the 11700K. And honestly, the 11700K is not that much better than a 10700K. In fact, the 10700K performs better in some games.
https://www.techspot.com/review/2261-ryzen-5800x-vs-core-11700k-vs-10700k/
The performance between all these procs is almost identical in gaming. Especially at 1440p and 4K. At 1080p, the 5800x is slightly faster(in some games) but not much to make a difference. In some the 10700K is better than both(at all resolutions). And the 9900K is almost identical to a 10700K.
So, to say that the experience is far, far more refined with Zen 3, or Alder Lake, IS utter nonsense.
Again, I don't know where you are getting your getting your info, but you show no evidence to support your claims. So, you are trying to defend the indefensible. To say that they do not meet this year's performance standards is just ridiculous. Why? Because they don't use PCI-e 4.0? Ridiculous. Like I said, running a GPU in PCI-e 3.0 will not hinder its performance vs. a system that is PCI-e 4.0.
Sure, Alder Lake, and the 5800X3D are faster than these CPUs. But in gaming, that is really only at 1080p. At 1440p, and especially at 4K, there will be no difference. Just go read the articles I posted. There was no difference between the 5800X, the 11700K, and the 10700K. And in turn that would mean the 9900K. In fact, in some games, the 10700K was faster than both the 5800X and 11700K. I also posted some articles comparing the 5800X3D to the 3600, 5600, and 5800X. The story was the same. At 4K, there was virtually no difference.
So, how can you sit there and say that these platforms are starting to show their age? If one is using a highend GPU(or really any GPU), playing at 1440p or 4K, they will see virtually no difference in performance between the 9900K and Alder Lake and the 5800X3D. Especially at 4K. There can be some difference still at 1440p in more CPU intensive games. However, most games are GPU intensive nowadays. So, CPU doesn't really matter as much as GPU does.
The 9900K and 10700K are not the newest CPUs on the block, but to say something like they don't meet 2022's performance standard, or that they are starting to show their age is laughable. The 10700K was released in 2020 and it is more than adequate to meet the needs of games in 2022. And in certain conditions, like playing at 1440p or 4K, will perform just as good as Zen3, and Alder Lake and Rocket Lake. That is the bottom line.
The article showed the 10700K outperforming BOTH the 5800X AND the 11700K. In select games. However, the 1% lows were still better on the 11700K. And again, these performance differences were very minor. Overall, the 11700K is slightly faster in overall games than the 10700K. But my point is that, all three of these processors perform very close to one another. Trading blows. So, while Alder Lake is faster than those three platforms, my point is that the 9900K and 10700K are still more than adequate in 2022. At higher resolutions, like 4K, they will not perform any less than even Alder Lake or the 5800X3D. In fact, if you are playing at 1440p, and especially 4K, and want to save some money, going with a CPU like the 10700K, even in 2022, which is going for very cheap now, might be a better value than buying a more expensive CPU. Even if it is only PCi-e 3.0. As that will make very little difference in gaming.
So, what I was trying to say, is that making a statement that Zen 3, or Alder Lake, will provide a far, far more refined gaming experience than Comet Lake, Rocket Lake, or even Coffee Lake(the 9900K) is just foolish. Again, GPU is what matters, not CPU. Especially at 1440p and 4K.
And I am not trying to get defensive, but I just thought the notion that the 9900K, or the 10700K, are not meeting 2022 standards was ridiculous. Sure, IPC gains, and overall performance of newer processors will be faster, but in gaming, especially when you are GPU bound, like playing at higher resolutions, these processors will perform just as good as Zen3, Rocket Lake, and Alder Lake.
That is just the point I was making, and proving. The 9900K was released in 2018, and the 10700K was released in 2020, and sure while they were chart toppers at the time, and will not be considered chart toppers today, that doesn't mean they are inadequate. And I was showing that these charts are based on 1080p resolution, which makes games more and more CPU bound. 1080p is the new 720p. Previously 720p, and below, was the testing resolution that made games CPU bound, but now 1080p does this as well. They test at lower resolutions to show the differences between the CPUs. Testing at higher resolutions makes the games more and more GPU bound, and thus less and less difference between the CPUs. Unless a game is extremely CPU intensive, but this tends to be older games that typically don't scale well and are not multi-threaded or designed to make use of newer processors with more cores.
So, my point is that, this is Steam. Steam is for gaming. When playing at higher resolution, like 1440p, 4K, and some Ultrawide resolutions, more and more you will become GPU bound. And more so the higher resolutions you go. So, that is why I always say to get yourself a good CPU, and then spend twice as much on the GPU. The GPU is what will determine your overall performance in games. So, midrange CPUs, and CPUs like the 9900K, will give you just as good performance as the newer more expensive CPUs when paired with a modern GPU and playing at higher resolutions.
I think the 40 series will be better than just 20% better than the 30 series. At least, from what I am seeing from early leaks. But of course, it is all speculation until we see actual benchmarks and hardware.
And 4K is not as hard to run as you think. I know you are always claiming this, and many times I refute this, and then delete my comments, lol, because I don't want to be in an argument, but I have been playing at 4K for a couple years, and I have no issue maintaining 60FPS on high/ultra in every single game that I play. And this is with a 3070 Ti.
First, not every game is super demanding. Second, I don't play Cyberpunk, lol, so that might be an issue. But I am sure with tweaked settings, and DLSS, I would be able to maintain 60FPS with my 3070 Ti in Cyberpunk. But like I said, I am able to maintain a constant 60FPS in every single game that I play, with settings pretty much maxed, in all games that I own. So, personally, I just think this notion that 4K is so demanding, even in 2022, is overblown.
Like I said, not every game is super demanding. But even in my most demanding games, I am still able to maintain a constant 60 FPS in all my games. With a 3070 Ti. Currently I am in the market for a new display. I want OLED. Burn-in risk and all. They are just too sweet. And am pretty sure I will stay at 4K. Unless I get one of these new OLED Ultrawide screens that have 3440 x 1440 res. And with my new display, my target might become 120FFPS, as it will have 120Hz, or more, refresh rate.
So, while my 3070 Ti is capable of 4K/60, it will not be capable of 4K/120. Which is why I will probably be upgrading to a next gen GPU. Hoping that the 4090, or 4080, or maybe even the RX7900, or whatever, will be capable of 4K/120. Even if not in every game, I have been having a glorious experience running 4K/60 with my 3070 Ti, so even if I just get 60 or higher in 4K with the newer GPUs, it will still be a great experience. I love the higher fidelity that 4K brings. Especially in 2022. Pair that with a fast OLED screen with those outstanding blacks and HDR. Sounds fantastic.
BTW, I went from 1920 x 1080 to 2560 x 1440 to 3440 x 1440 to 3840 x 2160. I personally love the extra clarity and screen real estate that 4K brings. It would be very hard from me to go down to a lower resolution not to mention something like a 34" or 27" screen. I think it has to be at least 48"(maybe 42) for me to equal the gaming experience that I have now. And looking at these new displays coming out, they are all OLED, either 4K or 3440 x 1440, and they are all big screens.
I've personally tested 5950x vs 9900k and i saw absolutely no difference in cp2077 performance