Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
In games like Baldur’s Gate 3 the new Ryzen is the ONLY cpu that never drops below 60fps according to Gamer’s Nexus benchmarks.
These numbers are important for most people even at 4K. I agree that most people don’t care about anything above 120/144fps but almost everyone cares about stutters, dips and stable frametime.
AMD also isn't the one that just lost over 16 billion in profits in the last quarter, Intel is. Their tactics aren't working and haven't been working for years.
That's your problem.
I gave you the Techpowerup test results @ 4K and you're invited to check out the numbers. Techpowerup is a definitive source!
However, like an Ostrich, you prefer to keep your head firmly in the ground. Don't bother replying!!
Whole thing stinks of FX era AMD fanboying. I liked Intel when they didn't make garbage, Comet Lake was the last generation remotely worthy of praise in my opinion, and I still have my 10850K.
This is the only balanced comment you've made
I'll never sell my 10900k!!!
I own a 9900K, 9900KS, 10900K, 12900KS, 13900K. These are all keepers. I still game with the 12900KS, because it's still a stonking cpu and because of the 13th and 14th gen degradation issue, which is now fully solved. Lol, my 13900K is still in the box!
Seriously, I gave a reputable source for my view and the fact that you haven't addressed it shows you can't score any points against it.
Are you still using a decade old 24 inch 1080p LED TN monitor?? It sure sounds like it.
If it makes you feel better, yes I'll concede, {much more gracefully than Harris} That you can enjoy Counter Strike 2 @ 1080p at 572.9 FPS, while my 13900K gets 552.2 FPS.{Techpowerup review of the 9800X3D}
What a nothing burger you champion!
2160p is a stupid reason to minimize the 9800X3D, because you're GPU bound, no ♥♥♥♥ you're not going to see much of a difference when your performance is hard capped by your video card at that resolution, that's why most CPU tests are done at lower resolutions or all popular resolutions, they're showing the best case scenarios primarily.
And the fact that you have five rather recent i9s just means that you have more money than reason considering you've been upgrading almost every generation for the last 5 years, all the more reason for people not to take much of what you say to heart. I'm not buying another CPU until I feel like mine isn't cutting it anymore for what I want to use it for, and after that, it'll more than likely end up in the hands of one of my friends that desperately needs an upgrade but can't afford one.
Well, I conceded everything to you and yet your resentment predominates over your reason and manners, I might add.
Probably the most recognised cpu limited gaming test is counter strike 2 @ 1080p. YOU ONLY GET 20 EXTRA FRAMES OVER A 13900K, AND WORSE OF ALL THOSE FRAMES ARE ABOVE 550 FPS FOR BOTH CPUS!!!! Is this your victory? This is absolutely nothing and a meaningless victory. Enjoy eating you nothing burger.
No meaningful advantage @ 1080p and even less so @ 4K.
Gamer's Nexus and few others do a good job showing the numbers.
Baldur's Gate for example doesn't drop below 60fps only on Ryzen 9800x3D.
Dragon's Dogma 2 will drop below 60fps on any CPU but the 9800x3D will stutter and dip way less than anything else.
Rainbow Six Siege might be easy to run and average 500-600fps but the 9800x3D is the only CPU that doesn’t drop below 120fps. That is huge for competitive players.
I think 9800x3D is even more impressive than some reviews suggest due to massive improvements in 1% and 0.1% lows (aka dips and stutters). This does affect 4K gaming as well.
Any benchmarker that hides let alone ignores watt usage is a knob.
Which myth are you referring to?
I was talking about averages, as based on the Techpowerup review of the 9800X3D. No myths here, just test results. If you don't like them, then that's your problem. I think the myth you're referring to is the one you pulled out of your fundament.
Techpowerup has a 10 game average 1% lows both in 1080p and 4K. {Check the review for the games but they are a good sample} All stock numbers...no overclocking.
10 game average {No ray tracing} @ 1080p, 1% FPS lows
9800X3D-145.1 FPS
14900K-135.9
13900K-135 FPS
12900K-121.3
Good results for the 13th and 14th cpus @1080p, while the 12900k is falling behind, but nevertheless, still showing solid numbers.
10 game average {No ray tracing} @ 4K, 1% FPS lows
9800X3D-77.8 FPS
14900K-78.2 FPS
13900K-78.1 FPS
12900K-75.8 FPS
Class leading results for the 13th and 14th gen cpus @ 4K. While 12900K shows great performance. Looks like C1REX is indulging in myth making of his own!!!! Oh dear.
What about Spiderman Remastered??? Here is more myth busting for our hapless friend C1REX. At 1080p, 1% lows and no ray tracing.
14900K-145.9 FPS
13900K-141.9 FPS
12900K-121.8 FPS
9800X3D-117.4
This is frankly an embarrassing result for the 9800X3D and C1REX
Should I care what brand is making some silicon?
What matters for me are 1% and .1% results in games I play. I do care about stutters even at 4K.
With all due respect, who T F gives a flying S about CPU 1% and .1% when playing on 4K or even on 1440p?
Who T F notices?
Ive seen people rocking a 5800X3D and a 7900XTX/4090/4080S heck, even a 6800XT on 4K and 1440p where youll NEVER, EVER run into a CPU bottleneck.
From a gaming perspective, CPU benchs alone are negligible and not worth anything.
You know that those CPU benchmarks are being recorded in an environment/situation where the CPU is the bottleneck, right?
I hardly believe that someone will use a 9800X3D playing on 480p/720p with a GTX 1050 and being glad they got aprox. 20 more FPS than on a 7800X3D or 40 FPS more than a 5800X3D.
Nobody will do that.
Everybody and their mum knew, that the 9800X3D would be around 20% faster than the 7800X3D which already was 20% faster than the 5800X3D.
But you wont notice ANY of that, while playing a game where your CPU is not being bottlenecked. Sure, better 0.1% and 1% here and there are noticeable while monitoring your FPS and keeping track of the dips like a lunatic, but you wont notice any difference ingame and in motion. Well, you do but on lower resolutions like 1080p/720p/you name it.
edit:
But I guess the hype will never stop, right?
I should have stayed on AM4, at least for the next few years. My bad.
I don’t think I’m the only one experiencing stutters and drops. It’s very common to see people on the Elden Ring forum talking about stutters and drops below 60fps. Or how hard it is to hold a stable 60fps in Dragon's Dogma 2. Or that the 7800X3D was bottlenecking the 4090 at 4K in Monster Hunter Wilds Beta, and the CPU could drop below 60fps in the main hub.
Actually, stutters are a very common complaint about modern AAA games and Unreal Engine 5 games in particular. A strong CPU can brute-force a flatter frametime graph and reduce some of the problem.
Weird take by some users here. Just because your fave games dont demand plenty of cpu power, it doesnt mean the cpu upgrade is pointless for others.
Look at msfs2020 benchmark. 9800x3d is almost 20% better than the next best cpu, 7800x3d (84fps vs 70fps).
The difference will be even more noticable once you add framegen on top of it.
Myself personally isnt gon upgrade from my current 7950x3d. Upgrading on every gen is a waste of money imo.