r.linder Nov 6, 2024 @ 9:07am
Ryzen 7 9800X3D Review: "Devastating Gaming Performance" (Tom's Hardware)
https://www.tomshardware.com/pc-components/cpus/amd-ryzen-7-9800x3d-review-devastating-gaming-performance/2

So the rumors mostly appear to be true, the 9800X3D really is going to dominate more than expected, Intel is at least around 30% behind now with the 14900K and even further behind with the 285K. They really need to fix this, and fast, or we might end up with diminished upgrades per generation again.
< >
Showing 31-45 of 129 comments
C1REX Nov 6, 2024 @ 11:23pm 
I believe people greatly under appreciate 1% and 0.1% lows when focus too much on averages.

In games like Baldur’s Gate 3 the new Ryzen is the ONLY cpu that never drops below 60fps according to Gamer’s Nexus benchmarks.

These numbers are important for most people even at 4K. I agree that most people don’t care about anything above 120/144fps but almost everyone cares about stutters, dips and stable frametime.
Last edited by C1REX; Nov 7, 2024 @ 12:01am
r.linder Nov 6, 2024 @ 11:24pm 
Originally posted by Peter:
Originally posted by r.linder:
https://www.youtube.com/watch?v=s-lFgbzU3LY

Benchmarks don't lie. There are easily instances like BG3 where it pulls way ahead of Intel and other AMD processors. AMD only promised 8% but they underestimated their own chip in many instances.

Your claims are also for 2160p resolution which is freaking GPU bound, genius. If you're running 2160p, you don't even need a 9800X3D because it won't make a noticeable difference until there are far more powerful GPUs available.

I'll stick with Intel but my point is applicable to the 9800X3D.

Well, if you really really think that the vast majority of people who buy i9 cpus {I own a 12900KS and a 13900K} play at 1080p, then your thinking is out of date and irrelevant and at least a full decade behind the times.

We play with 3090 and 4090 gpus {yep I own both} on 4K OLED and 4K MiniLed monitors like the Asus PG42UQ and Asus 32UQX respectively. {yep I own both}

Sorry to burst your bubble but the 9800X3D has barely a ONE FPS ADVANTAGE over my 2 year old 12900KS at 4K, which will go down as one of the greatest cpus ever released for the enthusiast.
Think what you want and stay in your bubble, but the 5800X3D made an absolute mockery of the 12900K.

AMD also isn't the one that just lost over 16 billion in profits in the last quarter, Intel is. Their tactics aren't working and haven't been working for years.
Last edited by r.linder; Nov 6, 2024 @ 11:28pm
Peter Nov 6, 2024 @ 11:34pm 
Originally posted by r.linder:

Think what you want

That's your problem.

I gave you the Techpowerup test results @ 4K and you're invited to check out the numbers. Techpowerup is a definitive source!

However, like an Ostrich, you prefer to keep your head firmly in the ground. Don't bother replying!!
Last edited by Peter; Nov 7, 2024 @ 12:32pm
r.linder Nov 6, 2024 @ 11:37pm 
Originally posted by Peter:
*angry Intel fanboy noises*
You're more than welcome to not participate in AMD threads, all you've really done is try to excuse Intel for failing to keep up with the competition and hype up a generation that nobody really cared about in the end because of the terrible design choice of E-cores.

Whole thing stinks of FX era AMD fanboying. I liked Intel when they didn't make garbage, Comet Lake was the last generation remotely worthy of praise in my opinion, and I still have my 10850K.
Last edited by r.linder; Nov 6, 2024 @ 11:39pm
Peter Nov 7, 2024 @ 12:34am 
Originally posted by r.linder:
Comet Lake was the last generation remotely worthy of praise in my opinion, and I still have my 10850K.

This is the only balanced comment you've made

I'll never sell my 10900k!!!

I own a 9900K, 9900KS, 10900K, 12900KS, 13900K. These are all keepers. I still game with the 12900KS, because it's still a stonking cpu and because of the 13th and 14th gen degradation issue, which is now fully solved. Lol, my 13900K is still in the box!

Seriously, I gave a reputable source for my view and the fact that you haven't addressed it shows you can't score any points against it.

Are you still using a decade old 24 inch 1080p LED TN monitor?? It sure sounds like it.

If it makes you feel better, yes I'll concede, {much more gracefully than Harris} That you can enjoy Counter Strike 2 @ 1080p at 572.9 FPS, while my 13900K gets 552.2 FPS.{Techpowerup review of the 9800X3D}

What a nothing burger you champion!
r.linder Nov 7, 2024 @ 12:41am 
Originally posted by Peter:
Originally posted by r.linder:
Comet Lake was the last generation remotely worthy of praise in my opinion, and I still have my 10850K.

This is the only balanced comment you've made

I'll never sell my 10900k!!!

I own a 9900K, 9900KS, 10900K, 12900KS, 13900K. These are all keepers. I still game with the 12900KS, because it's still a stonking cpu and because of the 13th and 14th gen degradation issue, which is now fully solved. Lol, my 13900K is still in the box!

Seriously, I gave a reputable source for my view and the fact that you haven't addressed it shows you can't score any points against it.

Are you still using a decade old 24 inch 1080p LED TN monitor?? It sure sounds like it.

If it makes you feel better, yes I'll concede, {much more gracefully than Harris} That you can enjoy Counter Strike 2 @ 1080p at 572.9 FPS, while my 13900K gets 552.2 FPS.{Techpowerup review of the 9800X3D}

What a nothing burger you champion!
I've been on 1440p for years, and I already addressed your point earlier in my first response.

2160p is a stupid reason to minimize the 9800X3D, because you're GPU bound, no ♥♥♥♥ you're not going to see much of a difference when your performance is hard capped by your video card at that resolution, that's why most CPU tests are done at lower resolutions or all popular resolutions, they're showing the best case scenarios primarily.

And the fact that you have five rather recent i9s just means that you have more money than reason considering you've been upgrading almost every generation for the last 5 years, all the more reason for people not to take much of what you say to heart. I'm not buying another CPU until I feel like mine isn't cutting it anymore for what I want to use it for, and after that, it'll more than likely end up in the hands of one of my friends that desperately needs an upgrade but can't afford one.
Peter Nov 7, 2024 @ 1:06am 
Originally posted by r.linder:
2160p is a stupid reason to minimize the 9800X3D, because you're GPU bound, no ♥♥♥♥ you're not going to see much of a difference when your performance is hard capped by your video card at that resolution, that's why most CPU tests are done at lower resolutions or all popular resolutions, they're showing the best case scenarios primarily.

Well, I conceded everything to you and yet your resentment predominates over your reason and manners, I might add.

Probably the most recognised cpu limited gaming test is counter strike 2 @ 1080p. YOU ONLY GET 20 EXTRA FRAMES OVER A 13900K, AND WORSE OF ALL THOSE FRAMES ARE ABOVE 550 FPS FOR BOTH CPUS!!!! Is this your victory? This is absolutely nothing and a meaningless victory. Enjoy eating you nothing burger.

No meaningful advantage @ 1080p and even less so @ 4K.
Last edited by Peter; Nov 7, 2024 @ 1:07am
C1REX Nov 7, 2024 @ 1:32am 
Originally posted by Peter:
No meaningful advantage @ 1080p and even less so @ 4K.
This is a myth because people don't understand the importance of 1% and 0.1% lows and focus on less impactfull averages.

Gamer's Nexus and few others do a good job showing the numbers.
Baldur's Gate for example doesn't drop below 60fps only on Ryzen 9800x3D.
Dragon's Dogma 2 will drop below 60fps on any CPU but the 9800x3D will stutter and dip way less than anything else.
Rainbow Six Siege might be easy to run and average 500-600fps but the 9800x3D is the only CPU that doesn’t drop below 120fps. That is huge for competitive players.

I think 9800x3D is even more impressive than some reviews suggest due to massive improvements in 1% and 0.1% lows (aka dips and stutters). This does affect 4K gaming as well.
Last edited by C1REX; Nov 7, 2024 @ 1:41am
Skkooomer Lord Nov 7, 2024 @ 2:23am 
Originally posted by r.linder:
Originally posted by Capt Spack Jarrow:
The difference between the 7800X3D and 9800X3D is not the same as the difference between Sandy Bridge and first gen FX.

Yes I read correctly what you said, you don't like the answer is all.
That isn't what they said, you just didn't read properly.

Illusion said:
Originally posted by Illusion of Progress:
This same thing gets said every time.

I remember back when Sandy Bridge and Bulldozer were competing, there were a lot of people going "yeah the difference exists but it doesn't matter since most people are GPU limited more often than CPU limited so Bulldozer is fine".

And yeah, that can be true to an extent, and most people don't NEED the fastest CPU. Buying a 7800X3D/9800X3D and paring it with an entry level GPU like an RTX 4060 might not be the best approach.
It's not about the difference between the CPUs, it's about the fact that most people don't need the fastest CPUs available if they aren't going to push it with the fastest video card available.

Thread topic wasn't even about anyone needing a 9800X3D, it was simply to marvel at how far AMD has come and how far Intel has fallen, and to marvel at how insane of a performance gap there is between AMD and Intel's overall best processors for gaming in a best case scenario. You really didn't need to explain that because people already understood that.

If I could spare the cash, then I definitely would drop thousands on a 9950X3D and 7900-XTX (specifically because I use Linux and AMD has better support, although I would prefer a 4090) but I really can't because there's more important things to deal with. My 10850K and 3080 are also more than adequate.
So they just repeat what I stated?
Rin Nov 7, 2024 @ 3:24am 
The cache on the X3D can be the difference between having to use FSR/XESS/DLSS to achieve 120fps at 4K, or just leaving it off so you skip the vaseline blur and ghosting, not to mention the added input lag. They also eat half the wattage that Intel CPUs typicality do to achieve similar framerates.

Any benchmarker that hides let alone ignores watt usage is a knob.
Last edited by Rin; Nov 7, 2024 @ 3:27am
Peter Nov 7, 2024 @ 3:34am 
Originally posted by C1REX:
This is a myth because people don't understand the importance of 1% and 0.1% lows and focus on less impactfull averages.

I think 9800x3D is even more impressive than some reviews suggest due to massive improvements in 1% and 0.1% lows (aka dips and stutters). This does affect 4K gaming as well.

Which myth are you referring to?

I was talking about averages, as based on the Techpowerup review of the 9800X3D. No myths here, just test results. If you don't like them, then that's your problem. I think the myth you're referring to is the one you pulled out of your fundament.

Techpowerup has a 10 game average 1% lows both in 1080p and 4K. {Check the review for the games but they are a good sample} All stock numbers...no overclocking.


10 game average {No ray tracing} @ 1080p, 1% FPS lows

9800X3D-145.1 FPS

14900K-135.9

13900K-135 FPS

12900K-121.3


Good results for the 13th and 14th cpus @1080p, while the 12900k is falling behind, but nevertheless, still showing solid numbers.


10 game average {No ray tracing} @ 4K, 1% FPS lows

9800X3D-77.8 FPS

14900K-78.2 FPS

13900K-78.1 FPS

12900K-75.8 FPS


Class leading results for the 13th and 14th gen cpus @ 4K. While 12900K shows great performance. Looks like C1REX is indulging in myth making of his own!!!! Oh dear.


What about Spiderman Remastered??? Here is more myth busting for our hapless friend C1REX. At 1080p, 1% lows and no ray tracing.

14900K-145.9 FPS

13900K-141.9 FPS

12900K-121.8 FPS

9800X3D-117.4

This is frankly an embarrassing result for the 9800X3D and C1REX
Last edited by Peter; Nov 7, 2024 @ 3:34am
C1REX Nov 7, 2024 @ 4:48am 
Originally posted by Peter:
[
This is frankly an embarrassing result for the 9800X3D and C1REX
How is it embarrassing for me? Am I AMD?
Should I care what brand is making some silicon?
What matters for me are 1% and .1% results in games I play. I do care about stutters even at 4K.
Schrute_Farms_B&B Nov 7, 2024 @ 5:26am 
Originally posted by C1REX:
I believe people greatly under appreciate 1% and 0.1% lows when focus too much on averages.

In games like Baldur’s Gate 3 the new Ryzen is the ONLY cpu that never drops below 60fps according to Gamer’s Nexus benchmarks.

These numbers are important for most people even at 4K. I agree that most people don’t care about anything above 120/144fps but almost everyone cares about stutters, dips and stable frametime.

With all due respect, who T F gives a flying S about CPU 1% and .1% when playing on 4K or even on 1440p?
Who T F notices?

Ive seen people rocking a 5800X3D and a 7900XTX/4090/4080S heck, even a 6800XT on 4K and 1440p where youll NEVER, EVER run into a CPU bottleneck.

From a gaming perspective, CPU benchs alone are negligible and not worth anything.
You know that those CPU benchmarks are being recorded in an environment/situation where the CPU is the bottleneck, right?

I hardly believe that someone will use a 9800X3D playing on 480p/720p with a GTX 1050 and being glad they got aprox. 20 more FPS than on a 7800X3D or 40 FPS more than a 5800X3D.
Nobody will do that.

Everybody and their mum knew, that the 9800X3D would be around 20% faster than the 7800X3D which already was 20% faster than the 5800X3D.
But you wont notice ANY of that, while playing a game where your CPU is not being bottlenecked. Sure, better 0.1% and 1% here and there are noticeable while monitoring your FPS and keeping track of the dips like a lunatic, but you wont notice any difference ingame and in motion. Well, you do but on lower resolutions like 1080p/720p/you name it.

edit:
But I guess the hype will never stop, right?
I should have stayed on AM4, at least for the next few years. My bad.
Last edited by Schrute_Farms_B&B; Nov 7, 2024 @ 5:30am
C1REX Nov 7, 2024 @ 5:55am 
Originally posted by Schrute_Farms_B&B:
But you wont notice ANY of that, while playing a game where your CPU is not being bottlenecked. Sure, better 0.1% and 1% here and there are noticeable while monitoring your FPS and keeping track of the dips like a lunatic, but you wont notice any difference ingame and in motion. Well, you do but on lower resolutions like 1080p/720p/you name it.

I don’t think I’m the only one experiencing stutters and drops. It’s very common to see people on the Elden Ring forum talking about stutters and drops below 60fps. Or how hard it is to hold a stable 60fps in Dragon's Dogma 2. Or that the 7800X3D was bottlenecking the 4090 at 4K in Monster Hunter Wilds Beta, and the CPU could drop below 60fps in the main hub.

Actually, stutters are a very common complaint about modern AAA games and Unreal Engine 5 games in particular. A strong CPU can brute-force a flatter frametime graph and reduce some of the problem.
Last edited by C1REX; Nov 7, 2024 @ 5:55am
Tiberius Nov 7, 2024 @ 6:43am 
Intel has been irrelevant since 5800x3d.

Weird take by some users here. Just because your fave games dont demand plenty of cpu power, it doesnt mean the cpu upgrade is pointless for others.

Look at msfs2020 benchmark. 9800x3d is almost 20% better than the next best cpu, 7800x3d (84fps vs 70fps).

The difference will be even more noticable once you add framegen on top of it.

Myself personally isnt gon upgrade from my current 7950x3d. Upgrading on every gen is a waste of money imo.
< >
Showing 31-45 of 129 comments
Per page: 1530 50

Date Posted: Nov 6, 2024 @ 9:07am
Posts: 129