i bought rtx 4070ti but my cpu is bottlenecking me
im thinking of buyying
ryzen 7 5800x3d
This is the best cpu my motherboard can handle, is this going to be good enough to handle my gpu or should i also upgrade my motherboard to
Αναρτήθηκε αρχικά από Illusion of Progress:
The Ryzen 5 5500 in particular is sort of between Zen 2 (Ryzen 3000 series) and Zen 3 (Ryzen 5000 series). it's not as good as the rest of the Ryzen 5000 series for a number of reasons, so you'll get a small uplift from something like a 5800X3D even in worst case scenarios, and that's before factoring in the extra cores. Not quite as much of an uplift as if you were coming from Zen 2 (Ryzen 3000 series) like I would have thought you might, though. Still probably your best "value" proposition, though.

Newer platforms would of course be a bit faster but cost more relative to how much faster they are. It's an option, though, if you are willing to spend as much.

And higher resolutions (I presume you're referring to 1440p by 2K) are often more GPU reliant, not more CPU reliant. It will actually be the same on the CPU at 1080p, 1440p, or 4K, but since you're less likely to be GPU bottlenecked at lower resolutions, the frame goes higher and you see the CPU become the bottleneck. It's not worth worrying about, it's natural. For example, let's say in game A, at second B, your CPU can do 90 FPS. But you're playing at 4K so you're GPU bottlenecked and you get 75 FPS, because the GPU can only produce that much in second B. You change to 1080p, where the GPU COULD do, let's say 120FPS. But you get 90 FPS because of course that's what the CPU can do at that very second. Again, it's not worth micro-managing which is the bottleneck, and instead focusing on "is this performance good enough for me". But if you're going UP in resolution in the future, then you're more likely to not need as fast of a CPU, so it's the other way around from what you're thinking. If the CPU is okay at 1080p, then it's no worse at 1440p or 4K. the worst you'll get is the same performance as now (but likely lower due to the GPU bottlenecking).

Disclaimer; if you go to an ULTRA-WIDE 1440p and compare it to 16:9 1440p, then yes, the CPU will have to do more (but so will the GPU).

Anyway, it sounds like you're not drowning in disposable income so I'd either consider the 5800X3D, or even just stay with the Ryzen 5 5500 (if you're not bothered by your current performance). Do note that the 5800X3D will warrant considerations on cooling. It is a WARM chip, even under good air cooling. I have mine under a Dark Rock Pro 4 and it still gets very warm at times. Not unexpected, but I wouldn't use anything less than near top end air cooling (like the aforementioned Dark Rock Pro 4, Noctua NH-D14/15, etc.), or good AIO/water cooling. The stock AMD coolers or lower end third party ones wouldn't be good for a 5800X3D IMO, so you'd have to factor in up to another $100 (or a bit less) for cooling.
< >
Εμφάνιση 46-60 από 84 σχόλια
Αναρτήθηκε αρχικά από Illusion of Progress:
Αναρτήθηκε αρχικά από emoticorpse:
Man, I hate this. I really feel like you should be getting better performance. Your performance hurts me more than you I think lol.
Why? Didn't OP say they were fine with performance?

Why are you sure they should have better performance? "I feel like it should be better" doesn't sound like you know it should.

You can't look at one other data point, which has countless numerous variables, and just wave them away with "I feel it should be more similar". This subject usually comes up when a certain user (I forget the name) comes and links to a random video and asks why they aren't getting the same performance, and the answer is always the same, yet the threads go on for pages and pages for no reason. The answer is because of the variables.

Well, I mean I'm fine with the truck I have but If I can get a brand new 2023 one for free I would?

I'm not sure they should have better performance. I'm speculating but highly suspect that's the case. Instincts/intuition are a part of it but the technical reason was because the first comparison I did was between the 5500 and the 5800x3d and Youtube videos I saw showed them being on par. This is what lead me to believe he should be getting higher results comparable to the 5800x3d. I understand this is counter intuitive since the 5800x3d is supposed to trump anything without 3d cache so how could they be equal. I was going off those videos.

Now that I saw the other videos showing the 5800x3d beat the 12400f which I saw beat the 5500, I'm starting to re-think my line of thought. That's why I said I was confused up to now.

What data points are you looking at? If you mean all the numbers/digits/features and paperwork of what is put into each chip, I don't look at that. I just look at end result benchmarks/real life comparisons which is the same exact thing just a faster route/shortcut which is the crux of the matter and in the end the specs don't mean much.
Okey for Atomic Hearth
https://www.youtube.com/watch?v=iC5p7YweFxM
Im getting 20-30 less fps at 09:24 settings

For Plague Tale Requelm
https://www.youtube.com/watch?v=JLThToV5SB0&t=1s
Im getting 5-10 less fps at 7:15 options

And for Watch dog 2
https://www.youtube.com/watch?v=wf5JInVJg90
Im gettin 10-15 fps more this one is a diffrent cpu but still

For Battlefield 2042
https://www.youtube.com/watch?v=WD2__hROy8w&t=369s
Game is working on a diffrent version its season 3 here and they relased one of their biggest updates so far its season 4 now i think they messed up something.

After personaly trying their settings i decided im totaly fine with my cpu right. Problem is probably because of battlefield.
Αναρτήθηκε αρχικά από Gökyüzü:
Okey for Atomic Hearth
https://www.youtube.com/watch?v=iC5p7YweFxM
Im getting 20-30 less fps at 09:24 settings

For Plague Tale Requelm
https://www.youtube.com/watch?v=JLThToV5SB0&t=1s
Im getting 5-10 less fps at 7:15 options

And for Watch dog 2
https://www.youtube.com/watch?v=wf5JInVJg90
Im gettin 10-15 fps more this one is a diffrent cpu but still

For Battlefield 2042
https://www.youtube.com/watch?v=WD2__hROy8w&t=369s
Game is working on a diffrent version its season 3 here and they relased one of their biggest updates so far its season 4 now i think they messed up something.

After personaly trying their settings i decided im totaly fine with my cpu right. Problem is probably because of battlefield.

In that case and after me realizing 5500 wasn't as close to 5800x3d I suppose I will just accept you have reasonable performance.

I think in time as you format and learn more about your new configuration, you'll probably even be able to increase it.
Αναρτήθηκε αρχικά από emoticorpse:
Well, I mean I'm fine with the truck I have but If I can get a brand new 2023 one for free I would?
I don't see how that's analogous though o.o?

Accepting a free improvement with a physically faster thing is different from saying something is underperforming for what it is.
Αναρτήθηκε αρχικά από emoticorpse:
I'm not sure they should have better performance. I'm speculating but highly suspect that's the case. Instincts/intuition are a part of it but the technical reason was because the first comparison I did was between the 5500 and the 5800x3d and Youtube videos I saw showed them being on par. This is what lead me to believe he should be getting higher results comparable to the 5800x3d. I understand this is counter intuitive since the 5800x3d is supposed to trump anything without 3d cache so how could they be equal. I was going off those videos.
I'd never expect a 5500 to be equal with a 5800X3D. The 5800X3D's baseline performance is that of the 5800X, and can be up to much faster. The 5500, meanwhile, is a slower per core (the lack of cache hurts) AND missing 25% of the cores relative to the 5800X. In other words, you're comparing sub-Zen 3 baseline to above Zen 3 baseline (on average) and you therefore can't expect them to be the same. It's not like, say, comparing a 5600X and 5900X and going "they can be close to equal" which COULD be true as they're both roughly baseline Zen 3 performance. This is comparing two extremes relative to Zen 3 baseline performance, and the 5800X3D in particular is an inconsistent wild card.

That said, I also tried searching 5800X3D reviews and hoped to find stuff that included Battlefield 2042 to see if it was a title that benefited from the cache, but little turned up. I don't pay attention to Battlefield games but is this not a new(ish) title? I never saw it included, only Battlefield V here and there.

And worse, modern reviews seem to use smaller and smaller pools of games for their tests compared to years past, which makes this harder to find than I thought.
Αναρτήθηκε αρχικά από emoticorpse:
What data points are you looking at?
None.

I didn't have a firm idea in mind that there SHOULD be a given gap. But I also didn't have a firm idea in mind that there SHOULDN'T be, which is why I asked why you were expecting OP's results to match one random Youtube video results when there's no doubt variables at play.

(By the way, I wasn't questioning you to shut down your exploration of questioning IF the gap should or shouldn't be there, so feel free to question it and explore that, but I was just asking if you yet had firm reason to believe they should perform the same or not, beyond "I feel like they should" because I merely wanted to point out that's far from good enough.)
Αναρτήθηκε αρχικά από emoticorpse:
I just look at end result benchmarks/real life comparisons which is the same exact thing just a faster route/shortcut which is the crux of the matter and in the end the specs don't mean much.
Right, as do I.

By "data point" I meant that particular Youtube video you were referencing and using for grounds of "you should be getting more according to this example", when said example (said "data point") had variables, both known and unknown. And those have to be accounted for.
Τελευταία επεξεργασία από Illusion of Progress; 10 Μαρ 2023, 16:18
Αναρτήθηκε αρχικά από Illusion of Progress:
Αναρτήθηκε αρχικά από emoticorpse:
Well, I mean I'm fine with the truck I have but If I can get a brand new 2023 one for free I would?
I don't see how that's analogous though o.o?

Accepting a free improvement with a physically faster thing is different from saying something is underperforming for what it is.
Αναρτήθηκε αρχικά από emoticorpse:
I'm not sure they should have better performance. I'm speculating but highly suspect that's the case. Instincts/intuition are a part of it but the technical reason was because the first comparison I did was between the 5500 and the 5800x3d and Youtube videos I saw showed them being on par. This is what lead me to believe he should be getting higher results comparable to the 5800x3d. I understand this is counter intuitive since the 5800x3d is supposed to trump anything without 3d cache so how could they be equal. I was going off those videos.
I'd never expect a 5500 to be equal with a 5800X3D. The 5800X3D's baseline performance is that of the 5800X, and can be up to much faster. The 5500, meanwhile, is a slower per core (the lack of cache hurts) AND missing 25% of the cores relative to the 5800X. In other words, you're comparing sub Zen 3 baseline to above Zen 3 baseline (on average) and you therefore can't expect them to be the same. It's not like, say, comparing a 5600X and 5900X and going "they can be close to equal" which COULD be true as they're both roughly baseline Zen 3 performance. This is comparing two extremes relative to Zen 3 baseline performance, and the 5800X3D is an inconsistent wild card especially.

That said, I also tried searching 5800X3D reviews and hoped to find stuff that included Battlefield 2042 to see if it was a title that benefited from the cache, but little turned up. I don't pay attention to Battlefield games but is this not a new(ish) title? I never saw it included, only Battlefield V here and there.

And worse, modern reviews seem to use smaller and smaller pools of games for their tests compared to years past, which makes this harder to find than I thought.
Αναρτήθηκε αρχικά από emoticorpse:
What data points are you looking at?
None.

I didn't have a firm idea in mind that there SHOULD be a given gap. But I also didn't have a firm idea in mind that there SHOULDN'T be, which is why I asked why you were expecting OP's results to match one random Youtube video results when there's no doubt variables at play.

(By the way, I wasn't questioning you to shut down your exploration of questioning IF the gap should or shouldn't be there, so feel free to question it and explore that, but I was just asking if you yet had firm reason to believe they should perform the same or not, beyond "I feel like they should" because I merely wanted to point out that's far from good enough.)
Αναρτήθηκε αρχικά από emoticorpse:
I just look at end result benchmarks/real life comparisons which is the same exact thing just a faster route/shortcut which is the crux of the matter and in the end the specs don't mean much.
Right, as do I.

By "data point" I meant that particular Youtube video you were referencing and using for grounds of "you should be getting more according to this example", when said example (said "data point") had variables, both known and unknown. And those have to be accounted for.

Well as far as the analogy goes, I'll try again. I guess I would say "I'm fine with my truck's performance but if I can get higher mileage per gallon I would"?

Entering this thread, I just had a rough assumption the 5500 would be maybe like 15% slower than the 5800x3d (I don't think I've ever actually realized the 5500 even existed before now). That is what I kind of though and then when I saw the 5500 and 5800x3d on par in the videos (lol, yeah I mean I know how the internet works and you have to take everything with a grain of salt).

I do know the 5800x3d offers a performance leap but figured it was in most part to earlier cpu models like 3xxx series. But I didn't really think it was that much more powerful than anything in the same 5xxx line of cpus.

By the way, I do respect the variables you're talking about. It might not seem like it but I have been through the course of this thread keeping that in mind. I understand his list of installed software isn't going to match up with whoever's video we're watching at the moment, or his motherboard model, or his ram, or his startup or any open programs and all that leads to possible differences. But for the sake of simplicity I leave all that out. The thread would be already 10 pages long if we seriously went into every bios setting and Nvidia Control panel setting or driver version. I get it.

But at this time I am honestly getting a little wore out from the research on this thread. I'm satisfied with "his pc is working well enough". I still would love to get my hands on it and give it a shot myself. That might be my over-confidence though, just like I got proven wrong in the Userbenchmark idle cpu usage matter.
No no no, I totally got what you meant with the example of the truck. I would take free improvements, too.

I was just confused because that doesn't seem analogous to "that same thing should perform better than it is".

And yeah, even though I was approximating too, just like you were, I had a feeling you were overestimating the 5500 and/or underestimating the 5800X3D, so I knew that some gap was likely to be expected between them, and that's before factoring in other variables, of which I would expect there to be some (if not many). If OP isn't getting MASSIVE differences with stuff they can rule out variables on, and is okay with the performance they are getting, then all seems well to me.

But yeah, again, I wasn't trying to step on your toes of trying to figure out IF OP could get more out of things, so feel free to explore those avenues if you so wish. I just wanted to add that the difference may not necessarily be unexpected (and some level of one would certainly be expected).
Τελευταία επεξεργασία από Illusion of Progress; 10 Μαρ 2023, 16:37
I mean i was also expecting to see some diffrences between cpu's this is why i was considering buying a new one. But Battlefield is just wild. For other games that i have test diffrences is nothing more then %10-15 which is totaly fine for me carry on with what i have.

I make the upgrades about a 9 days ago and Battlefield's update came about a 11 days ago. I played battlefield 42 about a week tested alot of diffrent options but no matter what option that i have try i just couldn't feel confortable. And weirdest one is DLLS its suppouse to give me more fps and yet it gives me less some how. I played 10 or maybe 20 games with ray trace so far and none of them was like this. DLLS allways give me more fps. I was also getting more fps with 2060+DLLS but it was before the update. Im still not sure if its because of new update or not.

At that video im still having %10 less fps at low settings but at ultra settings gap becomes more then %50.

Im still hoping it might work if i try again but servers are down right now i cant even try it. Im gona test it tomorrow and if it doesnt work im gona contact to support

Edit : By the way thanks for intesresting if it weren't for you guys i wouldn't even bother testing for other games.
Τελευταία επεξεργασία από Gökyüzü; 10 Μαρ 2023, 16:57
The Ryzen 5 5500 can sometimes be better than the r5 3600, but can sometimes be slower than it too.
The 16mb level 3 cache does hurt it's performance.

https://www.youtube.com/watch?v=4JebBhH-B88

I think even the Ryzen 7 5700x would be a good upgrade over the r5 5500.
Today i acualy got the chance to test this again aperently that video misguided me because im also getting around %10-15 less fps at the same map againts bots with real players it might change im not sure. 60-70 like fps usualy happends some other maps which there are 10 more other maps and the map in the video is the most optimized one. And fps is still all around it fluctuating from 70 to 170's its never stable.

After this i finaly realized there is nothing wrong with my components its all because of Battlefield 2042.

Im still gona upgrade to 2k toght im not happy with my monitor :)
Yeah, certain games, like ones with random components, are hard to test against. Stuff with multiplayer, Minecraft, etc., you don't see performance numbers for these things as often for that sort of reason.

So we found out what it was, which like I said in the beginning, was "variables". If you can't "control your variables", then you can't apply your results to another data point and expect 1:1 results.
I would first start by moving up to 1440p with at least 144Hz or higher.
If not GSync then something with FreeSync Premium that is GSync compatible.

Then run the OS + Games at 1440p

If need be you can apply a Max FPS Cap within NVIDIA Control Panel. As not all games will run butter smooth when left un-capped.

Then see if a better CPU is really needed.
I recently encountered a situation where I had 40 FPS in Helldivers 2 with a rtx 4070 ti super video card at full HD, a 5600x processor, and 3600 MHz RAM. This is what I really think is a bottleneck. I started thinking about upgrading my processor, but I don't want to overpay for a 5800x3d because I think my current 5600x processor just doesn't have enough cores, and a 5800x can solve this problem.
Αναρτήθηκε αρχικά από Vaitness:
I recently encountered a situation where I had 40 FPS in Helldivers 2 with a rtx 4070 ti super video card at full HD, a 5600x processor, and 3600 MHz RAM. This is what I really think is a bottleneck. I started thinking about upgrading my processor, but I don't want to overpay for a 5800x3d because I think my current 5600x processor just doesn't have enough cores, and a 5800x can solve this problem.
First of all, you didn't need to revive thread that's over a year since the last comment, and second of all, 40 FPS with those specs is not a bottleneck situation at all, that's something like drivers not working correctly.
The Ryzen 5500 is roughly equivalent to a Ryzen 3600 or a 10700k. The 4070 ti was released at least an upgrade cycle past those processors and it's at the upper end of the Nvidia stack. The 5500 is at the lower end of the A.M.D. stack, and was released just a year before the 4070 ti. It's really only natural that these products would be mismatched.

The 5800x3D is arguably the best processor your motherboard can support, but it is too expensive for what it is. If you pick up a 5700x3D you're only losing about 5% perf,[gamersnexus.net] and only spending $192[www.amazon.com], whereas a 5800x3D will currently cost you about $335. That's a ridiculous cost differential for such a small difference in performance.

The 5700x3D is probably your best bet if you insist on sticking with A.M. 4, but it should be noted that the 3D V-Cache chips don't universally outperform their non 3D v-cache counterparts. The improvements scale with how well the extra cache is actually utilized, plus it comes with locked clocks and lower base clock speed to keep the thermally sensitive v-cache safe.

But really, with a 4070 ti, I'd be looking to jump to A.M. 5 when the Ryzen 9000 series releases at the end of the month. Even if you don't buy a 9000 series processor, the whole market's going to be shaken up and you can probably snag some good deals on 7000 series processors. Maybe LGA 1700 prices will sink too if you really want to keep your existing D.D.R. 4 R.A.M., but it's harder to say.
Τελευταία επεξεργασία από Tonepoet; 13 Ιουλ 2024, 1:12
Αναρτήθηκε αρχικά από Tonepoet:
The Ryzen 5500 is roughly equivalent to a Ryzen 3600 or a 10700k.
The R5 3600 is more comparable to an 8600K, it's the 5600 that compares rather evenly with the 10700K

1000 series was up to around Devil Canyon performance, 2000 series was around Skylake, 3000 series was around Coffee Lake, 5000 series was highly competitive with Comet Lake
< >
Εμφάνιση 46-60 από 84 σχόλια
Ανά σελίδα: 1530 50

Ημ/νία ανάρτησης: 10 Μαρ 2023, 10:06
Αναρτήσεις: 84