Instalar Steam
iniciar sesión
|
idioma
简体中文 (chino simplificado)
繁體中文 (chino tradicional)
日本語 (japonés)
한국어 (coreano)
ไทย (tailandés)
Български (búlgaro)
Čeština (checo)
Dansk (danés)
Deutsch (alemán)
English (inglés)
Español de Hispanoamérica
Ελληνικά (griego)
Français (francés)
Italiano
Bahasa Indonesia (indonesio)
Magyar (húngaro)
Nederlands (holandés)
Norsk (noruego)
Polski (polaco)
Português (Portugués de Portugal)
Português-Brasil (portugués de Brasil)
Română (rumano)
Русский (ruso)
Suomi (finés)
Svenska (sueco)
Türkçe (turco)
Tiếng Việt (vietnamita)
Українська (ucraniano)
Comunicar un error de traducción
You'd most likely to get 200-250 FPS easily once you move to 1440p, as your CPU will be able to catch up with your GPU. Of course there may be 5-15 fps difference due to CPU difference and how they perform in different games.
7900xtx ref model. He had the thermal throttling and junction temperatures of 110c. He took it apart, repasted it, put it back together, undervolted it by 10%, and had has it paired with ddr5 5600 ram and a ryzen 7700x. Supposedly this is the perfect combination
I am guessing your xtx is rasterizing a crapload of frames at 1080p and is overheating itself. Just cap your frame rate and undervolt the card, that is literally all you need to do until there are driver updates
Hope this helped. If not, then you are experiencing a different issue. That cpu is going to hit the highest possible clock speed at all times. It iss not bottlenecking the card at all. The gpu is likely just rendering way too many frames and it is not in synce with the monitor?
The CPU serves as the baseline that computes everything. It passes information to the GPU on what the render, and the GPU just renders it. Ergo, if OP is getting a given performance level at 1080p, and the reason for not getting more performance is because of the CPU, that can not change by raising the resolution. How could it? The CPU is already running at its limit. Raising the resolution doesn't increase the performance the CPU can do. It merely increases the chance that the GPU is the bottleneck instead. Ergo, you'll get the same performance at best (if still CPU limited), or lower performance at worst (if now GPU limited), by raising the resolution.
With higher end gpus, then move towards 1440p (21:9) or higher.
Having an RTX 3080/4080 or RX 6800/7800 or better it's just foolish to run at 1080p as a daily-driver kind of config. Your gpu would be wasted and go mostly un-used at that low of a res.
I have to disagree with this OP. These same people would come up with some ridiculous excuse for spending on a car that can do 120 MPH and yet they've not a single time even went over 80, but when it comes to GPUs it's either run that sucker at 99-100% usage, simple as. If not return it and get a slower one so that when you find that one game that makes your computer choke, you can sell? throw out? return? the 6700 Xt and find another one that's just right for your games and nothing more.
I mean, it's like telling someone they shouldn't have a fridge if it's only half full? or I couldn't even imagine owning a stove if you don't even cook. The horror!.
I am being snarky and I don't want to but I really don't like the logic. I want to get a card that's more than I need so I'm ready in the future, I think it's smart.
But I'm open to discussion. Am I wrong? Set me straight.
The frame of reference you're responding to seems like it may come from the position of someone who upgrades frequently (or at least is willing to). In that case, there's definitely an argument to be made about not going so over on what you need. But that's not everyone. Enthusiasts often forget that not everyone is like them. If you plan to stick with something for the longer term, which with rising prices and slowing improvements, I think many people might, buying just enough right now won't be as good of a fit for that.
Had a similar discussion with someone on Asus forum about the, then latest, Bios update removing ability to undervolt. Myself and others all complained especially those that could not rollback as that Bios changed something that made it no longer possible before that bios. Why do you all have such CPUs and board if you are restricting its performance by undervolting.
Answer from all of us was
1. We can undervolt and still get the advertised speeds but with temp heat thus lower fan RPM thus less noise.
2. Better than needed means when we need a boost we can boost it versus buy a completely new component. £50 between the 12600K and 12700K when I got mine. More than worth while to have better now for £50 than having to buy a completely new CPU in the years to come. Came from a 7700K a stock 12700K is plenty without needing to be OC'd right now
I apply the same logic to PSUs more wattage than required for a little more £ can save more in the long run. Especially if you get one with a 10+ year warranty. Last thing you want is to need to replace it when you upgrade something in 5 years
What's odd for me is both my CPU and GPU don't reach anywhere near 100% utilization in the games I'm under performing in. Usually my CPU is around 40% utilization while my GPU is 60-80%?
I tried Cyberpunk and I'm far lower in performance than Hardware Unboxed(As low as 60fps in the more demanding area in the sitting with "High" preset at 1080p). Should be noted this is the same engine as The Witcher 3(Except more up to date) so it could be argued whatever's going on is tied to the engine and my particular setup for these two games.
What's REALLY weird is when playing Warhammer 40,000: Darktide I'm getting around 90-115 fps(Which is around the average for Hardware Unboxed at 1440p) BUT if I turn FSR 2.0 "Ultra Performance" on the image quality is noticably worse however the performance is the EXACT SAME. I get no performance uplift with FSR 2.0 on??? And again, CPU utilitzation is below 40% while GPU utilization is 60%... That ain't right.
Contrary to this when playing Metro Exodus: Enhanced Edition with Ray Tracing on Ultra(Other settings on Ultra) I'm getting fairly decent performance(130-150fps at 1080p) with 100% GPU utilization(And again CPU at 40% utilization - this is to be expected as Ray Tracing is far more GPU bound than CPU bound which in this situation the GPU is clearly the bottleneck).
Sorry guys, I'm flat out not buying that 1080p is bottlenecking me here(It doesn't even make sense!). I'm not about to spend 300-400 on a 1440p monitor just to come back and say my computer is underperforming compared to where it should be.
There has to be something else going on here. Either it's the RAM speed that's heavily impacted my performance or there's something wrong with this GPU/CPU combo.
From GPU and CPU utilization being so low in these games I'm left with the impression the RAM may actually be the bottleneck, is it likely that these games(Where Ray Tracing wouldn't result in the GPU being the bottleneck) are bottlenecked due to the RAM? If my GPU can push very high frames especially at 1080p, which it can, and my CPU is still one of the faster CPU's on the market and can also push higher frames then the old 2400Mhz RAM(3000Mhz OC) might be the culprit. Even going from 2400Mhz to 3000Mhz(OC) resulted in significantly more FPS for me. I got 30+ FPS and from what I've read the sweet spot for Ryzen 5800X is 3600Mhz CL16 or 4000Mhz.
Can you run a userbenchmark as mentioned earlier? I mean do you do want help? it only takes a couple minutes to do.
Also considering both GPU/CPU are not being fully utilized seems logical that is the simple reason you're performance is lacking. So the issue should be right now narrowed down to figuring that out.
I mean if you absolutely refuse to run a userbenchmark, I understand but can you at least say so? It does give some useful info even though I am aware there is a boycott against the site basically for their bias.
So a few things:
Yep those temperatures are fine, so rules out any possible overheating going on here.
For the Warhammer situation, it isn't weird at all that you don't see an increased framerate from turning on FSR, you would get the same performance if you ran the game at 600x800. Something in your system is already working flat out and holding back the GPU.
The 40% utilisation your are seeing with your CPU is the average utilisation over all cores and threads, so in your case over all 16 threads. I have an R9 5900X and it hardly ever goes over 30% utilisation in games, however if I were to run them at 1080p I would see a CPU bottleneck, I even see a bottleneck at 1440p occasionally. Many games use only a few cores, where some will be heavily utilised while the rest will remain unused by the game engine, this is incredibly common. Quite frankly it's been a while since I've seen a CPU pegged at 100% or even close to that in any game.
Maybe look at CPU utilisation PER THREAD rather than overall, this may give some incite as to what threads are being utilised heavily.
I think from the specs you've provided, the memory seems to be the weakest link in your system. Might be worth upgrading, but then again I couldn't possibly say how much an improvement you will see, or whether it will get your FPS close to the Hardware Unboxed results. The only thing you can do is try it really.
And yet, did you try:
Put the GPU to a minimum clock rate of 2300Mhz + in Radeon Control Center. See how it affects performance.
Alternatively turn on VSR and run at 1440p on your 1080p screen. No need to buy 1440p just to see if people are right or wrong, just run the tests at 1440p on the screen you already own.
Good idea.
I just tried this and can report that I have no performance difference between 1080p, 1440p and 4k. It's always the same FPS regardless. The only time my FPS is impacted is when I switch to 8k(Low 30s from 1080p/1440p/4k of 90-110 in built up areas and 150 in the outside less built up areas).
My clocks run at 2500-2600Mhz throughout with no downclocking.
Edit: In the outside areas where my FPS at 1080p and 1440p is at 150-160 at 4K it's 100-110fps instead. In towns and built up areas the FPS is the same across the board except at 8k.
Edit 2: Also, looking at the performance gap between The Witcher 3 Ryzen 7 5800X vs Ryzen 7 5800X3D Hardware Unboxed noted a 0% difference between the two at 1080p, 1440 and 4k. So I doubt very much this is a result of a performance gap between the 5800X and 5800X3D at least specifically in relation to The Witcher 3.
isn't the performance gain from using that only like 10-20% at most? seems like that could not account for the large difference in FPS the OP is talking about
op... maybe you have a virus or malware or something that's eating up system resources?