Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
Granted I am not running at low quality.
Do you mean that on average *all* of your 16 cores / 32 threads show 50% usage? If so, that would actually be using a LOT of CPU time, only comfortable because you have so many cores. If not, then I don't understand what kind of CPU load you do have. Not that it's always easy to describe in few words, I know. :)
Why are you asking me this? I specifically said "never had a *thread*", so I'm clearly not talking about the cores averaged usage (which never went beyond 15%, most of the time is floating around 10%).
Edit: here's an example: https://i.imgur.com/UCGq7Yk.jpg
Thank you. I was asking this for two reasons.
One is that a lot of "people on the Internet" use core and threads interchangeably, and I didn't know you, so I thought I'd politely ask for confirmation.
The other is that, in my personal experience, I find it easier (or rather, less hard) to understand how an application uses the CPU by first looking at how much cores are used, before threads.
Your screenshot provides a fair bit of information, thanks! May I ask what GPU you used here? And what tool did you use for this overlay?
That's on the table, but... currently TTP2 is the only app I use that *could* benefit from the full upgrade, so I am still pondering if there is a cheaper way out.
Predictions are hard to do though, so in the end I'll probably just buy a used mid-range GPU and see how it goes.
The 6700XT ($300) is 3X-4X faster than your 470 generally, and the $500 7800XT is 4.5-5x faster versus your RX 470 with 3X or 4X more VRAM (4gb vs 12 or 16GB)
Additionally, on the CPU side, even a low-end 7600X ($230) is about 5-6X faster in Multicore performance (and at least 2x faster singlecore, or more), with RAM that has 4-5X the bandwidth (DDR5 6000 vs DDR3 1300/1600)
In short:
$1200 PC will be
CPU: 6X faster in multicore (most games now want 12 threads to 16 at 5GHz clocks for 1080P 144hz+) and 2X faster in singlecore
GPU: 4-5X faster depending on where your budget goes, and will let you run 1440P 90fps, especially with FSR3 or DLSS2+FrameGen
RAM: 32GB is more common and honestly the minimum for any new build, is $100ish
USB3: 5-10X more bandwidth, not to mention PCIE4 and PCIE5 SSDs being 10x faster sequential read and write on good drives (SATA III 600MB/s vs 6000MB/s NVME)
In other words... yeah, you're just going to keep feeling this and there's nothing you can do other than play Optifine Minecraft or Portal 2 for puzzles (and Talos 1 ofc)
You're right, most of the times people complain and/or make blanket statements without having the knowledge or the will to do some basic research before.
Anyway, back on topic: I have a 7900 XTX (Core undervolted to 1090mV, VRAM OC to 2700MHz), that screenshot was taken at 5120x1440 resolution, with the game set to Native + FSR, every graphic option at max except Global Illumination set to High instead of Ultra (I couldn't see any real difference between the two settings, except the higher frametime for the latter).
The overlay is CapFrameX, updated to the latest beta that implements the new GPU Busy stat from Presentmon.
As you can see, on new CPUs the game is very light in usage, ofc on newer CPUs, I was surprised by that other user that reported their 9700 at 100% in conjunction with a 7900XT, unless they're playing at 1080p, that would explain the high CPU usage, but still surprised nonetheless by the CPU, I didn't think that UE5 could saturate a 9700.
But yeah, as other have said here, I guess it's time to make an upgrade to your PC, for all purposes is really old and won't be able to run properly any UE5 game (UE5 is very heavy already on my machine lol).
Getting a whole new pc would be probably cheaper than pu4chasing old socket cpu still available or a low end gpu lol
Are you running that 3570k at stock?
I've had mine at 4.7ghz all cores, but you should easily be able to get at least 4.5ghz with a half decent cooler. Should give you a noticeable performance boost.
I kept using mine paired with a GTX 980Ti up until the end of 2020 when i upgraded for Cyberpunk 2077. It was really starting to show its age, but still got me through Control, ME Andromeda and other stuff from that time. Still have it in a box somewhere.
If you find a dirt cheap I7 3770k you could replace the CPU, but only if it's dirt cheap. Otherwise it's not worth investing money into a platform this old imo.
Ah, you gave me hope! But after a number of tries I couldn't get better than 4GHz (unstable or too hot at higher freqs/voltages)... not bad for a free upgrade, but 4.5 sounded VERY nice. :)
Thanks for the tip anyway, I might give it another try when the full game is released.
Sure, it's the end of the line for this CPU/MB combo. 10 years of good use isn't too bad. Maybe it'll go to eleven... ;)