Steam installeren
inloggen
|
taal
简体中文 (Chinees, vereenvoudigd)
繁體中文 (Chinees, traditioneel)
日本語 (Japans)
한국어 (Koreaans)
ไทย (Thai)
Български (Bulgaars)
Čeština (Tsjechisch)
Dansk (Deens)
Deutsch (Duits)
English (Engels)
Español-España (Spaans - Spanje)
Español - Latinoamérica (Spaans - Latijns-Amerika)
Ελληνικά (Grieks)
Français (Frans)
Italiano (Italiaans)
Bahasa Indonesia (Indonesisch)
Magyar (Hongaars)
Norsk (Noors)
Polski (Pools)
Português (Portugees - Portugal)
Português - Brasil (Braziliaans-Portugees)
Română (Roemeens)
Русский (Russisch)
Suomi (Fins)
Svenska (Zweeds)
Türkçe (Turks)
Tiếng Việt (Vietnamees)
Українська (Oekraïens)
Een vertaalprobleem melden
To be fair though, I have an old ass i7-6800K. I run it OC'd at 4.4GHz all core and it literally makes the 4600G look like a toy, even though they are both 6c/12t CPUs. Add the fact that mine is an HEDT and supports quadchannel memory to the mix and it's even worse.
Mostly though, it comes down to Ryzen having originally cheaped out on L3 cache in those budget models. Mine has 15MB of L3. A 4600G has 8MB of L3 and that basically neuters it and makes it about half the chip, more on par with an i7-6700 while mine is closer to an i7-8700K, even though they both run at similar clocks, and even with the 4600G still having a large age and architecture advantage, it's still just an OK chip because it is missing the L3 cache needed for it to be a powerhouse like the 5600X is (largely because of it having 32MB of L3 cache).
Look at how the new upcoming Ryzens with 3D V-cache have just simply added more memory to existing dies and how that alone can enable their new refresh chips to greatly surpass everything up to and including an R9 5950X.
That lightning-fast memory used as L3 makes or breaks a CPU. You have to look a little beyond just clock speeds and even IPC to actually see a CPU's real performance and one of those key factors, just like memory bandwidth in GPUs, is cache amount for CPUs.
Yeah, you mean in task manager? It's been wrong forever. Use MSI Afterburner or EVGA Precision XOC to monitor.
If you're getting more than 6-10 FPS on a GTX 1070 Ti, then you're realistically not at 10% GPU usage. You'd be getting a slideshow.
PC port is proken.
How are you monitoring your GPU usage though? Like I said above: 10% usage would be absolutely abysmal. You'd ♥♥♥♥♥♥♥ know it right away.
Too many people think that Windows Task Manager is accurate. IT IS ABSOLUTELY NOT.
It also tells me that my GTX 1080 is only running at 10-15% while I'm getting 50-60 FPS. That is just not even close to being realistic. I'd be lucky to see 20 FPS at 10% usage.
Meanwhile in Afterburner, I see 85-100% usage.
Use Custom Resolution, DSR, play at higher resolution if your FPS cannot improve.