Установить Steam
войти
|
язык
简体中文 (упрощенный китайский)
繁體中文 (традиционный китайский)
日本語 (японский)
한국어 (корейский)
ไทย (тайский)
Български (болгарский)
Čeština (чешский)
Dansk (датский)
Deutsch (немецкий)
English (английский)
Español - España (испанский)
Español - Latinoamérica (латиноам. испанский)
Ελληνικά (греческий)
Français (французский)
Italiano (итальянский)
Bahasa Indonesia (индонезийский)
Magyar (венгерский)
Nederlands (нидерландский)
Norsk (норвежский)
Polski (польский)
Português (португальский)
Português-Brasil (бразильский португальский)
Română (румынский)
Suomi (финский)
Svenska (шведский)
Türkçe (турецкий)
Tiếng Việt (вьетнамский)
Українська (украинский)
Сообщить о проблеме с переводом
It`s kinda pointless to get cards like RTX 4080 and 4090 for 1080p resolution, no matter how powerful CPU is it will hit the performance limit and can`t spit out more frames so to speak.
If it's not a single core CPU, which I'm really guessing it's not in this day and age and with an RTRX 4080, then overall CPU use is almost pointless to look at. Looking at individual cores is necessary, but still not fool proof. All it takes is a single core to be at 100% (or sometimes, not even that) to have a potential CPU bottleneck.
Overall CPU use as an indicator of whether a CPU is the limitation or not isn't been accurate beyond single core CPUs.
Tarkov 30-75% usage 1080p maxed
Cyberpunk 70-80% usage everything high except RT Psycho getting 80-120 fps
Minecraft 15-50% 32 chunks render distance with shaders chocaptic maximum on 1.19.1
The answer may be even simpler, OP should get a monitor that is 1440p or higher.
Stop using high end GPUs with 1080p monitors folks
Is 80% your overall use? If so, you can not use that to determine anything. You have to look at a per-core basis to better determine this, and even that isn't fool proof.
Minecraft Java is largely single core limited. It IS multi-threaded, largely when the client is first loading the game in the recent few versions (okay that doesn't count) or when loading a lot of chunks fast. It also splits threads for dimensions and some other things. But Minecraft is MOSTLY single core limited in typical play, and almost always CPU bound. Version 1.18 and newer are also rather heavy due to the terrain height changes.
But 32 render distance with shaders? Goooood luck! Every time you double the render distance, you quadrouple the loaded chunks. The new simulation distance helps lower CPU load somewhat from that, but with shaders that's still a high render diistance. Even shaders can be CPU limited in weird situations, especially if you're near a lot of entities and don't have a shader pack that disabled entity shadows, because these became MUCH more taxing after 1.18 for me (the default video setting for this does NOT apply to shaders so you NEED shaders that allow this to be adjusted). Found this out the hard way with BSL. Once I updated it and a new version added that setting, it was a MASSIVE difference. Easy way to find out, turn entity distance down to 50% from 100% and see if it does anything. If not then this isn't your issue. In general, 32 render distance in 1.18 and up is harder, and shaders over 16 is already often hard (though not sure if an RTX 4080 should handle it or not).
So you could be CPU AND GPU limited here. Minecraft Java with shaders will vary a LOT. Like a LOT.
The 10700K is not too slow for a 4080. However, at 1080p. the 10700K WILL bottleneck a 4080 in many instances. Especially CPU intensive games. Thus, you would not see full GPU usage in those games.
A 10700K would do the same thing to a 3080. Not every game, but at 1080p, in many CPU intensive games, it WILL bottleneck the GPU. Use 1440p, or 4K resolution, and those CPU bottlenecks will be mostly eliminated.
With that said, GPUs like the 4080, and 4090, are not really designed for 1080p. To get the most out of them, you want to be using higher resolutions like 1440p, or 4K. Now, if you have a 1080p 240Hz, or even 360Hz monitor, and are trying to maximize them, then you will want a high-end GPU. However, you will also need to get one of the higher end CPUs as well to insure that the CPU is able to push those high frames that the GPU is rendering.
Nothing wrong with getting a modern mid-range CPU and a modern high-end GPU for 1440p, or 4K. But at 1080p, you will definitely run into CPU bottlenecks. At this point, a mid-range CPU will not be enough to push the GPU to its full extent.
if playing at 1080p 300+hz/fps the cpu will be the bottleneck
if at all of the k res and multiple displays the gpu will be the bottleneck with almost any modern cpu