Steamをインストール
ログイン
|
言語
简体中文(簡体字中国語)
繁體中文(繁体字中国語)
한국어 (韓国語)
ไทย (タイ語)
български (ブルガリア語)
Čeština(チェコ語)
Dansk (デンマーク語)
Deutsch (ドイツ語)
English (英語)
Español - España (スペイン語 - スペイン)
Español - Latinoamérica (スペイン語 - ラテンアメリカ)
Ελληνικά (ギリシャ語)
Français (フランス語)
Italiano (イタリア語)
Bahasa Indonesia(インドネシア語)
Magyar(ハンガリー語)
Nederlands (オランダ語)
Norsk (ノルウェー語)
Polski (ポーランド語)
Português(ポルトガル語-ポルトガル)
Português - Brasil (ポルトガル語 - ブラジル)
Română(ルーマニア語)
Русский (ロシア語)
Suomi (フィンランド語)
Svenska (スウェーデン語)
Türkçe (トルコ語)
Tiếng Việt (ベトナム語)
Українська (ウクライナ語)
翻訳の問題を報告
Your "sources" are dumber than a box. Please give me a logical explanation as to how the CPU load decreases at higher resolutions. If you're getting lower FPS, it can be EITHER because of the CPU OR THE GPU.
As you can see in the video, increasing the resolution increases gpu and lower cpu usage. At 4K you're so gpu bound with a 1080 Ti unless the game is very cpu demanding (such as in AC Origins).
Again, just because you saw %10 lower CPU usage when you increased the resolution doesn't mean anything. If you do understand about these things, you'll know that there are factors which could actually add up EVEN higher CPU load once you increase the resolution. Open world games where you need to load up a lot of stuff are good examples of this.
You were loading into the map, maybe other things happened and ended on the background... many factors can lead to that. Moral of the story here is, looking at the GamersNexus charts, you can see that those chips, even overclocked, can barely survive the 60 FPS limit in CPU intensive games.
I'd agree with Infinity Josh in this case, just get a 1070 and you should be good to go. 1080 Ti will look ludacris with that CPU and you'll hit diminishing returns in BF V.
*Facepalm* indeed. Whoever thinks increased resolution will in one way or another DECREASE the load on ANYTHING, he or she doesn't know what they're talking about. All higher resolution does is to tax your system MORE, NOT LESS. Duh.
but an aftermarket gpu will allow it to run windows
First, I would like to ask OP two things before answering with my non expert but more down to Earth advice than a bunch of numbers.
FIRST: What exacly d you want to put that CPU and GPU combo through? Casual stuff? 1080 gaming? 4K? Super extreme lol godly frames per second everywhere?
SECOND: Would you be willing to swap for a newer CPU (plus a new motherboard and maybe new RAM most likely) if you do happen to want a stronger setup? Or are you fine and well with your CPU and just want a solid GPU?
BASED ON THAT, a more appropriate answer might come from the pros "discussing" this whole spaghetti of numbers, cores and bottlenecks scenarios.
I am posting here because I'm also mildly curious. I have a 3770K as well and i totally love it. I paired it with a 1060 because of budget issues, but I may have bought a 1070ti if I had the cash on hand for it. Still, my 1060 does exceptionally well for what i'm using it. Ole 1080p casual gaming ad plenty of old games + a single core intensive korean mmo or two.
Intel Turbo peaks it at around 4.1 GHZ. no other kind of OC needed.
The best you can afford and feel happy with. The better the gpu the longer it will last before an upgrade is needed.
I use a 1080ti and i7-2600 at 4.3 ghz with a 1080p 120hz monitor. Most newer games will use the 1080ti at 100% usage even at 1080p, producing higher framerates. (The 3770k is 5% faster than i7-2600.)
A better gpu also enables the use of Nvidia's DSR where you can run games at 4K on a 1080p monitor. This can produce a better image as the sharpening algorithms have more pixels to work with.
If you ever go 1440p or higher you don't need a new gpu.
I think it's possible. Queuing Theory states that it would be.
(I used to do performance management for everything from Oracle Superclusters to Windows servers. Before that Windows development.)
At 4K the load on the gpu per frame is usually higher than for lesser resolutions. Framerates at 4K will be a lot less than for 1080p.
Consequently the cpu can have less work to do as it only needs to produce frames as fast as the gpu buffer can handle them. The part of the game engine that talks to the DX dll is likely to run slower. It only needs to update the frame buffer when a frame is cleared. So a small decrease in cpu usage could be observed due to the slower framerate.
The effect would depend on the cpu, gpu and game. Difficult to predict. No two systems ever perform identically.
But monitoring tools never lie (except on ryzens and laptop gpus).
kind regards
CPU usage goes down because FPS goes down along with drawcalls the CPU has to deal with, if you ran the same FPS at 4k as you do at 1080p than CPU usage would stay the same.