Steam installieren
Anmelden
|
Sprache
简体中文 (Vereinfachtes Chinesisch)
繁體中文 (Traditionelles Chinesisch)
日本語 (Japanisch)
한국어 (Koreanisch)
ไทย (Thai)
Български (Bulgarisch)
Čeština (Tschechisch)
Dansk (Dänisch)
English (Englisch)
Español – España (Spanisch – Spanien)
Español – Latinoamérica (Lateinamerikanisches Spanisch)
Ελληνικά (Griechisch)
Français (Französisch)
Italiano (Italienisch)
Bahasa Indonesia (Indonesisch)
Magyar (Ungarisch)
Nederlands (Niederländisch)
Norsk (Norwegisch)
Polski (Polnisch)
Português – Portugal (Portugiesisch – Portugal)
Português – Brasil (Portugiesisch – Brasilien)
Română (Rumänisch)
Русский (Russisch)
Suomi (Finnisch)
Svenska (Schwedisch)
Türkçe (Türkisch)
Tiếng Việt (Vietnamesisch)
Українська (Ukrainisch)
Ein Übersetzungsproblem melden
The utilisation of all the ressources of your setup depends on many factors. How optimized the game is towards the specific architecture, what resolution he is playing, what grafic settings he's using and so on.
Old games won't hardly be able to use the avaible workload of modern systems, new 2d games won't be able to use the workload of modern systems...f.e.
A general advice: If you're unhappy with the performance, tinker with your ingame settings. The diminishing return of visual quality vs performance lost with higher settings is, most of the time, not worth it.
And as Snow wrote, 99% usage introduces many other unwanted factors.
With a decently balanced rig along with a decently optimized game and some basic knowledge there is zero reason not to get every penny worth out of your GPU and a high hz monitor.
I would also suggest capping your FPS below your panels refresh rate using RTSS instead of vsync but microstutters should not happen when at 99% usage within you panels refresh rate, if it does it's likely a CPU or VRAM limitation causing big FPS swings or something running in the background or one of many overlays people run at the same time.
https://www.youtube.com/watch?v=7CKnJ5ujL_Q
Here's another video from a well-known source. Unfortunately, they didn't understand the first video, so they have only 1 test with GPU at 99%, which shows the same input latency with 30% FPS difference. Added latency at higher framerate is the result of GPU being at 99%.
https://www.youtube.com/watch?v=VtSfjBfp1LA
You can't technically use RTSS instead of VSync because those things have nothing in common. VSync locks the graphics card's front buffer during the period screen reads the image to eliminate tearing. Some VSync techniques come with FIFO-queued triple buffering, which also limits the amount of frames card can push per second, but make no mistake - it's a side-effect and should not be used as a frame limiting technique. RTSS limiting sets the desirable amount of time CPU spends on a single frame, and each time CPU does its job faster than desired - it inserts little "busy-wait" time, so CPU rests until the end of frame and be able to make a next one just in time, because it wasn't busy with anything else. This provides smooth and responsive gameplay, but does not eliminate tearing. Using RTSS and VSync together is the best option for smooth tear-less gameplay.
Oh, and if you limit your framerate below your refresh rate while VSync is one - you'll get constant stutters, i.e. 59fps at 60Hz will show one frame twice each second. You want screen to show each frame the same amount of time, i.e. 60fps/30fps at 60Hz, 144fps/72fps/36fps at 144Hz and so on.
That's partially true. Because CPU doesn't know how many frames GPU is rendering on hardware level, even with GPU at 99% CPU keeps pushing as many frames as it can until the software tells otherwise. So far the only game I found that was correctly managing CPU when GPU was at 99% usage is Devil May Cry 5, and I've played hundreds of games of all sorts, genre and age. RTSS is a nice workaround for this issue.
while this sounds amazing for people with extrem low and old hardware which can provide only ~80fps in an extrem optimazed game like overwatch on 1080p , how is it for people with better hardware?
he shows this himself inside the video:
81fps rtss off 99% load avg 72ms
60fps rtss on 77% load avg 41ms
145fps rtss off 99% load avg 48.2ms
138fps rtss on 82% load avg 34ms
how does most people play a compedetiv shooter like overwatch ? right not on epic preset 200% render scale. this means a 1080 or even lower gpus can provide 300fps on decent settings.
if a jump from 81fps to 145fps reduces input lag by 23.8ms, what happen if you play on 300fps with 99% load ?
another -20ms, less or nothing ? this would be interresting for me
i mean for both videos about input lag and "stutter" he maxed out all settings and increses the render scale on 4k. this is not not a normal scenario but how does this look like with normal settings?
EDIT: ok i saw the 2nd video right now. so every game acts different. some games run better with rtss and some worse when it comes to input lag. same with LL.
your 2nd video 9m35sec:
conclusion of the video was clearly rtss cap below 99% load is not a fix for input lag/ stutter. it can be on some games but it can run worse !
As you pointed out, a nice graphics card can push 300 FPS in such a game in a "normal scenario". But this "normal scenario" will likely end up being CPU-limited way before GPU reaches 99% usage. The point of the video was to show how a game would act, and definitely not to show how people typically play Overwatch. What was shown can be applied to other games that make graphics card run at 99% usage.
The second video is a bit... complicated. See, the guys from Hardware Unboxed didn't fully understand the point of Battle(non)sense's video. From all the games they've tested only Gears 5 was utilizing GPU at 99%, and it shown the same input latency at 184fps with GPU at 99%, and at 144fps with RTSS limiter. 30% higher framerate resulted in the same input latency, thus they've proven the point Battle(non)sense made, as the video wasn't about 97% or 95% or anything else, but 99%. Every game indeed works differently, and some don't put 99% load on the GPU at all, so even if there is some added input latency - it might not make that much of a difference. Hardware Unboxed simply didn't get the point - RTSS limiting does not decrease input latency by itself, it just prevents from getting added latency caused by 99% GPU usage.
Now this is the hardest part. While it's clear that 99% GPU usage adds some input latency, there are too much things in this equation, yet I'll give that a try.
First we have frametime. RTSS adds busy-wait cycles between the frames so each frame would be rendered at exactly 1000/60=16.67, thus input latency caused by i/o devices and camera is 41-16.67=24.33. Same with 138fps limit, 1000/138=7.25, so 34-7.25=26.75. Judging by this, i/o devices and camera result in approximately (24.33+26.75)/2=25.54 added input latency.
Now let's calculate the added input latency caused by 99 GPU usage.
1000/81=12.35ms per frame.
72-12.35-24.33=35.32 added input latency. Now we need a relative added latency.
35.32/12.35*100=286% of input latency increase.
1000/145=6.9 per frame.
48.2-6.9-26.75=14.55 of added input latency.
6.9/14.55 *100=211%.
So, judging by Battle(non)sense tests, GPU being at 99% adds up to around (286+211)/2=248.5 input latency, which is indeed a huge ♥♥♥♥♥♥♥ number.
Now let's calculate the 300fps case.
1000/300=3.33ms per frame.
3.33+25.54=28.87
3.33*2.485+25.54=33.81505
So, according to the Battle(non)sense tests, these are the approximate results at 300fps.
300FPS rtss off 99% load avg 33.82ms
300FPS rtss on <99% load avg 28.87
Is there a difference? Sure. But is it anything to be concerned about? Well, at framerate that high it'll be pretty much impossible for an average human to tell the difference.
Conclusion. If your framerate is in 0-60 zone - 99% GPU load causes noticeable input latency difference. If it's in 60-144 zone - 99% GPU load might be a problem in fast-paced games like Overwatch. If it's in 144-240 zone - it's likely to be of concern only for hardcore competitive gamers. If framerate is 240+ - whatever.
I'm glad you brought up this question, as now I have a better understanding of when and why.