Installera Steam
logga in
|
språk
简体中文 (förenklad kinesiska)
繁體中文 (traditionell kinesiska)
日本語 (japanska)
한국어 (koreanska)
ไทย (thailändska)
Български (bulgariska)
Čeština (tjeckiska)
Dansk (danska)
Deutsch (tyska)
English (engelska)
Español - España (Spanska - Spanien)
Español - Latinoamérica (Spanska - Latinamerika)
Ελληνικά (grekiska)
Français (franska)
Italiano (italienska)
Bahasa Indonesia (indonesiska)
Magyar (ungerska)
Nederlands (nederländska)
Norsk (norska)
Polski (polska)
Português (Portugisiska – Portugal)
Português - Brasil (Portugisiska - Brasilien)
Română (rumänska)
Русский (ryska)
Suomi (finska)
Türkçe (turkiska)
Tiếng Việt (vietnamesiska)
Українська (Ukrainska)
Rapportera problem med översättningen
And VRR would have (more) dips and stutters as most games are designed with a static performance/fps target in mind. What the game does in the time beyond 16.6ms/8.2ms most developers care little about, i think. And VRR lays this to bare, in my experience moreso than a locked refresh rate.
I personally also dislike the fluctuating input latency that comes with VRR and dips. I dislike a steady 30ms latency less, than 20ms latency with dips to 30ms, to me the latter is more noticeable/disruptive.
True, but you need to be able to display these extra frames. No need to run 280fps on a 120hz monitor. Except for maybe input latency, otherwise you're just wasting compute and worsening your framepacing. And yes, per game basis. I think he does as well, but just generalized with the 60-120. I go between 60, 96, 100, 120, 180 and 240. I reckon he does the same between 60 and 120fps (base, he interpolates).
You also seem to overestimate a bit, most game's ability to stably render over 120fps. Games like that are more of an exception for now. Most games are just now starting to get developed with a target of 120fps in mind, with consoles and TVs recently allowing for it. Talking about AAAs.
Agreed
Lol, is that so? He said 240hz oled, which makes it likely he’s 1440p atleast, wide perhaps and 4k maybe.
The way i understand this works is that in VRR-mode a display is always processing at it's fastest scanout speed, it just varies the length of the back-buffer-margin/overscan after the frame is already drawn.
Low fps = longer "back porch", high fps = shorter "back porch". This effectively leads to a 60fps frame being drawn 4 times as fast on a 240hz display that's running in VRR-mode vs. a fixed-refresh 60hz. Which in turn leads to reduced display-latency. Happy coincidence of how VRR works.
And i agree that when you can't avoid varying frametimes, VRR is the way to go. As a strict, vsync'ed, fixed-refreshrate will present really choppy when fps is unstable.
In comp FPS's i don't see for example why you wouldn't, unless you want strobing.
Heh, i'm the opposite. Too big fps swings will bother me more than a steady lower fps.
Yep. I'm note sure what is your precise preference, but I for one am still waiting for that The Fifth Element begining of the movie experience. Beyond Good and Evil 2 had the flying vehicle among buildings, but those 'tards just threw it in a drawer for later or whatever.
My opinions of the comp level settings are just from what I glean from their settings videos explaining how they configure their hardware for lowest possible latency. It's nearly all high refresh locked to the highest their hardware can maintain without going under. What specific ways they lock that fps can vary by user.
The fluctuating fps for AAA games is where vrr shines because if the fluctuation is in the high fps range you won't even notice it as long as it doesn't dip too low. But I still usually cap them at 60~120 if only to reduce power consumption. Open world games play smooth at 60 and fast paced games at 120.
Your 3090ti has a big undervolting potential, I hope you use it :) My 3080ti at stock settings draws 275W at 60fps native in Witcher 3 at 45% load,. Undervolted to 843mv/1800 I can do a 60 -> 120fps framegen and it only uses only 230W at 67% load.
Yes both the 3090ti and 11900k are undervolted. Really cuts down on heat generation which is the main issue. So simple to do too.
So i got curious and I spent some time testing VRR on; The Witchter 3, The Talos Principle 2, Shadow of the Tomb Raider, Forza Horizon 5, No Man’s Sky, System Shock 1 Remake and The Hunter – Call of the Wild.
I tested all games with; NVCP settings defaulted, vsync'd in-game if satisfactory or in NVCP otherwise, on their highest settings. On the XG2431 Gsync-compatible monitor, which when games allow for it, can display VRR flawlessly. Where needed I set resolutions higher than native, to put the bottleneck on the GPU and get games off of the display’s refresh-limit.
Of the games I tested, only 1 worked flawlessly (SOTTR) and one other (CotW) worked flawlessly only on a gamepad. The rest of these games exhibited varying amounts of noticeably uneven framepacing and asset loading stutters, which seem bigger when contrasted to the higher fps VRR allows. For VRR-usability I’d rate SOTTR and CotW an average 4.75/5 and the rest an average 2-2.5/5.
If I where to have to use VRR on these games, I would have to use a sufficiently low fps-cap in order to stabilize framepacing. At which point I’d rather turn down some Ultra’s to High and get a 99.9% solid fixed refresh rate. Which will work with frame-interpolation and backlight-strobing as well.
It seems that most of the games I’ve tested are not developed with VRR in mind, or to work smoothly with it. Consoles and TVs recently allowing for VRR might change this. Though I doubt it would be restrictively hard to implement a “VRR” vsync-mode into almost any game right now or in the past 10 years. This could keep some margin for engine-limitations and limit frames to some kind of running average. I get the feeling both SOTTR and CotW do this to some degree. That or the programming on them is stellar.
But after having done some admittedly brief and somewhat narrow testing, my notions of VRR remain unchanged for now.
It is entirely possible that any AA/AAA-game released in the last few years works flawlessly with VRR. I wouldn’t know about this, though I have reason to doubt this. I’ll also admit that it’s very likely that VRR works miles better when 100% GPU-bottlenecked on a low- to mid-tier GPU.
I’m interested in hearing about more people's experiences, thoughts and opinions on VRR and different usage-scenarios.
Yep. I have 165hz Gsync for almost seven years and I never just let the framerate run free, I always set cap where I try to make sure that the GPU usage doesn't exceed 85-90% often. or a cap where I get less stutters due to CPU/Gameengine/SSD-llimit.
Lol, you can set up VRR in a way so that it works acceptably 99% of the time. You can do this by making sure you are sufficiently GPU-bound, 95-100% preferably, or by setting a realistic framerate-cap. Which will likely be quite a bit lower then your max-fps.
Also note that my testing was not done in the latest of games, like i mentioned, it might be possible modern AAAs behave better.
I find VRR most useful when looking at it as a way to set a fixed refresh rate anywhere you want, up to your monitor's max refresh.
From there you decide how deep of dips you tolerate. For example; If you are fine with your framerate dipping 25% of the time; set a cap of lets say 180fps. If you want your fps to be locked 95% of the time; set a cap of 150fps. It's entirely up to your preference and VRR allows you this freedom without having to mess with custom resolutions.
Not to even mention the tearing and latency benefits.
They basically all work well with VRR, but don't expect good results with uncapped fps in games with fps jumping wildly around between 120 to 240fps.
I don't think I ever saw my 4090 going reaching 95%. LOL. But, I see, you were just testing it uncapped. Then not to worry because capping it "a bit lower then your max-fps" is perfectly acceptable. People say 3 FPS less is enough. How much do you cap? Or do you take that approach from that video you shared here the other day? I don't quite remember the exact math of it.