TEITANBLOOD May 10, 2024 @ 7:23pm
will more fps mean more electricity bills even if i have 60 hz monitor?
i play tf 2 without vsync and i get around 150 - 250 fps. i check on msi afterburner and my gpu is not running more than 40%.

does that still mean my gpu is drawing like 4 times more power when vsync is off? because msi afterburner says its significantly less watts compared by my next question below....

I am confused because in some games gpu is actually drawing around 120 watts which is still capped at 60 fps.

In which case is it drawing significantly more electricity?
< >
Showing 1-12 of 12 comments
Electric Cupcake May 10, 2024 @ 7:49pm 
Depends. Is your power grid 60 Hz?
nullable May 10, 2024 @ 7:52pm 
What's the question? Do you see the problem with describing TF2 as only using 40% of your GPU, and then describing other games as drawing 120 watts even though they're capped at 60 FPS.

We don't know which GPU. And you're asking us to compare an unquantified percentage to a specific number of watts. You're missing a few bits of data to make that work.

And I don't understand, if you can see how many watts the GPU is drawing, 120 watts in some games capped at 60FPS, well how many watts is it drawing when you're running TF2 without vsync. So run TF2 uncapped, how many watts are you drawing, let's call that X.

Is X > 120 or is X < 120? Is the difference significant? How much does electricity cost per watt? How much do you play? Is the cost enough to care about, or not? Why ask us to make random guesses when you can do some pretty basic arithmetic and have a definitive answer tailored to you.
Last edited by nullable; May 10, 2024 @ 8:32pm
_I_ May 10, 2024 @ 7:56pm 
get a kill-a-watt meter, and compare
TEITANBLOOD May 10, 2024 @ 8:03pm 
Originally posted by nullable:
What's the question? Do you see the problem with describing TF2 as only using 40% of your GPU, and then describing other games as drawing 120 watts even though they're capped at 60 FPS.

We don't know which GPU. And you're asking us to compare an unquantified percentage to a specific number of watts. You're missing a few bits of data to make that work.

And I don't understand, if you can see how many watts the GPU is drawing 120 watts in some games capped at 60FPS, well how many watts is it drawing when you're running TF2 without vsync. So run TF2 uncapped, how many watts are you drawing, let's call that X.

Is X > 120 or is X < 120? Is the difference significant? How much does electricity cost per watt? How much do you play? Is the cost enough to care about, or not? Why ask us to make random guesses when you can do some pretty basic arithmetic and have a definitive answer tailored to you.


my question is simple is which situation draws more power?
More FPS and more % GPU usage?
Last edited by TEITANBLOOD; May 10, 2024 @ 8:03pm
nullable May 10, 2024 @ 8:48pm 
Originally posted by TEITANBLOOD:

my question is simple is which situation draws more power?
More FPS and more % GPU usage?

Well FPS is kind of arbitrary and not all frames are equal, and what I mean is 1 frame of Half-Life takes a lot less work than 1 frame of Cyberpunk 2077 at 4k, ultra quality, fully ray traced. It's really hard to quantify power usage based on FPS, you need more data.

You could run a simple enough games at 1,000 FPS and use only a little bit of the GPU. Or have the GPU running at 100% and render 6FPS.

Percentage of GPU usage gets you a little closer, a higher percentage of usage would likely mean more power draw. Although there's some caveats with that, percentage is a fairly vague general purpose metric. GPU's have different bits and those bits may use varying amounts of power. 50% in one game may differ from 50% in another game, how much variation that could amount to.

Again you have a specific question in mind, maybe with specific games. And if you can see the watt usage you can sort out the answer. But as is you're asking a pretty vague unanswerable question, or at least a question with enough answers that it's not much use.

I think part of your question is confused because you don't take being CPU bound into affect. If the CPU can't keep up with the GPU then even at 250FPS the GPU could be using less power than a more demanding, non-CPU bound game at 60FPS.
Last edited by nullable; May 10, 2024 @ 8:51pm
_I_ May 10, 2024 @ 9:00pm 
if you use vsync or rtss, the gpu will go to idle and not bother drawing extra frames that would be dropped
Last edited by _I_; May 10, 2024 @ 9:01pm
Karumati May 10, 2024 @ 9:39pm 
savings are not significant in this case.
A&A May 10, 2024 @ 9:48pm 
This question has an obvious answer. And the answer is yes. If you play the same game at 60 FPS and then switch the frame limit to 240 FPS, that means your CPU and GPU have to do four times as much work, and that work requires electricity. So if in the first example the computer draws 25 watts, the second should draw 100 watts. There may be inconsistencies such as running in the second case with 100±(10-20) Watts because there are many different factors that affect a computer's efficiency.

Like the FPS in TF2, you said yourself, when the fps is not limited it's around 150 and 250 and the reason is that the scenes that need to be simulated and rendered are inconsistent in terms of CPU and GPU demands. It's the same with any other game. Some games have less and others have more demanding scenes.
Last edited by A&A; May 10, 2024 @ 9:55pm
Pocahawtness May 11, 2024 @ 11:51am 
Originally posted by TEITANBLOOD:
i play tf 2 without vsync and i get around 150 - 250 fps. i check on msi afterburner and my gpu is not running more than 40%.

does that still mean my gpu is drawing like 4 times more power when vsync is off? because msi afterburner says its significantly less watts compared by my next question below....

I am confused because in some games gpu is actually drawing around 120 watts which is still capped at 60 fps.

In which case is it drawing significantly more electricity?

Buy one of those power meters you plug in to the wall and monitor what your PC is using. They are really cheap and take the guesswork out of it.
emoticorpse May 11, 2024 @ 12:16pm 
I really don't get this, but I wanna say the 60hz monitor part has nothing to do with this equation and I can't keep thinking about it and move on to the other stuff.
SlowClick May 11, 2024 @ 5:55pm 
The GPU might be putting out 150-250FPS according to the FPS meter but the monitor is only showing 60FPS - it is unable of showing any more. (I have a 60Hz monitor and have seen this too). The higher the GPU power usage the higher the power bill. (but by such a small amount you wont notice). The less work the GPU has to do the less power it will use, so I'd limit FPS to 60 to make it work easier.

Because of my 60FPS monitor, my 3070 rarely goes above 75% usage on high/ultra settings.
_I_ May 11, 2024 @ 6:09pm 
with vsync off, it will show parts of each of the 250+ frames per second, stitched together with tear lines on a 60hz display

new vsync will cap it at the displays refresh rate
draw -> wait -> display
rtss, similar but estimating the wait time before drawing
wait -> draw -> display

old vsync or fast sync, will never put the gpu to idle, keeping it drawing and drop frames
draw -> dr -> display last complete draw -> aw

adaptive sync, just toggles vsync on/off
on when at or above refresh rate, and off when below so less stuttering when fps tanks
Last edited by _I_; May 11, 2024 @ 6:13pm
< >
Showing 1-12 of 12 comments
Per page: 1530 50

Date Posted: May 10, 2024 @ 7:23pm
Posts: 12