Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
We don't know which GPU. And you're asking us to compare an unquantified percentage to a specific number of watts. You're missing a few bits of data to make that work.
And I don't understand, if you can see how many watts the GPU is drawing, 120 watts in some games capped at 60FPS, well how many watts is it drawing when you're running TF2 without vsync. So run TF2 uncapped, how many watts are you drawing, let's call that X.
Is X > 120 or is X < 120? Is the difference significant? How much does electricity cost per watt? How much do you play? Is the cost enough to care about, or not? Why ask us to make random guesses when you can do some pretty basic arithmetic and have a definitive answer tailored to you.
my question is simple is which situation draws more power?
More FPS and more % GPU usage?
Well FPS is kind of arbitrary and not all frames are equal, and what I mean is 1 frame of Half-Life takes a lot less work than 1 frame of Cyberpunk 2077 at 4k, ultra quality, fully ray traced. It's really hard to quantify power usage based on FPS, you need more data.
You could run a simple enough games at 1,000 FPS and use only a little bit of the GPU. Or have the GPU running at 100% and render 6FPS.
Percentage of GPU usage gets you a little closer, a higher percentage of usage would likely mean more power draw. Although there's some caveats with that, percentage is a fairly vague general purpose metric. GPU's have different bits and those bits may use varying amounts of power. 50% in one game may differ from 50% in another game, how much variation that could amount to.
Again you have a specific question in mind, maybe with specific games. And if you can see the watt usage you can sort out the answer. But as is you're asking a pretty vague unanswerable question, or at least a question with enough answers that it's not much use.
I think part of your question is confused because you don't take being CPU bound into affect. If the CPU can't keep up with the GPU then even at 250FPS the GPU could be using less power than a more demanding, non-CPU bound game at 60FPS.
Like the FPS in TF2, you said yourself, when the fps is not limited it's around 150 and 250 and the reason is that the scenes that need to be simulated and rendered are inconsistent in terms of CPU and GPU demands. It's the same with any other game. Some games have less and others have more demanding scenes.
Buy one of those power meters you plug in to the wall and monitor what your PC is using. They are really cheap and take the guesswork out of it.
Because of my 60FPS monitor, my 3070 rarely goes above 75% usage on high/ultra settings.
new vsync will cap it at the displays refresh rate
draw -> wait -> display
rtss, similar but estimating the wait time before drawing
wait -> draw -> display
old vsync or fast sync, will never put the gpu to idle, keeping it drawing and drop frames
draw -> dr -> display last complete draw -> aw
adaptive sync, just toggles vsync on/off
on when at or above refresh rate, and off when below so less stuttering when fps tanks