R3dAlert93 Jul 7, 2024 @ 5:22pm
Wattage?
Is 402 watts normal in some newish games on a 1440p 165hz monitor??? I currently own the asus rog thor 850w psu.
< >
Showing 1-15 of 23 comments
Rumpelcrutchskin Jul 7, 2024 @ 5:52pm 
Depending on your hardware. RTX 4090 can draw more then that alone.
Dutchgamer1982 Jul 7, 2024 @ 6:24pm 
sadly we get screwed over..

for over 15 years,. nearly 20...

the best 2 cards in every nvidea series used 250W (sometimes even just 220 or 230w)

usuallyu that best card was called 80ti.. (x90 have not been used a while) and that elite card was called titan.

today what used to be called 80ti is rebranded x90 and titan is rebranded x90ti which this series not have one of.

further evidence is.. that always the x80ti two series apart roughly doubled in performance but kept the same 250w.. while an x70 always used 120-130w and had the same performance as an x80ti of 2 generations ago (or halve the perforoamcne of it series x80ti)


if we look the 2080ti last card to still use 250w... indeed the 4070 performs roughlyu the same and the 4090 roughly double..

but where things have changes is that a 4070 uses 220w not 120.. so basicly only a 30w reduction for that performance in all those years..

while that 4090 means now using 450 insetad of 250w.. meaning the doubling in performance also mean nearly doubling in wattage..

this is UNACCEPTABLE..
(if you live were power is expensive than including taxes you pay 65 EURO cent per KWH... (bout 72 dollarcent per kwh)
than you will spend much more on powering a gpu than you ever did spend on purchasing it.. and thus powerdraw matters.

I also have a house without ac, again like all houses in europe and having something produce that more watt of heat.. makes your house so overheated in summer it is not exactly enjoyable to game at all.
it is like sitting next to a red hot stove that blows hot air right in your face when it already is 40 Celcius in your house (104F for you americans)


---- and what is basicly means is that Nvidea has become lazy.. where first they screw us gamers over by making megaprofits over cryptominers buying all the cards and doing nothing to make sure, we their core customers.. on who their buisnuis always survived... could get them..

and than they have abused this further by hiking up the prices for their cards...

4090 = 1800 euro

I compare it to cards like the 1080ti or perhaps a titan.

titan was 1050 euro... in 2016. (as was every titan at it's release before and after)
corrected for inflation that should be 1320 euro now.

1080ti was 700 euro (as was every best card in their series often called x80ti but sometimes x90 too)
that would be 880 euro now..

so an 4090 should cost 900 euro not 1800..
and if there was an 4090ti.. it shoudl cost 1350 euro at best not 3000 like what a 3090ti did cost.

similairly an x70 was always 350 euro (which corrected for inflation is 440 euro) not the 600 euro they ask for it now.

and an x60 was always 200 euro (corrected for inflation 250 euro now) not the 350 euro they ask for it.

oh and an x80 was always 500 euro (corrected for inflation 630 euro now.). not the 1100 euro they charge for one now..

SO even AFTER we correct for inflation....
so basicly nvidea has spiked up the price of an x60 and x70 by 40%, of an x80 by 75% and of an x90 by 100%

no wonder they NEVER made as much profit as they do now..
but did they invest asll this cash into R&D nope.. they SKIMPED on R&D.. why care making a good product if it sells anyway..

sure they could have done more research and made a 4070 that actually was 130w.. and an 4090 that was actually 250w but they did not want to spend that much and basicly just overvolted their old crap and called it a day.

I mean where before their performance per watt every 2 generations doubled..
now they only went 250w -> 220w (10% gain)
and +100% performance at expending 180% of the power (again only a real gain of 10%

making only a 10% real performance gain in 2!! generations of gpu's where before they made 40% performance gain per generation and 100% over 2... just shows how greedy and non caring to invest in r&d they have become.
skOsH♥ Jul 7, 2024 @ 7:09pm 
My cpu when I am just doing stuff like browsing steam or the internet, will be about 13w. Ryzen, so AMD for cpu

gpu, AMD also has cards that use less wattage. Their top card doesn't outperform a 4080, but your power bill will be about 2x lower with the same usage.
Dutchgamer1982 Jul 7, 2024 @ 8:41pm 
Originally posted by dc_:
My cpu when I am just doing stuff like browsing steam or the internet, will be about 13w. Ryzen, so AMD for cpu

gpu, AMD also has cards that use less wattage. Their top card doesn't outperform a 4080, but your power bill will be about 2x lower with the same usage.

false.... firstly an 7900XTX will easely outperform an 4080 Super. by 5% of more.
secondly radeon cards are even less efficient in performance per watt than nvidea cards are. and MORE powerhungry.

cards listed.. n order of most powerfull to least powerfull with behind how much watt they need.

4090 : 450W
RX7900XTX 355W
4080S : 320W
4080 : 320W
RX7900XT 315W
4070TiS : 285W
4070Ti : 285W
RX6950XT : 335W
4070S : 220W
RX7800XT : 263W
RX6900XT : 300W
RX6800XT : 300W
4070 : 200W
RX7700XT :245W
RX6800 : :250W
RX6750XT :250W
4060Ti : 160
RX7600XT : 190W
RX6700XT :230W
4060 : 115
skOsH♥ Jul 8, 2024 @ 12:24am 
Originally posted by Dutchgamer1982:
Originally posted by dc_:
My cpu when I am just doing stuff like browsing steam or the internet, will be about 13w. Ryzen, so AMD for cpu

gpu, AMD also has cards that use less wattage. Their top card doesn't outperform a 4080, but your power bill will be about 2x lower with the same usage.

false.... firstly an 7900XTX will easely outperform an 4080 Super. by 5% of more.
secondly radeon cards are even less efficient in performance per watt than nvidea cards are. and MORE powerhungry.

cards listed.. n order of most powerfull to least powerfull with behind how much watt they need.

4090 : 450W
RX7900XTX 355W
4080S : 320W
4080 : 320W
RX7900XT 315W
4070TiS : 285W
4070Ti : 285W
RX6950XT : 335W
4070S : 220W
RX7800XT : 263W
RX6900XT : 300W
RX6800XT : 300W
4070 : 200W
RX7700XT :245W
RX6800 : :250W
RX6750XT :250W
4060Ti : 160
RX7600XT : 190W
RX6700XT :230W
4060 : 115

Ah. I thought their technology going into reducing wattage worked a little better.

I'm still glad I have an amd. It's a nice card. I figure it is better at most games due to its high rasterization score
r.linder Jul 8, 2024 @ 12:31am 
Originally posted by dc_:
Originally posted by Dutchgamer1982:

false.... firstly an 7900XTX will easely outperform an 4080 Super. by 5% of more.
secondly radeon cards are even less efficient in performance per watt than nvidea cards are. and MORE powerhungry.

cards listed.. n order of most powerfull to least powerfull with behind how much watt they need.

4090 : 450W
RX7900XTX 355W
4080S : 320W
4080 : 320W
RX7900XT 315W
4070TiS : 285W
4070Ti : 285W
RX6950XT : 335W
4070S : 220W
RX7800XT : 263W
RX6900XT : 300W
RX6800XT : 300W
4070 : 200W
RX7700XT :245W
RX6800 : :250W
RX6750XT :250W
4060Ti : 160
RX7600XT : 190W
RX6700XT :230W
4060 : 115

Ah. I thought their technology going into reducing wattage worked a little better.

I'm still glad I have an amd. It's a nice card. I figure it is better at most games due to its high rasterization score
AMD had to increase their wattage to keep up with NVIDIA this generation
_I_ Jul 8, 2024 @ 1:49am 
they are increasing performance
they pretty much hit a wall as performance per watt,
increasing power is the only way to overcome it
Rumpelcrutchskin Jul 8, 2024 @ 1:52am 
Originally posted by _I_:
they are increasing performance
they pretty much hit a wall as performance per watt,
increasing power is the only way to overcome it

Maybe they should just stop there until some new technological breakthrough.
Hell, who am I kidding, capitalism.
_I_ Jul 8, 2024 @ 2:09am 
intel always knew ht was a power hit for small performance
and is finally dropping it, but only to save space on the die for more cores and cache
A&A Jul 8, 2024 @ 3:00am 
Originally posted by _I_:
intel always knew ht was a power hit for small performance
and is finally dropping it, but only to save space on the die for more cores and cache
It's like they're trying to chase good single-core performance or higher IPC, as if there's nothing pushing the execution resources further, you're left with a gap that needs to be filled.
_I_ Jul 8, 2024 @ 3:39am 
their e-cores kinda fill the gap tho

a single e-core and a single p core combined > a p core with ht
in power and performance

but they do take more space, for their own cache and other parts
C1REX Jul 8, 2024 @ 4:05am 
Originally posted by SoloPlayahSnc93:
Is 402 watts normal in some newish games on a 1440p 165hz monitor??? I currently own the asus rog thor 850w psu.


Sounds normal.

My PC draws between 130W and 550W but I have a huge screen and a very power hungry GPU.

It’s GPU + CPU + motherboard with RAM and chipsets + monitor. There is also some energy wasted as nothing has 100% efficiency.

You can save some energy by capping your FPS. If you cap close to your 1% lows you may save energy, have more stable frame rate and frame time and actually get better experience.

If I cap my FPS the power draw of my GPU drops from 380W to about 160W-200W.
Last edited by C1REX; Jul 8, 2024 @ 5:52am
A&A Jul 8, 2024 @ 4:22am 
Originally posted by _I_:
their e-cores kinda fill the gap tho

a single e-core and a single p core combined > a p core with ht
in power and performance

but they do take more space, for their own cache and other parts
I mean, more likely the gap is in the P cores, because disabling HT, power consumption becomes lower by ≈20%, and multicore performance by ≈30-40%, and when you remove it completely, it will probably compensate by power consumption at least a little. So either way there is power/heat headroom and more space and should allow several options:
1. Clock the P cores furthur
2. Put more cache
3. Put more e cores
4. Make them cheaper

And they always can put a tag on it. "The best in something".
_I_ Jul 8, 2024 @ 5:46am 
agreed, they have more room, for higher tdp
but bigger die = less produced from each wafer
and more chance of defective or lower performing parts, which greatly increases costs
or and produces more lower rated skus and mobile cpus
Last edited by _I_; Jul 8, 2024 @ 5:46am
Tonepoet Jul 8, 2024 @ 7:37am 
You spec. your system out based on what you anticipate the maximum potential sustained load to be based upon the T.D.P. of the G.P.U. and C.P.U. with some additional overhead for the other auxiliary components

A component isn't supposed to run at T.D.P. at all times though. It's a theoretical maximum sustained load. If the job it is instructed to do can be done with less than that, then it should use less wattage. This is true even while gaming.

You might note for example in Techpowerup power consumption reviews that there are different categories for power consumption, and while the RTX 4090 F.E. does sport a 450 watt T.D.P. the expected average consumption is actually more along the lines of 346 watts[www.techpowerup.com], and can also go above 450 watts depending on the workload in actual practice. Playing Cyberpunk with maximum details and ray tracing enabled can actually push 477 watts.

Power supplies run most efficiently at 50% load, so I'd actually suppose that around the 400-500 watt range is about normal consumption for a system that's appropriately specced out with an 850 watt power supply. Mind you, I don't normally monitor power consumption or heat output of a system, so this is more of an educated guess than certain knowledge, but it seems reasonable.
Last edited by Tonepoet; Jul 8, 2024 @ 7:50am
< >
Showing 1-15 of 23 comments
Per page: 1530 50

Date Posted: Jul 7, 2024 @ 5:22pm
Posts: 23