Installer Steam
log på
|
sprog
简体中文 (forenklet kinesisk)
繁體中文 (traditionelt kinesisk)
日本語 (japansk)
한국어 (koreansk)
ไทย (thai)
Български (bulgarsk)
Čeština (tjekkisk)
Deutsch (tysk)
English (engelsk)
Español – España (spansk – Spanien)
Español – Latinoamérica (spansk – Latinamerika)
Ελληνικά (græsk)
Français (fransk)
Italiano (italiensk)
Bahasa indonesia (indonesisk)
Magyar (ungarsk)
Nederlands (hollandsk)
Norsk
Polski (polsk)
Português (portugisisk – Portugal)
Português – Brasil (portugisisk – Brasilien)
Română (rumænsk)
Русский (russisk)
Suomi (finsk)
Svenska (svensk)
Türkçe (tyrkisk)
Tiếng Việt (Vietnamesisk)
Українська (ukrainsk)
Rapporter et oversættelsesproblem
However, with that having been said, I would probably advise you against committing to a full platform upgrade just yet. The 9000 series processors aren't the only new options coming in July. The 5800xt and the 5900xt were also announced, and you might also want to give the 5700x3D some due consideration since it seems as if the price has dropped and it already offered a pretty good value proposition compared to the 5800x3D in the first place. I won't get into the specifics though since I have no idea what greek pricing is like or what your usage application is.
Not sure where you get that from since 4090 non-D can use between 450-600 Watts alone
If it's so grossly wrong, you'll have to take it up with the manufacturers of this stuff 'cause that's where I'm getting most of my information. From their spec. sheets.
When you click "View Full Specs." on Nvidia's page for the 4090[www.nvidia.com], the last listed specifications before the footnotes are Thermal and Power Specs:
Nvidia's website makes it annoying to provide the full spec. sheet with a direct link, but you can also see specs. on Tech Power Up's page for the 4090[www.techpowerup.com], which also includes this snippet of text:
450 watts is supposed to be the maximum power draw for a reference 4090.
Now I concede that board partner 4090 variants might use more in practice, but if you look at the specs. of the M.S.I. GeForce RTX® 4090 SUPRIM X 24G[us.msi.com] for example, they rate the power draw at 450 watts in "silent mode" and 480 watts in "gaming mode", and similar to Nvidia, they suggest you can get by with an 850 watt power supply (although they recommend a thousand watt power supply, despite only drawing 30 watts more).
Granted, it does seem as though you folk are right about the 4090 having 600 watt spikes. Chiliblast[www.chillblast.com] notes:
However it goes on to state:
However, with that having been said, power supply should be able to handle momentary spikes beyond its continuous power rating momentarily. From the Intel guide P.C. Power Supply: How to Chose the Right One for You[www.intel.com]:
Simply put, we're not necessarily factoring in the 600 watt spikes when selecting the continuous power draw rating of our P.S.U. I'd like to see what sort of spike power draw the Corsair CX 750M can handle, but I can't seem to easily find that information.
If I made any error in my previous statement, it's simply not factoring in the power draw of the auxiliary components[www.buildcomputers.net] under the assumption that they wouldn't draw more than 50 watts. Each stick of R.A.M. and a S.A.T.A. based S.S.D. might only use 3 watts each, and each fan might only consume up to 6 watts.
Assuming 2 sticks of R.A.M., 1 S.S.D. and say 5 fans, that's only 41 watts. Mouse and keyboard together might be 5 watts[remoteworkingwarriors.com], bringing us up to maybe 46. 46+703=749 which brings us to the limit of a 750 watt power supply.
Thing is that I forgot the motherboard needs power too and that can be up to 80 watts. >_< With that in mind we're looking at more like 830 watts based on reference 4090 specs, and this roughly in line with Nvidia's recommendation of an 850 watt power supply for the reference card. We end up exceeding the 850 limit if we assume M.S.I.'s 480 watt rating though. Granted, it's not necessarily advisable to go up so close to the limit. Some power supplies may be misrated, and power conversion isn't 100% efficient.
Still though, my point was that a 750 watt power supply can handle significantly more power hungry components. The T.D.P. of the 4080 is 320 watts, and the suggested P.S.U. is only 700 watts.[www.techpowerup.com]
A 6700xt[www.amd.com] doesn't consume as much power as a 4080. A.M.D. suggests typical board power is 230 watts and only recommends a 650 watt supply. Those figures fall in line with the board power and recommended minimum P.S.U. Sapphire Pulse 6700 XT O.C.[www.sapphiretech.com]
Powercolor doesn't list a T.D.P. or board power of their Red Devil variant[www.powercolor.com], but they recommend a 700 watt power supply. Based on that, 'let's suppose the Red Devil variant actually consumes 50 watts more than a typical 6700 xt, just to be on the safe side.
What we have is a 280 watt G.P.U., a 170 watt C.P.U. for the 9950x (the most power hungry processor of the lot), maybe an 80 watt motherboard and 50 watts for the rest of the internal components.
280+170+80+50=580.
Now I'm not so sure about how a power supply is rated. Is the rating supposed to be the power it delivers to the system, or the power it draws from the wall?
Regardless, the worst case scenario is that it's the latter, so let's just go with that for now. An 80+ rated P.S.U. is supposed to be at least 80% efficient under load, and 80% of 750 is 600. That gives us at least 20 watts of overhead. Plus the odds are that Shredder isn't going to be using a 9950x, and the step down from the 9950x to the 9900x makes it more like 120 watts, which is 50 watts less than the 9950x. In that case we have 70 watts of overhead. The other two processors have a fairly nominal 65 watt T.D.P., which is actually lower than the T.D.P. of the 1700x (which is 95). A 750 watt power supply seems as if it should be fine no matter which of these processors is chosen.
Are we all on the same page now?
For optimal efficiency, wattage should be based more on how much of a load is going to be on it most of the time, and getting one with wattage double that so that most of the time it runs as close to 50% load as possible, it’s the most power efficient and it’s much easier on the power supply than running 80%+ load.
No you just can't properly interpret what you read. It clearly says '450W or greater'. Many of the aftermarket 4090 have been well known to be able to pull upwards of 600W. They can use around 450W when the power limit and voltage is turned down. You can't go by NVIDIA specs for aftermarket GPUs as many come with higher specs out of the box due to being pre-OC'ed
We're not on the same page 'cause you're lagging behind in the reading. We should've flipped past that page already since I admitted to underestimating the draw of the card. That was basis of your objection though didn't regard the power supply Nvidia recommended. It regarded the power draw of the card. Reference spec. is that the 4090 draws 450 watts.
If Nvidia was accounting for 600 watts of power, the math works out so that they should be recommending a thousand watt P.S.U. because they're anticipating an LGA 1700 i9 series C.P.U. they anticipate you using will consume at least 245 watts of power.
No, I'm interpreting the spec. sheet correctly. I'm looking at the "total graphics power" portion of the spec. sheet, which only reads "450 watts". The part that reads "or greater" is the part that regards the power handling of the auxiliary 12vhwpr cable or adapter. They're saying you at least need a 12vhpwr cable[videocardz.com] that's rated to handle 450 watts of power, and one rated for 600 watts is okay to use because it's in excess of the anticipated power draw of the card.
In any case, the 4090 doesn't particularly matter for the purposes of this topic since Shredder plans to use a 6700xt, and any mention of the 4090 on my part was just a backfired piece of rhetoric to illustrate that its power requirements of a 4090 so far beyond those of the 6700xt that he could put whatever motherboard compatible C.P.U. he wants in the system and still be okay, which still holds true and seems to be something we all agree upon.
Also, "a 4090" does not mean any particular 4090, so we go by reference spec...
Even the FE model can draw up to 500W in a peak spike, on a cheap power supply that isn't ATX 3.0, such as the OP's CX750M, it isn't sustainable and could easily shut off if the spikes trigger any of the power supply's protections, it's happened for weaker cards from RX 6000 series and RTX 30 series like the 6900-XT and 3080 which can also have high transient power draw.
And yeah you CAN make assumptions based on the FE model, but there's a flaw in your logic: most people don't buy the Founders' edition models. They're buying custom models from ASUS, MSI, Gigabyte, etc because those models tend to be more available and often cheaper, the FE is over 550-600 dollars more expensive than the cheapest 4090 models, they're not paying that much more for effectively the same card, they would just get the cheaper card and buy a better power supply anyway.
And then there's also the fact that a better and more efficient power supply will also waste less power than a cheaper old power supply that's being pushed hard. It's a stupid idea to put that kind of load on a Tier C power supply when there are actual older Tier A units that have also struggled with it.
ASUS ROG STRIX model[www.techpowerup.com]
Just for reference, a high end model like the ROG STRIX draws over 500W at a consistent peak without power spikes, over 550W during a spike. That's why ASUS recommends at least 850W, but if you're using a 14900K, 1000W minimum, 1200W is a safer bet, and if your average total system draw is around 600W while playing very demanding games on the CPU and GPU, then a 1200W would be most power efficient option for the PSU, running at 50% load is as ideal as it gets for it. Someone who can afford these top end cards can afford to splurge on a good 1000-1200W power supply.
For a CPU like the 7800X3D, even the ROG STRIX would realistically fine with an 850W ATX 3.0 unit provided it isn't complete crap of a unit, because that CPU can draw is little as freaking 50 watts while gaming, it SIPS power whereas the 14900K CHUGS and gets similar performance.
If you go with Ryzen 7 or 9 series + 4090 then you want 850W watts, quality gold or platinum PSU. If it's 13th/14th Gen i7/i9 + 4090 then you want 1000 watts PSU
Founder’s Edition and reference are technically two different things. Founder’s edition is the end user focused card Nvidia themselves make, whereas reference is the spec. Nvidia sends out to its manufacturing board partners to try and match. There are reference spec. cards out there that aren’t made by Nvidia, and while it might be reasonable to consider that some overclocked cards might need more than that, even some of the stock overclocked cards match Nvidia’s 850 watt recommendation, such as the M.S.I. Ventus O.C.[www.msi.com] and the Asus Tuf O.C.[www.asus.com] Asus even goes so far as to state that this in their footnotes:
This should at least be enough to establish that it’s perfectly realistic to be able to buy 4090 where the stipulated specification is an 850 watt power supply, even with a stock overclock, which is all I need to establish for my statement to remain reasonably accurate.
Regarding the real world figures of the Strix 550 watts is only achieved during a 10 millisecond excursions, and it’s honestly rather irrelevant. Although I have difficulty finding information for the CX750 specifically, I already stated that a power supply should be able to handle excursions in excess of its rating, and the general rule of thumb seems to be that it can handle an excursion of 130% of its rating according to M.S.I.[www.msi.com]
Ergo, a 750 watt power supply should be able to handle a 225 watt excursion that brings the system up to 925 watts. An excursion up to 550 is, at most, 100 watts above the 450 spec. Now the 30% excess isn’t a standard per se, but 100 watts is also under half of the excess power handling. The reason excursion data is so hard to find is likely because the anticipated threshold is so high that it is of no relevance to a builder specing out a system. We’re only concerned about sustained power draw.
This is not in consideration of A.T.X. 3.0. or P.C.I.E. 5. Power power supplies adhering to either of those standards are supposed to be able to handle an excursion that is double the rating of the power supply. Ergo, a 750 watt A.T.X. 3.0 power supply should be able to handle an excursion that brings the total system power consumption up to 1500 watts.
And regarding sustained power draw, 500 watts is only sustained using furmark, which is in part a G.P.U. power handling stress test that will push a card beyond its anticipated limits and has, in the past, been known to kill hardware as such. In realistic practical application, the most the 4090 strix sustains is 473 watts with maximum settings and ray tracing enabled. Given that I previously calculated the actual power draw of a 4090/i9 14900 system to be 830 watts, we have 20 watts of overhead, so while realisticaly, we do exceed threshold, it’s only by 3 watts. That’s rather unlikely to make an actual difference, and at that point we need to take manufacturing tolerances into consideration.
Now if you actually go to Asus's provided power supply U.R.L.and specify a thousand watt power supply for i9/RTX 4090 configuration, but that webpage mentions accounting some extra overhead for efficiency and upgrading headroom. If we cross reference with Corsair’s recommended process for selecting a power supply, it’s a simple four step process[www.corsair.com] goes something like this:
Take the T.D.P. of your C.P.U., add the T.D.P. of your G.P.U., add 150 watts to account for all other components and peripherals, and finally add another 150 watts for efficiency and future upgrades. That last step is largely optional though, and if it’s in part for upgrading headroom, you’re expected to dip into it eventually.
We can infer that the process A.M.D., Nvidia and some manufacturers use to determine how strong of a power supply goes something like this:
Take the T.D.P. of the strongest processor on the market, since we can’t predict which one the customer wants to use, add the anticipated power draw of our G.P.U., add 150 watts to account for all other components and peripherals, then round up to the next highest 50 watt increment. Account for no additional overhead.
At the time the 4090 was introduced to market, the hungriest processor was the 12900, which has a T.D.P. of 245 watts, and my hypothesis checks out rather well mathematically speaking: 245+450+150=845. Round up to 850 because nobody makes 845 watt power supplies.
I was in error, but the only error I admit to is the error of underestimating how much power the other components might draw. Considering that it’s a little discussed topic, and fans and R.A.M. sticks only take a handful of watts each. that’s more of an honest mistake than gross negligence, especially considering it's often questioned if manufacturers overestimate the recommendations.
Applying these principles to this scenario If we apply this process to the RX 6700, it still checks out. 230+245+150=625. Round up to 650 and you get A.M.D’s. Power supply recommendation for the 6700xt.[www.amd.com] Power color actually recommends a 700 watt power supply for the Red Devil Variant[www.powercolor.com], but this is possibly due to the stock overclock pushing past A.M.D’s. Base specification, which is possible with as little as 35 watts of extra consumption. Regardless, 700 watts is under 750 watts, so Shredder should be fine with his current power supply. He could even upgrade to an rtx 4080 Super[www.nvidia.com] or a 7900xt[shop-us-en.amd.com] and remain within the recommended reference spec of 750 watts. I’m thinking it’s a bit too soon for that with RTX 5000 series cards around the corner though.
Plus, unlike the manufacturers, we actually know what the actual T.D.P. of the components we’re choosing is, so we can base our estimate upon that. rather than the hypothetical maximum the customer might elect to use. If we wanted to cut it as close as possible to stick on a budget, a C.P.U.+G.P.U.+150 equation works out so that you could hypothetically get away with running a 9950x/6700 xt on a 550 watt power supply, or a 9900x/6700xt with a 500 watt supply, provided the supplementary connections were there. I don’t seem to be the only one to think this either, since Techpowerup mentions a 550 watt power supply on their specs. page for the RX 6700xt for some reason, although that could just be a typo from 650, and even if not, it is assuming the reference wattage of 230 watts though instead of whatever unspecified excess the Red Devil variant consumes.
Regarding efficiency, I see it as utterly irrelevant to picking the wattage of a power supply. Yes, it is technically true that a power supply runs most efficiently at 50% load, but the differential between that and running it at full load is rather marginal. It’s only 3% at bronze level based on 80+ certification thresholds, and the CX750M is no different., and this wasted amount decreases as the load reduces, since the lower load represents a smaller amount of conversion. Running a 750 watt power supply at 100% load only wastes an additional 23 watts, and running it at 20% load only wastes 5 watts.
Moreover, you're probably not going to make an accurate guess as to how much power your system consumes as a typical load unless you already have most of the components in hand, as the specs. only concern themselves with maximum loads.
If you are worried about it, the way the 80+ certification thresholds[www.msi.com] are that it’d generally be better to buy the next certification up if you can afford it, because the minimal efficiency of a higher level approximately the same as the 50% load threshold before it. So while bronze power supplies are 85% efficient at best, silver power supplies are 85% efficient at least, and whereas silver power supplies are 88% efficient at most, gold power supplies are at least 87% efficient at least and at most 90% efficient. Platinum is least efficient at 20% load where 90% efficient, but at 50% load it's at least 92% efficient and at 100% load it's 89% efficient. Simply put, it's not worth worrying about 50% load unless you're buying a titanium power supply, since there is no higher level of certification.
Plus, the components should only be sustain their full rated wattage consumption in the worst case scenarios. Most of the time, a system should be running at closer to 50% load anyway, without needing to make special accommodations for the wattage.
The way general pricing trends are going, you’d rather buy an 850 watt platinum power supply rather than a 1000 watt gold supply if efficiency is concerned. It'll be likely cheaper and definitely more power efficient
The only real fault in my process is that I would have never considered that the needed headroom for components and peripherals would take up to 150 watts. It’s a little discussed topic in my experience, and things like S.S.Ds., ram sticks and fans take such small amounts of energy to run individually. A mouse and keyboard together might only need a single watt. That’s more of an honest mistake than gross negligence, esp. since it’s often questioned if the manufacturer overrates their recommendations for a P.S.U., which they actually seem to do by considering the strongest C.P.U. on the market, rather than the C.P.U. that’s likely to be paired with the G.P.U. in the first place.
750 watts is too little for a 4090, but 850 should be fine generally speaking.
They certainly have little to no reason to underrate the P.S.U. recommendation. They’re not gaining enough extra sales on low volume products like high end G.P.Us. by underestimating the power draw, and would likely rather avoid an unnecessary R.M.A.
Personally, I think there is no case in which a red devil 6700 xt needs a power supply that exceeds 750 watts. You shouldn’t upgrade your P.S.U. until you decide to upgrade your G.P.U. because there are a few of disadvantages to a premature upgrade.
The first is that your warranty starts ticking down again the moment you buy a new component. You probably shouldn’t buy a new component until you actually need in excess of 750 watts of power handling. Even going by the P.S.U. recommendation of your card, you’re already in excess of what you need.
The second is that it’s too committal to buy a new part when you don’t need it. You’re going to be spending somewhere in the ballpark of $150 to buy a 1200 watt gold rated P.S.U., and not only will you not be taking advantage of the extra 550 watts it provides you, but power consumption trends tend to go up with time. That means by the time you’re thinking of buying an A.M.D. RX 9000 or Nvidia RTX 7000 series G.P.U. on your next upgrade cycle, you might actually need more than 1200 watts. You can’t actually know until you have the need.
The third is that if you overspec your P.S.U., it’ll be just as inefficient as it would be if you underspec as it would be at full load, if not more-so. This defeats the purpose of buying a higher wattage to increase power supply efficiency in the first place.
It’d be a shame to spend $150 on a 1200 watt P.S.U. and not realize any benefit from it. I’d recommend saving the money for now, and investing it in future proofing the next P.S.U. you buy when you finally decide to upgrade your G.P.U.
The only thing I’m concerned about is the P.S.U’s. age. Even the longest P.S.U. warranties are only 10 years, and although you might hope that a product would last beyond its warranty period, the CX750M’s is significantly shorter, at only 3 years. It’d really suck to have a leaky capacitor blow and take out other components alongside it. It’s doubtful that will happen, but considering you’d have hundreds of dollars worth of new components hooked up to it, it might be better safe than sorry.
If you do upgrade your power supply, accounting for some headroom above what you actually need so you don’t have to buy another power supply again in the future might be wise, but there’s little reason to rush it and with an adequate power supply already in hand, it is a bit too late to take that into consideration. You’re fine with your current supply for now anyway, insofar as wattage is concerned. I'd lean more heavily towards a higher certification rating than higher wattage though.