got a gtx 650 and idk what amount of wattage it needs
as the title implies im trying to figure out if 300w is enough because im not good with hardware and cant find a straight answer on the internet
< >
A mostrar 16-30 de 30 comentários
Shaggin'Wagon 11 jun. 2024 às 20:56 
Originalmente postado por _I_:
nvidia posts all the specs on their site
cores, speeds, ram, speeds, tmus, rops everything
performance speeds you can calc out, or use gpuz results to show them

https://www.nvidia.com/en-us/geforce/graphics-cards/geforce-gtx-650/specifications/

for some time, amd only listing rx x00 or whatever in device manager or dxdiag and other software as the gpu, forced people to use gpuz and hwcompare to figure out what gpu they actually had, since it listed the core counts along with tmus and rops

I guess I need to repeat myself as it seems you do not read things: The problem with hwcompare is that no one knows where they get their information from. It could be the official specifications. Or it could be literally anything that they decide to fabricate up that looks good. Because they do not list their sources for their information then they can not be trusted to provide factual specifications about any hardware.

No one can ever refer to hwcompare for the actual correct specifications for any hardware. Everything that they state there must be considered suspect unless we can verify their sources.
Bad 💀 Motha 11 jun. 2024 às 22:57 
Originalmente postado por Shaggin'Wagon:
Originalmente postado por _I_:
nvidia posts all the specs on their site
cores, speeds, ram, speeds, tmus, rops everything
performance speeds you can calc out, or use gpuz results to show them

https://www.nvidia.com/en-us/geforce/graphics-cards/geforce-gtx-650/specifications/

for some time, amd only listing rx x00 or whatever in device manager or dxdiag and other software as the gpu, forced people to use gpuz and hwcompare to figure out what gpu they actually had, since it listed the core counts along with tmus and rops

I guess I need to repeat myself as it seems you do not read things: The problem with hwcompare is that no one knows where they get their information from. It could be the official specifications. Or it could be literally anything that they decide to fabricate up that looks good. Because they do not list their sources for their information then they can not be trusted to provide factual specifications about any hardware.

No one can ever refer to hwcompare for the actual correct specifications for any hardware. Everything that they state there must be considered suspect unless we can verify their sources.

You're being ridiculous. Why does everyone need to post sources. My sources are my sources, end of story. When anyone can easily and accurately google the specs and bring it right up, what's the issue? It's not as if someone like TechPowerUp (makers of GPU-Z) or HWCompare or others, have any need or reason to disclose the sources, when it's all public domain knowledge.

Posting a source is more of a need when it's a "rumor"
When it's truth, it's truth, nothing more/less.
Shaggin'Wagon 12 jun. 2024 às 1:48 
Originalmente postado por Bad 💀 Motha:
You're being ridiculous. Why does everyone need to post sources. My sources are my sources, end of story. When anyone can easily and accurately google the specs and bring it right up, what's the issue? It's not as if someone like TechPowerUp (makers of GPU-Z) or HWCompare or others, have any need or reason to disclose the sources, when it's all public domain knowledge.

Posting a source is more of a need when it's a "rumor"
When it's truth, it's truth, nothing more/less.
It appears that you are confused. I will help to enlighten you: I was not speaking to you, referring to you, nor did I ever even mention you. I was responding to user _I_, not you. This does not concern you.
_I_ 12 jun. 2024 às 6:05 
whats the difference?

i give sources and the specs are correct

anyone can edit the wiki
Última alteração por _I_; 12 jun. 2024 às 6:05
Bad 💀 Motha 12 jun. 2024 às 6:11 
Originalmente postado por Shaggin'Wagon:
Originalmente postado por Bad 💀 Motha:
You're being ridiculous. Why does everyone need to post sources. My sources are my sources, end of story. When anyone can easily and accurately google the specs and bring it right up, what's the issue? It's not as if someone like TechPowerUp (makers of GPU-Z) or HWCompare or others, have any need or reason to disclose the sources, when it's all public domain knowledge.

Posting a source is more of a need when it's a "rumor"
When it's truth, it's truth, nothing more/less.
It appears that you are confused. I will help to enlighten you: I was not speaking to you, referring to you, nor did I ever even mention you. I was responding to user _I_, not you. This does not concern you.

I'm just saying that for general purposes. There are too many idiots on this forum, then they be like "Oh where are your sources to back up your information." LOL ain't no body got time for that BS and 90% of the time I talk on here, I'm on my Phone web browser. I ain't go no time to be pulling up all that info. If you can't look it up yourselves, then you need to learn how.
Shaggin'Wagon 12 jun. 2024 às 6:16 
Originalmente postado por _I_:
anyone can edit the wiki
I already explained that to you. In the case of the Nvidia wikipedia page: NO. Random people can NOT edit it. Please take a moment to go re-read my comment where I explained it to you again. Here's a link for you since you missed it the first time: https://steamcommunity.com/discussions/forum/11/4339860642014530491/?tscn=1718197505#c4333106094377805752
Bad 💀 Motha 12 jun. 2024 às 6:20 
You also can not always just go off of what NVIDIA has to share for that info. There are plenty of OEM GPU Cards for HP, DELL and such that have different specs, memory count, max TDP, etc.

Much better to use and go by the Device ID shown within GPU-Z
Última alteração por Bad 💀 Motha; 12 jun. 2024 às 6:20
Shaggin'Wagon 12 jun. 2024 às 6:23 
Originalmente postado por Bad 💀 Motha:
You also can not always just go off of what NVIDIA has to share for that info. There are plenty of OEM GPU Cards for HP, DELL and such that have different specs, memory count, max TDP, etc.

Much better to use and go by the Device ID shown within GPU-Z

In the case of Nvidia this is not true. Nvidia defines the specifications for a given model number and a maximum allowed power limit for each model. All vendors have to comply with Nvidia's mandated limitations, even OEM's. OEM versions of Nvidia cards are not fundamentally different than any other version made by anyone else.

For example: All versions of RTX 4070 all have the same amount of VRAM, even OEM's and AIB versions.
_I_ 12 jun. 2024 às 6:24 
and there are cases like the gtx 970
where nvidias marketing didnt even know the complete specs
Bad 💀 Motha 12 jun. 2024 às 6:30 
Originalmente postado por Shaggin'Wagon:
Originalmente postado por Bad 💀 Motha:
You also can not always just go off of what NVIDIA has to share for that info. There are plenty of OEM GPU Cards for HP, DELL and such that have different specs, memory count, max TDP, etc.

Much better to use and go by the Device ID shown within GPU-Z

In the case of Nvidia this is not true. Nvidia defines the specifications for a given model number and a maximum allowed power limit for each model. All vendors have to comply with Nvidia's mandated limitations, even OEM's. OEM versions of Nvidia cards are not fundamentally different than any other version made by anyone else.

For example: All versions of RTX 4070 all have the same amount of VRAM, even OEM's and AIB versions.

There were plenty of varients of OEM GPUs for HP and DELL made for lower power needs that are not specified anywhere with information directly from NVIDIA for the end-user / consumer to see. GTX 650 is one of those, they also did that for GPUs such as many of the lower end GTX 700 series such as 750 and lower models.

But overall there is no need to drag this out any further; there weren't many variants of the GTX 650
Shaggin'Wagon 12 jun. 2024 às 6:36 
Originalmente postado por Bad 💀 Motha:
There were plenty of varients of OEM GPUs for HP and DELL made for lower power needs that are not specified anywhere with information directly from NVIDIA for the end-user / consumer to see. GTX 650 is one of those, they also did that for GPUs such as many of the lower end GTX 700 series such as 750 and lower models.
I'm sorry but that's just not true. Nvidia has explicit control over their video cards. They do not allow OEM's to modify or change Nvidia video cards at all. No modified video memory. No modified power limits. Nothing is allowed to be changed. OEM versions of Nvidia video cards (at least the ones that are using the same model as normal consumer cards that is) are exactly the same as AIB versions.

The main difference more than anything is Some OEM's use incredibly crappy cooling solutions on their cards which cause them to run artificially hot, which in turn causes them to use more power than they otherwise would, which makes them throttle and run slower due to how Nvidia's boost works.
Última alteração por Shaggin'Wagon; 12 jun. 2024 às 6:37
_I_ 12 jun. 2024 às 6:38 
dell and other mfgs can make their own pcb and cooler designs
nvidia only sells the silicon

remember back when msi/gigabyte/asus and the others would sell 'oc' versions
they binned the gpu cores and the ones that performed better had better cores or ram controller or whatever, got paired with better ram and coolers and factory overclocked

back on topic
the gtx 650 is a 65w gpu
it has the 6pin pci-e power plug, in case a board cant deliver 65w to the pci-e slot
and it also makes sure the psu has the pci-e power plug, so it should be more than strong enough to run the build with the gpu
Última alteração por _I_; 12 jun. 2024 às 6:42
Shaggin'Wagon 12 jun. 2024 às 6:39 
Originalmente postado por _I_:
it has the 6pin pci-e power plug, in case a board cant deliver 65w to the pci-e slot
This also is not true either. Not all GTX 650's have a PCI-E power plug. Quite a lot do not have one at all. Some do some do not.
_I_ 12 jun. 2024 às 6:43 
Originalmente postado por Shaggin'Wagon:
Originalmente postado por _I_:
it has the 6pin pci-e power plug, in case a board cant deliver 65w to the pci-e slot
This also is not true either. Not all GTX 650's have a PCI-E power plug. Quite a lot do not have one at all. Some do some do not.
that may be the board mfg design not all are done by nvidia
nvidia spec lists the pci-e 6pin power connector

the pci-e x16 spec is 75w, but many oem or low end boards do cheap out and cant do it
Última alteração por _I_; 12 jun. 2024 às 8:32
Shaggin'Wagon 12 jun. 2024 às 14:22 
Originalmente postado por _I_:
Originalmente postado por Shaggin'Wagon:
This also is not true either. Not all GTX 650's have a PCI-E power plug. Quite a lot do not have one at all. Some do some do not.
that may be the board mfg design not all are done by nvidia
nvidia spec lists the pci-e 6pin power connector
Of course they're slightly different. For many, many years (specifically back when the GTX 650 was new) Nvidia did not even make their own versions of the cards. They only sold their chips to 3rd party companies who made the video cards.

The only thing that was the same across all GTX 650's was the technical specifications and power limits.
< >
A mostrar 16-30 de 30 comentários
Por página: 1530 50

Postado a: 1 jun. 2024 às 12:58
Comentários: 31