ZeekAncient 15/nov./2022 às 20:36
Thoughts on Nvidia's Geforce RTX 4080...
I was wondering what people thought of the RTX 4080.

I have been reading some initial reviews, and I have some mixed feelings on it myself.

Here are some of the reviews that I have read:

https://www.tomshardware.com/reviews/nvidia-geforce-rtx-4080-review

https://www.techspot.com/review/2569-nvidia-geforce-rtx-4080/

https://www.pcgamer.com/nvidia-rtx-4080-16gb-review-performance-benchmarks/

I have read some others, but I figured I would post these as their scores vary from one an other and they provide the same important and relevant info as other review you will find.

The consensus is that the 4080 provides excellent gen-on-gen performance increases over the 30 series, and compared to the 4090, is very power efficient, and probably more worth purchasing if you are eyeing resolutions lower than 4K, like 2560 x 1440.

At 4K, it can be less than 30%% less powerful than an RTX 4090, however still a great performer at 4K, outperforming the flagships of the last generation.

However, it seems to be way overpriced. Obviously not a surprise here, as I think everyone was expecting this to be the case. But with the advent of AMD's 7900XTX, and 7900XT, to be released for $999, and $899, respectively, I was wondering what you all thought about the 4080's performance and how it should compare to AMD's RDNA3 offerings.

Some seem to think that even the 7900XTX will fall short of the 4080's overall performance, while others think that AMD's flagship will easily outperform the 4080 in rasteriztion performance, but fall short when it comes to ray tracing.

While most don't seem to care about ray tracing, or upscaling tech for that matter, I for one DO care about ray tracing performance, and love upscaling tech, especially at 4K. Most games nowadays are going to have some kind of ray tracing implemented, and more and more newer titles are going to be implementing it better, and more so. I don't think it will be very long before we see games completely rendered with ray-tracing. And not just older games like Quake II and Portal.

So, I think ray tracing performance is very important, and I don't want my next generation GPU to fall flat in that area. So, that is why I am very curious to see how AMD's next GPUs will perform in that area. I mean, if the 7900XTX can outperform the 4080 in rasterization but then be more on the level of the 40 series midrange cards when ray tracing is implemented, that might turn me off.

And then when it comes to upscaling tech, I am very interested in what DLSS 3.0 has to offer. DLSS 2.0 is already fantastic, but the performance gains that 3.0 seem to be providing, almost seem like a game changer. Considering that DLSS 3.0 can only be used on 40 series cards, unlike FSR which can be used on any card, FSR 3.0 might need to be a game changer as well if I am going to consider buying an AMD card.

I have been on Nvidia a long time, but I will be wanting to upgrade my GPU when I upgrade my display to a 4K 120Hz+ display. The RTX 4090 is just too expensive and too power hungry for me to realistically consider it. And while the 4080 seems to push all the right buttons for me when it comes to performance and power consumption, even it it still uses the 16-pin power connector, that $1199 price tag, and even more when you consider AIB cards, is way too overpriced.

If it would have been $999, or less, I think it would have been a no brainer. But it will be hard to drop $1200 or more on a GPU, when AMD's $999, or even $899, offering provide better rasterization performance. Not too mention more VRam. Even if the ray tracing will not be quite as good, and DLSS 3.0 will be a no no.

I've been impressed by what the specs, and price, of the 7900XTX and XT have shown but I guess now it all depends on how well it will perform in ray tracing and how good, and how well adopted, FSR 3.0 will be. I guess we will have to wait and see in December.

I'm kind of one that would like to stick with Nvidia since I have been there for so long, and it is comfortable for me, even with Nvidia's shady business practices, but $1200 or more for a GPU is just too much to stomach. The 4080 really should have been $200-$300 bucks cheaper. But if AMD falls flat in ray tracing, and FSR 3.0 just can't match the quality of DLSS 3.0, I just can't see myself going the AMD route. Might have to wait for Nvidia to come to their senses and drop their prices....

I might be waiting long....

Anyway, those are my very long two cents on the matter...sorry....
Última edição por ZeekAncient; 15/nov./2022 às 20:37
< >
Exibindo comentários 136150 de 205
Seamus 22/nov./2022 às 19:44 
Escrito originalmente por ZeekAncient:
But lets get back on topic. This thread was about the 4080. And it probably should be dead by now anyway. The GPU is fine, but Ngreedia has decided to overcharge incredibly for this thing and thus has made it a bad product.

The end, lol.
Yeah, a major MAJOR cut to the price would help matters as at a lower price point it'd probably honestly be a pretty decent card with how overbuilt the cooler is.
Jamebonds1 22/nov./2022 às 19:46 
Escrito originalmente por ZeekAncient:
Ok, everyone needs to stop about undervolting.

And it is called undervolting Jamebonds1, no matter what you want to call it. In this industry, the PC building and PC gaming industry, people call it "undervolting" a GPU. No matter if that should be the correct term for it or not. It is what it is called.

And the consensus is, is that undervolting will not hurt your GPU. In fact, can be beneficial to the GPU. Especially in this day and age when overclocking a GPU does not yield the same results it once did.

But you can get lucky, like I did, and usually do, and still get some yields from traditional overclocking a GPU. So, until I do not find it beneficial, I will still OC my GPU in traditional ways. However, I might mess around with undervolting when I get a chance to see the results I get.

But lets get back on topic. This thread was about the 4080. And it probably should be dead by now anyway. The GPU is fine, but Ngreedia has decided to overcharge incredibly for this thing and thus has made it a bad product.

The end, lol.
I still don't understand how it is undervolt while it is within the voltage operating and it is not your fault. So, I use a "low voltage" instead to define decreased voltage from normal but within the voltage operating. I just don't agree with the article because writers are not an EE.

And beside, Seamus and some other is the one that tried force me to change my way due to my poor English.
Jamebonds1 22/nov./2022 às 19:47 
Escrito originalmente por Seamus:
Escrito originalmente por Jamebonds1:
Nope, they are only built to prevent damage to the GPU.
That hasn't been the case for well over a decade.
I have proved that I'm correct.
No, you haven't.
Yes, I did. Now it is your turn but no wikipedia this time. The writer have to be EE too. Without anything to prevent damage, then undervolt and overvolt could be out of control and damage it.
Última edição por Jamebonds1; 22/nov./2022 às 19:48
ZeekAncient 22/nov./2022 às 19:49 
Escrito originalmente por Seamus:
Yeah, a major MAJOR cut to the price would help matters as at a lower price point it'd probably honestly be a pretty decent card with how overbuilt the cooler is.

Exactly. To save money, Nvidia and AIBs have just reused the same GPU shroud over the PCB as the 4090, so it has allowed the 4080 to have great thermals.

Some of these are still huge though. The ASUS ROG 4080, which would be a nice card to have, is 358 mm long. My RTX 3070 Ti is 300mm long. Lol, that is a big difference.

I am not sure it would fit in my case considering that I have front mounted my AIO. It also 3.5 slots thick, while my EVGA 3070 Ti is 2.75 slots thick. I thought my card was huge when I got it, but this ASUS card is GIGANTIC!!!
Última edição por ZeekAncient; 22/nov./2022 às 19:51
Seamus 22/nov./2022 às 19:49 
Escrito originalmente por Jamebonds1:
I still don't understand
We know.
Jamebonds1 22/nov./2022 às 19:51 
Escrito originalmente por Seamus:
Escrito originalmente por Jamebonds1:
I still don't understand
We know.
Know what? I still think it is improper word to define decreased voltage while it is within the operating voltage range. Lower voltage or decreased voltage is more proper. Even power phase is actually not proper word to use, because it should be VRM.
Última edição por Jamebonds1; 22/nov./2022 às 19:51
Seamus 22/nov./2022 às 19:52 
Escrito originalmente por ZeekAncient:
Exactly. To save money, Nvidia and AIBs have just reused the same GPU shroud over the PCB as the 4090, so it has allowed the 4080 to have great thermals.

Some of these are still huge though. the ASUS ROG 4080, which would be a nice card to have, is 358 mm long. My RTX 3070 Ti is 300mm long. Lol, that is a big difference.

I am not sure it would fit in my case considering that I have front mounted my AIO. It also 3.5 slots thick, while my EVGA 3070 Ti is 2.75 slots thick. I thought my card was huge when I got it, but this ASUS card is GIGANTIC!!!
I've been content with my evga 2080 for quite a while.

I remain sad they've decided to leave the gpu marketplace due to nvidia's ♥♥♥♥.
Seamus 22/nov./2022 às 19:53 
Escrito originalmente por Jamebonds1:
Know what?
We know you don't understand why undervolting is the correct term.

Your opinion doesn't matter when the entire computer industry uses the term.
r.linder 22/nov./2022 às 19:54 
Escrito originalmente por Seamus:
Escrito originalmente por ZeekAncient:
Exactly. To save money, Nvidia and AIBs have just reused the same GPU shroud over the PCB as the 4090, so it has allowed the 4080 to have great thermals.

Some of these are still huge though. the ASUS ROG 4080, which would be a nice card to have, is 358 mm long. My RTX 3070 Ti is 300mm long. Lol, that is a big difference.

I am not sure it would fit in my case considering that I have front mounted my AIO. It also 3.5 slots thick, while my EVGA 3070 Ti is 2.75 slots thick. I thought my card was huge when I got it, but this ASUS card is GIGANTIC!!!
I've been content with my evga 2080 for quite a while.

I remain sad they've decided to leave the gpu marketplace due to nvidia's ♥♥♥♥.
My hope is that Zotac will step up and get their ♥♥♥♥ together, because the big 3 won’t replace EVGA.
Seamus 22/nov./2022 às 19:56 
Escrito originalmente por 尺.し工几句ヨ尺:
My hope is that Zotac will step up and get their ♥♥♥♥ together, because the big 3 won’t replace EVGA.
Zotac might be an okay choice if their shrouds weren't absolutely hideous. I mean, have you seen their 4090? That design should be in a dumpster alongside the AMD FX architecture.

Maybe they'll hire Vince Lucido and we'll get some cards that aren't absolutely awful looking.
Jamebonds1 22/nov./2022 às 19:57 
Escrito originalmente por Seamus:
Escrito originalmente por Jamebonds1:
Know what?
We know you don't understand why undervolting is the correct term.

Your opinion doesn't matter when the entire computer industry uses the term.
It is not opinion, it is fact. I proved evidence that undervolt could doing damage to your GPU. Undervolt mean it is below the declared manufacturer's voltage range/rate.
ZeekAncient 22/nov./2022 às 20:00 
Escrito originalmente por Seamus:
I've been content with my evga 2080 for quite a while.

I remain sad they've decided to leave the gpu marketplace due to nvidia's ♥♥♥♥.

I think we are seeing now why EVGA decided to pull the plug on the 40 series. The 4090 was one thing, but now that the 4080 has been released, and the industry wide consensus is that Nvidia has really overcharged for this, you can see why EVGA wanted out.

The AIBs don't make much profit on these things anyway, and EVGA probably less than most, so seeing how poorly the 4080 is selling because of Nvidia's pricing tactics, EVGA knew they were going to lose money on these cards.

It is clear Nvidia priced the 4080 how it is to force consumers to choose between the 30 series or going all the way for the 4090. But even then the 4090 is stuck in this melting adapter controversy. EVGA probably saw all this coming.

Look if Nvidia needed to sell the 30 series this badly, then they could have just waited to release the 40 series. No one said they had to release these things this year. The 30 series was still fine for while and they could have waited. But TSMC probably tied their hands so they didn't have a choice, and the corporate schedule was to release them this year, so they kind of had to.

Doesn't mean we have to get screwed with the 4080 price. My thinking is that the price will go down once 30 series supply dries out, AMD releases the 7000 series, and they are ready to release other 40 series cards. They will cut the price of the 4080, call it a bargain, say they are the good guys, and put a 4080 Ti at that price point or slightly higher.
Seamus 22/nov./2022 às 20:00 
Escrito originalmente por Jamebonds1:
It is not opinion,
No, it really is.

Undervolting gpus is a standard procedure.
I proved evidence that undervolt could doing damage to your GPU.
No, you linked something unrelated to gpus.
Undervolt mean it is below the declared manufacturer's voltage range/rate.
Undervolt is and remains the proper term.
Jamebonds1 22/nov./2022 às 20:01 
Escrito originalmente por Seamus:
Escrito originalmente por Jamebonds1:
It is not opinion,
No, it really is.

Undervolting gpus is a standard procedure.
I proved evidence that undervolt could doing damage to your GPU.
No, you linked something unrelated to gpus.
Undervolt mean it is below the declared manufacturer's voltage range/rate.
Undervolt is and remains the proper term.
No it is not. Prove your evidence as I did. This is why I won't prove anything because you will just rejected my most reliable source.

ALso, did you see Zeek ask you to get back on topic? It show you're no better than me.
Última edição por Jamebonds1; 22/nov./2022 às 20:03
Seamus 22/nov./2022 às 20:02 
Escrito originalmente por ZeekAncient:
I think we are seeing now why EVGA decided to pull the plug on the 40 series. The 4090 was one thing, but now that the 4080 has been released, and the industry wide consensus is that Nvidia has really overcharged for this, you can see why EVGA wanted out.

The AIBs don't make much profit on these things anyway, and EVGA probably less than most, so seeing how poorly the 4080 is selling because of Nvidia's pricing tactics, EVGA knew they were going to lose money on these cards.

It is clear Nvidia priced the 4080 how it is to force consumers to choose between the 30 series or going all the way for the 4090. But even then the 4090 is stuck in this melting adapter controversy. EVGA probably saw all this coming.

Look if Nvidia needed to sell the 30 series this badly, then they could have just waited to release the 40 series. No one said they had to release these things this year. The 30 series was still fine for while and they could have waited. But TSMC probably tied their hands so they didn't have a choice, and the corporate schedule was to release them this year, so they kind of had to.

Doesn't mean we have to get screwed with the 4080 price. My thinking is that the price will go down once 30 series supply dries out, AMD releases the 7000 series, and they are ready to release other 40 series cards. They will cut the price of the 4080, call it a bargain, say they are the good guys, and put a 4080 Ti at that price point or slightly higher.
Did you see EVGA's "not 4090" some tech reviewers got sent?

Absolutely beautiful design with some fantastic choices in the hardware setup. And it'll never be.

Really hope the 7000's do well and absolutely murder nvidia's bottom line.
< >
Exibindo comentários 136150 de 205
Por página: 1530 50

Publicado em: 15/nov./2022 às 20:36
Mensagens: 205