ZeekAncient 15 listopada 2022 o 20:36
Thoughts on Nvidia's Geforce RTX 4080...
I was wondering what people thought of the RTX 4080.

I have been reading some initial reviews, and I have some mixed feelings on it myself.

Here are some of the reviews that I have read:

https://www.tomshardware.com/reviews/nvidia-geforce-rtx-4080-review

https://www.techspot.com/review/2569-nvidia-geforce-rtx-4080/

https://www.pcgamer.com/nvidia-rtx-4080-16gb-review-performance-benchmarks/

I have read some others, but I figured I would post these as their scores vary from one an other and they provide the same important and relevant info as other review you will find.

The consensus is that the 4080 provides excellent gen-on-gen performance increases over the 30 series, and compared to the 4090, is very power efficient, and probably more worth purchasing if you are eyeing resolutions lower than 4K, like 2560 x 1440.

At 4K, it can be less than 30%% less powerful than an RTX 4090, however still a great performer at 4K, outperforming the flagships of the last generation.

However, it seems to be way overpriced. Obviously not a surprise here, as I think everyone was expecting this to be the case. But with the advent of AMD's 7900XTX, and 7900XT, to be released for $999, and $899, respectively, I was wondering what you all thought about the 4080's performance and how it should compare to AMD's RDNA3 offerings.

Some seem to think that even the 7900XTX will fall short of the 4080's overall performance, while others think that AMD's flagship will easily outperform the 4080 in rasteriztion performance, but fall short when it comes to ray tracing.

While most don't seem to care about ray tracing, or upscaling tech for that matter, I for one DO care about ray tracing performance, and love upscaling tech, especially at 4K. Most games nowadays are going to have some kind of ray tracing implemented, and more and more newer titles are going to be implementing it better, and more so. I don't think it will be very long before we see games completely rendered with ray-tracing. And not just older games like Quake II and Portal.

So, I think ray tracing performance is very important, and I don't want my next generation GPU to fall flat in that area. So, that is why I am very curious to see how AMD's next GPUs will perform in that area. I mean, if the 7900XTX can outperform the 4080 in rasterization but then be more on the level of the 40 series midrange cards when ray tracing is implemented, that might turn me off.

And then when it comes to upscaling tech, I am very interested in what DLSS 3.0 has to offer. DLSS 2.0 is already fantastic, but the performance gains that 3.0 seem to be providing, almost seem like a game changer. Considering that DLSS 3.0 can only be used on 40 series cards, unlike FSR which can be used on any card, FSR 3.0 might need to be a game changer as well if I am going to consider buying an AMD card.

I have been on Nvidia a long time, but I will be wanting to upgrade my GPU when I upgrade my display to a 4K 120Hz+ display. The RTX 4090 is just too expensive and too power hungry for me to realistically consider it. And while the 4080 seems to push all the right buttons for me when it comes to performance and power consumption, even it it still uses the 16-pin power connector, that $1199 price tag, and even more when you consider AIB cards, is way too overpriced.

If it would have been $999, or less, I think it would have been a no brainer. But it will be hard to drop $1200 or more on a GPU, when AMD's $999, or even $899, offering provide better rasterization performance. Not too mention more VRam. Even if the ray tracing will not be quite as good, and DLSS 3.0 will be a no no.

I've been impressed by what the specs, and price, of the 7900XTX and XT have shown but I guess now it all depends on how well it will perform in ray tracing and how good, and how well adopted, FSR 3.0 will be. I guess we will have to wait and see in December.

I'm kind of one that would like to stick with Nvidia since I have been there for so long, and it is comfortable for me, even with Nvidia's shady business practices, but $1200 or more for a GPU is just too much to stomach. The 4080 really should have been $200-$300 bucks cheaper. But if AMD falls flat in ray tracing, and FSR 3.0 just can't match the quality of DLSS 3.0, I just can't see myself going the AMD route. Might have to wait for Nvidia to come to their senses and drop their prices....

I might be waiting long....

Anyway, those are my very long two cents on the matter...sorry....
Ostatnio edytowany przez: ZeekAncient; 15 listopada 2022 o 20:37
< >
Wyświetlanie 151-165 z 205 komentarzy
Seamus 22 listopada 2022 o 20:03 
Początkowo opublikowane przez Jamebonds1:
No it is not. Prove your evidence as I did. This is why I won't prove anything because you will just rejected my most reliable source.
Your "reliable source" has nothing to do with video cards.

It also uses the term undervolting which you're trying to say isn't correct.

Just sod off. You're wrong and you aren't contributing anything.
Jamebonds1 22 listopada 2022 o 20:05 
Początkowo opublikowane przez Seamus:
Początkowo opublikowane przez Jamebonds1:
No it is not. Prove your evidence as I did. This is why I won't prove anything because you will just rejected my most reliable source.
Your "reliable source" has nothing to do with video cards.

It also uses the term undervolting which you're trying to say isn't correct.

Just sod off. You're wrong and you aren't contributing anything.
Yes it is. Even my EE friend agreed with me. How can your video card working without MOSFET like one in the source? Video card with MOSFET-free does not exist.
ZAP 22 listopada 2022 o 20:07 
Repost for night crew :stein:

https://youtu.be/UmjhPuMI9Es
Seamus 22 listopada 2022 o 20:07 
Początkowo opublikowane przez Jamebonds1:
Yes it is. Even my EE friend agreed with me. How can your video card working without MOSFET like one in the source? Video card with MOSFET-free does not exist.
Oh so it's your friend that's the electrical engineer?

Funny, earlier you said you are.

That rings about as true as your claim you passed a college level English class.

Seriously. Go somewhere else if you're not going to contribute.
ZeekAncient 22 listopada 2022 o 20:07 
Początkowo opublikowane przez Seamus:
Did you see EVGA's "not 4090" some tech reviewers got sent?

Absolutely beautiful design with some fantastic choices in the hardware setup. And it'll never be.

Really hope the 7000's do well and absolutely murder nvidia's bottom line.

Yes!. From JayzTwoCents? Awesome looking card. Nicely built. And the bracket that attaches to case was tied right into the GPU so it wouldn't have even sagged.

I believe Gamers Nexus did a video as well which I am about to go watch, lol.

Anyway, that would have been the card to get IMO. And definitely isn't as overbuilt as Asus's behemoth is. Though, Asus's card does provide great thermals. Plus, Asus has had the middle fan rotate in the other direction for a couple gens now, which they say helps thermals.
Ostatnio edytowany przez: ZeekAncient; 22 listopada 2022 o 20:11
Jamebonds1 22 listopada 2022 o 20:08 
Początkowo opublikowane przez Seamus:
Początkowo opublikowane przez Jamebonds1:
Yes it is. Even my EE friend agreed with me. How can your video card working without MOSFET like one in the source? Video card with MOSFET-free does not exist.


Seriously. Go somewhere else if you're not going to contribute.
You too and quit try to force me to change my way to speak my English. You're not a teacher in college.
Ostatnio edytowany przez: Jamebonds1; 22 listopada 2022 o 20:09
Seamus 22 listopada 2022 o 20:11 
Początkowo opublikowane przez ZeekAncient:
Yes!. From JayzTwoCents? Awesome looking card. Nicely built. And the bracket that attaches to case was tied right into the GPU so it wouldn't have even sagged.

I believe Gamers Nexus did a video as well which I am about to go watch, lol.

Anyway, that would have been the card to get IMO. And definitely isn't as overbuilt as Asus's behemoth is.
Steve got one too, yeah. I believe Linus got one as well.

The card design is beautiful. I'm really going to miss them.

It's less that it's not overbuilt and more that it's built better. The higher fin density, relocated power connector, etc.
Ostatnio edytowany przez: Seamus; 22 listopada 2022 o 20:12
Seamus 22 listopada 2022 o 20:13 
Początkowo opublikowane przez ZAP:
Repost for night crew :stein:

https://youtu.be/UmjhPuMI9Es
Kyle does love him some ♥♥♥♥ posting.

I still remember him being the first one to get DMCA'd during the whole verge pc guide fiasco.
ZeekAncient 22 listopada 2022 o 20:16 
Pretty much every card I have ever bought has been from EVGA. Except for my 980 TI which was Asus. And guess what? That was the only card that I have had that had something go wrong with it. It clocked well, but a year into using it, one of the fans sh1t the bed. So I RMA'd it and Asus sent me a 1070. Which I bought another and SLi'd them. Then I sold them and my friend gave me an EVGA 1070 Ti for free. So, now all my working PC's have an EVGA card in them. My current rig, which has an EVGA 3070 Ti, my last PC that my Dad is using, which has an EVGA 1070 Ti, and my backup/work/business PC which has an EVGA 780 in, lol.
Jamebonds1 22 listopada 2022 o 20:19 
I still remember my first GPU was EVGA GT9500.
ZeekAncient 22 listopada 2022 o 20:20 
Nice! How did that GT9500 run Crysis? lol

I didn't have a good working 'gaming' PC when Crysis first released, so I didn't get to really try it until 2010 when I bought a PC with a GTX 460. And still that card couldn't run it maxed at 1080p, lol.

Though, that PC got me back into PC gaming. Which I was absent from for a couple years.
Ostatnio edytowany przez: ZeekAncient; 22 listopada 2022 o 20:22
Seamus 22 listopada 2022 o 20:21 
Początkowo opublikowane przez ZeekAncient:
Pretty much every card I have ever bought has been from EVGA. Except for my 980 TI which was Asus. And guess what? That was the only card that I have had that had something go wrong with it. It clocked well, but a year into using it, one of the fans sh1t the bed. So I RMA'd it and Asus sent me a 1070. Which I bought another and SLi'd them. Then I sold them and my friend gave me an EVGA 1070 Ti for free. So, now all my working PC's have an EVGA card in them. My current rig, which has an EVGA 3070 Ti, my last PC that my Dad is using, which has an EVGA 1070 Ti, and my backup/work/business PC which has an EVGA 780 in, lol.
Last card I had that had a serious issue was... ♥♥♥♥. A 9000 series radeon I think. Either Acer or Asus. I forget which. It died so bad it killed my entire motherboard.

Fun day.

Had things to do that night so I had to go out and get a full set of components and rebuild and get set back up in like three hours.

Glad it didn't kill my drives, but I still had to do a fresh windows install. Windows never has liked being moved from one pc to another.

Haven't used an AMD card in a personal build since then. Which will be really weird if I end up going for an rx 7000 of some kind to replace my 2080.
Jamebonds1 22 listopada 2022 o 20:23 
Początkowo opublikowane przez ZeekAncient:
Nice! How did that GT9500 run Crysis? lol
My first Crysis series is Crysis 2 and it running so-so. I run low to med setting
Ostatnio edytowany przez: Jamebonds1; 22 listopada 2022 o 20:25
ZeekAncient 22 listopada 2022 o 20:25 
I actually have never had an ATI/AMD card. But I was thinking that the 7000 series might be my first. It all depends on how well it performs in 4K with ray tracing.

Most don't care about RT. But I do. It is the future and most new games will be implementing it more and more. Thus, I don't want to have my next-gen GPU be great at rasterization but then be midrange when it comes to ray tracing.

If the 7900XTX can still hold its own with ray tracing, that is, still be able to have settings maxed and ray tracing on at 4K, and get great frames, I will probably snag one.
Ostatnio edytowany przez: ZeekAncient; 22 listopada 2022 o 20:27
Komarimaru 22 listopada 2022 o 20:26 
Początkowo opublikowane przez ZeekAncient:
I actually have never had an ATI/AMD card. But I was thinking that the 7000 series might be my first. It all depends on how well it performs in 4K with ray tracing.

Most don't care about RT. But I do. It is the future and most no games will be implementing it more and more. Thus, I don't want to have my next-gen GPU be great at rasterization but then be midrange when it comes to ray tracing.
Most will bash on ray tracing for years to come, like people bashed on games that required TnL.
< >
Wyświetlanie 151-165 z 205 komentarzy
Na stronę: 1530 50

Data napisania: 15 listopada 2022 o 20:36
Posty: 205