Coffee Apr 30, 2024 @ 3:47pm
I feel like I got scammed by Nvidia (nvm I was trolled)
I just found out from the Gray Zone Warfare forum that a 3050/3060 is just a rebranded GTX 1080 card but...with raytracing...wooohooo...a feature that is barely used and is a huge gimick

I feel extremely ripped off :|

I'm buying AMD in the future. Nvidia just lost me as a customer. Sure I'm just one customer, but if others feel ripped off, they'll also switch to AMD

Anyone know what AMD GPUs are decent these days? Like what is the AMD equivalent of a RTX 4080 or 4090? Looking something around 300-500 USD dollar range.
Last edited by Coffee; Apr 30, 2024 @ 4:33pm

Something went wrong while displaying this content. Refresh

Error Reference: Community_9708323_
Loading CSS chunk 7561 failed.
(error: https://community.fastly.steamstatic.com/public/css/applications/community/communityawardsapp.css?contenthash=789dd1fbdb6c6b5c773d)
Originally posted by MancSoulja:
Originally posted by Artoria:
Originally posted by MancSoulja:

The 10 series used Pascall GPUs , the RTX 20 & 30 series use Ampere, they don't even use the same generation of GPU so how can a 3050 be a rebranded 1080?

Did the person who told me that in that forum just not know what they were talking about and this whole thread was based on that false premise from what that person told me...

They're probably just trying to wind you up because you have a decent GPU and they don't, so they try to make you fell bad about your stuff.

Here's the Wiki page for Ampere, as you can see it's not repurposed from an older generation and definitely not one as old as Pascal.

https://en.wikipedia.org/wiki/Ampere_(microarchitecture)
Showing 1-15 of 15 comments
nullable Apr 30, 2024 @ 3:56pm 
Originally posted by Artoria:
I just found out from the Gray Zone Warfare forum that a 3050/3060 is just a rebranded GTX 1080 card but...with raytracing...wooohooo...a feature that is barely used and is a huge gimick

I feel extremely ripped off :|

Well do you have a source for that claim? Because it sounds like a little bit of oversimplification. And if the complaint is your entry level GPU (3050) from 2021 is only as good as the best GPU from 2016, well. I'm not really sure what you're expecting or imagining.

But it doesn't seem that weird to me, so I'm not sure what you're expecting or how reasonable it is.


Originally posted by Artoria:
I'm buying AMD in the future. Nvidia just lost me as a customer. Sure I'm just one customer, but if others feel ripped off, they'll also switch to AMD

Anyone know what AMD GPUs are decent these days? Like what is the AMD equivalent of a RTX 4080 or 4090? Looking something around 300-500 USD dollar range.

You're not going to find an AMD GPU that's equivalent to Nvidia's flagship cards for 300-500. That might be some of your mismanaged expectations leaking through though.

Although you can probably get a rx 7800 XT in the $500 range and it's a good card. Often beats out a Geforce 4070, so that's pretty good. Although someone might claim it's just a rebagded 6800 XT and that might make you feel ripped off.

https://www.tomshardware.com/reviews/amd-radeon-rx-7800-xt-review/3
Last edited by nullable; Apr 30, 2024 @ 4:01pm
Coffee Apr 30, 2024 @ 4:00pm 
It was this post

https://steamcommunity.com/app/2479810/discussions/0/4355620303682252215/#c4355620303682327830

I always thought GPUs were supposed to evolve and get more powerful, not just rebrand themselves with a gimick and some slight changes to upscaling or whatever that I also barely use.

And I dunno, whats a good AMD card for 300-500? Thats how much I paid for my RTX 3080 (thats actually what I got, not a 3060). But looking into it, its also a rebranded version of an old card. Thats what made me angry, I bought an old card rebranded with some features I don't even care about...I thought I was getting a more powerful card lol. A GTX 1080 for example was made way before a RTX 3060 and that is what a RTX 3060 is :|

I do know prices and inflation has increased overall so maybe my budget needs to be higher, but my entire PC cost 1.5 thousand and it was pretty top end. But a brief look sees my SAME PC I can resell for 3000 and my PC is kinda not that new anymore...has prices increased THAT much?
nullable Apr 30, 2024 @ 4:05pm 
Originally posted by Artoria:
It was this post

https://steamcommunity.com/app/2479810/discussions/0/4355620303682252215/#c4355620303682327830

I always thought GPUs were supposed to evolve and get more powerful, not just rebrand themselves with a gimick and some slight changes to upscaling or whatever that I also barely use.

Well compare a 3050 to a 1050. Expecting a 3050 to be more powerful than every GPU that was released previously isn't reasonable. It's really only supposed to be better than the previous xx50 card or maybe the xx60 card depending on how big the performance increase from one generation to the next ended up being.


Originally posted by Artoria:
And I dunno, whats a good AMD card for 300-500? Thats how much I paid for my RTX 3080 (thats actually what I got, not a 3060). But looking into it, its also a rebranded version of an old card. Thats what made me angry, I bought an old card rebranded with some features I don't even care about...I thought I was getting a more powerful card lol. A GTX 1080 for example was made way before a RTX 3060 and that is what a RTX 3060 is :|

I do know prices and inflation has increased overall so maybe my budget needs to be higher, but my entire PC cost 1.5 thousand and it was pretty top end. But a brief look sees my SAME PC I can resell for 3000 and my PC is kinda not that new anymore...has prices increased THAT much?

You can probably get a rx 7800 XT in the $500 range and it's a good card. Often beats out a Geforce 4070, so that's pretty good. Although someone might claim it's just a rebagded 6800 XT and that might make you feel ripped off. Make sure you look at how the 7800 XT and the 6800 XT are often neck and neck...

https://www.tomshardware.com/reviews/amd-radeon-rx-7800-xt-review/3
Last edited by nullable; Apr 30, 2024 @ 4:06pm
Coffee Apr 30, 2024 @ 4:05pm 
No prices are still low on AMD

Are these good cards? Apparently this is a RTX 4060 equivalent?

https://www.microcenter.com/product/677012/gigabyte-amd-radeon-rx-7600-xt-gaming-overclocked-triple-fan-16gb-gddr6-pcie-40-graphics-card

and

this is below 500 and apparently a RTX 4080 equivalent? Gets bad reviews though

https://www.microcenter.com/product/676123/asus-amd-radeon-rx-7700-xt-dual-overclocked-dual-fan-12gb-gddr6-pcie-40-graphics-card

first one has way more video ram
nullable Apr 30, 2024 @ 4:09pm 
Originally posted by Artoria:
No prices are still low on AMD

Are these good cards? Apparently this is a RTX 4060 equivalent?

https://www.microcenter.com/product/677012/gigabyte-amd-radeon-rx-7600-xt-gaming-overclocked-triple-fan-16gb-gddr6-pcie-40-graphics-card

and

this is below 500 and apparently a RTX 4080 equivalent? Gets bad reviews though

https://www.microcenter.com/product/676123/asus-amd-radeon-rx-7700-xt-dual-overclocked-dual-fan-12gb-gddr6-pcie-40-graphics-card

first one has way more video ram

Well I'm not sure what resource is claiming an rx 7700 XT is equiavelent to a 4080, but if you look up a few independent reviews and they all show it beating a 4080 on the regular, then sure.

Based on the tom's hardware benchmarks I posted it's closer to a 4060 ti if I'm eyeballing it. And regularly beaten by the 4070, so 4080 equivalent seems like wishful thinking to me.
Coffee Apr 30, 2024 @ 4:15pm 
Originally posted by nullable:
Originally posted by Artoria:
No prices are still low on AMD

Are these good cards? Apparently this is a RTX 4060 equivalent?

https://www.microcenter.com/product/677012/gigabyte-amd-radeon-rx-7600-xt-gaming-overclocked-triple-fan-16gb-gddr6-pcie-40-graphics-card

and

this is below 500 and apparently a RTX 4080 equivalent? Gets bad reviews though

https://www.microcenter.com/product/676123/asus-amd-radeon-rx-7700-xt-dual-overclocked-dual-fan-12gb-gddr6-pcie-40-graphics-card

first one has way more video ram

Well I'm not sure what resource is claiming an rx 7700 XT is equiavelent to a 4080, but if you look up a few independent reviews and they all show it beating a 4080 on the regular, then sure.

Based on the tom's hardware benchmarks I posted it's closer to a 4060 ti if I'm eyeballing it. And regularly beaten by the 4070, so 4080 equivalent seems like wishful thinking to me.

for the first card

Its this site, the first google search. I don't know anything about AMD so I just googled "rtx 4060 amd equivalent"

https://www.digitaltrends.com/computing/nvidia-rtx-4060-vs-amd-rx-7600-xt/#:~:text=When%20looking%20at%20raw%20game,it%20with%20ray%20tracing%2C%20however.

this one also says its roughly equal to a 4060

https://gamersnexus.net/gpus/amd-radeon-rx-7600-xt-gpu-benchmarks-review-power-efficiency-gaming#:~:text=For%20directly%20comparable%20data%20at,lead%20over%20the%20new%20XT.

But I guess I expect technology to improve.

Seems AMD might do the same thing with the link you sent.

Like...one kinda expects technology to improve.

Like as an example, why buy a 10,000 dollar TV if it has a couple extra features but otherwise exactly the same as a TV made 10 years ago for 1,000 dollars and they both look almost exactly the same 99% of the time, unless a show or movie makes use of that one or two extra features. But almost no show or tv does.

Most games don't use raytracing at all, and if they do its not that greatly done or very gimicky. Cyberpunk 2077 is only game I've played that looks amazing with raytracing, so one single game that I've personally played. The rest don't look great...I guess uh watch dog legions is the 2nd, that one looks super good I think thats raytraced. So probably two games

No other game makes use of it like those two (and I forgot if legions actually uses it or not)

Just an example lol, probably a junky one. If AMD does the same thing, then I guess its a bit moot.
MancSoulja Apr 30, 2024 @ 4:18pm 
A 7700 XT is anywhere from 35 - 50% slower than a 4080 depending on the game
MancSoulja Apr 30, 2024 @ 4:23pm 
Originally posted by Artoria:
I just found out from the Gray Zone Warfare forum that a 3050/3060 is just a rebranded GTX 1080 card but...with raytracing...wooohooo...a feature that is barely used and is a huge gimick

The 10 series used Pascall GPUs , the RTX 20 & 30 series use Ampere, they don't even use the same generation of GPU so how can a 3050 be a rebranded 1080?
Last edited by MancSoulja; Apr 30, 2024 @ 4:23pm
Coffee Apr 30, 2024 @ 4:24pm 
Originally posted by MancSoulja:
Originally posted by Artoria:
I just found out from the Gray Zone Warfare forum that a 3050/3060 is just a rebranded GTX 1080 card but...with raytracing...wooohooo...a feature that is barely used and is a huge gimick

The 10 series used Pascall GPUs , the RTX 20 & 30 series use Ampere, they don't even use the same generation of GPU so how can a 3050 be a rebranded 1080?

Did the person who told me that in that forum just not know what they were talking about and this whole thread was based on that false premise from what that person told me...
The author of this thread has indicated that this post answers the original topic.
MancSoulja Apr 30, 2024 @ 4:29pm 
Originally posted by Artoria:
Originally posted by MancSoulja:

The 10 series used Pascall GPUs , the RTX 20 & 30 series use Ampere, they don't even use the same generation of GPU so how can a 3050 be a rebranded 1080?

Did the person who told me that in that forum just not know what they were talking about and this whole thread was based on that false premise from what that person told me...

They're probably just trying to wind you up because you have a decent GPU and they don't, so they try to make you fell bad about your stuff.

Here's the Wiki page for Ampere, as you can see it's not repurposed from an older generation and definitely not one as old as Pascal.

https://en.wikipedia.org/wiki/Ampere_(microarchitecture)
Last edited by MancSoulja; Apr 30, 2024 @ 4:31pm
Guydodge Apr 30, 2024 @ 4:39pm 
You only had to check to find out it has 20-25% better fps than a 1080

Last edited by Guydodge; Apr 30, 2024 @ 4:43pm
nullable Apr 30, 2024 @ 4:40pm 
Originally posted by Artoria:
Originally posted by MancSoulja:

The 10 series used Pascall GPUs , the RTX 20 & 30 series use Ampere, they don't even use the same generation of GPU so how can a 3050 be a rebranded 1080?

Did the person who told me that in that forum just not know what they were talking about and this whole thread was based on that false premise from what that person told me...

Lots of pseudo-experts out there. Nothing wrong with listening to people. But there's a saying, "trust, but verify". So ideally when people make claims they will provide some resources that you can look over yourself to make sure the claim matches the data. If they don't, you should really fact check unsourced claims.

Sometimes people make mistakes. Sometimes they make assumptions. Sometimes they mis-remember. Sometimes they exaggerate. Sometimes they have biases. Sometimes they're just confidently incorrect.

And as I mentioned I thought the "3050 is a rebranded 1080" was a bit oversimplified. My guess is that because the 3050 might offer a similar level of performance as a vanilla 1080 they made the statement based on raster performance and dismissed all the differences that exist. If all you care about is the performance I understand the rationale, but it's still a lazy/bad statement because the differences do matter sometimes, to some users. And invariably it creates a level of misinformation too, and when that stuff is mindlessly repeated it creates more of a mess than any simplifying one might intend. Better to stick with the hard facts when you can. Less ambiguous or open to interpretation.
Last edited by nullable; Apr 30, 2024 @ 4:46pm
Coffee Apr 30, 2024 @ 4:47pm 
Originally posted by nullable:
Originally posted by Artoria:

Did the person who told me that in that forum just not know what they were talking about and this whole thread was based on that false premise from what that person told me...

Lots of pseudo-experts out there. Nothing wrong with listening to people. But there's a saying, "trust, but verify". So ideally when people make claims they will provide some resources that you can look over yourself to make sure the claim matches the data. If they don't, you should really fact check unsourced claims.

Sometimes people make mistakes. Sometimes they make assumptions. Sometimes they mis-remember. Sometimes they exaggerate. Sometimes they have biases. Sometimes they're just confidently incorrect.

And as I mentioned I thought the "3050 is a rebranded 1080" was a bit oversimplified. My guess is that because the 3050 might offer a similar level of performance as a vanilla 1080 they made the statement based on raster performance and dismissed all the differences that exist. If all you care about is the performance I understand the rationale, but it's still a lazy/bad statement because the differences do matter sometimes, to some users. And invariably it creates a level of misinformation too, and when that stuff is mindlessly repeated it creates more of a mess than any simplifying one might intend. Better to stick with the hard facts when you can. Less ambiguous or open to interpretation.

yeah. I thought of deleting this thread cause it looks pretty foolish now. But I'll leave it up for others to not do what I do (if they read the forums before assuming)

See I just listened to one random person, assumed, and kinda made a fool of myself.

Don't do what I did.

Yeah it was really oversimplified, and I kinda fell for a rage bait in a way even if person believed what they said or not.

So its not really rebranded at all, there are definite differences.
A&A Apr 30, 2024 @ 4:53pm 
Wtf are they talking about???
RTX3050 is at least 30% slower than GTX1080!
Last edited by A&A; Apr 30, 2024 @ 4:53pm
nullable Apr 30, 2024 @ 5:08pm 
Originally posted by Artoria:

Like...one kinda expects technology to improve.

Like as an example, why buy a 10,000 dollar TV if it has a couple extra features but otherwise exactly the same as a TV made 10 years ago for 1,000 dollars and they both look almost exactly the same 99% of the time, unless a show or movie makes use of that one or two extra features. But almost no show or tv does.

Well I'd argue a 3050 didn't cost 10x more than a 1080 did in 2016. I understand your analogy, but you're not accurately representing the situation. A 1080 was $500-$600 in 2016. And a 3050 is a lot cheaper than that now, https://store.nvidia.com/en-us/geforce/store/?page=1&limit=9&locale=en-us&category=DESKTOP,GPU&gpu=RTX%203050 $200, I don't think the 3050 was $500 in 2021-2022.

Granted if you bought a 1080 in 2016, then maybe buying a 3050 in 2022 doesn't make the most sense. But generally I don't think that's Nvidia's expectation for people to buy flagship cards in on generation and have their next upgrade be an entry level card. I don't think they're responsible for that decision, or any assumptions that one might make that you should be able to do that AND expect a huge performance uplift.

Originally posted by Artoria:
Most games don't use raytracing at all, and if they do its not that greatly done or very gimicky. Cyberpunk 2077 is only game I've played that looks amazing with raytracing, so one single game that I've personally played. The rest don't look great...I guess uh watch dog legions is the 2nd, that one looks super good I think thats raytraced. So probably two games

No other game makes use of it like those two (and I forgot if legions actually uses it or not)

Just an example lol, probably a junky one. If AMD does the same thing, then I guess its a bit moot.

Well give it another decade when ray tracing is the norm and hardware can run it without too much fuss and you might sing a different tune. Early adoption on entry level hardware might not provide the awe inspiring experience you wish for. I'm not shocked by that.

But I've seen lots of new technologies come and be dismissed, and years later they're so mainstream and necessary you can't imagine a world without them. FSAA anyone? It was hardly feasible on the first generation hardware and it was a number of years and multiple generations of hardware before all games had it, and all users would have it enabled by default. Ray Tracing might be like that, adoption and evolution takes time, even longer on entry level hardware.
Last edited by nullable; Apr 30, 2024 @ 5:10pm
Showing 1-15 of 15 comments
Per page: 1530 50

Date Posted: Apr 30, 2024 @ 3:47pm
Posts: 15