Should i wait until RX7000 releases or buy a 4000 series NVIDIA gpu
I'm looking to do a complete upgrade ( new mobo ; ddr5 ram; ryzen 7 series or raptor lake cpu) soon.

However on the GPU side i'm tempted to switch back to team Red after 10 years of buying NVIDIA Cards only. I just feel that atm NVIDIA doesn't deserve my money after their Scummy practices of the last couple years.

Anyone with recent high powered AMD gpu 's here that can Tell me what thé driver situation is now with AMD Cards?

I play both old games so DX9 and DX11 and emulation needs to work fine aswell without stuttering or major fps loss.

I can live without dlss 3.0 personally since i game mostly on 1440p.

Should i stick with NVIDIA that has thé most market share and thus better supported drivers and features ( although completely overpriced) or switch back to AMD after all these years?
Автор останньої редакції: Zef; 25 жовт. 2022 о 11:45
< >
Показані коментарі 1630 із 39
Цитата допису 🦜Cloud Boy🦜:
I don't think AMD will bring anything new to the table. When it comes to graphics cards, AMD is always one step behind to Nvidia. They just follow Nvidia's foot steps (or copy them).

Look at this gen's history for example, Nvidia released RTX 3080 for $700 msrp, 2 weeks later AMD released RX 6800xt for $680 msrp. Later, Nvidia released RTX 3070 for $500, 2 weeks later AMD released their RX 6700xt for $480, and so on....

Even if AMD releases something that's substantially cheaper, Nvidia will also cut their's prices next day. That's how market works.

LOL what an nvidia cuck :laughing_yeti:

Op I have had zero issues with my RX 6800, and my previous nvidia 3060 I did have a few driver issues.

Now my last Amd card, the 5700xt, I had some issues there as well.

But the 6800 has been rock solid.
I personally am not loyal to either Nvidia or AMD. I don't like either company, lol. And I know that AMD fans don't like hearing this, but they are as much a shady greedy conglomerate as Intel and Nvidia are. Just look at their current prices for the Ryzen 7000 series.

I don't like any of these companies. But obviously, being a PC gamer, and having been one since the early 90s(80s if you consider the Commodore 64, lol), I am forced to fork over money to these companies. While I have been on Intel and Nvidia for many years, I was strongly considering going all AMD this time.

I will probably build a new PC in 2023 and looking at the current gen products for CPU and GPU, I am not completely sold on either Intel's offerings or Nvidia's. I like the performance of the Raptor Lake in gaming, and the price/performance, but being on an 11700K already( I know not the greatest, but plenty for gaming at 4K), I think I would need something that would be more of a game-changer. Intel 13th Gen and Ryzen 7000 aren't it for me. I was looking more at Intel's Meteor Lake, but after reading that it has been delayed, I think I might be looking at AMD's Zen 4 3DV-Cache line.

However, getting on topic of whether I would look at Nvidia's Ada Lovelace(40 series) or AMD's RNDA 3(7000 series), personally, I haven't been that impressed with the 40 series. On paper anyway. The 4090 is impressive performance wise, but it is very power-hungry, expensive, and not sure is something I truly need just for gaming. And the 4080 16GB doesn't look like it will be as impressive price/performance wise. Considering its power needs, the 4080 would be a card I would definitely look at. But at $1199, this seems far too high for what it will bring performance wise. Maybe if it was $200 or so less. I guess we will have to see the performance numbers.

So really, right now, it looks like I might be looking at an all AMD build for my next PC. I'd have to wait to see what Radeon 7000 brings. But we already know they will be more power-efficient than 40 series, but how will they perform. I think a lot of people on here are assuming that AMD's 7000 will be like the 6000 series and pretty much match Nvidia's GPUs, but I am not so sure. And I think Nvidia knew this when they announced the 4090 and 4080s. People are expecting that AMD's 7000 will match them( or exceed them) in rasterization performance, but that doesn't look to be the case.

I am already pretty certain that AMD will not have something that can compete with the 4090 performance-wise, but maybe that is ok. For me, if they can release something that can perhaps match, or maybe exceed, the 4080s performance, and be cheaper, that is what I would be looking for.

Only problem is, is that I do consider ray tracing performance. Ray tracing is definitely something I care about. Almost every new game that is coming out is using some kind of ray tracing. And every game I own that has ray tracing, I would much have it on, than not not. Sorry, but for me ray tracing looks amazing. And it games that implement it well, it is fantastic, and definitely want to play with it on. Not only that, but developers will soon be starting render games completely with ray tracing. Ray tracing is in its infancy, but it is here to stay. It is not going anywhere. So, ray tracing performance is definitely important to me. And I am not ashamed to admit it.

I know I am writing an essay, and most will not read this, but the tl;dr version, basically I am looking at an all AMD build for my next PC. While I have owned AMD CPUs in the past, I have never had an AMD(or ATI) GPU. There are several things that worry me about going all AMD. First being on Nvidia for so long, drivers are an issue. Now I know that a lot on here are saying that they have not had any issues with AMD graphics drivers recently, but I know that in the past, AMD was not as solid as Nvidia. For as long as I have owned Nvidia cards, I have never had any issue(that I know of, lol) with Nvidia drivers, and I update to the newest drivers every time there is a new one. Been doing this for well over a decade.

But it is not just driver stability. When you have been doing things a certain way for so long, changing that routine is scary. I mean, I am so used to the Nvidia Control Panel, I know what every setting does, and where I want things set. I know this isn't the hardest thing in the world, but when I think of starting to use AMD's Adrenaline drivers/control panel(or whatever it is called), it feels like a foreign language. Lol, I know I am being dramatic here and it isn't that bad or different, but you know.

And then obviously another thing that worries me about going AMD for GPU is of course the ray tracing performance, and Nvidia's DLSS. You can say what you want about both, but I care about ray tracing performance, and I play at 4K res(say what you will about that as well, but I am at 4K to stay) so I am a big fan of up-scaling tech. For one, being at 4K, I find upscaling tech has not just improved performance, but has really been a godsend for image quality. I mean, if I can render a game at 1440, or even 1080, but have it look almost identical to 4K and have the performance of the rendered res, sorry but this is just amazing. And so far, after trying both FSR and DLSS, DLSS is just superior, bar none. Leaving that behind would be hard for me.

I am currently on a 3070 Ti, and it is a decent GPU. Even for 4K. I make it work and get 60 FPS. But obviously in some games I need to use upscaling. So upscaling tech is very important to me. Recently I was looking at getting a 6900XT. I mean I can get one right now for like $650, or less. Considering that its performance, rasteriztion wise, it should be much better than my 3070 Ti right? Well, not really. When you factor in ray tracing, which like I said, I am a fan of, a lot of the games I play use some sort of ray tracing, its performance is about on par with a 3070 Ti. Then you factor in that I play at 4K, and will need some sort of upscaling, not having DLSS might be bummer.

So anyway, like I said, I am not a fan of either Nvidia or AMD. Currently have an Nvidia GPU and Intel CPU. But looking at the future for my next PC build, I was strongly considering an all AMD build. But there are several things that scare me about going all AMD, and it isn't just about performance like I said. But of course, in the end, it always boils down to performance. And of course, in the end, it really doesn't matter what you have in your PC anymore. Intel, AMD, and Nvidia all make great viable CPUs and/or GPUs. And all are still evil shady greedy conglomerates. But when looking at my needs, price/performance value, and what CPU/GPU fits into those, I think that AMD's next gen offerings might be better suited for me. Even if switching over scares me.
One piece of good news that has filtered out is. AMD is skipping the problematic 12VHPWR connector on the RTX 4090 and sticking with good, old overbuilt PCIe 6/8pin connectors on their upcoming RX 7000 series. I find it funny that Intel helped with the creation of the 12VHPWR specs, yet chose to NOT use it on their cards. Sure, like Jay2cents had said, they don't require much power, vis-a-vis RTX 4090. but shouldn't Intel have had it on their cards for continuity/adoption? Maybe they'd realized this glaring design flaw.
Автор останньої редакції: UserNotFound; 25 жовт. 2022 о 18:12
As always, wait for the market to mature and the releases to come out. You can either jump on the early adopter train, or wait and see how things go.

For previous gen, a 3080TI is around a 6950XT in gaming performance. The 6950XT will pull ahead in some games, the 3080TI ahead in others. The 3080TI and 6950XT overclocked pushes the 3080TI ahead in all regards by a little bit. The 3090TI is better than both, but for obvious reasons.

We don't know what this generation will behold until everything is released. So I'd wait until not only AMD has released their hardware, but at least a few months past to see if any X50 variants from AMD or TI's from nvidia before making a purchase.
Цитата допису UserNotFound:
One piece of good news that has filtered out is. AMD is skipping the problematic 12VHPWR connector on the RTX 4090 and sticking with good, old overbuilt PCIe 6/8pin connectors on their upcoming RX 7000 series. I find it funny that Intel helped with the creation of the 12VHPWR specs, yet chose to NOT use it on their cards. Sure, like Jay2cents had said, they don't require much power, vis-a-vis RTX 4090. but shouldn't Intel have had it on their cards for continuity/adoption? Maybe they'd realized this glaring design flaw.
being a nvidia fan ive only owned nvidia cards this is very troubling IMO.if i were in
the market this would affect my buying decision big time.this needs to be fixed and not
with some sketchy 90 degree cord.which is what will probably happen i would wait for
the second round of cards to see if they 86 that 12 pin.honestly it needs a redesign
Автор останньої редакції: Guydodge; 25 жовт. 2022 о 23:26
I'm waiting for next year, before buying a new GPU. I like AMD cpu's, and just this minute (well yesterday tbh) bought an AMD Ryzen 7 7700X cpu'd computer. I've put a 2070 Super into the build, to drive a big new 2K monitor. The old one had to be replaced anyway, it wouldn't take DP or HDMI feeds.
I'm waiting to see how AMD v Nv pan out with their mid range gpu's, 4060 or 4070 and the new AMD equivalents.
I think paying over £1k for a 4090 is way to much, for a well OTT card, unless it's driving a 4K monitor.
Цитата допису Komarimaru:
For previous gen, a 3080TI is around a 6950XT in gaming performance. The 6950XT will pull ahead in some games, the 3080TI ahead in others. The 3080TI and 6950XT overclocked pushes the 3080TI ahead in all regards by a little bit. The 3090TI is better than both, but for obvious reasons.

Where do you get this stuff?...

The 6950xt is a solid 3090/3090ti competitor and is well above the 3080/80ti levels...

https://www.techspot.com/review/2463-amd-radeon-6950xt/

at 1080p across 12 games its faster than the 3090ti
at both 1440p and 4k it sits faster than the 3090, but below 3090ti

Thats across a 12 game test suite from a reputable reviewer. AKA - Thats real world game performance on average.
Цитата допису xSOSxHawkens:
Цитата допису Komarimaru:
For previous gen, a 3080TI is around a 6950XT in gaming performance. The 6950XT will pull ahead in some games, the 3080TI ahead in others. The 3080TI and 6950XT overclocked pushes the 3080TI ahead in all regards by a little bit. The 3090TI is better than both, but for obvious reasons.

Where do you get this stuff?...

The 6950xt is a solid 3090/3090ti competitor and is well above the 3080/80ti levels...

https://www.techspot.com/review/2463-amd-radeon-6950xt/

at 1080p across 12 games its faster than the 3090ti
at both 1440p and 4k it sits faster than the 3090, but below 3090ti

Thats across a 12 game test suite from a reputable reviewer. AKA - Thats real world game performance on average.
https://www.youtube.com/watch?v=WSTmeZXHWyk
Because it's true? And last I checked, you do not get such GPU's for 1080p. Even at 1080p, the difference is so minimal it's not noticeable, and in Ray Traced games, the 30 series always will pull ahead further due to stronger and more RT cores.

And this isn't even counting the AIB models of the 30 series cards, which are all above the FE version in performance nearly. e.i. The Evga FTW 3 3080TI, smokes the Sapphire NItro, one of the best 6950XT cards.

Maybe before trying to start a fight, try owning both GPU and research first. 1080P performance... give me a break... Derailing a thread because you think 1080P performance for these cards is important...

But yes, if you want the best 1080P performance, without ray tracing. Then yes, a 6950XT will be a few more FPS than a 3080TI and higher 30 series. There, happy? I don't assume people buy high end hardware to play at 1080p.
Автор останньої редакції: Komarimaru; 26 жовт. 2022 о 11:49
Maybe that's why they push DLSS so hard. Turn it off and the smoke starts rolling.
Цитата допису Komarimaru:
Цитата допису xSOSxHawkens:

Where do you get this stuff?...

The 6950xt is a solid 3090/3090ti competitor and is well above the 3080/80ti levels...

https://www.techspot.com/review/2463-amd-radeon-6950xt/

at 1080p across 12 games its faster than the 3090ti
at both 1440p and 4k it sits faster than the 3090, but below 3090ti

Thats across a 12 game test suite from a reputable reviewer. AKA - Thats real world game performance on average.
https://www.youtube.com/watch?v=WSTmeZXHWyk
Because it's true? And last I checked, you do not get such GPU's for 1080p. Even at 1080p, the difference is so minimal it's not noticeable, and in Ray Traced games, the 30 series always will pull ahead further due to stronger and more RT cores.

And this isn't even counting the AIB models of the 30 series cards, which are all above the FE version in performance nearly. e.i. The Evga FTW 3 3080TI, smokes the Sapphire NItro, one of the best 6950XT cards.

Maybe before trying to start a fight, try owning both GPU and research first. 1080P performance... give me a break... Derailing a thread because you think 1080P performance for these cards is important...

But yes, if you want the best 1080P performance, without ray tracing. Then yes, a 6950XT will be a few more FPS than a 3080TI and higher 30 series. There, happy? I don't assume people buy high end hardware to play at 1080p.

Lol...

I said it was faster in 1080p. And to think no one uses 1080p, the 240hz and 360hz crowd would like to talk...
I incorrectly said it was below the 3090ti in 1440p, its not, its equal, I was doing a quick at lunch post from work. But yeh, at 1440p the 6950xt and 3090ti are *equal*.
And its 7% behind the *3090ti* and not the 3080ti lol, at 4K rez.

So all around the 6950xt is roughly = 3090/90ti depending on 1080p/1440p/4k.

You linked to a review with fewer data points (only 5 games) that was run using DLSS vs Native on Radeon.

Meanwhile a linked to a review that more than twice the data points (12 games) and was run mostly Native vs Native.

Using the DLSS on one but not the other and saying one is "faster" was a straw man argument at best and this was before FSR existed. Now that FSR is out trying to pull a DLSS vs Native comparison of NV vs AMD is both trollish and wrong.

You compare native vs Native, or you compare DLSS vs FSR with both at same quality presets. Anything else is a meaningless FPS comparison.

lol

Цитата допису ZAP:
Maybe that's why they push DLSS so hard. Turn it off and the smoke starts rolling.
pretty much ;)
Автор останньої редакції: xSOSxHawkens; 26 жовт. 2022 о 12:38
Цитата допису xSOSxHawkens:
Цитата допису Komarimaru:
https://www.youtube.com/watch?v=WSTmeZXHWyk
Because it's true? And last I checked, you do not get such GPU's for 1080p. Even at 1080p, the difference is so minimal it's not noticeable, and in Ray Traced games, the 30 series always will pull ahead further due to stronger and more RT cores.

And this isn't even counting the AIB models of the 30 series cards, which are all above the FE version in performance nearly. e.i. The Evga FTW 3 3080TI, smokes the Sapphire NItro, one of the best 6950XT cards.

Maybe before trying to start a fight, try owning both GPU and research first. 1080P performance... give me a break... Derailing a thread because you think 1080P performance for these cards is important...

But yes, if you want the best 1080P performance, without ray tracing. Then yes, a 6950XT will be a few more FPS than a 3080TI and higher 30 series. There, happy? I don't assume people buy high end hardware to play at 1080p.

Lol...

I said it was faster in 1080p.
I incorrectly said it was below the 3090ti in 1440p, its not, its equal, I was doing a quick at lunch post from work. But yeh, at 1440p the 6950xt and 3090ti are *equal*.
And its 7% behind the *3090ti* and not the 3080ti lol, at 4K rez.

So all around the 6950xt is roughly = 3090/90ti depending on 1080p/1440p/4k.

You linked to a review with fewer data points (only 5 games) that was run using DLSS vs Native on Radeon.

Meanwhile a linked to a review that more than twice the data points (12 games) and was run mostly Native vs Native.

Using the DLSS on one but not the other and saying one is "faster" was a straw man argument at best and this was before FSR existed. Now that FSR is out trying to pull a DLSS vs Native comparison of NV vs AMD is both trollish and wrong.

You compare native vs Native, or you compare DLSS vs FSR with both at same quality presets. Anything else is a meaningless FPS comparison.

lol

Цитата допису ZAP:
Maybe that's why they push DLSS so hard. Turn it off and the smoke starts rolling.
pretty much ;)
None of those tests were with DLSS... Again, you're trying to start an argument over nothing, to try and look correct. Like when you said 100C was normal for GPU temps. That ended real well, didn't it.

I own the 6950XT Nitro+ and an Evga 3080TI FTW 3. The 3080TI from EVGA wins in 1440p and above, and excels further when overclocked reaching 3090TI performance.

So, again, quit making up things. The tests did not use DLSS, and I'd trust Gamers Nexus due to how accurate their testing is. Many other benchmarks agree, especially when use AIB cards of high quality.

Let's not forget, you're basing your performance on 1080p, for someone asking to get a 40 series and 1440p. My advice was far more sound than yours, since told them to wait a bit for things to mature, or to look into previous generation based on benchmarks and personal results.

Take your bias and ignorance elsewhere.
Автор останньої редакції: Komarimaru; 26 жовт. 2022 о 13:01
Both manufacturers make good GPUs. As simple as that.
So lets brak this down into two parts:

Цитата допису Komarimaru:
None of those tests were with DLSS...

I own the 6950XT Nitro+ and an Evga 3080TI FTW 3. The 3080TI from EVGA wins in 1440p and above, and excels further when overclocked reaching 3090TI performance.

So, again, quit making up things. The tests did not use DLSS, and I'd trust Gamers Nexus due to how accurate their testing is.

1) I will admit that I didnt watch the entire video you linked. I did skip through the 5 singular games and review the performance charts. I made the (apparently incorrect) assumption from the third party prior reply to mine that indicated DLSS, that it was indeed used in the shown results. If it was *not* then I will accept that the results for the five games shown are indeed correct and acceptable. The assumption was wrong.

Those 5 results dont change the averages results across 12 games. And a 12 game average is more valid than 5 game sample.

Those are facts. You can claim else-wise all you want, but across 12 games, across three resolutions, the 6950 beats the 3080ti. Period. As you pointed out (in an ironically reversed way) there were cases where the 3080ti might keep up, or win by a frame or two, but overall, it is the grand loser being left int he dust by the 69xx and 3090/90ti cards.

____________________________________

Now, part two

Цитата допису Komarimaru:
Again, you're trying to start an argument over nothing, to try and look correct. Like when you said 100C was normal for GPU temps. That ended real well, didn't it.

You mean this?



Цитата допису xSOSxHawkens:
I think you are over-reacting a bit to think the card is dead or your gaming days are over. 90-100c has been the top acceptable operating temp for boosting for some time in both AMD and NV gpus.

If you are unhappy with the temps being that high, use a custom set fan curve to crank the fans up to a higher RPM sooner in the temperature curve.

That time where, at the time of my post, the OP was in panic mode, claiming that his "gaming days were over" and lamenting that his card was useless before being paid off?

The post where I specifically said top acceptable operating temp? Not

Цитата допису Komarimaru:
Like when you said 100C was normal for GPU temps.

The one where I made a multi generational (pascal/Vega > Now) statement across two brands, and thus included the (slightly generic) range of 90-100c as top acceptable operating temp? Which depending on card, maker, and generation, is entirely a true range to list in such a vague and wide covering statement?

The one where all I tried to do was calm a slightly over-reacting OP by reassuring him that despite being hot, his GPU was likely fine?

The one where I tried for *pages* to get you and one other stubborn poster to understand the difference between max acceptable vs recommended, and backed myself up with three sources, including Nvidia themselves?

The one where you willingly ignored one source on grounds that you didnt find it done by an author worthy of your acceptance, where you straight up said the other was wrong despite being an EVGA Mod, and then downplayed the final NV source themselves by implying they were technically right but their temp listing is wrong for what they describe?

The one where you never *once* bothered to back your self up and instead looked and three sources and shrugged without bothering to counter source at all, leaving your claims as nothing more than bias, unsourced, personal claims?

Well...

Цитата допису Komarimaru:
That ended real well, didn't it.

Yeh, sure. But not because I didn't try and help ya learn a bit by linking you to relevant source material...

I will give credit, at least this time in this thread you came back with a respectable source to support your view. It still a substantially smaller sample size that demonstrates the need for more data points before drawing conclusions, but its still a good source.
If you own a RTX 3070x or RTX 3080x, stick with it, as these are still overkill.. You can surely skip the 40's generation, especially the AMD/Ati side..
< >
Показані коментарі 1630 із 39
На сторінку: 1530 50

Опубліковано: 25 жовт. 2022 о 2:11
Дописів: 39