r.linder 19 ENE 2022 a las 8:16
PSA: DON'T Buy The RX 6500-XT (Unless There's LITERALLY Nothing Else)
https://www.youtube.com/watch?v=ZFpuJqx9Qmw
https://www.youtube.com/watch?v=ArW4-mkGHSw
https://www.youtube.com/watch?v=DmE8iZWaLWE
https://www.youtube.com/watch?v=M5_oM3Ow_CI

Right out of the gate, the RX 6500-XT is being outperformed by the RX 580, a 5 year old refresh of a 6 year old GPU architecture. It can sometimes win against the 580, but it loses battles against it more than it wins, and there's just other issues with the card, like the fact that it only uses up to 4 PCI-e lanes out of a x16 PCI-e slot, and lacking some encoding support like HEVC.
AMD seems to have intentionally stripped down the 6500-XT, I guess to cut cost, but the MSRP is still basically around the same as what the RX 580 was, and that number doesn't mean anything anymore because of scalpers, that card is probably still going to be scalped even though it's terrible, which makes it even worse for people who might end up buying a scalped 6500-XT.

In my opinion, don't buy the 6500-XT, you're much better off paying more for a GTX 1660 SUPER or RTX 2060. If the 6500-XT sells out, it'll only show AMD that their shady design choices are a-okay with us, but it shouldn't be, this is absolutely disgusting that a 4 year old GPU is still consistently faster than their latest. And don't give me that "well the RX 580 was the top end of Polaris" crap, they've released the RX 590, which is even faster, than the 5500-XT, which was slower than that, and now the 6500-XT, which is basically the same GPU but actually worse in some aspects. The 580 was always a mid-range GPU in actual performance when it launched, now it's just barely above entry level, this isn't acceptable.
I don't really recommend buying Radeon GPUs at all at this point. AMD is starting to become more like Intel and NVIDIA and they're doing it wrong, they're making it too obvious that they're flipping their budget users off. At least their competitors are a lot more subtle about it.
Última edición por r.linder; 23 ENE 2022 a las 12:26
Publicado originalmente por Monk:
I'm not saying amd is worse, I'm more saying amd is no better, and by on top, the athalons and phenom's were superb chips overall at a great value, then came bulldozers bs 8 core when their prices shot wayyyyyy up despite not gaining any real performance but fools bought them anyway.

Frankly, I'm done, anyone who defends this rubbish card is looking to either cause an argument or is fanboying, simple as that, no one should buy it and no one should recommend it.
< >
Mostrando 226-240 de 452 comentarios
AbedsBrother 23 ENE 2022 a las 12:56 
Publicado originalmente por Fuzzy_Dunlop:
Publicado originalmente por AbedsBrother:
Saw that outrage - a lot of it centered around the RTX 3070 and 6600XT. 8GB is more than enough for any game.

Good joke. Almost every current AAA game on Ultra and high resolution needs more than 8GB Ram. Im currently playing Cyberpunk, RDR2, Far Cry 6, FS2020 and Control and every single game needs more than 8GB of VRAM.
I honestly don't know what you're talking about.

RDR2 even uses 13-15GB on 4K-1440p and Ultra.
The joke is you interpreting
"8GB is more than enough for any game"
as
"8GB is more than enough for any game on Ultra settings." Which isn't what I said.
Última edición por AbedsBrother; 23 ENE 2022 a las 13:11
r.linder 23 ENE 2022 a las 13:40 
Publicado originalmente por ZeekAncient:
Publicado originalmente por Fuzzy_Dunlop:

Good joke. Almost every current AAA game on Ultra and high resolution needs more than 8GB Ram. Im currently playing Cyberpunk, RDR2, Far Cry 6, FS2020 and Control and every single game needs more than 8GB of VRAM.
I honestly don't know what you're talking about.

RDR2 even uses 13-15GB on 4K-1440p and Ultra.

I use a 3070 Ti at 4K and have had no issue with any game. Even if the game states that it uses more than 8gb of Vram, I have had no issues running any game and getting good frame rates.

You people are out of touch with how Vram allocation actually works. And put way too much stock in how much Vram a GPU has. Look at 4K benchmarks, a 3070 TI will beat out a 6700XT, and in a lot of cases the 6800, which have more Vram.

The issue with the 6500XT is not that it has only 4GB of Vram. It is that it has only 4GB of Vram AND uses a PCI-E 4.0 x 4.0 interface. So when a game uses more than 4GB of Vram, it will have to dip into the system ram. That limited interface will slow it down considerably when doing so. And if you are on a PCI-E 3.0 interface, it can be almost crippling.

When a game uses more than 8GB of VRam with my 3070 TI, it will also dip into system ram. However, my card uses a PCI-E 4.0 x 16 interface, so even on my PCI-E 3.0 system, there is not a noticeable affect on performance.

I am sure when it goes considerably over 8GB of Vram, there can be performance issues. But from personal experience, I play a lot of games that go well over 8gb of Vram at 4K and I have not had any drop in performance.

But the limited PCI-E 4.0 x 4 interface of the 6500XT will slow things down considerably when dipping into that system ram.
Don't forget that it's also using a 64-bit memory bus instead of a 128-bit memory bus as it's seen with the RX 580 and 5500-XT, so memory bandwidth is effectively halved too.

The card basically just gets a "Eh, at least it's better than the 1050 Ti" award.
Última edición por r.linder; 23 ENE 2022 a las 13:41
Andrius227 23 ENE 2022 a las 13:52 
Publicado originalmente por 尺.ㄥ丨几ᗪ乇尺:
Publicado originalmente por ZeekAncient:

I use a 3070 Ti at 4K and have had no issue with any game. Even if the game states that it uses more than 8gb of Vram, I have had no issues running any game and getting good frame rates.

You people are out of touch with how Vram allocation actually works. And put way too much stock in how much Vram a GPU has. Look at 4K benchmarks, a 3070 TI will beat out a 6700XT, and in a lot of cases the 6800, which have more Vram.

The issue with the 6500XT is not that it has only 4GB of Vram. It is that it has only 4GB of Vram AND uses a PCI-E 4.0 x 4.0 interface. So when a game uses more than 4GB of Vram, it will have to dip into the system ram. That limited interface will slow it down considerably when doing so. And if you are on a PCI-E 3.0 interface, it can be almost crippling.

When a game uses more than 8GB of VRam with my 3070 TI, it will also dip into system ram. However, my card uses a PCI-E 4.0 x 16 interface, so even on my PCI-E 3.0 system, there is not a noticeable affect on performance.

I am sure when it goes considerably over 8GB of Vram, there can be performance issues. But from personal experience, I play a lot of games that go well over 8gb of Vram at 4K and I have not had any drop in performance.

But the limited PCI-E 4.0 x 4 interface of the 6500XT will slow things down considerably when dipping into that system ram.
Don't forget that it's also using a 64-bit memory bus instead of a 128-bit memory bus as it's seen with the RX 580 and 5500-XT, so memory bandwidth is effectively halved too.

The card basically just gets a "Eh, at least it's better than the 1050 Ti" award.

Comparing it to a ~5yr old gpu is unfair. RTX3050 comes out soon and thats what rx6500 will have to compete with.

And i think i know which one is going to be better.

RX6500 would only get a 'Better than nothing' or 'The only gpu available because no one wants it' award from me.
Última edición por Andrius227; 23 ENE 2022 a las 14:05
A&A 23 ENE 2022 a las 13:58 
64bit bus vs 128 bit bus vs 256 bit (butt)

That means you have 2X2GB vs 4X1GB vs 8X1GB

As AMD wants to say "Cheap edition"


Somebody should start selling us the packets with memory chips, because miners will love mining with higher capacity and higher bandwidth :D

Well, somehow l still can play Mafia 2 with GT240 PCI 2 x 2, 20 fps at Ultra
Última edición por A&A; 23 ENE 2022 a las 14:16
ZeekAncient 23 ENE 2022 a las 14:02 
Publicado originalmente por 尺.ㄥ丨几ᗪ乇尺:
Don't forget that it's also using a 64-bit memory bus instead of a 128-bit memory bus as it's seen with the RX 580 and 5500-XT, so memory bandwidth is effectively halved too.

Exactly. And this is exactly what has been my problem with this card. Of course only 4GB, but couple that with a minuscule memory bus of 64-Bit and only 4 lanes on the PCI-E 4.0 interface, and you could be running into some serious problems.

The smaller the memory buffer is, the more likely you are to have to use system memory. Keep it under 4GB and you should be fine. But even AMD said that is nigh impossible with today's games. And they said this in 2020. Once you exceed 4GB then the GPU will dip into system rmemory. The 64-Bit memory bus and the limited PCI-E bandwitdh will make this process extremely slow compared to a card that has a larger memory bus and faster PCI-E bandwitdh. This is where performance can be potentially crippled. No matter what settings or resolution you are using.

Someone in the know can make sure they keep the Vram usage of games to around 3GB to make sure they never potentially exceed that 4GB. That is a process of optimizing settings and resolution, bench-marking, and keeping a strict account on Vram usage.

Someone who is not in the know may have a hard time doing this to make sure they never exceed that 4GB memory buffer. And if they are on a PCI-E 3.0 interface, which most people are, the results of exceeding 4GB could be devastating for performance. And if they are not too tech savvy, they maybe shaking their heads and wondering why on Earth they cannot maintain decent frame rates with this card.

I have never seen Nvidia, or even AMD for that matter, ever release a card with as many shortcomings as this. At least not for $200, and brand it as in the same series as their current generation of GPUs. I don't care what the market is, or what excuse you give for AMD doing this. They could have released a card that was much better than this. Look Nvidia is about to release a card in the 3050 that is apparently crap for mining, but I am sure it will perform a hell of a lot better than the 6500 in gaming. And you won't have to worry if you have PCIe 4.0 or 3.0, and won't have to worry if, god forbid, you happen to exceed the memory buffer.
Última edición por ZeekAncient; 23 ENE 2022 a las 14:05
r.linder 23 ENE 2022 a las 14:18 
Publicado originalmente por Andrius227:
Publicado originalmente por 尺.ㄥ丨几ᗪ乇尺:
Don't forget that it's also using a 64-bit memory bus instead of a 128-bit memory bus as it's seen with the RX 580 and 5500-XT, so memory bandwidth is effectively halved too.

The card basically just gets a "Eh, at least it's better than the 1050 Ti" award.

Comparing it to a ~5yr old gpu is unfair. RTX3050 comes out soon and thats what rx6500 will have to compete with.

And i think i know which one is going to be better.

RX6500 would only get a 'Better than nothing' or 'The only gpu available because no one wants it' award from me.
It's in the same price range as the 1050 Ti right now, so it has a place in the current market as a barebones entry level card, but AMD could've done that with an entirely different card, doesn't make sense why they'd make a 6500-XT when it's slower than the 5500-XT and too slow to realistically use its raytracing support, which has to be backed by FSR, which doesn't have widespread support. It'll never make it to Cyberpunk, for example, that's pretty much an NVIDIA title. Other NVIDIA biased titles won't get that either.

The 3050 will probably smash the 6500-XT, unless NVIDIA is doing pretty much the same thing and just refreshing the 1050 Ti or 1650 with RT support. Then they'll both be pretty lackluster, but if you have literally nothing and just need a GPU you can afford, guess it doesn't really matter unless you're looking for better performance and visuals than the latest consoles.
Última edición por r.linder; 23 ENE 2022 a las 14:20
r.linder 23 ENE 2022 a las 14:25 
Publicado originalmente por ZeekAncient:
Publicado originalmente por 尺.ㄥ丨几ᗪ乇尺:
Don't forget that it's also using a 64-bit memory bus instead of a 128-bit memory bus as it's seen with the RX 580 and 5500-XT, so memory bandwidth is effectively halved too.

Exactly. And this is exactly what has been my problem with this card. Of course only 4GB, but couple that with a minuscule memory bus of 64-Bit and only 4 lanes on the PCI-E 4.0 interface, and you could be running into some serious problems.

The smaller the memory buffer is, the more likely you are to have to use system memory. Keep it under 4GB and you should be fine. But even AMD said that is nigh impossible with today's games. And they said this in 2020. Once you exceed 4GB then the GPU will dip into system rmemory. The 64-Bit memory bus and the limited PCI-E bandwitdh will make this process extremely slow compared to a card that has a larger memory bus and faster PCI-E bandwitdh. This is where performance can be potentially crippled. No matter what settings or resolution you are using.

Someone in the know can make sure they keep the Vram usage of games to around 3GB to make sure they never potentially exceed that 4GB. That is a process of optimizing settings and resolution, bench-marking, and keeping a strict account on Vram usage.

Someone who is not in the know may have a hard time doing this to make sure they never exceed that 4GB memory buffer. And if they are on a PCI-E 3.0 interface, which most people are, the results of exceeding 4GB could be devastating for performance. And if they are not too tech savvy, they maybe shaking their heads and wondering why on Earth they cannot maintain decent frame rates with this card.

I have never seen Nvidia, or even AMD for that matter, ever release a card with as many shortcomings as this. At least not for $200, and brand it as in the same series as their current generation of GPUs. I don't care what the market is, or what excuse you give for AMD doing this. They could have released a card that was much better than this. Look Nvidia is about to release a card in the 3050 that is apparently crap for mining, but I am sure it will perform a hell of a lot better than the 6500 in gaming. And you won't have to worry if you have PCIe 4.0 or 3.0, and won't have to worry if, god forbid, you happen to exceed the memory buffer.
Only need to really worry about 3.0 vs 4.0 if they do the same thing and limit the bandwidth to x8 or x4. If it's x16, you're golden.

They really don't need to be pushing 4.0 and 5.0, they're just going to alienate a lot of ultra budget users that can't afford to upgrade pretty much their entire system just to get the most out of a 200$ GPU.
A&A 23 ENE 2022 a las 14:42 
Even if the GPU was 128bit, then miners would buy it again
We saw this with RX6600XT
I saw something like a reason to have RT cores, this is because you can use them for DLSS setting, if the game is allowing it.
Around 120 apps and game are supporting DLSS and around 70 games support FSR, but also there are some programs which are trying to emulate it or eneble it for some games

Well, today you can't mine eth or btc with GTX 1060 3GB


Well, other option is to OC the FBS speed and the memory clock
Última edición por A&A; 23 ENE 2022 a las 15:12
r.linder 23 ENE 2022 a las 15:36 
Publicado originalmente por A&A:
Even if the GPU was 128bit, then miners would buy it again
Yeah what you can't make up in performance per card, you can make up for by using a crapton of those cards, it's one reason why the RX 570 8G was super popular by miners, it was cheap enough to buy in bulk
Última edición por r.linder; 23 ENE 2022 a las 15:36
UserNotFound 23 ENE 2022 a las 19:48 
Publicado originalmente por 尺.ㄥ丨几ᗪ乇尺:

The 3050 will probably smash the 6500-XT, unless NVIDIA is doing pretty much the same thing and just refreshing the 1050 Ti or 1650 with RT support. Then they'll both be pretty lackluster, but if you have literally nothing and just need a GPU you can afford, guess it doesn't really matter unless you're looking for better performance and visuals than the latest consoles.
Nobody doubts the RTX 3050 will 'smash' the RX6500XT, on paper and at MSRP, it's not even close. But early price leaks of the former being listed at 460USD to well clear of 500USD. So, how is it a fair comparison? The RX 6500XT was deliberately crippled to not attract miners, the RTX 3050 isn't, so be prepared for short supply, and when demand exceeds supply by a good margin, I do not expect to see the card to be sold at less than 500USD. MSRP now is a bad joke....
_I_ 23 ENE 2022 a las 20:07 
a lower end gpu will not attract anyone, unless its price is very low
it may attract miners if its mining performance vs electricity cost is profitable, mining does not need much bandwidth, usb 3.0 is enough for a single 3090 mining
xSOSxHawkens 23 ENE 2022 a las 20:30 
Publicado originalmente por _I_:
a lower end gpu will not attract anyone, unless its price is very low
it may attract miners if its mining performance vs electricity cost is profitable, mining does not need much bandwidth, usb 3.0 is enough for a single 3090 mining
Mining needs vram though. Part of the 4gb limit. No one thinks 4gb is good. But it is 100% Ethereum mining proof. Anything 6gb or more can be mined on.

Now if another profitable alt coin comes along we will have to wait, but for now the 6500xt is worthless for miners, that 8gb 3050 though... It will be a hot cake likely for them.
ZeekAncient 23 ENE 2022 a las 20:54 
Publicado originalmente por xSOSxHawkens:
Now if another profitable alt coin comes along we will have to wait, but for now the 6500xt is worthless for miners, that 8gb 3050 though... It will be a hot cake likely for them.

From what I am reading, the 3050 will not be very good at mining. Perhaps it won't be worth it all to mine with the 3050. But we will have to wait and see I guess.

https://www.techspot.com/news/93088-numbers-show-rtx-3050-isnt-worth-mining.html

I personally think AMD should have went with a route similar to this one. At least, if you were set on making it 4GB only, give it a higher bandwidth than PCIE 4.0 x 4.
Última edición por ZeekAncient; 23 ENE 2022 a las 20:56
Monk 23 ENE 2022 a las 20:56 
I cannot believe how many people are defending this blatant money grubbing poor excuse for a card.

Frankly, I don't care if its 'cheap' as it usnt when you think about it, it us under powered and awful.

Honestly, it's AMD back to being AMD, I guess alot are to young to remember what they were like pre FX when they were on top, they are just as bad as Intel and nvidia and faboys for any of them are the only ones who will blindly defend any of their bs.

If you only have 200 bucks to spend, look for an old second hand 1060 or 980, you'll be better off.

Oh and it isn't miners buying up these low end cards enmasse pushing the prices up, same for the high end for the most part.
UserNotFound 23 ENE 2022 a las 21:10 
Publicado originalmente por Monk:
I cannot believe how many people are defending this blatant money grubbing poor excuse for a card.

Frankly, I don't care if its 'cheap' as it usnt when you think about it, it us under powered and awful.

Honestly, it's AMD back to being AMD, I guess alot are to young to remember what they were like pre FX when they were on top, they are just as bad as Intel and nvidia and faboys for any of them are the only ones who will blindly defend any of their bs.

If you only have 200 bucks to spend, look for an old second hand 1060 or 980, you'll be better off.

Oh and it isn't miners buying up these low end cards enmasse pushing the prices up, same for the high end for the most part.
Well, that's your opinion or take on the matter, what about the RTX 3050, isn't it a big money grab as well? As stated, when the RX 6500 XT is used for games at 1080P, a mix of low to med to high ingame graphics settings, it does pretty well. And given it's a new card from AMD, driver support would be forthcoming for years to come.

As for getting a used GTX 1060 (assume you'd meant the 6GB version) for 200 bucks, good luck with that! As for the GTX 980 at 200USD, that's doable.

The RX 6500 XT was never meant for high res, max ingame gaming at 1080P (perhaps only eSport titles), many reviewer even ran HD texture packs (just to cripple it more) and when running games at 1080P with reasonable settings (as long as one doesn't exceed the 4GB frame buffer, it does well enough (as was intended). That it cost close to what was considered mid range 3 years or so back is irrelevant as we can't be living in the past, COVID-19 has seen to that.
Última edición por UserNotFound; 23 ENE 2022 a las 21:12
< >
Mostrando 226-240 de 452 comentarios
Por página: 1530 50

Publicado el: 19 ENE 2022 a las 8:16
Mensajes: 452