Instalar Steam
iniciar sesión
|
idioma
简体中文 (chino simplificado)
繁體中文 (chino tradicional)
日本語 (japonés)
한국어 (coreano)
ไทย (tailandés)
Български (búlgaro)
Čeština (checo)
Dansk (danés)
Deutsch (alemán)
English (inglés)
Español de Hispanoamérica
Ελληνικά (griego)
Français (francés)
Italiano
Bahasa Indonesia (indonesio)
Magyar (húngaro)
Nederlands (holandés)
Norsk (noruego)
Polski (polaco)
Português (Portugués de Portugal)
Português-Brasil (portugués de Brasil)
Română (rumano)
Русский (ruso)
Suomi (finés)
Svenska (sueco)
Türkçe (turco)
Tiếng Việt (vietnamita)
Українська (ucraniano)
Comunicar un error de traducción
"8GB is more than enough for any game"
as
"8GB is more than enough for any game on Ultra settings." Which isn't what I said.
The card basically just gets a "Eh, at least it's better than the 1050 Ti" award.
Comparing it to a ~5yr old gpu is unfair. RTX3050 comes out soon and thats what rx6500 will have to compete with.
And i think i know which one is going to be better.
RX6500 would only get a 'Better than nothing' or 'The only gpu available because no one wants it' award from me.
That means you have 2X2GB vs 4X1GB vs 8X1GB
As AMD wants to say "Cheap edition"
Somebody should start selling us the packets with memory chips, because miners will love mining with higher capacity and higher bandwidth :D
Well, somehow l still can play Mafia 2 with GT240 PCI 2 x 2, 20 fps at Ultra
Exactly. And this is exactly what has been my problem with this card. Of course only 4GB, but couple that with a minuscule memory bus of 64-Bit and only 4 lanes on the PCI-E 4.0 interface, and you could be running into some serious problems.
The smaller the memory buffer is, the more likely you are to have to use system memory. Keep it under 4GB and you should be fine. But even AMD said that is nigh impossible with today's games. And they said this in 2020. Once you exceed 4GB then the GPU will dip into system rmemory. The 64-Bit memory bus and the limited PCI-E bandwitdh will make this process extremely slow compared to a card that has a larger memory bus and faster PCI-E bandwitdh. This is where performance can be potentially crippled. No matter what settings or resolution you are using.
Someone in the know can make sure they keep the Vram usage of games to around 3GB to make sure they never potentially exceed that 4GB. That is a process of optimizing settings and resolution, bench-marking, and keeping a strict account on Vram usage.
Someone who is not in the know may have a hard time doing this to make sure they never exceed that 4GB memory buffer. And if they are on a PCI-E 3.0 interface, which most people are, the results of exceeding 4GB could be devastating for performance. And if they are not too tech savvy, they maybe shaking their heads and wondering why on Earth they cannot maintain decent frame rates with this card.
I have never seen Nvidia, or even AMD for that matter, ever release a card with as many shortcomings as this. At least not for $200, and brand it as in the same series as their current generation of GPUs. I don't care what the market is, or what excuse you give for AMD doing this. They could have released a card that was much better than this. Look Nvidia is about to release a card in the 3050 that is apparently crap for mining, but I am sure it will perform a hell of a lot better than the 6500 in gaming. And you won't have to worry if you have PCIe 4.0 or 3.0, and won't have to worry if, god forbid, you happen to exceed the memory buffer.
The 3050 will probably smash the 6500-XT, unless NVIDIA is doing pretty much the same thing and just refreshing the 1050 Ti or 1650 with RT support. Then they'll both be pretty lackluster, but if you have literally nothing and just need a GPU you can afford, guess it doesn't really matter unless you're looking for better performance and visuals than the latest consoles.
They really don't need to be pushing 4.0 and 5.0, they're just going to alienate a lot of ultra budget users that can't afford to upgrade pretty much their entire system just to get the most out of a 200$ GPU.
We saw this with RX6600XT
I saw something like a reason to have RT cores, this is because you can use them for DLSS setting, if the game is allowing it.
Around 120 apps and game are supporting DLSS and around 70 games support FSR, but also there are some programs which are trying to emulate it or eneble it for some games
Well, today you can't mine eth or btc with GTX 1060 3GB
Well, other option is to OC the FBS speed and the memory clock
it may attract miners if its mining performance vs electricity cost is profitable, mining does not need much bandwidth, usb 3.0 is enough for a single 3090 mining
Now if another profitable alt coin comes along we will have to wait, but for now the 6500xt is worthless for miners, that 8gb 3050 though... It will be a hot cake likely for them.
From what I am reading, the 3050 will not be very good at mining. Perhaps it won't be worth it all to mine with the 3050. But we will have to wait and see I guess.
https://www.techspot.com/news/93088-numbers-show-rtx-3050-isnt-worth-mining.html
I personally think AMD should have went with a route similar to this one. At least, if you were set on making it 4GB only, give it a higher bandwidth than PCIE 4.0 x 4.
Frankly, I don't care if its 'cheap' as it usnt when you think about it, it us under powered and awful.
Honestly, it's AMD back to being AMD, I guess alot are to young to remember what they were like pre FX when they were on top, they are just as bad as Intel and nvidia and faboys for any of them are the only ones who will blindly defend any of their bs.
If you only have 200 bucks to spend, look for an old second hand 1060 or 980, you'll be better off.
Oh and it isn't miners buying up these low end cards enmasse pushing the prices up, same for the high end for the most part.
As for getting a used GTX 1060 (assume you'd meant the 6GB version) for 200 bucks, good luck with that! As for the GTX 980 at 200USD, that's doable.
The RX 6500 XT was never meant for high res, max ingame gaming at 1080P (perhaps only eSport titles), many reviewer even ran HD texture packs (just to cripple it more) and when running games at 1080P with reasonable settings (as long as one doesn't exceed the 4GB frame buffer, it does well enough (as was intended). That it cost close to what was considered mid range 3 years or so back is irrelevant as we can't be living in the past, COVID-19 has seen to that.