Zainstaluj Steam
zaloguj się
|
język
简体中文 (chiński uproszczony)
繁體中文 (chiński tradycyjny)
日本語 (japoński)
한국어 (koreański)
ไทย (tajski)
български (bułgarski)
Čeština (czeski)
Dansk (duński)
Deutsch (niemiecki)
English (angielski)
Español – España (hiszpański)
Español – Latinoamérica (hiszpański latynoamerykański)
Ελληνικά (grecki)
Français (francuski)
Italiano (włoski)
Bahasa Indonesia (indonezyjski)
Magyar (węgierski)
Nederlands (niderlandzki)
Norsk (norweski)
Português (portugalski – Portugalia)
Português – Brasil (portugalski brazylijski)
Română (rumuński)
Русский (rosyjski)
Suomi (fiński)
Svenska (szwedzki)
Türkçe (turecki)
Tiếng Việt (wietnamski)
Українська (ukraiński)
Zgłoś problem z tłumaczeniem
but the 5500XT and 6500XD see more benefits than the 5700XT and 6700XT, it's bad. They're intentionally designed to be that way, possibly to give lower end users more drive to upgrade to something better that even on PCIE3 will run much faster.
The gap alone between the 6500XD and 6600XT is just insulting by itself, and even the 5500XT is faster.
It's sad to see the current state of GPUs.
Again, I ain't defending it, like the older GTX 1650, I firmly believe the RX 6500 XT is a terrible card for modern gaming. But, targeting 1080P gamers who are willing to compromise on visuals or graphics settings, IF the 6500XT is easily available at near MSRP, no reason why some budget gamers with PCIe 4.0 compatible mobos might not go for it given the limited choices in the GPU market for 'budget' GPU's.
The comparisons I'd seen shows it to be better than the GTX1650 at 1080P (remember, the GTX 1650 is still being sold at crazy high prices given its poor level of gaming performance), and at mix of Medium-High settings, never at Ultra. One doesn't expect such an entry level card to run modern games at Ultra setting as it would sap the VRAM and saturate the memory interface, but for reason of testing/benchmarking, it's understandable as it shows the strengths and weaknesses of the tested card.
Who can forget the GTX 750 Ti? I guess that was just such a terrible product, being one of the earliest budget range products that was able to play most games at 1080p and higher settings well. Less common, but the GTX 650 Ti Boost a generation before wasn't far off the GTX 660 either (which was also close-ish to the GTX 660 Ti). Even the "vanilla" GTX 650, though substantially weaker than these, was pretty good considering it was at a price point that would make us cry today, while still being overall okay for gaming at the time (of course there'd be compromises, it wasn't going to get you insane frame rates at max settings in all titles). The GTS 250/450 were popular in their time for much of the same reasons.
No, these aren't exciting or ground breaking ends of the market, and most of them are pretty "boring" with just a few standout exceptions (like the GTX 750 Ti I mentioned), but they usually aren't bad for the purpose they are intending to serve either.
I had issues with 970 and 980 due to only 4GB VRAM. While my 780 6GB provided just enough VRAM while performing the same as a 970 in most games. The real downside was that the 780 used almost twice as much power. I solved that by moving to a 980 Ti at that time.
I wouldn't even give an AMD or NVIDIA x50/xx50/x500 GPU to a young PC Gamer. They aren't going to enjoy games as much with terrible graphics. Something where even most games on PS3 and PS4 look better and run better. You are NOT going to enjoy games like GTAV or RDR2 on a GPU like a GTX 1050, 1050 Ti, AMD 5500/6500 series; just never going to happen. They are too low-end. In which case playing a game like RDR2 on even the older PS4 non-PRO would be more fun, run smoother and look heck of alot better; all for way cheaper.
If all you needed was a work PC; then something with an 8th Gen i5 or later and using the onboard Intel GPU would be plenty.
But l'm still enjoying to play GTA 5 with Vega 7 2 GB VRAM
Possible. To answer this to you specific - I used to have had a extremely weak PC back then. 4gb ram with low hz, and a very weak cpu. The complete package was simply too bad and low settings was the best I was able to do at least. That is why Bad Motha's post isn't realistic at all to me.
They're looking for a card that can do 1080P gaming decently enough, and that's where the two cards fall into. With a decent CPU and enough RAM, these cards can do games decently enough at Low-high setting and still net playable framerate. Yes, some game engines are harder on them than others, hence these card owners need to find the right mix of ingame graphics settings to get the most out of their cards.