Installer Steam
connexion
|
langue
简体中文 (chinois simplifié)
繁體中文 (chinois traditionnel)
日本語 (japonais)
한국어 (coréen)
ไทย (thaï)
Български (bulgare)
Čeština (tchèque)
Dansk (danois)
Deutsch (allemand)
English (anglais)
Español - España (espagnol castillan)
Español - Latinoamérica (espagnol d'Amérique latine)
Ελληνικά (grec)
Italiano (italien)
Bahasa Indonesia (indonésien)
Magyar (hongrois)
Nederlands (néerlandais)
Norsk (norvégien)
Polski (polonais)
Português (portugais du Portugal)
Português - Brasil (portugais du Brésil)
Română (roumain)
Русский (russe)
Suomi (finnois)
Svenska (suédois)
Türkçe (turc)
Tiếng Việt (vietnamien)
Українська (ukrainien)
Signaler un problème de traduction
He's, ran and for years until a few months back when he swapped to Intel / nvidia.
And how warning people off a product the entire tech space has pretty much condemned as awful gives him an agenda.
While you can buy cards second hand, there is zero reason to buy this thing.
I do love how you announcing you are blocking someone should make anyone give a flying monkeys left hairy testicle about it, just do it.
Frankly I never block anyone regardless of any disagreements we have or how wrong I see them as being as they may one day have the answer to the problem I am having.
As for hardware jesus, aka gamers nexus 'steve', he's one of the more advanced tech tubers who does alot of proper in depth testing of kit and has been around for years, despite his smaller channel size, he's widely known as one of the big tech tubers and very knowledgeable overall.
I'm so lucky, I'm living rent free in this guys head so much he has to announce he's blocking me!
What ever will I do?
Anyway...
Anyway, it's pleasantly interesting hearing you state that as your opinion on the Phenom (that it was great for being a value alternative to Intel's offerings at the time) when you seem give Zen 2 such a hard time for being such a poor choice, despite it fitting that description even better than the K10 lineup did IMO. So it surprises me you'd state that.
I didn't defend nor recommend the 6500 XT though.
You made a statement and I poked you for your reasoning on it. Nothing more. I like to have discussions with people to understand their reasoning for things. If you take that as people wanting to have arguments or as only having a disagreement with you because THEY are the one with the bias then I'm not really sure what to tell you.
In a thread that's been rife with distaste towards AMD in general and beyond just the 6500 XT being bad, it's intriguing to see the discussion attempted to be steered only towards discussing how it's bad whenever any other statement is questioned.
Also OP marks as answer to main post (my bolding for emphasis of certain parts):
Meanwhile 100 posts ago...
Kinda stand by as that being both an accurate descrip of how the thread has turned out and also a current running theme from the looks of it from other commenters.
_______________________
As for valid tech discussion vs critique of this whole threads collapse...
The VII was far from a flop and you should know this. While it might not have sold like hot cakes it did well enough, and more importantly it allowed AMD a test run through 7nm to iron out 7nm before using the process on Ryzen to great success on Ryzen 3000. It doesnt take rocket scientist minds here to see that they used a well understood architecture on a not well understood design fab rather than trying a new fab and new acrh at the same time. Result was glorious for the next hop (ryzen-3k). Though its less obvious here with the 6nm node, maybe they are doing that again, though I suspect this is more about limits on fab line capacity more than anything, but who knows outside AMD.
Umm... Fewer cores = Faster cores. This is pretty common knowledge. So I think AMD is likely right when they say it has the highest sustained GPU clock rates ever (for an out of box product) considering it runs at 2.6Ghz out of the box. For comparison that is faster core speed than either my RX6800 or 6900 both of which top out around 2500Mhz, and much much faster than many overclocked CPU's from others (have a buddy with a high end asus 3080 that runs at like 2.3). Now if you want to make the argument that core speed is a bad raw comparison of power go ahead, that argument has been in play since the Ghz warz 20-25 years ago, but their claim is likely not wrong.
As for reply, to offer a contrasting opinion that can speak from AMD use recently and historically their GPU's are not nearly as bad as they get made out to be. I have three of their upper end modern cards (6900xt, 6800 (non-xt) and a Vega 64) along with some legacy cards such as the HD7870, HD5770, and HD3850 AGP, all of which have been used in recent years with either windows 7 or 10, with the vega and up seeing daily use.
None have had major issues. I did have one minor and highly specific bug that lasted about 1.5 years but has since been fixed. It required that you had a specific generation of GPU, that you had 3 or more screens, and that you had two or more refresh rates in use. I think it also may have needing differing rez on the screens cant remember. Would cause occasional driver hangs.
My biggest driver issue of note the past 12 mo? When I swapped my 6800 out and 6900xt in, then didnt bother to reinstall drivers and ran on the same install. Everything worked great but something was messed up despite no driver lvl changes with the tessellation settings or something that invalidated some of my 3dmark runs. A clean uninstall/reinstall using the drivers included clean install feature fixed all the issues, issues caused mind you by me being lazy not by the drivers being an issue. That is exactly why I generally do clean driver installs (and all you should too!)
I also run NV as well, though nothing super duper new anymore. Supported I have a mobile Quadro and a GTX-960 4GB, unsupported I have a GTX-670 4GB and a desktop Quadro. If I need specific or additional NV resource/benchmark/user review two of my close real life friends have and allow access to (or provide results of) benching more or less at my leisure (as I am their main computer repair friend) of a 1060-6gb, RTX-2060, or RTX-3080 (one also has a 4GB rx-580 in the mix).
Both AMD and Nvidia have driver crashes. I will admit that though both are infrequent the AMD are slightly more common. That doesn't mean they are bad, nor common at all, just *slightly* more so than Nvidia. That said, AMD handles failure with grace where NV pukes its guts. Most AMD driver hangs reset and resume usable ability of either 2d (windows) or both 2d and 3d within a few seconds. When an NV Driver hangs or fails you are lucky to get anything back to working even enough for a clean shutdown. If I have to take 99% stability with hard crashes or 98% stability with soft crashes, I know which I would prefer. Its better to at least be able to close things down and shut down.
Neither card were or are honestly misrepresented though.
Yes, the 430/730 sucked, but as you say they were not often marketed as gaming cards (outside of shoddy OEM's which sold them as such). From NV side of things most of those were meant to be graphics accelerators for video consumption or office use or other non gaming uses.
Likewise the 6500xt *is* a gaming card and is marketed correctly. The box itself markets the specific resolution it is meant for, 1080P gaming. At that rez, with settings reasonable for an entry level card, it will do its intended purpose quite well, especially on most windows 11 era machines which will all more or less run modern enough hardware for Gen 4 pcie. Even on older systems the performance wont be total crap provided you stick within what its marketed and quite open and established limits are. People can think its crap value all day long, but fact is its a card you can buy that *will* offer usable 1080p performance especially with lower to mid range settings. Lets, again, remember this is an entry level GPU that should not be seen as a maximum settings option. We can argue all day long over if the previous generations should dictate performance levels (and fwiw, I agree that it would be nice if this card performed better than its predecessor), but at the end of the day the entry level product gives entry level performance, and thats low-medium 1080p which is where the 6500 will do well enough.
Lets remember, its not like AMD is trying to pull a 3.5GB VRAM on us here... They have been pretty open about the limitations and the compromises they have made, even if you dont believe the reasons they have at least aired that the limits exist.
This I can agree with in spades, assuming both are at or near MSRP. Cats on the 6600 is what price is it? I have seen those as high as what I paid for my 6800 which is silly af. If one is choosing between a $270 Gigabyte Eagle 6500xt in stock or a 6600 at 460, well, thats a pretty big jump (both the cheapest current in stock listings on newegg). The 6600 is a much better card, but I am not sure its $190 bucks or more better, I guess it would mostly depend on the system and use case. Though I tend to think NV is a poor value if one were looking in the 350-400 range, depending on how it performs, the 3050 might be a great value. But from what we are already seeing on pricing it may come down to:
280-350 = 6500xt models
450-700 = 3050/6600
600-800 = 6600xt
900 = 6700xt
1000+ = 3060+/6700xt-6800+
Assuming one wants to buy new and/or current gen. (based on prices for purchasable goods on 01/24/22 on newegg.com)
_______________________
To wrap up, yeh, the 6500xt sucks and has lots (and lots) of compromise. Yes the 3050 will likely beat it. But for the right purchaser, and for the right budgets, and with real availability, its far from the worst someone could possibly do r/n.
Regardless on how they are marketed, my opinion on these products still stand, they are very bad products. And my main problem with the 6500XT, is that AMD did not need to cut as many corners and could have released a better product. And still kept the $200 price tag. Read pretty much every review from every publication out there and they all concur. Having 4GB is one thing, but it has been hamstrung by the PCIE 4.0 x 4 interface. Anytime a game goes over that 4GB frame buffer, it will have to dip into system memory. That limited bandwidth will slow it down considerably. On PCIE 3.0 it can be almost crippling. You are almost forced to make absolute sure games don't exceed that 4GB memory buffer, which by AMD's own admission, can be nigh impossible in today's games. Even at 1080p on lower settings. So this all equates to a very bad product, one that will not perform how AMD claims it will perform, and potentially will leave many consumers of this product very disappointed.
The RX 6500XT is junk and should have been kept as a laptop GPU as it was intended to be. It does nothing to help the current landscape of the GPU market. But honestly, lol, I don't think any GPU release does at this point.
IMHO, this GPU should not even exist and I wouldn't buy it to be a paper weight.
I don't believe the 3050 will be the competition for the 6500 XT it will be in a higher performance tier (GTX 1660 Ti) and price tier effectively.
I think the low budget gaming end of the market has been starved for a long while. AMD have thrown some crumbs.
The real competition will be what Intel does with their budget offering Arc GPU. Allegedly their 96EU desktop card will come with 4GB video memory and 64 bit memory bus. The Intel card's performance, features and pricing will show if AMD called the market right.
The article linked below relates to Intel Mobile GPU but I have see in Youtube channels speculate the desktop versions will follow the same.
https://www.techspot.com/news/93103-leaked-slide-confirms-intel-arc-alchemist-memory-configurations.html
Will Intel be keen enough to take market share by under cutting the competition?
Here's an angry AMD fanboy angry about AMD. Not everyone's cup of tea but he raises a laugh from me from time to time.
https://youtu.be/HH4foh0e1zc
RTX 3050 MSRP is 250$, but the subject of the review had a mistake on EVGA's part where they installed a vBIOS intended for the 330$ model, so performance benchmarks aren't indicative of the weakest 3050s.
It generally sits just below the 2060 KO, which back when EVGA released that model, was 300, so a 330$ card being slightly slower is just a little bit disappointing but not terrible if you can actually land one for a low enough price compared to a 2060. Compares well with the GTX 16 series cards and the 1070, so it's possible we won't see any more GTX series GPUs in the future.
On average it's at LEAST 30 to 40% faster than the 6500-XT, as I pretty much expected since NVIDIA learned their lesson a long time ago with the 9800-GT and 8800-GT and wouldn't make the same stupid mistake. No it's not an amazing value or anything, but it's still a lot better than the 6500-XT.
But, only buy the base model of you have no better options, fancy versions are a total waste.
DLSS is the selling point for me but I'm hearing initial stock at MRSP will be extremely limited where I am in the UK. Pricing is expected to go up afterwards, as much as £400 for upper estimate.
I paid £230 for a 1660 Super in 2020 and £400 for a RX 6600 XT in August 2021.
https://store.overclockers.co.uk/sapphire-radeon-rx-6500-xt-pulse-gaming-4gb-gddr6-pci-express-graphics-card-gx-39s-sp.html
Fact is its still 2x the cost for a used card that has no warranty.
The 6500xt is 100% a compromise card, but its targeted specifically at 1080p, at compromised settings not ultra or high, and at 60-120fps not high refresh. It does all these thing equally well to or better than comparably priced cards in the current market (1650 being worst case comparable in performance (gen3x4 comparible)).
Its available, its usable, and its better than an APU. Its up to the buyer if they think paying twice as much for twice the power on a used card (not nec yours, just any used higher powered card), or three times as much for twice the power on a new card. Those are the options.
A used 1080 ti vs a new 6600 xt? Same price? same performance? I'll take the 6600 xt.