Steam installieren
Anmelden
|
Sprache
简体中文 (Vereinfachtes Chinesisch)
繁體中文 (Traditionelles Chinesisch)
日本語 (Japanisch)
한국어 (Koreanisch)
ไทย (Thai)
Български (Bulgarisch)
Čeština (Tschechisch)
Dansk (Dänisch)
English (Englisch)
Español – España (Spanisch – Spanien)
Español – Latinoamérica (Lateinamerikanisches Spanisch)
Ελληνικά (Griechisch)
Français (Französisch)
Italiano (Italienisch)
Bahasa Indonesia (Indonesisch)
Magyar (Ungarisch)
Nederlands (Niederländisch)
Norsk (Norwegisch)
Polski (Polnisch)
Português – Portugal (Portugiesisch – Portugal)
Português – Brasil (Portugiesisch – Brasilien)
Română (Rumänisch)
Русский (Russisch)
Suomi (Finnisch)
Svenska (Schwedisch)
Türkçe (Türkisch)
Tiếng Việt (Vietnamesisch)
Українська (Ukrainisch)
Ein Übersetzungsproblem melden
Hopefully this choice will lead to them further refining their cards and really nailing down that midrange. They generally hold the edge over Nvidia in the $300-700 bracket which I think is the range where most regular people will be buying.
AMD currently dominates the CPU market, though. All of the top performers are AMD, 11 of the 12 best-selling CPUs on Amazon are AMD, and Intel is currently a bit of a laughing stock. The only thing keeping Intel's market share respectable are prebuilt PCs.
All this aside I should note that one company dominating in hardware is NOT A GOOD THING and leads to excessive price gouging and lackluster performance gains. We've seen this time and time again, and we currently see both AMD and Nvidia doing this.
The way I understand it, when AMD makes GPU chips right now, they're forced to allocate between AI/compute products ("CDNA"; Instinct) and graphics products ("RDNA"; Radeon). The hardware has many similarities, but due to the complexity of AI software stacks and stability for big institutional customers, they more-or-less froze the Instinct family at the Vega instruction set.
So the key medium/long-term goals for AMD are:
1) Unify the AI/compute and graphics products into a single instruction set (and ideally also microarchitecture). This is already on the roadmap, with the apparent plan being to skip "RDNA5" in order to ship "UDNA6".
2) Get chiplet-based GPU production dialed in.
In other words, they're trying to make their GPU products strategically resemble their CPU products, and just haven't gotten there yet.
According to "some" it's indistinguishable from the real thing. (pretty funny, right?)
When they state that, I ask: then why even call it AI anything if it's indistinguishable? Why differentiate?
You can guess what happens: rage or block , every time. It's a joke.
AI upscaling is ugly and makes my eyes hurt. Any game that forces me to use that is an automatic skip, not that there are any good games these days anyway. Most are riddled with bugs or are completely lazy.
This Ryzen 9 cpu is a joke comment? The Intel equivalent uses more electricity, and the Ryzen 9 up to 20% less.
A fifth less on the electric bill is a joke? Plus electrical wear and tear on the equipment......
but each has their own market and both survive
Otherwise I'll just go amd and also amd it's easier to run Linux
The good old bait :D
7900-XTX only traded blows with the 4080 but drew almost as much power as the 4090, or more if you pit a top end model like Sapphire's NITRO+ against a 4090 FE.
7900-XTX models with a particular vapor chamber design (AMD reference model, PowerColor Red Devil, etc.) were plagued with heat problems due to a flaw in the cooler design, which caused people to return their cards.
Many users, some of whom I know personally, have noticed inconsistencies in performance (larger FPS fluctuations, stutters, etc.) with it as well compared to 4080s and 4090s which is probably owed to AMD's drivers if it isn't being caused by manufacturing defects. Not the CPU, because it was all with top end gaming chips at the time like the 7800X3D and it affected multiple games.
Multiple reasons cause AMD to have lower market share in the GPU market, fewer customers in general but more-so in the high end, most of AMD's market share is in the low end to mid range where people can't just throw money around willy-nilly. So they dropped the high end to focus on where their strengths are in the market, building wealth to more than likely fund a more lucrative project to build back up to where they're ready to re-enter the high end graphics market and actually compete with NVIDIA.
Comments like this are just uncalled for and really speaks of your own intelligence. If you really need to feel the need to attack someone because they made an observation that NVIDIA is completely dominating the market (which they are, stop kidding yourself), then that's an issue you need to work on.
Radeon only offers one thing to its consumers: higher performance per dollar value. That only works for people that aren't interested in anything else and overlook every other detail.
GeForce has more to offer, and that's a fact;
1. In addition to AMD FSR and Intel XeSS, with GeForce RTX you get support for DLSS, so it wouldn't even matter if games didn't support one or the other, because you can use all of them and will always get access to extra performance if needed.
Speaking of DLSS, for those that can make do with frame generation, DLSS-FG is the most effective variation compared to AFMF.
Raytracing isn't really an argument anymore between AMD and NVIDIA because AMD performs almost as good as NVIDIA in every segment of the market, it's other technologies and support that makes up the difference.
2. NVIDIA's drivers are tried and proven to be better than AMD's, even post-2020. If AMD's drivers weren't always ranging from being a little bit behind in quality to being so bad that they bordered on being completely unusable, then AMD wouldn't have garnered a bad reputation for their drivers.
3. NVIDIA now offers better performance per watt again compared to AMD, and it's trending upwards, with Blackwell expected to be more efficient than Lovelace while AMD has been doing the opposite, RX 7000 series was a sizeable increase in power consumption for not much improvement over 6000 series aside from the 7900-XTX, which uses way too much power to only trade blows with a 4080. Lower power usage means lower running cost and lower temperatures. A more efficient design is also more durable and less prone to failure.
Pretty much this, AMD mainly appeals to people who don't have a lot of money and don't care about any kind of frills, whereas people who want the best go with NVIDIA
Problem is that in this day and age graphics pipeline devs are either lazy or don't have the tools at their disposal - or both - to do anything that comes close to the crisp and optimized image we have a decade ago. So whenever the taa blur is alleviated by dlss people call it "better" and conveniently omit other problems this tech introduces.
It's what it is and I have no hope we will see any difference in the future unless AMD and NVIDIA magically design hardware that is able to do full res pt
This gives me 4080 power, albeit, without the RT capabilities of an Nvidia GPU
It was the price point that got me in the end, always had Nvidia cards but wanting over £2000 for their premium card these times round left a sour taste, especially since in my opinion, RT is still in its infancy and still has a long way to go
There's no doubt that Nvidia has the better capabilities if you go high end but that doesn't escape the fact that I got my 7900XTX for around £400-500 cheaper than a 4080 and goes toe to toe with it in raw performance anyway. Let alone that I've seen post after post on new game forums stating the amount of problems 4090 owners are having getting games to work correctly
Sad thing is that I'll likely be ready for a new rig next year and if AMD are not going to release anything high end then I'll be either forced to pay Nvidia's greedy prices or wait until AMD release something worth the upgrade cost