Εγκατάσταση Steam
Σύνδεση
|
Γλώσσα
简体中文 (Απλοποιημένα κινεζικά)
繁體中文 (Παραδοσιακά κινεζικά)
日本語 (Ιαπωνικά)
한국어 (Κορεατικά)
ไทย (Ταϊλανδικά)
Български (Βουλγαρικά)
Čeština (Τσεχικά)
Dansk (Δανικά)
Deutsch (Γερμανικά)
English (Αγγλικά)
Español – España (Ισπανικά – Ισπανία)
Español – Latinoamérica (Ισπανικά – Λατινική Αμερική)
Français (Γαλλικά)
Italiano (Ιταλικά)
Bahasa Indonesia (Ινδονησιακά)
Magyar (Ουγγρικά)
Nederlands (Ολλανδικά)
Norsk (Νορβηγικά)
Polski (Πολωνικά)
Português (Πορτογαλικά – Πορτογαλία)
Português – Brasil (Πορτογαλικά – Βραζιλία)
Română (Ρουμανικά)
Русский (Ρωσικά)
Suomi (Φινλανδικά)
Svenska (Σουηδικά)
Türkçe (Τουρκικά)
Tiếng Việt (Βιετναμικά)
Українська (Ουκρανικά)
Αναφορά προβλήματος μετάφρασης
However, I do remember few examples:
- Final Fantasy 13 needed a mod for high VRAM AMD gpu. Super easy to install but still.
- Resident Evil Revelations had similar problem but later driver update fixed that.
- Earth Defence Force 6 was crashing on Radeon GPUs for the first few months. Devs finally fixed the problems(it was their fault) but long after there was a fix made by mod community.
Both brands offer driver level frame generation but I personally prefer lossless scaling app for upscaling and frame gen in games that don’t have it natively.
I use Radeon sharpening slider option as I like it a lot in almost all of my games including the newest AAA ones.
Nvidia has better power consumption.
But there is really not that much of a difference now. Choose whatever has a better price and makes you feel better. Swapping brands is a good way to avoid fanboism.
prior to the latest nvidia drivers both amd and nvidia had around 17-19 issues on average that said they aren't the same ones
on top of just the nvidia drivers there has been some instability already going on prior in games like the last of us and is happening in part 2 were nvidia users on average experience more crashes then amd users or better said amd has no crashes that i know of with that game indiana jones is another example its a ray traced title yet seems to run better on average on amd then nvidia from the amounts of complaints i could gather
i ain't a fanboy as like you i been changing teams each upgrade that said i am going to stick with amd for 1 the pricing is now ridiculous here at least amd has better performance vs cost and the 2nd part is i fricking hate that 16 pin connector and its a given because now even 9070 xt's seem to melt even while the connector draws less then half the max rated capacity
so if you do purchase a 9070 xt or non xt variant avoid the 16 pin like the plague stick with the trusty old 8 pin pci cable variants as the board partners were just allowed to pick what connector they wanted
so for the first time i probably stick with amd for a next upgrade that said got no reason to yet got a 6950 xt for cheap can handle games at 4k no problem
also another reason i would stick with amd is that i left the windows eco system behind for linux and still of today linux favors amd still on top linux has better fps in a lot of games vs windows if your gpu is amd
A quick example that comes to mind is Minecraft Java, which uses OpenGL. It used to rely on an nVidia OpenGL extension to allow rendering the fog in a consistently spherical fashion. For AMD (and Intel) hardware, the function was not supported so fog was more of a "straight wall" ahead that "moved" depending on your camera orientation. This resulted in changing visibility of some terrain right along the fog line depending on the camera angle. This may be less pleasant or desirable.
It's not that making the fog behave that way on non-nVidia hardware was impossible. Minecraft simply used ridiculously outdated philosophies (fixed function pipeline rendering in 2021 when hardware was moving away from that around two decades earlier). The solution would be to use the programmable shaders to write their own shader, which is an added flexibility that more recent (or in this case, slightly less ancient) OpenGL versions offer. That is what it now does as of 1.17 when it updated to OpenGL 3. This goes back to Minecraft being a passion project of Notch in 2009, him choosing to use what was already long outdated philosophies in 2009, and technical debt piling up to the point where it took Mojang way too long to address it.
Now if you play Minecraft on AMD hardware, it's absolutely great. But if you want to delve into older versions, the fog will still be like that in those versions. So how important is stuff like that is to you? I can't answer that for you.
Older emulation is another example where older OpenGL examples come into play. I've had no issues with DuckStation on my 7800 XT, but if you're using older software, results may vary.
if they are to old there are always compatibility layers like dgvoodoo 1 and 2 and 3DAnalyzer and dxvk
that said if i want to play some 16 bit i prefer using linux over getting a windows 32 bit system then using "turn windows features on or off" to enable 16 bit linux still allows 64 bit alongside 16 bit that's how i can just play animaniacs out the box old but gold wel technically wine is a compatibility layer
technically you can also use dosbox but i have not been able to get every game working through that
there are also older games that require nvidia inspector to fix ambient occlusion not working in the game while amd does not have that issue
Coming off an OC'd and undervolted 1080 Ti, I think an RX 9070 XT is gonna pull about the same amount of power, so in a way, that's more efficient per watt 🤣
Oooohhhh yeah. I'm no expert in electronics but I know enough to tell that the 16 pin just reeks of bad ideas and bad "nah it'll be fine" engineering across the board. The hell were they thinking pushing so much power through basically one wire anyway. No contingencies or anything, it's really bizarre. I'm surprised no one has raised a ruckus to any EU consumer protection agency, this is downright dangerous.
The models I'm interested in (Powercolor Reaper, Sapphire Pulse, XFX Swift, ASUS Prime) all have the 8-pin anyway.
This was a fascinating read. I'll look into it. This is the kind of stuff I hoped for, something to look into.
Same, thanks. Will look into it.
True that, but for my use case, the only vaguely "work-y" thing I do with my PC is audio production and there the CPU does the bulk of the work. I can run Reaper on my R7 7700's iGPU just fine, heh.
Source: my ownership experience with an R7 260X, then an RX 470, then an RX 5700XT and finally an RX 7900XT.
If you only care about gaming and don't care about "future-proofing" for the latest titles then I suggest the 9070xt. Got mine last week and I have to say I'm impressed
1. what do you perceive as old games mine goes back into the 16 bit days
2. what amd or nvidia gpu from that time because reality is there is a potential for any gpu to brick a older title when moving to a new architecture lets take stubbs the zombie and i mean the original copy no gpu of today works out the box with that game anymore yet when it was released both nvidia and ati had no issue running it
there is also a unfortunate truth that while games are backward compatible they might no longer have the hardware support they once had instead use legacy emulated support so new features can fit in the place of the deprecated feature that no new games use
sometimes you lose nothing when changing hardware to emulation other times some loss occurs but someone comes up with a workaround
this issue ain't just with gpu's when i moved to the 1st generation of ryzen max payne no longer worked it would crash because a instruction set was missing later was fixed with a workaround
and lastly cores a lot of bugs occur in very old games if they even allow you to run them with more then 1 or 2 cores the game doesn't know what to do with the remaining cores and might refuse to even start some can work if you change the affinity settings in task manager to use less cores but others that still does not work and you require disabling the cores in the bios for the game to run heck it might even just be tied to the speed of your processor
what the point is that a lot of factors play into why a game runs or does not run i have yet to come across a game i cannot get working one way or another this will turn into a essay if i had to right every solution i ever found
rocm is released you just need the linux subsystem enabled in windows to use it
also zuda is revived
Maaaybe they mean hardware support for 32 bit PhysX? But Nvidia's latest cards dropped support for it, anyway. If you have nostalgia for one of the 20 games that used it, just plug in an old Nvidia card. Anyone who's not a blind fanboy probably has something from the past 10 years in their basement or a closet somewhere. Or buy a used 750ti or 1030 on ebay for a couple bucks? Not a big deal.