What is "bare minimum" GPU by modern stndards?
I am about to complete my rig (AMD 5950X/32GiB RAM/2TB SSD) with a "reasonable" GPU.

Currently I have a paleolithic card suitable only for basic window rendering (G98 [Quadro NVS 295]).

OS is Linux (Debian Sid) and I plan to use it ALSO for gaming (mostly through Steam Proton).

Question is:

What is considered "bare minimum" GPU to run modern titles? (BTW: I have all AC, FC, GoW, Death Stranding, TR and similar games in my lib)

I don't care much for super-high definition/effects, but I want a decent rendering with little/no lag and a bit of "future-proof" to be able to use this thing for a few years.

I've seen a lot of guides comparing price/performance, but it's unclear if "great buy" will actually be able to run AAA games.

For comparison: my current rig (i7-8700K/32GiB RAM/1TB SSD/6TB RAID/RX580 GPU) is about to be shipped away, but it's still working OK.

Any advice?
TiA!
< >
Visualizzazione di 61-70 commenti su 70
Messaggio originale di Illusion of Progress:
Bad optimization may indeed be at play, but that is a variable that's been around. It's not new. Something else also explains what is going on.

What's happening is the latest generation of games really need this much. They target the latest consoles and may not scale down too far.

Understood, but IMHO this is *not* what's happening in Hogwarts Legacy.

Game crashes (supposedly due to VRAM exhaustion) *only* while completely reloading maps, either because entering/exiting some dungeon or because of some fast-travel.

There is NO reason to fail there, especially so because relaunching the game passes the same critical point without a hitch.
What I see is I have from 3 to 5 "reloads" available, then it crashes, rinse and repeat.
From programmer point of view this means there's either a memory leak (consuming VRAM over time) or garbage collection is failing (badly) somehow. I never had a crash "in action".
... this is somewhat "nice" because forces me to take a break from the game from time to time ;)


OTOH I fully appreciate Moore law for RAM (my first Linux install, loooooong ago, was on 4MB Ram and 40MB Hd, yes MEGA bytes) and thus I think I'll shoot for at least 12GB, possibly 16GB.

Thanks.
Which is why I also stated, and even started with, "bad optimization may indeed be at play".

The other thing I was mentioning (hardware requirements going up) isn't mutually exclusive with "flaws may exist".

Both can be true simultaneously. I was mostly pointing out that there are very real hardware requirements increases that a lot of people seem to want to just rule out as a possibility entirely, and write off as solely bad optimization.
Messaggio originale di Illusion of Progress:
Which is why I also stated, and even started with, "bad optimization may indeed be at play".

The other thing I was mentioning (hardware requirements going up) isn't mutually exclusive with "flaws may exist".

Both can be true simultaneously. I was mostly pointing out that there are very real hardware requirements increases that a lot of people seem to want to just rule out as a possibility entirely, and write off as solely bad optimization.

I think what what he got from your post is that it seems a lot of it focused on the "optimization" aspect of the game development and a lack of the and the second paragraph...

"What's happening is the latest generation of games really need this much. They target the latest consoles and may not scale down too far."

...may have kind of lead him to believe you were "for the most part" inserting that as somewhat of the answer to the Hogwart's issues he's facing.

This is speculation, though. But, It's kind of what I would have interpreted it as if you had responded to me so I'm just saying.

Granted, I've never actually heard of those crashes in Hogwart's as much junk as I've been criticizing it, so I didn't even factor that in to any hardware recommendations. It's actually like an ENTIRELY different issue. No amount of ram will escape that level of incompetence.

It is strange, but it might not even be a permanent issue (although not sure what he's done to try to resolve it). It might possibly be like the Bioshock 1/2 Remasters. About half (guesstimating) face constant and repetitive issues with random crashes. This is widely known/reported but I almost never face them. Almost didn't believe it myself until I saw it during a brief period and I couldn't fix it. Didn't know what caused it, but eventually it got fixed through my constant re-installs I guess and all sorts of other configuration changes.

Definitely an issue with the games, but still hope of a fix outside of bumping up hardware.
Dont buy and stay away from "second-hand", "used" and/or "outlet" GPUs as these were literally raped for crypto mining. You can never be sure. These cards will brick in no-time. Wasted money. Even manuals of latest devices and machines tell you to not "abuse" these for crypto mining.

Only buy factory new GPU cards. And these will support you for next 10 years easily..

And we talk about "bare minimum" and "future proof".

Of course a GTX 1080 can be counted as "bare minimum", but for sure is NOT "future proof". We entered the RTX era, GTX is obsolete just like the GT era after GTX appeared as successor.

A RTX 3050? Sure, "bare minimum", but in no way "future proof".. A 3050 is the biggest trap for a newbie. RTX 3060 Ti will fit though.

Just re-check my list which I had posted on page 2.
Ultima modifica da N3tRunn3r; 27 apr 2023, ore 17:46
Bare minimum GPU for 1080p these days to play new AAA games at 60+ fps High settings is RTX 3060. If you want little future proofing, then the 6700xt is better choice.
Messaggio originale di 🦜Cloud Boy🦜:
Bare minimum GPU for 1080p these days to play new AAA games at 60+ fps High settings is RTX 3060. If you want little future proofing, then the 6700xt is better choice.
Yep, they cost about the same, at least on Amazon US. Yes, the RTX 3060 is faster with RT enabled, but GPUs of this tier, it's better to NOT enable RT as the performance hit cost is too much. Even with DLSS at 1440P, the RTX 3060 just about equals the performance of the RX 6700 XT at 1440P native in Returnal. Without RT, the RX 6700 XT owns the RTX 3060 in many games...
Ultima modifica da UserNotFound; 27 apr 2023, ore 20:34
Messaggio originale di emoticorpse:
I think what what he got from your post is that it seems a lot of it focused on the "optimization" aspect of the game development and a lack of the and the second paragraph...

"What's happening is the latest generation of games really need this much. They target the latest consoles and may not scale down too far."

...may have kind of lead him to believe you were "for the most part" inserting that as somewhat of the answer to the Hogwart's issues he's facing.
I was speaking broadly of the trend of recent games having high hardware demands and issues rather than of any one particular instance of an issue. It was largely your post and its comment about "games are approaching cripplingly bad optimization" immediately above that got me to say what I did.
Messaggio originale di N3tRunn3r:
Of course a GTX 1080 can be counted as "bare minimum", but for sure is NOT "future proof". We entered the RTX era, GTX is obsolete just like the GT era after GTX appeared as successor.
Huh? There was no separate "GT era" and "GTX era" though. They were separate tiers of products that existed simultaneously, and nVidia merely rebranded everything RTX instead to shifting focus to them supplementing their GeForce products with ray tracing.
Just as an example however.
:cgrazz:
< >
Visualizzazione di 61-70 commenti su 70
Per pagina: 1530 50

Data di pubblicazione: 26 apr 2023, ore 4:51
Messaggi: 70