Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
You say max settings, yea per what that GPU allows for maybe. And what screen res? Yea 1080p of course. Cause in many games you're going to get maybe 40-50 fps on a 1080 or 1080 Ti where as with a 3080 or better its 90-120+ and with even better graphics quality options, and that's before any DLSS comes into play
Can you imagine an almost decade old graphics card is playing new games at max settings tho, that is simply amazing. This is proof that graphics quality hasn't been improving like it did in the past.
It's because we're not seeing huge graphics increases in games anymore like compare the graphics of the PS1 vs the PS5. This was the huge increases in graphics where they increased the polygon count and you needed better hardware to play, that isn't happening anymore.
The major reason is because when the GTX 1080 and Ti came out, they already were top of the line for processing large amounts of polygons and since there has been no real increases in this type of graphics in the past decade and likely never will be again, the PC gamers don't have a reason to upgrade anymore, unlike console users who are basically locked into buying the latest console to play new games.
https://www.youtube.com/watch?v=3BbB-dE6oB8
I mean still, the 1080 ti has fared very well, a lot better than my GeForce 2 GTS 32MB card did.
dont believe influecer and corpo bulldung doctrin
I was going to make a similar analogy. There has never been a time in my life where I've been able to go without upgrading my PC for this long. The fact is simply that there are no more games coming out with some major graphics improvement that would require a better graphics card because we have hit a point where the PC graphics are simply not going to improve anymore. There is a point where it's too costly to make a video game that looks hyper realistic so it's never been made and probably wont ever be made.
This would be a game that is indistinguishable from real life, but I do not think that game will ever be made!
This video is explaining it very well, basically PC graphics are actually getting worse believe it or not, and even the latest games do not have major improvements like from what I said, PS1 to PS2 type improvement that is noticeable and actually needs stronger hardware.
This is because in order to make a game look extremely realistic it costs a ton of money and most studios are not spending billions of dollars to make a game look realistic,
What we are seeing is even the top AAA titles are looking in some cases worse graphics than they did 5 years ago.
Well your predictions about graphics is wrong. Easy raw performance increases might slow down, but they'll still exist.
The orher limiting factor is consoles. Many games are cross platform and consoles will dtive the devisions.
At least there's been times where I thought it might be pointless for games to look better, but then I see what better is, and it's usually impressive.
Indiana Jones is a great example.
Having gone from sli 1080ti's to a single 2080ti, 3090, 4090 and now 5090 and wanting a constant 120fps at 4k with raytracing, id say there is a need to upgrade IF you want to maintain a certain performance and fidelity level, if you don't, then if course you don't need to change, but it us disengenuous at best to suggest that the 1080ti is the power house it once was.
Also, raytracing is incredible lighting, reflection and shadow tech and the only people I really hear say that it's not are those that cannot use it.
Games have gotten far more complex over the past 10 years with more detail in both characters and environments plus larger scale overall.
1080 and 1080Ti can’t handle 4K60fps and can’t do DLSS upscaling. Few games don’t run at all.
If a SteamDeck can run most PC games then so does the 1080 Titanium but if someone wants a better than consoles (ps5 pro) experience then 1080Ti and regular 1080 are not enough any more regardless of how fantastic that card is and how well it aged.
Even my much more capable RX 6900XT (in my 2nd rig) can do many games well enough at 3440x1440, but some AAA titles require some visual compromises via in-game graphics setting to net smooth framerate. But as long as the card can handle games at pretty decent ingame graphics setting and still net smooth enough framerate, I can deal with it.
The GTX 1080 Ti is capable of decent enough graphics setting at 1080P, even at 1440P, but for graphically intense game. I haven't gotten Doom TDA yet, but I believe the minimum requirement is a card that is RT capable, minimum being RTX 2060, AMD RX 6600. IF more games are made with such requirement, it'd be harder and harder for the GTX 1080 Ti to stay relevant. With the passage of time, the GTX 1080 Ti would fade into obsolescence and irrelevance.
I must state that the GTX 1080 Ti is truly a revolutionary card that set the standard at the time, it is indeed a testament to the high quality and capability of the card that's surprisingly, still relevant to this day.
In some ways, the 1080ti remained relevant for less time than previous high end cards as it was incapable of the tech developed on the very next generation.
The 1080ti was a good card, but a 2080ti has stood the test of time far better and was a much better gpu overall.