Are publishers/developers just not bothering with optimization anymore?
If more games like Indiana Jones and The Great Circle are gonna start demanding raytracing as a minimum, despite Nvidia cards still being the only cards capable of decent RT performance, then I can't see future games even being playable if people aren't rocking the best Nvidia or AMD card (Intel is only mid-range so they're already a no go for those). If people start asking and being skeptical that PC gaming is affordable or reasonable, then I wouldn't be surprised if it gets even further ignored.

Does the gaming industry just expect PC gamers to have top of the line high end cards just to PLAY a game now? At that point, people would be better off getting a console. What happened to "You can still adjust settings to achieve playable frames at better than console level graphics"? That would be true if the games could still be run on raster. Much like the early RT games had the option to turn RT off so you can play with either a non-RT card or simply turn it off for more frames.
< >
Showing 1-15 of 45 comments
Tonepoet Mar 20 @ 5:58pm 
The 5070 almost matches the 4070 and the 9070 XT almost matches the 5070 in terms of ray tracing performance based on techspot's six game average[www.techspot.com]. Yeah if that was your main thing you'd rather have the Nvidia card since they are a bit stronger and $50 cheaper but the gap isn't anywhere near as large as it was previously, and the ray tracing perf. looks like a significant bump up from the 7900 XTX. A.M.D. is catching up in this regard. Also. these G.P.Us. can probably still get better ray tracing perf than console.
Last edited by Tonepoet; Mar 20 @ 6:01pm
Originally posted by Tonepoet:
The 5070 almost matches the 4070 and the 9070 XT almost matches the 5070 in terms of ray tracing performance based on techspot's six game average[www.techspot.com]. Yeah if that was your main thing you'd rather have the Nvidia card since they are a bit stronger and $50 cheaper but the gap isn't anywhere near as large as it was previously, and the ray tracing perf. looks like a significant bump up from the 7900 XTX. A.M.D. is catching up in this regard.

Yeah, but people would say that only matters if you can actually get your hands on a 5070 [Ti] or 9070 XT and at MSRP. These days, it doesn't seem like either is possible. The mid-range was and is always where people cared about performance and value. And now the 70 Ti series is more like yesterday's 60 series, so people are essentially priced out of the market.

A game you can't play without specific hardware might as well not exist. We've always hated consoles for their exclusivity BS (looking at you Nintendo) and now it looks like even PC gaming is gonna demand certain "hardware" just to pay to play. But at least you can get your hands on a console for less than $500. The same can't be said for GPUs.
_I_ Mar 20 @ 6:06pm 
why bother optimize?
pcs are always getting stronger/better, just upgrade to play the game as intended

most game have lowest setting low enough to play on low/entry level gaming pc at 30+fps
Most people would say just don't bother with AAA games and find better games that focus less on graphics. And you know what? They'd be right. One of my coworkers is a fan of the Yakuza and Final Fantasy games and is obsessed with them.

His only other game is Balatro on his phone. And he plays mostly on a base PS5. Personally? I'm looking forward to THPS 3+4 Remake and Mafia: The Old Country. They're more AA titles than AAA but it does show there's a demand for less eye candy and more actual fun gameplay.
It's not great in some regards, that's for sure.

A lot of this with the state of the GPU market seems to circle back to two things.

1. Node shrinks have hit a (for now) slowdown. On top of that, they are getting more expensive at a rate beyond before (add in post-pandemic market conditions). Node shrinks have traditionally been a big factor in driving progress, and keeping price for performance advancing at a good rate. This has broken down.

2. AI is a bigger focus now, particularly for nVidia but probably for AMD too. Since production time is finite, the most advanced node in particular will have the majority of its time dedicated to chips for AI (this compounds with number one above), and so consumer chips get a sliver of a production pie, and when they do get some, they are kept small in size (look at the difference between the RTX 5080 and RTX 5090, and then remember that most people buy x50s or x60s or mayyybe x70s, certainly not x80s or higher).

Oh, and this also partly explains why VRAM is low. nVidia doesn't want their cheap chips being effective AI substitutes. On top of that, like everything else, RAM ICs have stagnated in going up in capacity, which is why it's been stuck at 2 GB for a while. An increase to 3 GB is coming... (probably in time for a mid-generation refresh, or maybe next generation), and just like system DRAM, the increase is only 50% this time instead of 100%. A lot of things are actually beginning to hits walls now. If so, expect consumers to sit on existing products and upgrade less seldomly... which means manufacturers miiight just try and encourage planned obsolescence. It would also mean optimization and support will need to improve, adding burden to those areas.

On the other hand, there's some positives here.

While this bad for some games, a few games isn't the whole market. There's tons of games vying for our attention and and endless sea of them are pretty good. I still play plenty of older games (and in many cases, that means ones now considered retro).

Also, turning down settings is still an option in games that require ray tracing.

Now, yes, if you want to play those specific games, and to do so near release, and with a certain frame rate target... you might need something reasonably expensive.

Game demands don't keep strict parity with the hardware median. Sometimes there's a jump up in game requirements, and this might be one of those times.

Personally, I only buy a game right on release if I really want it, and most of those times, they aren't truly demanding triple A games. I think Silent Hill 2 was the most recent example of one like that, and funny enough, despite the common complaints of traversal stutter (which it does have), I have little complaints with its experience. That's the other side of the coin; while optimization is definitely skimped on, much (not saying all) of the noise is exaggerated and created by the vocal minority with high end hardware wanting perfect performance because they believe hardware with an arbitrary status allows for always have perfect performance, when that's never how hardware has worked.

I guess this is all weird way of saying the market will find a way. I'm worried about all the studios/developers losing jobs more than anything, and for us gamers, there's a vast sea of games to play. I believe FOMO for the latest hardware and games while they're in the discussion spotlight has simply gripped too many gamers. Slow down and just find a random game and enjoy it and ignore social media and Youtube clickbait outrage. Things can still be fun.
Last edited by Illusion of Progress; Mar 21 @ 1:34am
Lithurge Mar 21 @ 2:01am 
I played all the way through Indiana Jones with a 2070 and i7 9700k (the former just above and the latter below the games minimum specs), both of which I got over 5 years ago. I was getting somewhere around 70fps at all times.

On the other hand I've found certain UE5 games, not using raytracing where I had to use lowest settings to get 30+fps. So it's not just a case of ray tracing being the bogeyman.
Last edited by Lithurge; Mar 21 @ 2:01am
AmaiAmai Mar 21 @ 3:58am 
Originally posted by Iggy Wolf:
If more games like Indiana Jones and The Great Circle are gonna start demanding raytracing as a minimum, despite Nvidia cards still being the only cards capable of decent RT performance, then I can't see future games even being playable if people aren't rocking the best Nvidia or AMD card (Intel is only mid-range so they're already a no go for those). If people start asking and being skeptical that PC gaming is affordable or reasonable, then I wouldn't be surprised if it gets even further ignored.

Does the gaming industry just expect PC gamers to have top of the line high end cards just to PLAY a game now? At that point, people would be better off getting a console. What happened to "You can still adjust settings to achieve playable frames at better than console level graphics"? That would be true if the games could still be run on raster. Much like the early RT games had the option to turn RT off so you can play with either a non-RT card or simply turn it off for more frames.

If you don't like what publishers are doing, then don't buy the game.

When unoptimised slop like Monster Hunter: Wilds is successful it sends a message that PC gamers find such performance acceptable.

Also, you should know that behind closed doors, studios don't really care about you or you ability to play games any more -- they care about whether you will buy the game and how much money they can extract from you post-purchase.

This trend has little to do with hardware/software but everything to do with the fact that most of these companies exist on the stock market and they must generate returns or end up like Ubisoft.

But sales in the games industry overall dropped 9% YoY, meaning that there's less money going around in the market BUT game developers still need to post increasing profit.

Know what that means? It means you will continue to see broken games that run bad, as publishers will likely start pushing out more games or advancing release schedules to get ahead of a potential market downturn in the gaming industry

The question is whether or not you will continue to purchase them.
matt Mar 21 @ 9:40am 
Consoles use older AMD hardware. They keep up by dynamically lowering visual quality. You can do the same thing with any recent AMD card by enabling HYPR-RX.
C1REX Mar 21 @ 9:53am 
Originally posted by Iggy Wolf:
The mid-range was and is always where people cared about performance and value. And now the 70 Ti series is more like yesterday's 60 series, so people are essentially priced out of the market.
It’s not that bad.
2 generations old, mid range AMD cards can handle basic RT like in Indiana Jones.
These requirements for RT are very minimal.

https://youtu.be/5MocqVS3ZG4?si=snK6qWbTAA2YQcFh

https://youtu.be/0djjL9xxBP0?si=hkSj3aAg3ChEZOUE
Last edited by C1REX; Mar 21 @ 9:54am
A&A Mar 21 @ 10:15am 
Indiana Jones with ray tracing emulation on Vega64

https://youtu.be/cT6qbcKT7YY?si=ckRevghiW0Bjj3rx


I don't know, with these open source drivers and translation layers isn't it possible to bring back multi-GPU support.
Last edited by A&A; Mar 21 @ 10:18am
Indy is a really bad example to bring up here. Likely caused by that enthusiast sites always bench with all settings cranked up to the max. That means full pathtracing instead of the mandatory raytracing. Even the suggested RTX 2060 Super (launched in 2019) can handle more than lower settings.

https://www.youtube.com/watch?v=Ul_SYM67VGs

Originally posted by matt:
Consoles use older AMD hardware. They keep up by dynamically lowering visual quality. You can do the same thing with any recent AMD card by enabling HYPR-RX.


Given the current chip market, it will be VERY interesting what the PS6 is gonna look like. The PS5 Pro already upped the price, despite still being based on 2019 Zen2 chip design.

But yeah, there's people still running their 2016 Pascal cards in 2025 -- GTX 1060 / 1070 / 1080 (Ti). You can actually still get money for those on the used market, I've just sold a 1050 Ti. The main reason for that? Consoles. And consoles dictating what gets be made.

GPU prices have become ridiculous, no doubt about it. But at least unless you always aim for the max, you can keep them longer than ever before -- and their worth also isn't slashed in half a year or two after you bought them. Nobody actually NEEDS a higher end PC, unless they suffer from serious case of FOMO.
Last edited by fourfourtwo79; Mar 22 @ 12:24am
This would perhaps bother me if these nu games were interesting to begin with but sadly it's all looks and no substance. I'm not interested in fisticuffs with fps mechanics from two decades ago. And people who come up with the excuse of the plot being good should probably read books or watch cinema
D. Flame Mar 21 @ 11:45pm 
They just expect you to brute force it with OP hardware, and clown-fluencers on youtube and the like use $2000 GPUs and $800 CPUs, so they never even notice the problem. Then they get booty bothered when people point out that the games run like crap on hardware real people actually run.
OP hardware isn't absolutely needed 9 out of 10.

https://www.youtube.com/watch?v=0tnxybKwTqA


And you can thank consoles for that, they dictate what's being made. This isn't the 1990s anymore. Digital Foundry find it pretty unlikely that even the PS6 will go head to head with current gen high end PC hardware already avilable. Expect people to write about the "legendary RTX 4090" in 10-12 years time, just as they praise the 1080 Ti as "fine wine" now. In reality, this is mainly because consoles.
Last edited by fourfourtwo79; Apr 2 @ 3:35am
If you are not rich then don't buy these AAA games and don't buy expensive hardware. There are probably thousands of good cheaper non-demanding games on Steam alone.

I hope nobody forces you to buy AAA.
Last edited by Ocelote.12; Apr 2 @ 4:06am
< >
Showing 1-15 of 45 comments
Per page: 1530 50

Date Posted: Mar 20 @ 5:46pm
Posts: 45