Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
So no, they aren't relying on new features. They are relying on the increasing hardware power just like they've always been doing. Albeit more careless these times.
1. A lot of people don't seem to want to consider this, but it's not always a lack of optimization (at least not solely), but also that returns are diminishing. Sometimes it's just that demanding to do what you're asking the hardware to do. Lower settings are usually more of a "sweet spot" in terms of performance. Turning the settings up usually results in vastly increased demands relative to the added visual benefit, so it may seem less optimization from that. In other words, tune the settings in accordance to the hardware.
2. The average hardware goes up over time, so it make sense for developers to target that. Of course, developers won't intentionally makes games use what is common today if it can use a lot less, so opening yourself up to more, weaker hardware increases your buying audience.
3. Time needs to be measured in this "optimization' equation. Time is finite and time is money.
To some people, they only think of optimization as how well the game itself is optimized, but the game needs to be created, right? So looking at it more broadly, you have to consider that sometimes the development process is optimized when you factor time in, but the game might be less optimized as a result. To give an example, if a developer can spend half the time on development but it's 25% worse off for it, well... there's an argument that the process is more optimized that way, even if the game itself isn't. After a point "higher hardware requirements is cheaper than development time" becomes true, especially if the market can bare whatever that hardware floor is. You spend half the time, meaning less development cost in money and more time the studio could be working on other products to get even more money. If everything was "perfectly optimized" (which isn't possible anyway), we'd have less games being made because they'd need more development time. Just like the above example I gave where higher settings are diminishing results, eventually, you spend an increasing amount of development time for less performance increases. We live in a world were the most precious resource of all is time.
4. Many triple A games in particular, which are made for consoles first, don't always translate to PCs as well. Despite current consoles being closer to PCs than ever, there are some differences. Namely, the consoles have a shared (unified) RAM/VRAM pool, and they also have the ability to bring assets directly from storage to VRAM. Neither are true for the PC, and DirectStorage hasn't turned out to be the magical answer yet. And some people don't like to hear it, but the modern mainstream graphics cards are short on VRAM and it's sometimes part of this too. Usually, turning textures down helps if it is, but most people try medium or even high as this is often "free visual quality" as long as you have the VRAM, and not everyone will be aware that... they might not because relatively recent, mainstream GPUs are just a bit short here. The consoles have a shared pool of 16 GB, for reference, although not all of that can be used for graphics (it's probably often between something like 8 GB and 12 GB for the heavier games I think?).
5. The market for developers is... in a pretty bad state the last year or two, in case you haven't noticed. The constant news of layoffs, studio closures, and so on, sometimes right after a game releases. Turnover rate may be high too, meaning a lot of them are newer, inexperienced developers (many of who get quickly grinded out of the harsh market). This sort of leads right into the next one...
6. Many current game engines (you've probably heard Unreal Engine 5 in particular thrown some blame) seem to make things easier for the developer, but also still need some care and knowledge or else you won't get the best results. Stuff which fresh developers may not have. The engine itself may also have some issues. And again, these engines are often targeting the current consoles now, so 5+/7+ year old PCs aren't always going to fare as well. Sometimes, people are expecting more than their hardware can do.
"But this older game looked better while needing less hardware"? That's usually the exception, not the norm. Other times, many of those same older games were considered brutal/unoptimized on what was aged hardware in their time, but a later retrospective with faster hardware masks that a bit. Other times, it's simply apples to oranges. A lot of games can certainly do with better development, but one game doing something doesn't mean any other can automatically get better results, minimum.
7. Most important of all, customers buy it. That's what makes it possible in the first place. The bottom line will determine if developers/publishers do this. As I said, if trading off xx% performance saves them xx% time, and customers accept it, they will do it of course. It's not done with malicious intent; it's just done because, to them, that's how they best optimize their time and costs. They aren't just having to optimizing a game itself, but the very development process of it.
Unfortunately, games (and even software as a whole) has shifted more towards a "lay the foundation, get it out the door, and add the rest/fix it later" practice because... it's apparently still working for them. Consumers are accepting it. But this is also pushing back against this in recent times because you're hearing that despite how successful some things are, they're not always meeting expectations (Final Fantasy XVI, and even VIII Rebirth to an extent, comes to mind). Sometimes high sales aren't enough if development/marketing costs are also going higher. Usually, these things only "fix themselves" with a soft crash where the entire market suffers like it is now, because in normal times, single failures just get written off as the game itself being poor as to why it failed, not because their process didn't give it enough time and care to have a fair chance.
While I think most of these reasons (and perhaps more that I'm not even aware of/mentioning) might explain it, this is all purely speculation.
They do. Problem is it's just a magic buzzword to you to trot out any time you think a game should be less demanding than it is.
And people have these fantasies about specific games they remember as being finely tuned masterpieces and forget about the other 10,000 games normal games, and the incessant complaining about optimizing compared to even older games.
The reality has always been developers have deadlines and not many games can just be delayed perpetually until they're "optimized" to some armchair developers imagination.
The other problem is how much is optimizing worth beyond a certain point? Does a game sell better if it gets 10 more FPS on some systems? How much better? Cite your sources.
The game engines already come with pre-programed techniques which makes the production a lot faster.
Now the new Indiana Jones that requires RT tells me that they took the old GI research and it is throwed in the trash bin.
And speaking of anti aliasing, everybody is obsesed with TAA which is cheap and makes the games blury in the same time.
2.) A lot of modern devs only know how to use RPG maker like tools, so they have no idea how to optimize code. They just slap crap together in their tool (like UE5) then toss it out the door. They just hope for the best, or hope that GPU tech can brute force past the lack of optimization.
And honestly, companies should not be trying to force garbage like ray tracing, when they can't even make games with stable 60FPS framerates even with RT turned off.