Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
I'd rather be able to play the game on Medium for the entirety of the game's lifespan than to max it out for the first two years, and dial everything to low on the third onwards.
A theoretical UE4 would likely have to have a cel-shaded "comic book" filter, which would allow Tripwire to get away with using lower-resolution textures and low polygon counts. Even then, that'd only delay the collapse from content bloat by a year or two.
It is solely up to the one creating the project to set a baseline for required hardware resources needed to run the game/software.
One major wrinkle here is that you can have instances where (especially in games) the end result doesn’t justify the hardware resources needed to run the application at an acceptable speed, but again that will rely solely on the ability and willingness of the developer to optimize the application to ensure it is not wasting resources on features that don’t contribute to product experience.
SideNote: to address the OP’s mention of how the game Immortals of Aveum has horrible performance it’s worth keeping in mind that as I mentioned at the top of this post Unreal 4 & 5 can scale across a wide range device with vastly different hardware capabilities, and it’s up to the dev how demanding they make their game with regards to currently available hardware.
The engine simply provides the framework in which to build you’re project upon.
If the game doesn't use lumen and nanite it won't be as demanding as something like immortals of aveum
Pretty sure an UE5 game could be just as demanding as an UE4 game if it doesn't take advantage of the new features
That is correct, the developer has to look closely at which features they want to implement into their game and weather the gameplay/visual gains are worth the increased resource usage (GPU, CPU, Disk IO etc…).
This is also not restricted to features that ship with the engine either.
There are many variables that can contribute to “how hard a game is to run” outside of the Nanite virtual geometry system or the Lumen lighting engine.
A few basic examples would be;
- Shader complexity
- sheer triangle dencity
- lighting complexity
- amount of different shading models present in the rendered frame at a given time
- literal world scale (does it require full precision floats for massive levels or to scale planets)
- AI complexity/amount
ect, ect, ect…
It can still go both ways UE4 and UE5 have the capacity to saturate any current gen system you could possibly throw at them.
UE5 (due to being newer) does have baked in features that would allow for the creation of an application with a much higher ceiling in terms of hardware resources however this is really just an on-paper difference right now as both engine versions are fully capable of going far beyond any hardware configuration that could exist today.
Obviously, this will change in the future, but for now this is where they stand.
Nanite generally improves performance - it's by far the most efficient geometry rendering system in games today, allowing for polygon counts you can only otherwise achieve with mesh shader (which it sadly doesn't use). While it does allow you to use really high polygon models you wouldn't otherwise be able to use due to performance constraints, insane polygon counts will still incur some cost. Can't ignore polygon budget completely.
Use Nanite, use Virtual Shadow Maps which also helps. Lumen should be supported too; on its most performant setting in UE5.2 it's not too demanding for console hardware or the very common RTX 3060. Though RTXDI would ideally be implemented to significantly improve performance in scenes with lots of dynamic lights.
Also Tripwire has to do PSO caching and async shader compilation.
The more items? Unless you mean adding more and more objects to levels (which may not increase demand noticeably when using Nanite, unless they increase scene complexity and create more RT light interactions and such), it won't.
Does not sound like it scales very well to me, at least from all the examples I've seen. Unless you like low frame rates.
Yes, Immortals of Aveum is full of technical issues and was seemingly released prematurely (the devs did open up about their experience). Plenty of games seem to be designed with upscaling in mind, purely out of laziness, though laziness isn't the only reason for that game. It's a much worse showing than Fortnite as well as previous showcases like the Matrix demo. Its performance is surprising to me compared to what I'm used to in UE5.
There will be an adjustment period as games made on UE4 transition to UE5 - these may not use Nanite fully because some meshes can't be effectively converted to Nanite, so it'd be disabled for those. Lumen is quite scalable for an RTGI solution (but has plenty of shortcomings) so game devs need to implement that correctly for every graphics settings preset, include the important Lumen options in graphics settings (not just on/off), and test their games better which is a recurring theme for 2023 games on any engine.
Not that UE5 is perfect or anything (there's a lot to be desired especially in CPU thread scaling), but many of these big problems are clearly problems with the game more than the engine.
UE5 has been out for years, its limitations are well understood. I'm still not sure what was meant by "items" though.