Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
*quick google search* Yep, it's using the RE Engine.
Take the Frostbite Engine: Great for shooters and such. But when EA started trying to shoehorn it into all their games, problems came up. BioWare allegedly had a ton of trouble making it work for a semi-open world RPG like Dragon Age: Inquisition because the engine wasn't designed for it. Same with Mass Effect: Andromeda.
the issue isnt the open world either, like a lot of you seem to be parroting with out critically thinking about it. Yes its not made for open worlds, but the open world its self is acceptable.
DD2 runs fine when outside of a town. My i7-12700k and 3080ti were able to hit 70+ fps
Walking in to town would drop it to 40fps.
In wilds beta I was getting 55fps 3080ti (native 1440p ultra) / 65fps 4080s (native 1440p ultra) until I got the same drops to around 40fps in the towns.
What could be the linking factor? RE Engine npcs. The npcs in RE Engine are very cpu intensive, thats the major issue.
Monster Hunter has more pull than Dragons Dogma, so we might see engine updates to help with it this time. (Which I hope would trickle down to DD2)
UE5 is never to blame for dev incompetence and inserting trees and foliage with absurdly high polygon counts for example.
look at the gameplay segments of the benchmark to see this
the gameplay segments in open areas tank performance
like going from a difference of 80 fps to 40
compare cyberpunk at max settings to any unreal 5 open world game to see the difference
unreal has massive issues with stuttering
open world games show this off a ton
Even though this has been discussed, one big reason studios will choose UE5 over an older internal engine is because far more people know UE5 which makes scaling the team and hiring new people an easier process.
Often it can take a long time for new hires to learn a complex internal engine as well as the cost of maintaining and updating your engine pipeline. On paper there are are real cost benefits to using a well known, well supported off the shelf engine.
Also. Just to be 'that guy' but i haven't had an issue with any UE5 games as far back as they have been launching. Even Calysto Protocol at launch i was able to get decent performance out of and its only improved since launch.
A huge part of this 'UE5 is a bad engine' dialogue is coming from people with weaker outdated hardware. So while yeah UE5 is very demanding and if you don't have a well put together and modern system you are going to struggle with asset streaming, global illumination and EU5's global illumination - its still an incredible engine and once hardware catches up people will realise that.
I'd rather game engines comtinued to move the industry forwards in terms of visuals instead of everything being held back by PC players with budget systems.
Back in the day the PC master race used to be criticise consoles for holding things back but now we have this entire generation of budget PC gamers who seem to think the entire industry should cater to their systems while they also fail to understad how to do the most basic of configuration of any game settings.
Buuut...
*Throws PC out the window*
No, dearest poster. Engines are not forced onto developers. They're chosen or made with whatever the game needs to have function wise, or based on the skill of the coders, or what they want to use.
For instance, a lot of indies use Godot or Unity, since it's free or cheaper than a licence to a bigger engine, and could take less of a cut of sales.
Everyone uses Havok physics because it does the job, and it's easy to include it with the engine they're running on, for instance. Or Noita needed to code their own engine because of how the game was dependant on every pixel having it's own property.
It's cheaper to use an off the shelf engine, a lot easier, and typically, unless you need features that the others can't do, the only other reason you create your own engine is to show off that you can. And, since not every developer is a multi building, multinational business, sometimes showing off is too expensive.
Not the...dumbest take on this I've heard, but it's up there. Still, hopefully I've cleared the waters there for you.
Now...I just need to get my box out of the pond....
im running a 7900xtx 7800x3d on one of the more expensive pcie 4.0 ssd's and i still get stutters constantly on unreal 5 games
You are comparing cutscenes to gameplay, which is moronic. You NEVER compare those when looking at performance, its misleading.
i literally said the gameplay segments
it's only like 1/3rd of the benchmark or something