Immortals of Aveum™

Immortals of Aveum™

View Stats:
KwizatZ Aug 26, 2023 @ 4:01pm
Nice game, but UE5 Lumen is pretty bad
Honestly, when Lumen isn't messing up half of the shadows making them flicker like crazy, the sun + GI Lumen just burns the scene and washes all colors :
https://www.youtube.com/watch?v=kiA65z7qLmY

This is 4K max settings with DLSS2 Quality

Not even speaking that this bad Lumen is 100% software in this game, aka 4090 RT hardware is useless.
Man some hardware RT instead of Lumen would have made this game look so good, at the point they could have enabled HDR 🙄. But I understand it's not possible with this Lumen garbage...
< >
Showing 1-10 of 10 comments
JinxTheWorld Aug 26, 2023 @ 4:28pm 
Yeah lumen is still a bit early and needs cooking a bit. I expected it to be worse than it is in this game so bonus points I guess. In terms of accuracy and speed it can't match hardware RT'ing, but after some fixing it should come pretty close. It will however never match path tracing.
Squiggly1 Aug 27, 2023 @ 8:38am 
Adjust the gamma. The game's default is too high, leading to everything looking washed out with an extremely limited dynamic range. I dropped the gamma from its default of 2.2 to 1.6. Now the game looks like it should. Darks are inky black while still having detail, colors pop, and and skin tones don't all look washed out. It's not a replacement for HDR, but it at least corrects how awful the game looks in default under SDR.
Medusa Aug 27, 2023 @ 8:48am 
Lumen is there to support something in the line of Raytracing on console.
Consoles can never do Raytraced GI at a halfdecent framerate, therefor Lumen has come into play; but Lumen is actually much slower than Raytraced GI is on high-end NVIDIA hardware, while also looking worse.
Therefor i have already stated that Unreal Engine is not really a PC-focused Engine, and also just doesn't make sense for the future when all hardware does path tracing rather than software approximations like Lumen.
Squiggly1 Aug 27, 2023 @ 8:57am 
Originally posted by Medusa:
Lumen is there to support something in the line of Raytracing on console.
Consoles can never do Raytraced GI at a halfdecent framerate, therefor Lumen has come into play; but Lumen is actually much slower than Raytraced GI is on high-end NVIDIA hardware, while also looking worse.
Therefor i have already stated that Unreal Engine is not really a PC-focused Engine, and also just doesn't make sense for the future when all hardware does path tracing rather than software approximations like Lumen.


Lumen has a hardware supported mode, and there's nothing stopping devs from using generic DX based hardware raytracing or NV/AMD exclusive models. It's up to developers as to what implementation they wish to support. Unreal Engine 5 also supports DLSS up to version 3.5. So I don't know what you're on about with it being console first. Neither the PS5 nor Xbox S/X has Nvidia hardware. So, why support a PC only feature with DLSS 3.5 if the engine is console only?
Medusa Aug 27, 2023 @ 9:33am 
Originally posted by Squiggly1:
Originally posted by Medusa:
Lumen is there to support something in the line of Raytracing on console.
Consoles can never do Raytraced GI at a halfdecent framerate, therefor Lumen has come into play; but Lumen is actually much slower than Raytraced GI is on high-end NVIDIA hardware, while also looking worse.
Therefor i have already stated that Unreal Engine is not really a PC-focused Engine, and also just doesn't make sense for the future when all hardware does path tracing rather than software approximations like Lumen.


Lumen has a hardware supported mode, and there's nothing stopping devs from using generic DX based hardware raytracing or NV/AMD exclusive models. It's up to developers as to what implementation they wish to support. Unreal Engine 5 also supports DLSS up to version 3.5. So I don't know what you're on about with it being console first. Neither the PS5 nor Xbox S/X has Nvidia hardware. So, why support a PC only feature with DLSS 3.5 if the engine is console only?
I didn't say the engine is console only, i said it's main focus is console hardware to enable something similar to Raytraced GI on console.
But as far as i know the hardware lumen mode just extends upon the software mode, with a fully hardware-based RT-renderer still being much faster on competent PC's.
Of course you can do anything with the engine you want as a dev... but it's main focus when building the engine was delivering prime gfx on current gen consoles, at which they have failed though.
Squiggly1 Aug 27, 2023 @ 9:59am 
Originally posted by Medusa:
Originally posted by Squiggly1:


Lumen has a hardware supported mode, and there's nothing stopping devs from using generic DX based hardware raytracing or NV/AMD exclusive models. It's up to developers as to what implementation they wish to support. Unreal Engine 5 also supports DLSS up to version 3.5. So I don't know what you're on about with it being console first. Neither the PS5 nor Xbox S/X has Nvidia hardware. So, why support a PC only feature with DLSS 3.5 if the engine is console only?
I didn't say the engine is console only, i said it's main focus is console hardware to enable something similar to Raytraced GI on console.
But as far as i know the hardware lumen mode just extends upon the software mode, with a fully hardware-based RT-renderer still being much faster on competent PC's.
Of course you can do anything with the engine you want as a dev... but it's main focus when building the engine was delivering prime gfx on current gen consoles, at which they have failed though.


Try running the same titles on a PC equipped with equivalent hardware to either the PS5 or Series X. The efficiency of shared RAM, highly efficient I/O with custom API's, and singular hardware SoC's will eclipse the similarly specced PC's performance.

It's completely disingenuous to compare PC's running 4090's, with 64GB of DDR 5, the latest 13th gen Intel or Ryzen 7800X3D CPU's, and new PCIe gen 5 NVMe's to consoles that cost less than half of what the 4090 by itself costs. Ignoring that AMD is behind Nvidia in raytracing efficiency by a generation, while right on par or better than NV in rasterization.

Of course Epic are looking at optimization on console, and on PC. Full path based raytracing is still at least 3 to 5 generations off for the masses. It's pretty awful on even the highest-end PC's with 4090's when you consider you don't get anything resembling a decent framerate without copious amounts of DLSS and frame generation (which doesn't help latency at all) in anything modern employing the tech like Cyber Punk's raytracing overdrive mode.

I wouldn't count on NV for much longer with their dominance on ML/AI being their mainstay for income. Gaming is just a blip on the radar for them now. Meaning they can charge as much as they want and do as little as they want with no real impact on their business. Intel hasn't been in the game long enough to say they'll be around. AMD has the console market to rely upon and looks like they may be eschewing enthusiast level hardware for more mass production level stuff.
DeadlyKitten Aug 27, 2023 @ 10:14am 
They should stick to Unreal4 for now until Epic fixes and optimizes their crappy new version.
Hobgoblin Aug 27, 2023 @ 2:35pm 
I beat the game and did not notice a single flicker. The only thing I noticed was shimmering from FSR. So I turned it off. I did experience shadows flickering in Redfall, but that was due to FSR too. Try turning off DLSS and see if they still flicker. It could be Nvidia just need to release a driver update. It sounds like, so far, being an AMD optimized game? The only people reporting stutters and bugs are on Nvidias. So it's probably just a matter of an update or two. Rock solid 60FPS@1800p, ultra, on a 6950 XT, with FSR, or 1440p, ultra, without.
Last edited by Hobgoblin; Aug 27, 2023 @ 2:37pm
Jrm Aug 27, 2023 @ 9:03pm 
Originally posted by CRÆVΞN:
I beat the game and did not notice a single flicker. The only thing I noticed was shimmering from FSR. So I turned it off. I did experience shadows flickering in Redfall, but that was due to FSR too. Try turning off DLSS and see if they still flicker. It could be Nvidia just need to release a driver update. It sounds like, so far, being an AMD optimized game? The only people reporting stutters and bugs are on Nvidias. So it's probably just a matter of an update or two. Rock solid 60FPS@1800p, ultra, on a 6950 XT, with FSR, or 1440p, ultra, without.

I have a 6900XT overclocked to 6950XT Performance, didn't have any issues either.

I get pretty good performance in this game at 1440P High or Ultra settings with just FSR2 to quality. Not complaining.

I can't wait for FSR3 to release so everyone can get a bump in performance (if you meet Requirements) that is.

I hate upscalers and frame gens, but it looks like that's where we are at until the next 1-2 generations of Gpu's catch up a bit.

That being said I am already running on par / slightly better FPS than an Rtx 4070 TI at any Resolution, Native or with FSR to Quality VS a 4070 ti at dlss quality On UE5 games. Obviously the 4070 TI is a Sure win when Frame Gen is turned on as FSR3 isn't out yet.

With FSR3 frame gen, if I only get 20-30 frames more I will be beating a 4070 TI with frame gen and same settings.

With you having a 6950xt you will be in the same boat lol.

Hopefully the FSR3 frame gen gives us a good boost, but even if its worst case and we only get a small boost, looks like we should be right on par or exceed a 4070 ti with Frame gen and same settings.

Will be fun to test all this when its available.
Medusa Aug 28, 2023 @ 8:08am 
Originally posted by Squiggly1:
Originally posted by Medusa:
I didn't say the engine is console only, i said it's main focus is console hardware to enable something similar to Raytraced GI on console.
But as far as i know the hardware lumen mode just extends upon the software mode, with a fully hardware-based RT-renderer still being much faster on competent PC's.
Of course you can do anything with the engine you want as a dev... but it's main focus when building the engine was delivering prime gfx on current gen consoles, at which they have failed though.


Try running the same titles on a PC equipped with equivalent hardware to either the PS5 or Series X. The efficiency of shared RAM, highly efficient I/O with custom API's, and singular hardware SoC's will eclipse the similarly specced PC's performance.

It's completely disingenuous to compare PC's running 4090's, with 64GB of DDR 5, the latest 13th gen Intel or Ryzen 7800X3D CPU's, and new PCIe gen 5 NVMe's to consoles that cost less than half of what the 4090 by itself costs. Ignoring that AMD is behind Nvidia in raytracing efficiency by a generation, while right on par or better than NV in rasterization.

Of course Epic are looking at optimization on console, and on PC. Full path based raytracing is still at least 3 to 5 generations off for the masses. It's pretty awful on even the highest-end PC's with 4090's when you consider you don't get anything resembling a decent framerate without copious amounts of DLSS and frame generation (which doesn't help latency at all) in anything modern employing the tech like Cyber Punk's raytracing overdrive mode.

I wouldn't count on NV for much longer with their dominance on ML/AI being their mainstay for income. Gaming is just a blip on the radar for them now. Meaning they can charge as much as they want and do as little as they want with no real impact on their business. Intel hasn't been in the game long enough to say they'll be around. AMD has the console market to rely upon and looks like they may be eschewing enthusiast level hardware for more mass production level stuff.
The difference in IO, custom API's and singular hardware SoC's is only relevant when raw GPU or CPU power is not the bottleneck...
So basically if a game has a specific workload, it can run faster on a console than on a much faster PC for that specific workload, like streaming textures and loading levels and such...
However, stuff like Lumen and Virtual Shadow maps, just require raw GPU power and all the efficiency of the consoles barely matters for that kind of workload.
And you can see that is correct when we can determine that the PS5 and XSX run this game at 720p to get 60 fps most of the time. Let alone that they probably do not run PC's Ultra settings.
A 2080 Super, which is already a bit faster than those consoles, will surely run it at mostly 60 at 720p as well.
So no, the efficiency in the end doesn't matter for raw performance.
it can however matter for stuttering like loading lag when streaming assets, something that happens alot less on the current consoles.
< >
Showing 1-10 of 10 comments
Per page: 1530 50

Date Posted: Aug 26, 2023 @ 4:01pm
Posts: 10