The Elder Scrolls IV: Oblivion Remastered

The Elder Scrolls IV: Oblivion Remastered

View Stats:
3080ti users
how does the game run for u?
< >
Showing 16-25 of 25 comments
Originally posted by Edelgris:
Originally posted by patrick68794:
Because Lumen and Nanite.

Lumen and Nanite is the reason for the game literally bottlenecking on a 9800X3D?

We're talking about a game that's so old that the gameplay mechanics should be close to 99% negligible. So that would mean that with virtually zero overhead, Lumen and Nanite causes bottlenecking on a high end CPU just by existing and running on the system?

Personally, I think there's a big possibility that there's a major mess somewhere in the code. If Lumen and Nanite bottlenecks on this game, then they should bottleneck as hard on every game.
lumen is ♥♥♥♥ in every UE5 game it's used in. Software lumen gains me 10 fps over hardware lumen, and none of the quality settings for hardware lumen (there are 4 of them) make any visible difference to performance or quality. That being said, I have had a feeling since day one that the lumen implementation is broken in this game. The Finals is running UE5 on a custom branch from NVIDIA that uses their implementation of RTGI instead and that game runs really good even with RT on (despite what people with potato pcs claim). Part of the problem in this game is that software lumen is forced.

I heard that nanite isn't really any better than traditional LODs too and possibly worse.
Last edited by Myztkl©-Kev; Apr 24 @ 1:44pm
patrick68794 (Banned) Apr 24 @ 1:44pm 
Originally posted by Edelgris:
Originally posted by patrick68794:
Because Lumen and Nanite.

Lumen and Nanite is the reason for the game literally bottlenecking on a 9800X3D?

We're talking about a game that's so old that the gameplay mechanics should be close to 99% negligible. So that would mean that with virtually zero overhead, Lumen and Nanite causes bottlenecking on a high end CPU just by existing and running on the system?

Personally, I think there's a big possibility that there's a major mess somewhere in the code. If Lumen and Nanite bottlenecks on this game, then they should bottleneck as hard on every game.
They do in basically every single open world UE5 game. Most of them are CPU limited at sub 100fps.

Also, Oblivion is still a CPU heavy game because all of the game logic is single threaded. Even regular Oblivion with an uncapped framerate is CPU limited on current high end CPUs once you start increasing the number of cells to load in.
Edelgris Apr 24 @ 1:50pm 
Originally posted by Myztkl©-Kev:
Originally posted by Edelgris:

Lumen and Nanite is the reason for the game literally bottlenecking on a 9800X3D?

We're talking about a game that's so old that the gameplay mechanics should be close to 99% negligible. So that would mean that with virtually zero overhead, Lumen and Nanite causes bottlenecking on a high end CPU just by existing and running on the system?

Personally, I think there's a big possibility that there's a major mess somewhere in the code. If Lumen and Nanite bottlenecks on this game, then they should bottleneck as hard on every game.
lumen is ♥♥♥♥ in every UE5 game it's used in. Software lumen gains me 10 fps over hardware lumen, and none of the quality settings for hardware lumen (there are 4 of them) make any visible difference to performance or quality. The Finals is running UE5 on a custom branch from NVIDIA that uses their implementation of RTGI instead and that game runs really good even with RT on (despite what people with potato pcs claim). Part of the problem in this game is that software lumen is forced.

I heard that nanite isn't really any better than traditional LODs too and possibly worse.

Well, that sounds really bad and the game doesn't run great for sure. It would probably run far better with something like CryEngine.

However, I think there's also something like a memory leak, because my game can dip to less than 20fps every now and then, seemingly out of the blue. Other people's games are crashing. While others in turn say they don't have any problems. There's even people on 1080's saying they have no issues. There's something really iffy going on under the hood.
Originally posted by Edelgris:
Originally posted by Myztkl©-Kev:
lumen is ♥♥♥♥ in every UE5 game it's used in. Software lumen gains me 10 fps over hardware lumen, and none of the quality settings for hardware lumen (there are 4 of them) make any visible difference to performance or quality. The Finals is running UE5 on a custom branch from NVIDIA that uses their implementation of RTGI instead and that game runs really good even with RT on (despite what people with potato pcs claim). Part of the problem in this game is that software lumen is forced.

I heard that nanite isn't really any better than traditional LODs too and possibly worse.

Well, that sounds really bad and the game doesn't run great for sure. It would probably run far better with something like CryEngine.

However, I think there's also something like a memory leak, because my game can dip to less than 20fps every now and then, seemingly out of the blue. Other people's games are crashing. While others in turn say they don't have any problems. There's even people on 1080's saying they have no issues. There's something really iffy going on under the hood.
there is definitely a memory leak of some kind, it just causes my game to crash randomly (usually when paused or level transition) after an hour or 2 of playtime.
micmereu Apr 24 @ 2:28pm 
@2k with a 5600x and a 3080ti I've got at least a crash every 10-20 minutes of gameplay.
Lach Apr 24 @ 2:36pm 
I can't get more than 22fps with a 3080Ti and 5800x. All drivers up-to-date. Something is seriously broken with this game.
Well I take what I said back, had to lower everything to high, instead of ultra.
The open world is rough, especially after you get the horse, it chugs a lot.
Edit: turned everything back to ultra, disabled Lumen Hardware RT, and have Lumen Software RT on high.
The hardware RT is borked apparently, it's running much much better.
Last edited by LestatLestotis; Apr 25 @ 4:15pm
Edelgris Apr 25 @ 4:33pm 
I downloaded a .ini file tweak from Nexus mods which fixed a lot of my problems. My game has gone from unplayable to playable. Running the game on high with RT enabled on my 3070 ti now. It's not perfect, but sure is a hell of a lot better.

The fact that changing things in the engine initialization file fixes this much, proves that they didn't bother with making sure the most basic stuff was up to snuff.

At least this indicates that it's not some fundamental problem with the engine, it's sloppy work that should be easy to patch up.

The .ini tweak is called Ultimate Engine Tweaks.
I7 12700K 64GB DDR5@6000 4tb NVME GEN 4 ssd, 3080TI@4k dlss performance game runs like crap 120fps indoors, dips up and down from 30fps to 75ish fps outdoors. Changing settings doesnt make much difference other than shadows and RT...
Last edited by I am Groot; Apr 25 @ 4:41pm
Unoptimized piece of sh*t
20-50-forest , 35-65-room.
The KCD2 produces stable 70-85 frames in 2k ultra
The same RDR2 runs at 60-80 frames without problems, and the graphics there are 100 times better.
Last edited by ALEX-Encor; May 1 @ 1:22am
< >
Showing 16-25 of 25 comments
Per page: 1530 50

Date Posted: Apr 24 @ 12:34pm
Posts: 25