The Elder Scrolls IV: Oblivion Remastered

The Elder Scrolls IV: Oblivion Remastered

View Stats:
The game performance is good enough at 1080p or 1080p upscaled, if...
The game performance at a native resolution of 1080p (Ultra Preset) is very demanding.

I'm not going to get into framerates (or my own quite decent CPU and GPU), but you can see this by looking at the GPU usage of any currently available graphics card. Cap your FPS at 60 or 90 (or whatever), you will see.

So, to make the game playable at a good framerate of 60 or higher, you will likely need to use frame generation.

Even at 1080p native resolution. Or upscaling with a display resolution of 1440p or 4K (doesn't matter much which).

I suspect game performance is better optimised for consoles, this is another game that seems to rely heavily on shader compilation.
Last edited by cbdeakin; Apr 25 @ 1:02pm
< >
Showing 1-15 of 17 comments
cbdeakin Apr 25 @ 12:57pm 
This performance review shows that you need a very expensive GPU (like a RTX 4090) to even run the game at 1080p (native) Ultra, with a 1% low framerate above 60 FPS:
https://en.gamegpu.com/action-/-fps-/-tps/the-elder-scrolls-iv-oblivion-remasterd-test-gpu-cpu

An RTX 5090 can handle 1440p (native).
Last edited by cbdeakin; Apr 25 @ 1:12pm
Are you talking to someone?
cbdeakin Apr 25 @ 1:04pm 
Originally posted by EnfluenceXI:
Are you talking to someone?
Hopefully, if you are reading this.
Trvor Apr 25 @ 1:15pm 
ryzen 7 5800x, 6700 XT, 32gb ram. high settings. disabled FSR and started using TSR. capped to 60 fps and only dips to 58, no stutters, ghosting or other graphical issues.

i don't understand complaints of low performance when i'm pushing a constant stable 60 fps on a 6700 xt WITHOUT FSR or Frame Gen and only reaching 80-85% max GPU usage in very select cities or forests. averaging a GPU usage of 40-70% depending on cell type

edit: the only problem i've had with this release is the fact SSR doesn't work properly
Last edited by Trvor; Apr 25 @ 1:22pm
cbdeakin Apr 25 @ 1:16pm 
Lots of people seem to claiming that the performance is good at 1440 or 4K, even on the Ultra preset.

They are wrong. Or, they are running with upscaling.
Last edited by cbdeakin; Apr 25 @ 1:16pm
Trvor Apr 25 @ 1:17pm 
upscaling looks like ass so i would believe most people are not
Originally posted by cbdeakin:
Lots of people seem to claiming that the performance is good at 1440 or 4K, even on the Ultra preset.

They are wrong. Or, they are running with upscaling.

Neither is the case. Some people are just not stupid and trying to run the game on ULTRA with hardware that cannot do it.

I am getting 60 fps rock solid @1440p with a mix of ultra/high.
cbdeakin Apr 25 @ 1:19pm 
Originally posted by Trvor:
ryzen 7 5800x, 6700 XT, 32gb ram. high settings. disabled FSR and started using TSR. capped to 60 fps and only dips to 58, no stutters, ghosting or other graphical issues.

Lots of people play on Ultra. That's probably why.

I haven't tried the High preset, but I think because the game defaults to Ultra for a lot of decent graphics cards, most stick with that.

The game is a Remastered version of the 2006 game, people want to max out the settings, even if that's at a relatively low base resolution like 1080p.
Last edited by cbdeakin; Apr 25 @ 1:29pm
Trvor Apr 25 @ 1:30pm 
Originally posted by cbdeakin:
Originally posted by Trvor:
ryzen 7 5800x, 6700 XT, 32gb ram. high settings. disabled FSR and started using TSR. capped to 60 fps and only dips to 58, no stutters, ghosting or other graphical issues.

Lots of people play on Ultra. That's probably why.

I haven't tried the High preset, but I think because the game defaults to Ultra for a lot of decent graphics cards, most stick with that.

The game is a Remastered version of the 2006 game, people want to max out the settings, even if that's at a low resolution like 1080p.
yeah uhm, here's the thing... specs vs expectations. UE5 is a graphics pig and resource hog.

personally, i understand that meet the recommended specs and set my graphic settings accordingly.

lumen and ray tracing are doing a big bulk of the work in visuals and aren't performance friendly
cbdeakin Apr 25 @ 1:40pm 
I guess there's always Geforce Now, that has Oblivion on it now.

I've got an RX 7900 XT, so not much need.

EDIT - Apparently Nvidia's server hardware uses RTX 4080s, aren't quite high end enough.
Last edited by cbdeakin; Apr 25 @ 2:33pm
cbdeakin Apr 25 @ 2:31pm 
Originally posted by LawrenceOfUrLabia:
Totally playable on old hardware.

https://www.youtube.com/watch?v=6cJluAXyAjw

Guess you don't wanna use frame gen?

Lot easier than messing about with 5-10 settings.

You should update that really old RX 7900 XTX to the newer RX 9070 XT, it will make all the difference :D
Last edited by cbdeakin; Apr 25 @ 2:32pm
Originally posted by cbdeakin:

Guess you don't wanna use frame gen?

Lot easier than messing about with 5-10 settings.

You should update that really old RX 7900 XTX to the newer RX 9070 XT, it will make all the difference :D

I can't stand any type of input latency and this game is far, far away from being ready to dump in frame gen. That is literally the last option to enable as a developer.

I know I can't believe this thing is 2 years old and the best AMD has come up with subpar upgrade at best...crazy. Who do they think they are, Nvidia??! lol
Last edited by LawrenceOfUrLabia; Apr 25 @ 2:49pm
Originally posted by LawrenceOfUrLabia:
Originally posted by cbdeakin:

Guess you don't wanna use frame gen?

Lot easier than messing about with 5-10 settings.

You should update that really old RX 7900 XTX to the newer RX 9070 XT, it will make all the difference :D


Originally posted by cbdeakin:

Guess you don't wanna use frame gen?

Lot easier than messing about with 5-10 settings.

You should update that really old RX 7900 XTX to the newer RX 9070 XT, it will make all the difference :D

I can't stand any type of input latency. Anything over about 7-8 ms frame times and it's no deal. Yes, I'm super spoiled.
Sounds like a 7900 XT user
Originally posted by LawrenceOfUrLabia:
Originally posted by cbdeakin:

Guess you don't wanna use frame gen?

Lot easier than messing about with 5-10 settings.

You should update that really old RX 7900 XTX to the newer RX 9070 XT, it will make all the difference :D


Originally posted by cbdeakin:

Guess you don't wanna use frame gen?

Lot easier than messing about with 5-10 settings.

You should update that really old RX 7900 XTX to the newer RX 9070 XT, it will make all the difference :D

I can't stand any type of input latency. Anything over about 7-8 ms frame times and it's no deal. Yes, I'm super spoiled.


Originally posted by Sgt. Flaw:
Originally posted by LawrenceOfUrLabia:




I can't stand any type of input latency. Anything over about 7-8 ms frame times and it's no deal. Yes, I'm super spoiled.
Sounds like a 7900 XT user

It's an XTX, Not an XT. Tyvm.
< >
Showing 1-15 of 17 comments
Per page: 1530 50

Date Posted: Apr 25 @ 12:52pm
Posts: 17