The Elder Scrolls III: Morrowind

The Elder Scrolls III: Morrowind

Anyone else frustrated with the performance in this game?
Hi, I really love Morrowind, but I feel like the performance is horrendous. I struggle to maintain a constant 60fps on the highest draw distance settings, despite my hardware being more than adequate for this game. Are there any mods or external applications that improve frame-rates? I know of the FPS optimizer, but that doesn't really touch the game; it just fiddles with the settings and whatnot in a dynamic manner. I find frame-rates below 60 incredibly jarring and annoying to watch.
It's a bit of an issue, when The Witcher 3 sees better performance on my machine.
< >
Showing 1-15 of 22 comments
EgoMaster Aug 5, 2015 @ 8:37pm 
I'm guessing you don't watch films, which run at 24 fps.

My only suggestion is turning off power saving options if you're gaming on a laptop. Morrowind is an old game that doesn't take advantage of new hardware and relies on raw power. And hope OpenMW comes out quick.
Lily Aug 6, 2015 @ 3:20am 
Originally posted by EgoMaster:
I'm guessing you don't watch films, which run at 24 fps.

I'm speechless.

Now to the OP, I don't know what could be causing your bad performance, especially considering Witcher 3 runs better. Personally, I play with a mod that allows me to even further increase draw distance and I always manage to maintain a stable ~150 FPS with even that maxed. Are you running any graphic mods perhaps?
Originally posted by 笹瀬川佐々美:
Originally posted by EgoMaster:
I'm guessing you don't watch films, which run at 24 fps.

I'm speechless.

Now to the OP, I don't know what could be causing your bad performance, especially considering Witcher 3 runs better. Personally, I play with a mod that allows me to even further increase draw distance and I always manage to maintain a stable ~150 FPS with even that maxed. Are you running any graphic mods perhaps?
Nope, just vanilla Morrowind. In cities, the frame-rate goes below 50 in some areas. Frame-rate, maxed out, in forest areas, hovers around the 100-140 range.
kmk988 Aug 7, 2015 @ 5:04am 
Originally posted by Shrex:
Originally posted by 笹瀬川佐々美:

I'm speechless.

Now to the OP, I don't know what could be causing your bad performance, especially considering Witcher 3 runs better. Personally, I play with a mod that allows me to even further increase draw distance and I always manage to maintain a stable ~150 FPS with even that maxed. Are you running any graphic mods perhaps?
Nope, just vanilla Morrowind. In cities, the frame-rate goes below 50 in some areas. Frame-rate, maxed out, in forest areas, hovers around the 100-140 range.
morrowind is a old game so the engine has some issues , do you use a 4gb patch? that will probably help you some on a newer system, there is also an .exe optimizer that will help as well , also get the Morrowind code patch, which fixes a lot of bugs
Brandybuck Aug 10, 2015 @ 9:54am 
Originally posted by EgoMaster:
I'm guessing you don't watch films, which run at 24 fps.
You can't compare cinema film to games on a digital monitor. It's apples to oranges. Personally I have no problem with only 30FPS in a game, but to some people that's headache inducing. 60FPS should be good enough for most people, but even that might not be enough for a busy high motion scene on some monitors. (Which I attribute to very low response time of the monitor).

But that said, any reasonably recent video card should be able to keep with with Morrowind without breaking a sweat. If the OP is getting better performance with Witcher 3, then something weird is going on. Since OP says it's vanilla, I'm suspecting it's CPU related. Perhaps just too many objects, clutter and people in cities to keep track of.
Last edited by Brandybuck; Aug 10, 2015 @ 9:56am
EgoMaster Aug 10, 2015 @ 1:32pm 
Originally posted by Brandybuck:
Originally posted by EgoMaster:
I'm guessing you don't watch films, which run at 24 fps.
You can't compare cinema film to games on a digital monitor. It's apples to oranges. Personally I have no problem with only 30FPS in a game, but to some people that's headache inducing. 60FPS should be good enough for most people, but even that might not be enough for a busy high motion scene on some monitors. (Which I attribute to very low response time of the monitor).
You can compare films and games, because you can watch films on a digital monitor and they're still 24 fps (In the US 23.976 to be exact, 25, if you're in Europe). Also, I think headaches have more to do with 60Hz than 30fps. In the olden days, when we had CRT monitors, refresh rates started with 75 Hz and went up to 100. In fact, 100 Hz was the standard in the late days of CRT monitors. Higher refresh rates cause less eye strain, and in turn less headaches. I'm not a medical expert, but as an ex broadcast operator, I know a few things about the broadcast side of things. So if there's a medical study that shows lower frame rates do indeed induce headaches (instead of lower refresh rates), please point me to it.
Clover Aug 10, 2015 @ 7:40pm 
Originally posted by EgoMaster:
Originally posted by Brandybuck:
You can't compare cinema film to games on a digital monitor. It's apples to oranges. Personally I have no problem with only 30FPS in a game, but to some people that's headache inducing. 60FPS should be good enough for most people, but even that might not be enough for a busy high motion scene on some monitors. (Which I attribute to very low response time of the monitor).
You can compare films and games, because you can watch films on a digital monitor and they're still 24 fps (In the US 23.976 to be exact, 25, if you're in Europe). Also, I think headaches have more to do with 60Hz than 30fps. In the olden days, when we had CRT monitors, refresh rates started with 75 Hz and went up to 100. In fact, 100 Hz was the standard in the late days of CRT monitors. Higher refresh rates cause less eye strain, and in turn less headaches. I'm not a medical expert, but as an ex broadcast operator, I know a few things about the broadcast side of things. So if there's a medical study that shows lower frame rates do indeed induce headaches (instead of lower refresh rates), please point me to it.

Games and film CANNOT be compared. Movies use extremely high quality motion blur which reduce the effects of low frame rates and make the images appear smoother. Most games use nothing to minimize the effects of low framerates and some use low quality motion blur. The best implementation is "per object motion blur" which closely mimics the blur we experience from fast moving objects in real life (moving your hand in front of your face, fast moving car, etc).

Refresh rates DO have an effect on eye strain. The higher the refresh rate is, the better for your eyes. However, low framerates do too. The most important thing is a consistent frame delivery : 60hz = 60fps = 16.67ms per frame. Inconsistent frame delivery can cause headaches and eye strain. Stuttering and screen tearing can also cause this, as well as motion blur of poor quality.

Hopefully this is helpful.
Originally posted by Domhnall of Zena:
Originally posted by EgoMaster:
You can compare films and games, because you can watch films on a digital monitor and they're still 24 fps (In the US 23.976 to be exact, 25, if you're in Europe). Also, I think headaches have more to do with 60Hz than 30fps. In the olden days, when we had CRT monitors, refresh rates started with 75 Hz and went up to 100. In fact, 100 Hz was the standard in the late days of CRT monitors. Higher refresh rates cause less eye strain, and in turn less headaches. I'm not a medical expert, but as an ex broadcast operator, I know a few things about the broadcast side of things. So if there's a medical study that shows lower frame rates do indeed induce headaches (instead of lower refresh rates), please point me to it.

Games and film CANNOT be compared. Movies use extremely high quality motion blur which reduce the effects of low frame rates and make the images appear smoother. Most games use nothing to minimize the effects of low framerates and some use low quality motion blur. The best implementation is "per object motion blur" which closely mimics the blur we experience from fast moving objects in real life (moving your hand in front of your face, fast moving car, etc).

Refresh rates DO have an effect on eye strain. The higher the refresh rate is, the better for your eyes. However, low framerates do too. The most important thing is a consistent frame delivery : 60hz = 60fps = 16.67ms per frame. Inconsistent frame delivery can cause headaches and eye strain. Stuttering and screen tearing can also cause this, as well as motion blur of poor quality.

Hopefully this is helpful.

My dad always complains about 60fps in games. He says they're "Too fast, and they give him headaches". Lol, dad.
EgoMaster Aug 12, 2015 @ 12:26am 
Originally posted by Domhnall of Zena:
Games and film CANNOT be compared. Movies use extremely high quality motion blur which reduce the effects of low frame rates and make the images appear smoother. Most games use nothing to minimize the effects of low framerates and some use low quality motion blur. The best implementation is "per object motion blur" which closely mimics the blur we experience from fast moving objects in real life (moving your hand in front of your face, fast moving car, etc).

Refresh rates DO have an effect on eye strain. The higher the refresh rate is, the better for your eyes. However, low framerates do too. The most important thing is a consistent frame delivery : 60hz = 60fps = 16.67ms per frame. Inconsistent frame delivery can cause headaches and eye strain. Stuttering and screen tearing can also cause this, as well as motion blur of poor quality.

Hopefully this is helpful.
That was a bit more scientific, yes. And I will do some research about it. Still not totally convinced though. Maybe because I always sacrifice frame rate for visual fidelity (I don't go below stuttering limit of course) and never had a problem until now. For instance I played Far Cry 3 on a nVidia 750m in full detail, at 1080p (only AA off) with an average 25-30 fps. I completed my playthrough in 79 hours with no headaches.
Last edited by EgoMaster; Aug 12, 2015 @ 12:27am
Sunbow Aug 13, 2015 @ 12:16pm 
I normally have over 100 fps, and that's with MGSO installed.

In some areas where a lot is happening, it can drop right down into the 20s or 30s, but it's never been anything that has ruined the experience for me.

Morrowind, even with tons of add-ons, runs fine on a reasonably high-end PC. My PC is a 2.8 Ghz Intel Core i7, with a modest GTX 660 video card, 8GB mermory.
Originally posted by Mr_Benn:
I normally have over 100 fps, and that's with MGSO installed.

In some areas where a lot is happening, it can drop right down into the 20s or 30s, but it's never been anything that has ruined the experience for me.

Morrowind, even with tons of add-ons, runs fine on a reasonably high-end PC. My PC is a 2.8 Ghz Intel Core i7, with a modest GTX 660 video card, 8GB mermory.

The game itself is underusing the hardware. It only uses like, 30% of my CPU max, and even less of my GPU. Imagine how well this game would run on modern systems, if the developers took the time to optimize it. Of course, I'd assume the lacking utilization of my CPU would be due to it having 8 cores, but the clock rate and cache are high enough, even for single-threaded applications. (AMD FX 8370 4GHZ)
Last edited by Director of War Crimes; Aug 13, 2015 @ 3:09pm
Logorouge Aug 13, 2015 @ 8:10pm 
The game itself is 13 years old. So keeping in mind the expression "diminishing returns" would be wise when talking about performance. Also important to consider is that the developer, Bethesda, never comes back to their old games. Not even to fix fully documented major bugs or glitches. Might as well forget about optimization and polishing work from them.
EgoMaster Aug 13, 2015 @ 11:16pm 
Originally posted by Shrex:
The game itself is underusing the hardware. It only uses like, 30% of my CPU max, and even less of my GPU. Imagine how well this game would run on modern systems, if the developers took the time to optimize it. Of course, I'd assume the lacking utilization of my CPU would be due to it having 8 cores, but the clock rate and cache are high enough, even for single-threaded applications. (AMD FX 8370 4GHZ)
The game doesn't support multi-core CPUs and modern GPU acceleration techniques. Contrary to modern games which specifically use GPUs, Morrowind distributes its load almost equally to CPU and GPU. These are the reasons it runs slow on modern machines.

There's a project called OpenMW[www.openmw.org], which will run Morrowind with the advantages of modern hardware.
Sunbow Aug 13, 2015 @ 11:23pm 
With Morro

Originally posted by EgoMaster:
The game doesn't support multi-core CPUs and modern GPU acceleration techniques. Contrary to modern games which specifically use GPUs, Morrowind distributes its load almost equally to CPU and GPU. These are the reasons it runs slow on modern machines.

There's a project called OpenMW[www.openmw.org], which will run Morrowind with the advantages of modern hardware.

I haven't tried OpenMW yet, and I'm not even sure it's past alpha stage. However, if you use Morrowind Graphics Extender, the game can utilise GPUs. Alternatively, MGSO includes a version of the Graphics Extender, and thus also utilises the GPU as well has update the graphics to almost Skyrim level.
Last edited by Sunbow; Aug 13, 2015 @ 11:24pm
Eru Aug 14, 2015 @ 6:04pm 
Didn't rd everyone else's comments, but I believe Morrowind can only use 1 or 2 physical cores, severely decreasing perfromance on most hardware.
< >
Showing 1-15 of 22 comments
Per page: 1530 50

Date Posted: Aug 5, 2015 @ 4:35pm
Posts: 22