The Elder Scrolls Online

The Elder Scrolls Online

HNTR Mar 26, 2022 @ 12:17am
Why is this game locked at 100fps?
I get at least 144fps in all other games but this games is constantly locked. Eye cancer
< >
Showing 1-10 of 10 comments
Thomas D. Mar 26, 2022 @ 1:12am 
I have heard that you can change it by modifying the usersettings.txt. I am just not sure which option it was.
Psychlapse Mar 26, 2022 @ 3:12am 
It's stupid that they don't give you an actual in game option to change it, but you can Google it to find a workaround. It just involves changing a value in a text file.

Eye cancer? Because games have always been 100fps + since the dawn of time?
Originally posted by Psychlapse:
Because games have always been 100fps + since the dawn of time?

Yep! I have quite a few good memories of playing games based on Quake 1/2/3's engines like RTCW and Wolfenstein: Enemy Territory (based on Q3's engine) on ridiculously low settings and stupidly high FPS counts with a FX5200 in the early 2000s to move faster and have my guns overheat less[web.archive.org].

You had quite a few extra benefits if you played early multiplayer 3D FPS games on high FPS besides a better refresh rate (if your CRTs could show them, which they probably didn't) and lower input lag. That's why at least a few competitive players who are a bit older tend to play on lower settings even on modern games and good setups. It's not just about mitigating that Min% FPS metric... they just got used to it.

Sure, those extra side effects were not intentional back then and should not happen today, but it sure makes you think about why most devs still use lazy workarounds such as limiting and smoothing FPS for the lowest common denominator (consoles) instead of actually implementing code that works and syncs regardless of client FPS.
Blade12775 Mar 26, 2022 @ 5:59pm 
My buddy played on low settings while playing pvp shooters. Not for the FPS but less stuff on screen ment you can see more players easier to take them out. He ranked 1 on gun kills and 2 on knife kills on the leader boards.
Kissing Fish Mar 26, 2022 @ 7:46pm 
"eye cancer"
lol
you know what they about people who hyperbole alot...

L O L M A O
Axiata Mar 26, 2022 @ 8:46pm 
eye cancer. lol i remember someone saying human eyes can only see 30 fps
100 is more than enough.. stop complaining and if u wanna unlock it google!!
Capricorn Anomaly Mar 26, 2022 @ 10:05pm 
How to unlock FPS for ESO:

Go to Documents/Elder Scrolls Online/live
Search the UserSettings.txt file and open it with Notepad or Wordpad
Change the SET MinFrameTime.2 directive to a specifc number.
You can choose different values to unlock your FPS in ESO:

If you want 120 FPS, then you can calculate 1 divided by 120 = 0,00833333

60 FPS = 0.01666666
120 FPS = 0.00833333
144 FPS= 0.00694444
240 FPS = 0.00416667
Thomas D. Mar 27, 2022 @ 2:53am 
Originally posted by Axiata:
eye cancer. lol i remember someone saying human eyes can only see 30 fps
100 is more than enough.. stop complaining and if u wanna unlock it google!!
That's a often misunderstood statement and completely wrong.

It's more correct somewhere at the 24 / 25 Hz range and it's not the point human eyes can see any difference, it's the point where animations start to feel smooth.

But the human eye can register more than that. Most movies use 24, 25 or 30 Hz, because it's enough to deliver a smooth experience. Anything more than that would be to expensive (storage and transmission) with to less value gained.

The 25 or 30 Hz Standards are also based on the power grid standards. In US the power grid has for example 60 Hz, thus they used 30 Hz (60Hz interlaced) for movies and in EU the power grid usually has 50 Hz, so movies here used usually 25 Hz (50 Hz interlaced). Most screens in the early days were analogue and simple, thus coupled directly to the frequency of the power grid.

All this has less to do with how and what the human eyes perceive, besides that 24/25/30 Hz are enough to make animations fluid. The real reasons for this numbers are more technical and economical, less the human perception.

Also movies are / were interlaced. This means that the resolution of a frame was halfed vertically and 2 frames which were interwoven in time. One frame was every even line and the other frame every odd line. This leads to a perceived doubled frame rate. It was done this way to make stuff feel even smoother without the necessity to store and transmit the double amount of data. As a sidenote 1080i stands for interlaced while 1080p isn't interlaced.

Another number is 14 / 15 Hz. This is the minimum to make something look animated and it's often used by cartoons, because it's enough (lowest end) and more efficient in an economical way, again not because it's the maximum of the human perception. Also you can directly tell if a cartoon / animated movie uses 14, 30 or even 60 FPS.

The thing is that for movies / cartoons and animations it even feels odd at first if you see something with 60 FPS. People can directly tell you that something is weird. That is because they can perceive the difference but often are used to standards like 15 for cartoons and 24/25/30 for movies.

So the real number of how much the human eye can perceive is even higher. As far as I know it was 75-85 Hz where people stop being able to see the difference between something steady and something very quickly switching between black and white in normal circumstances. Drugs btw. can probably extend it a little bit further. Also it's the number where old CRT screens stopped flickering (because of the same reasons). But it's not a completely fix number, because in the outer regions of the eye we can perceive more FPS than in the inner regions where we can perceive details and colors better.

But the retina of human eyes can work even quicker and reacts even to several 100 FPS. But in the end it doesn't change that > 100 FPS are usually nonsense. Because there are several steps between what the retina detects and what the brain is able to handle (which usually operates at 14-35 Hz, that's also why animations start feeling animated at this low rates) and in every step there is a loss. So in the end a human usually won't be able to tell the difference between 85 FPS and more than 85 FPS and even probably won't notice any difference between 60 and 85 Hz without directly comparing both.

But there is one thing special to computer games comparing them to movies and sources with exact frame times, computer games dont have constant FPS and don't produce constant frame times. This means that a game that runs at about 85 FPS might produce frame times that are more like those of 60 FPS or lower and sometimes frame times more to games that run with more than 100 FPS. This on the other hand can be perceivable. That is also the reason why movies work ok at 24/25/30 FPS but games don't and need more than that to be smooth.

Another thing is that there are several types of sync technologies to prevent tearing (half drawn frames which overlap with other half drawn frames). Tearing is perceivable even on high FPS. The thing with many of the sync technologies is that they often delay frames until the screen is finished drawing the frame before. This on the other hand can lead to increased frame times (up to twice) leading to effectively halfing the FPS. That's where adaptive sync and gsync come into play, because they solve it better than previous sync technologies and are able to counter tearing without effectively halfing the FPS. An alternative would be to increase the FPS even further (beyond 100) but it's not necessarily if those technologies are used.

But in the end, steady 100 FPS are still in enough for humans to surpass what they are able to perceive, except in a few cases I explained before (and maybe drugs). This cases are primarily tearing and unsteady frame times. The first can and should be countered with new sync technologies and the later one will have same effects even if the game would limit the FPS, because it happens due to less performance in some situations which won't change with an unlock.

All in all, it's stupid to say anything below 100 FPS is eye cancer (even if he might see a difference because of some sideffects like tearing which can be handled in other ways). But it's also wrong to state that the eye can't see more than 30 FPS, because it can!

But most today's screen are still operateing at 60 Hz and won't be even able to show 100 or even more frames and would just lead to more tearing if not synced, which would effectively just drop frames which can't be displayed in this case, meaning it would be a waste to have more frames.

Except for old engines like the mentioned quake 3 🤣.
Last edited by Thomas D.; Mar 27, 2022 @ 3:11am
Space is the Place Mar 27, 2022 @ 10:07am 
Like all mutiplatform games the PC version gets nerfed so "console gamers don´t get a worst experience even though they made the free choice of having a worst hardware".

What´s new?
Originally posted by Thomas D.:
So the real number of how much the human eye can perceive is even higher. As far as I know it was 75-85 Hz where people stop being able to see the difference between something steady and something very quickly switching between black and white in normal circumstances.

Originally posted by Thomas D.:
But in the end, steady 100 FPS are still in enough for humans to surpass what they are able to perceive, except in a few cases I explained before (and maybe drugs).

It's pretty clear how much you thought about your post, so I'd like to make it clear I admire your effort even if I disagree on it almost entirely.

I'd start by arguing that it's way more complex than that.

On the capabilities of the human eye, there's a clear difference between 72 and 144 FPS on TestUFO with my monitor. I'm sure we still haven't 100% determined what we can "see" on easily quantifiable metrics because it's so hard to try to understand how other beings see the world. We know how our eyes look like and the components that allow them to be what they are, but it's a whole different beast to understand how they work and how their data is processed by our brains.

Originally posted by Thomas D.:
So in the end a human usually won't be able to tell the difference between 85 FPS and more than 85 FPS and even probably won't notice any difference between 60 and 85 Hz without directly comparing both.

That's true if you don't spend most of your day at a screen consuming content. It's how much you're used to something.

It's like driving on a highly populated city. You'll have plenty of roads with slightly different speed limits. If you're not used to it, you'll often forget about them since they're different but "feel" too similar. If you spend enough time on them, you'll easily notice the difference. Same goes for noticing slight differences in taste and smell on similar batches of coffee, wine and whisky. If you're an unexperienced beginner, you'll probably fail an A-B test even if you think you noticed something,

My free time is basically 10% food or coffee and 90% gameplay since I was 8 or something. I spent years playing on 30, then years playing on 60 and now the same's happening for 144. Sure, I won't notice much of a difference if my PC drops from 144 to 120, but it gets easier and easier to notice as it gets further away from one of these milestones and approaches another. I'd probably notice a 25 fps drop very easily if I was playing on 60 hz though, since I upgraded my main driver to 144 like less than two years ago.

You're probably not considering that we tend to notice things when they get past a certain level of variation, while small changes are kept unnoticed. It happens with framerates, it happens with how you season your food or sweeten your drinks, it happens with interpersonal relationships of any intensity ("YOU'RE NOT THE SAME PERSON I FELL IN LOVE WITH YEARS AGOOOOOO"), it happens with pretty much anything.

That's why you have Standard Deviations in Statistics.

Originally posted by Thomas D.:
That is also the reason why movies work ok at 24/25/30 FPS but games don't and need more than that to be smooth.

I get what you were saying with the whole argument about smoothness and animation, but I think it unfortunately falls flat pretty quickly. Yes, movies tend to be at sub-30, but that's a consequence of how many years society watched content recorded on pre-digital hardware. We were limited back then, and we kept true to that limitation.

There's also psychological aspect to it. Movies are, by nature, more "passive". The audience is allowed — and encouraged — to "soak up" what they're watching and think about it while it's being shown. We have tons of recorded (heh) history about attempts to raise framerates on movies when it was possible, and how it failed because audiences were so used to lower framerates... that raising the framerates ended up creating feelings of restlessness, anxiety and even nausea.

It's like the adrenaline you get when you finally get the chance to stomp your pedal while driving on a high speed highway. You will feel a rush because you're not used to that. If you like it, you'll want to do it again. If not, you'll actively avoid it.

just look up news, reviews and popular opinions about watching 2013's The Hobbit on 48fps. It wasn't the first, and it wasn't the last. We have plenty of digital content in 60+FPS now, and I hope the bar keeps being pushed.

Originally posted by Thomas D.:
But most today's screen are still operateing at 60 Hz and won't be even able to show 100 or even more frames and would just lead to more tearing if not synced, which would effectively just drop frames which can't be displayed in this case, meaning it would be a waste to have more frames.

Most, but not all. I have used plenty of monitors and TVs on my computers, and IMHO tearing is something that I experienced only on cheaper, lower-end screens, regardless of their maximum output being at 30, 60 or 144hz. Most screens I settled with as daily drivers for the last decade or so have been low-tier to mid-tier and don't show tearing even without VSync.

BTW check your screen control panel and manual and turn off post processing. How? Depends on the screen. Check for things to disable like interpolation, to enable like Game Mode or to adjust up or down like Response Time and Freesync/Overclocking.
< >
Showing 1-10 of 10 comments
Per page: 1530 50

Date Posted: Mar 26, 2022 @ 12:17am
Posts: 10