Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
I have ROG Zephyrus, this laptop has a system allowing use of either Optimus (choosing between Integrated Intel UHD graphic card or nVidia), or sole dedicated vid game card (nVidia only).
Long story short: running Catherine on Optimus makes my cutscenes choppy. Running the game on nVidia dedicated only setup produces no issues to the cutscenes.
I would ask devs to check what resources is their in-game "movie player" is using, it is possible it is using a different one than the 3D engine uses.
That's not optimal though. If you have a high refresh rate monitor, you should use it.
First thing though, Anime is choppy, in general I mean. It's what, 24fps? I don't know it might even be less, regular film is 24fps but the way anime is drawn and animated is naturally lower framing. I mean, it's intended.
But if you're actually talking about FPS, and not intended, do you have G-sync?
When I bought my new PC, I got an extremely expensive G-sync display to go with my 1080 TI (which at the time, was the best as the 20 series was not out) yet every time it went under 60 FPS (I mean, 58-59 FPS) it was unplayable. I can handle 30 FPS without going on a rant, as there's plenty of amazing Ps4 games, but due to TV's having interpolation and being sat back a bit further, it doesn't matter as much. I just assumed I had gotten so use to fast refresh rate that anything less than 72 was bad, even 60 FPS is pretty bad once you've had a long session at anything over 120. However, switching over to Ps4 and playing a 30FPS title, it was still playable (although noticable). Hell, I even switched to my 1080p 60hz display, and no issues.
Still, it didn't explain hitting 58-59 would be so DRASTICALLY different than 60, ESPECIALLY with G-sync. I tried everything but couldn't work out why. Most people would just try to say "you're seeing things, you can't tell the difference between 58 and 60" (Well, actually, you CAN, especially for someone use to frame pacing) but I did find a few people who had similar issues to me and could never work out why.
One day I decided to update my BIOS/UEFI and wow, that fixed everything. I had never seen Gsync before getting my monitor, and after I was wondering what the fuss was about, because it was awful. After the bios update though, now I get why people love Gsync. It took a year before I had done that as there was no need, and there was nothing pointing to doing so despite searching for months. Now, I can still notice when it goes lower than 60, but that's just normal, it's hardly a problem and because of Gsync looks and feels good still, but it certainly made a huge difference from going between 60-59 and it becoming instantly choppy, it now just does what Gsync is suppose to do. I've not used AMD for a long time, but perhaps it is the same for Freesync.
TLDR:
Update your bios, it might fix it. My 1080 TI was a strix OC edition, so it's possible whatever motherboard you have might not be working with Freesync/G-sync correctly.
You really shouldn't have to switch to 60hz as it's not only hassle, it defeats the point of a fast refresh rate!
This helped me get to the bottom of my issue! Catherine doesn't like dedicated card and only runs the cutscenes smoothly when I use the integrated card
I have a RTX 3090 and running on a 240hz monitor