Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
The Evil Within games used a modified version of id Tech 5. The first game was closest to the base engine, albeit modified to have a new renderer with dynamic lighting and tessellation features among other things. The Evil Within 2 continued this modification and offshoot development to where it is treated as a custom engine, possibly referred to as the "STEM" engine.
Ghostwire: Tokyo represents a move to Unreal Engine 4 for Tango Gameworks, and the first major project it has made with it. It specifically uses Unreal Engine 4 version 4.27.2 if one checks Steam\steamapps\common\Ghostwire Tokyo\GWT.exe and its Properties, followed by the Details tab. The actual game process located in Ghostwire Tokyo\Snowfall\Binaries\Win64 uses a custom version scheme, however.
I tried setting max FPS to 30 and even turning on v-sync in-game, btw. It didn't help with the cutscene issue. Only forcing DX11 gets rid of the cutscene desync but it tanks the game proper FPS too much for comfort.
Edit: Setting max FPS to 30 and enabling vsync in Nvidia Control Panel works.
This post really deserves more visibility. I pre-ordered this game because I loved Evil Within 1-2. I was curious seeing how this company was headed with a title that took a departure from the survival-horror genre and into action-adventure.
I struggled so much trying to get videos to play normally. The Windows Store VP9 codec simply did not work, as I kept getting no response when I clicked on install. I then tried to install the VP9 codec manually with Powershell. I spent 8 hours trying to troubleshoot via this method. Again, no results.
No other game has given me this much frustration in trying to install a video player codec. It's absurd. I'm perfectly content updating my drivers, using DDU to perform a clean install when needed, or to scan for viruses/ malware, or to keep Windows up to date. But it's absurd to require the end user to install a operating system video codec so that cutscenes would work normally. As Bioxid wonderfully states, this really should all work seamlessly out of the box.
I, too, refunded. With a heavy heart, might I add, as I really wanted to play this game.
I hope a developer reads this and places some semblance of urgency in implementing a proper video player function to work as a contingency. Not everyone has this codec, and it's a colossal pain in the rear to install.
It running at all unless I happened to set things too high and induce a crash that way was an achievement in itself compared to other titles. For the rest of you, I do not know why things are the way they are, but I wonder what else could be the cause.
As some suggestions to look into as possible external factors for some cases, maybe using a version of Windows 10 that is not something above what the game targeted (Seriously, manually update Windows 10 in Settings, Updates and Security if need be.), deleting the game's Movies folder to skip the startup screens(May break UE4 games like this one), some other bad tweak done to attempt to save time or squeeze out more performance for games past, or even something that seems fine on the system, but is in fact a sign of corruption could be looked into, unless you want to spam Tango's Twitter page about it until the entity cracks under pressure and attempts to fix it.
Oh, and, about the Microsoft Store VP9 Codec Extensions. While having these successfully installed may be of use, it seems it does not work. At best, it probably just applies to Movies and TV only. Nice to have, but that is it. No beneficial effect on this game is reported in this thread.
It was most notable in the Chapter 3 cutscene in KK's safehouse, where Rinko is first introduced and tells Akito that there is still a way to find KK after plot events saw the latter two experience Severance for the first time. However, witnessing the cutscene desynchronization led to a realization about it.
It seemed to concide with camera shot transitions and textures being streamed in whenever the camera cut to a different shot in the scene.
On a HDD from about 3 years ago, the drive is simply too slow even with disk write caching to both stream in assets to memory and queue up the audio, leading to a hitch whenever the camera shot cuts to a different angle. Should the hitch be drastic enough, the audio and facial animations will not be in sync for a noticeable amount of milliseconds. I could tell this because the model and scene textures were popping in before my very eyes.
This brings me to another point. Like Death Stranding, most of Ghostwire: Tokyo's cutscenes are not pre-rendered. Instead, most of them use real-time, in-level rendering to serve as gameplay transitions, with only certain cutscenes—the initial opening one, for instance—possibly receiving the truly pre-rendered treatment if it is deemed necessary.
This blending of pre-rendered with real in-level rendered cutscenes is something DirectX 12 and Vulkan API-using games seem to do a lot of, owing to being able to afford it in the rendering pipeline. This is probably why the system requirements mandate a 6GB GPU and a SSD, to enable this to (hopefully) work smoothly. For users where it does not work smoothly, figuring out why is necessary, if not just hoping for a new patch.
———
TL;DR real-time, in-level rendered cinematics hitching when asset and texture streaming on every single camera shot switch leads to audio desync.
Are the cutscenes fixed in dx12?
What was done was adding and exposing a manual option that was previously automatic, in the form of the Movie Display Mode toggle.
The game uses two different rendering resolutions for cinematics/cutscenes: 1080p or Full HD, and 2160p or 4K, respectively. Prior to Patch 1.003, which rendering resolution was picked for cinematic sequences would depend on the Display Resolution set. If it was at 1080p or below, 1080p would be what cutscenes were rendered at. If the user's display resolution was above 1080p, 2160p would be chosen as the resolution cutscenes were rendered at.
This would cause an issue with the 1440p middleground or other cases, where the end user's system and GPU would struggle to render the 2160p cinematics, hitch, and cause an audio desynchronization with each camera shot transition thanks to running on systems which could handle 1440p, but could not hope to truly handle 2160p.
Now, the ability to choose the render resolution for cutscenes is exposed with the Movie Display Mode setting. Quality Mode follows the original way this was handled, whereas Performance Mode forces 1080p rendering for cinematics no matter what Display Resolution is set.
So I can play in 1440 and watch the cutscenes in 1080? That... that will work for me. Thanks for the info. I am finally going to play the game now!
The other known cause of it that I could find prior to the new update was installing the game to a HDD. This game is not meant for those, and will have a habit of noticeably hitching when streaming in assets or turning around as a result of it. In fact, such hitches during camera shot transitions in cutscenes—see the first Rinko cutscene in Chapter 3—would cause the reported desynchronization in my testing.
If you like, re-test with Quality Mode to see if the typical cutscene rendering issue comes up again, or if the game respects your Display Resolution setting as it should now.