Instalar Steam
iniciar sesión
|
idioma
简体中文 (chino simplificado)
繁體中文 (chino tradicional)
日本語 (japonés)
한국어 (coreano)
ไทย (tailandés)
Български (búlgaro)
Čeština (checo)
Dansk (danés)
Deutsch (alemán)
English (inglés)
Español de Hispanoamérica
Ελληνικά (griego)
Français (francés)
Italiano
Bahasa Indonesia (indonesio)
Magyar (húngaro)
Nederlands (holandés)
Norsk (noruego)
Polski (polaco)
Português (Portugués de Portugal)
Português-Brasil (portugués de Brasil)
Română (rumano)
Русский (ruso)
Suomi (finés)
Svenska (sueco)
Türkçe (turco)
Tiếng Việt (vietnamita)
Українська (ucraniano)
Comunicar un error de traducción
This game uses 100GB installed data and not sure why it use so much CPU power to decompress the data in game loading.
Definitely possible.
I'm having more GPU issues there than CPU issues in those areas. I'm not claiming it is a super GPU,... I keep the CPU Frame Rate and GPU Frame Rate counters active, so when it hits 30 FPS I can see what to tune. Hence why I wanted to lower my settings.
Lowering graphics settings just seems to exacerbate the issue. Like VRAM isn't getting flushed.
Could be asset streaming also. I wonder if asset streaming is saturating the GPU pipeline causing the GPU frame rate to drop?
VRAM microstutter,... Not what I am experiencing. This isn't stutter, this is 99% averages of 30 FPS.
Memory leak is a suspect. I don't remember ever seeing the game max out 32GB, but it may have an internal DRAM cap?
Do you need glasses? I've got 32GB of RAM. I don't need a bigger page file, I just need to exit to the desktop and relaunch the game after each chapter.
Also, the game is just CPU heavy to begin with. Attempting to lower graphic settings increases frame rate counts which actually puts more workload on the CPU which is why the issue is getting worse with lower graphical settings. You wanna take as much of the workload off of the CPU as possible. Max graphic settings your GPU can handle within spec. Then turn down the game settings that pertain specifically to CPU. Higher resolutions can help a lot with CPU performance as it forces your GPU to work harder and produce less frames resulting in more headroom for the CPU to work on CPU bound tasks.
Agreed. CPU just wasn't 100%. So I'm investigating other possible causes. OP had the same issue so I'm attempting to collaborate.
Basic The Last of Us Optimization. I'm past that.
The OP and I both have PCI-Express 3.0 bus speeds. I'm presuming that my RX 6600 discreet GPU is limited to that PCI-Express 3.0 x8 bus.
Desktop class PCI-Express 3.0 x16 buses, and the PCI-Express 4.0 x8 may not typically be seeing this issue.
OP has a 4090 on a CPU with a PCI-Express 3.0 x16 bus, which means Ultra assets saturating that bus... ???
PCIe gen 3 x8 is also a potential issue here, and the CPU may not be able to in a timely manner load everything up to the GPU with how the game engine is designed. Gotta keep in mind, as a port, there's some things that will be heavily limited in terms of optimizations without the devs rebuilding the entire PC client to change how the game engine pipeline works. So some otherwise reasonable optimization efforts won't be so easy to introduce to fix the performance issues on lower game settings for these particular types of hardware configurations. All the work that goes into a PC preparing everything that the GPU needs to do takes time on a PC where the PS5 just doesn't have those latency bottlenecks because of the architectural design of the console hardware as a whole. Without nerfing actual asset sizes and file sizes, I'm not sure what all can actually be done to reduce the impact of these particular bottlenecks.
Realistically, it's unoptimal to put a PCIe Gen 4 device in a PC that ends up using it at lower PCIe versions. Was a pretty common feature with a lot of 30 series GPUs when people first upgraded and it wasn't necessarily something that was actually showing any real performance differences because of the way game workloads were up until quite recently. So from a technical standpoint, if these are the challenges you have, you'll either have to figure out what settings to sacrifice in favor of performance until you eventually decide to upgrade your parts or build a new PC that is optimal for the hardware pairing. Unfortunately, most prebuilt systems don't actually take this into consideration with hardware around the end of PCIe Gen 3 and the beginning of PCIe Gen 4.
My fps is fine in 1440p @ ultra but it pushes my 5950x and 3090 way too hard for the level of visual quality. The game lacks a crisp look and on top of that the way it handles textures is really bad. I notice some odd texture flickering going on. Its on a 980 pro nvme and I have 64gb of ram.
On top of that the way the camera moves with the mouse makes the game actually feel laggier than it is. Really strange. Glad I didnt pay for it ♥♥♥♥♥
Yeah, that is where my gut is sitting at the moment.
You technically have better hardware than me, yet you're not getting as good of performance as me (I'm playing on 4k myself with really great frame rate and frame time, 5800X, 48GB DDR4 3200 RAM, 12GB 3060, WD Blue NVMe SSD storage). It sounds like there's an issue causing a significant slowdown in your performance and it might actually be due to the SSD you have.
I'm also not denying the game doesn't have optimization it can use to improve performance for many users. Just in your specific situation, you might have the firmware bug with your SSD, or you might also have the nVidia driver bug that is screwing with how the GPU is rendering in the game (which can also get as bad as causing the game to randomly crash).
There's a lot of people still using a fair number of older Intel CPUs that are like that, a single thread for each core, and these CPUs have a dramatic enough drop off in performance compared to multithreaded CPUs, as in a CPU that has 2 threads for each core.
There should be something else. I’m with a weaker CPU, 5900x, and its usage stays under 5%. The thing is I’m playing at 4K native, which pushes the GPU to its limits and if there is a bottleneck it’s the GPU, not the CPU.
About the “crisp look”, did you remember to disable DOF in settings ?
BTW I haven’t seen a single texture flickering so far (I’m in the middle of Bil’s Town).
I’m not defending the game because I don’t have major issues, I guess it’s a crappy port if so many people are complaining.
You can understand the confusion we both had. "single threaded CPU" sounds like a Pentium III.
They have confirmed potential memory leaks, the game is running on a DX12 wrapper and not natively on DX12 ( it's emulated DX12 ) and has huge cpu overhead.
Hotfix coming tomorrow with a larger patch dropping afterwards.
This is something op cannot fix himself no matter what graphics options he changes, This can only be fixed via patching to make it run natively on DX12 which means rebuilding the entire render pipeline and better memory management.
Stop making a fool of yourself, this is pathetic. 8 threads from 8 cores is much better than 8 threads from 4 cores (your understanding of multi-threading).
Few posts up you'll see a guy who has a CPU with 16 cores / 32 threads (5950x). And he is complaining ... Do you think PS4 and PS5 have better CPUs ?