Steam installieren
Anmelden
|
Sprache
简体中文 (Vereinfachtes Chinesisch)
繁體中文 (Traditionelles Chinesisch)
日本語 (Japanisch)
한국어 (Koreanisch)
ไทย (Thai)
Български (Bulgarisch)
Čeština (Tschechisch)
Dansk (Dänisch)
English (Englisch)
Español – España (Spanisch – Spanien)
Español – Latinoamérica (Lateinamerikanisches Spanisch)
Ελληνικά (Griechisch)
Français (Französisch)
Italiano (Italienisch)
Bahasa Indonesia (Indonesisch)
Magyar (Ungarisch)
Nederlands (Niederländisch)
Norsk (Norwegisch)
Polski (Polnisch)
Português – Portugal (Portugiesisch – Portugal)
Português – Brasil (Portugiesisch – Brasilien)
Română (Rumänisch)
Русский (Russisch)
Suomi (Finnisch)
Svenska (Schwedisch)
Türkçe (Türkisch)
Tiếng Việt (Vietnamesisch)
Українська (Ukrainisch)
Ein Übersetzungsproblem melden
I would agree with you, but there is this:
https://www.techpowerup.com/img/sQfiw9fBdD0nVGSH.jpg
Recommended: 3060 Medium preset
High: 3070 High preset.
while there is a 3060 with 12GB VRAM the 3070 only has 8GB VRAM. :(
So if they advertise this it means it should work correctly on High Preset on a 3070 GPU with 8 GB of VRAM.
The fact that it degrades over time or on fast travel / cutscenes means there is a bug as going from 60 fps in 1 frame to 15 in the next frame when a cutscene starts is not what they advertised...
In any case, I am not sure why I even posted here. I used the feedback customer support ticket to explain the situation back to the developer and showed a few screenshots and proof as well as a savegame and steps to reproduce the issue.
I took 3 screenshots, 30 minutes a part. The usage goes up 500MB per 30 minutes and stopps at 10.2 GB.
It stays at 10.2GB VRAM no matter how long i play after or what i do or where i go or how many times i fast travel.
Maybe this Helps for further investigation.
I have 24GB VRAM, b ut i doubt this has anything to do with it.
Good Luck.
https://imgur.com/a/2F8QFgi
Its Rivatuner and thx, i know what im doing.
I don't know what you're doing, but you're not measuring VRAM usage :)
Dude, just stop embarrase yourself any further.
Thank you for the image. I will see if I can update my ticket as well.
In the meantime I believe I will play the game on my Desktop PC, where I also have a RTX 4090/13900KF/64GB/Nvme 32:9 Oled, etc and all is good.
(I was actually looking forward to grab my desktop replacement "laptop" with me for the weekend and get some gameplay going on in the evening ^_^. Guess I will stick with the Steam Deck Oled where the perf is around 30 FPS max but the unified memory architecture there is actually not showing this behaviour.. if only the PC would also use UMA ^_^)
You're not measuring VRAM usage and you never have been.
I don't understand the hostility, I'm just trying to correct your misconception that MSI Afterburner measures VRAM. Even "Dedicated GPU Memory Usage" is not VRAM.
Because i cant stand people who dont know what they are talking about act like you. Block it is than. Fine by me.
Yes, I am saying every gpu with 8gb of vram is trash. And if you knew anything about architecture you would instead know that swapping to physical memory isn't the issue per-say, it's the bus connection that is the underlying cause of this. Physical memory isn't THAT much slower than VRAM, it's locality, That VRAM swap happens on a per frame basis leading bus overload when doing the actual swap, heavily degrading performance in this case. Maybe you shouldn't assume what people know, but please, if you know so much about architecture, make a suggestion about how the developers could cater to your 6 year old graphics card better.
His lack of understanding comes from having a 24 GB GPU, as such, the game never get itself pressured into putting anything to shared memory. On 8-10 GB GPU cards, game will easily spook out and pressured into leaking into shared memory. It is by design, but a bad design nonetheless
https://www.gamedeveloper.com/marketing/gdc-how-marvel-s-spider-man-weaved-an-optimized-web-for-pc-gamers
"Roza confirmed that Nixxes built an entire system to accommodate players who ignore Spider-Man's recommendations for graphics cards (GPUs)—specifically, anyone who plays the game with settings that exceed the amount of video memory (VRAM) available in their GPU. Though Windows includes a memory management system that can shift memory-related calls from VRAM to general-purpose system RAM, this is unoptimized by default, and video games need quicker memory for specific tasks.
Nixxes' solution was to create its own non-unified memory management system, which ranked and tracked a hierarchy of calls' importance in the code, then measured them by "distance to 512 MB" in order to prioritize which large chunks of memory would make the most sense in a shift from VRAM to system RAM. So if you've ever decided to push your Spider-Man gameplay on a GPU that wasn't quite up to the task, you can thank Nixxes for implementing an invisible reduced-stutter option."
Problem here is that people are having these problems with recommended "3070" at settings lower than what they suggest. I really would like to see what kind of a magical 3070 Nixxes has.
I also believe there's no leak though. Almost every game in existence uses a bit more VRAM with time and stop at some point. It is just how most engines/games operate. Even 6 GB 1060 had enough VRAM surplus to accomodate for such behavior back in 2016. 8 GB in 2024 is just on the border of barely being enough. I'm not surprised problems keep happening.
I did for the burning shores dlc (on ps5). it was amazzzzingggg. The vangelis-esque sci-fi music definitely made me peak!
You are actually correct!
Measuring via GPU-Z the PCI-E Bus utilization I can see over time it sky-rockets up 80-90% usage. It starts at 40%.
So yes, You are correct.
Sure, 8 GB VRAM cards are trash.. the 3070 or the 4070 are not 6 years old cards, though...
And the 3070 is the Recommended Card for the Recommended Medium preset with "only" 8GB VRAM.
I guess Nixxes was telling us to use a TRASH GPU then for recommended settings :)
You see where I am getting at right?
As for suggestions.. You know that textures have Mips right? power of 2... use those if you don't have enough memory to allocate Mip 0 or at least make the allocation less agressive so you don't end up in the scenario where you need to dump a lot of memory over the PCI-E bus, or load from RAM...
It's quite ironic since Ratchet and Clank has the exact same issue and it was ported by Nixxes last year (different engine). So I guess this issue is how they translated the UMA architecture in PS5 to the PC where you have VRAM and RAM. Also the Steam Deck doesn't exhibit this problem as it has UMA ...
*UMA = Unified Memory Architecture