Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
https://www.youtube.com/watch?v=11VTtIwboe8
And if you want further proof, just read the damn patch notes:
Put that in your pipe and smoke it.
Honestly dude I’d just ignore that idiot. He sounds like the same kind of person that would defend tlou’s absolutely atrocious vram and textures issues and justify them taking up an insane amount of vram while looking worse than ♥♥♥♥♥♥♥. Luckily braindead buffoons like him are the minority and naughty dog fixed vram consumption issues with that game. This game also needs the exact same treatment because according to nixxes an rtx 3070 can run this game at high settings 1440p with Ray tracing enabled at 60fps which implies 8GB is enough. They made the port and they know better than anyone because of that, so if thier recommended hardware is underperforming despite what they listed that’s on THEM. They need to optimize vram consumption, and that’s not up for debate.
That is just a wild guess though but based on my own experience of the game on a 16 GB RTX 4080 when played on the highest possible settings, including RT set to 10, VRAM usage is around 11-12 GB on average. I saw it hit 13 GB at one point. I am playing at 1440p though using DLSS Quality and not 4K.
No.
I see a website that claims RC: RA runs excellently with good VRAM management.
Someone else sees a video that claims issues, and that makes me an idiot. No it doesn't.
You clearly did not look at the source I quoted.
https://www.techpowerup.com/review/ratchet-clank-rift-apart-benchmark-test-performance-analysis/6.html
They say, quote,
"What will be challenging for older hardware is the VRAM requirements which are pretty high. Even at the lowest setting, with RT off, the game allocates around 8 GB VRAM. Our performance benchmarks clearly show 8 GB cards at a disadvantage, but surprisingly the game still runs at smooth FPS, without stutter, just lower FPS than you'd expect from a given card. For example, the RTX 4060 Ti 8 GB gets 75 FPS at highest setting, usually you'd expect RTX 2080 Ti 11 GB to run at roughly the same FPS, but here it gets 102 FPS, a 33% difference. The game is quite smart about the way it allocates VRAM, on cards with smaller VRAM sizes (which it detects correctly) it will preload fewer assets and generally do the right thing, to ensure things run without stuttering. Maybe Direct Storage helps in this case, because it can load assets from disk faster and with less latency."
Admittedly I did not see the Digital Foundry breakdown video. However that was deliberate as I am avoiding any Youtube vids on this game. I don't want to see lots of the game before I play it.
I take it that DF got a pre release version, unless they managed to play and make that entire video within three days.
As for Terepin's quote about patch notes, that said, "Resolved texture streaming issues that could result in certain textures remaining low resolution."
It doesn't mean everyone got it.
You can't expect a game to come out perfect for every PC config. It clearly depends on what PC is being used, as shown by Techpowerup vs Digital Foundry.
However it's a damn sight better that almost every other game that has been ported to PC. With fixes coming fast.
... And no, I would not defend TLoU.
Who's the braindead buffoons idiot now? Not me?
I'm out. Unsubbing another trolling thread. Waste of my time.
This is why I bought a 7900 XTX after owning a 3080. But I wasn't giving Nvidia any money after gimping my last card as that is why they did it. Even my 1080Ti I had prior in 2017 had more vram. No regrets with the AMD card and is smashes anything at 3440x1440.
The 4070 and 4070Ti are gonna age badly as well. Even the 4080 is when it comes to 4K as I have seen many games use more than 16gb vram at 4K. Even R&C is using 13gb at 3440x1440 max settings no RT. I'm not sure at 4K as my son is borrowing my long HDMI 2.1 cable. But can't wait to see it run on a OLED TV at native resolution!
https://www.youtube.com/watch?v=_-j1vdMV1Cc&t=12s
There is a general misconception on the internet lately:
- If your GPU has 24 GB VRAM it will allocate and load assets in those 24 GB.
- If your GPU has 4 GB VRAM it will allocate and load assets in those 4GB.
This does NOT mean the game REQUIRES 24 or 4GB of VRAM to render a particular scene.
BUT, having things in VRAM "ready to go" it's faster and easier on the whole system. For example if you only have 4 GB the engine will have to allocate/deallocate a lot of resources on the fly and load from disk to RAM to VRAM (via CPU or DirectStorage now). If the RAM is fast enough it will not be an issue, however the traffic on the PCI-E bus will increase a lot, so if you don't run 16x PCI-E 4.0 you will have slow-downs (aka low fps).
The misconception is this: Allocated VRAM is NOT the same with Required VRAM. Hell grab 64 GB of RAM and you can load the whole game in RAM. Grab a Quadro GPU and you can load all assets in VRAM and you can remove your disk at this point :))
Wow. I never thought about that. Neither the 3,000 people here.