Εγκατάσταση Steam
Σύνδεση
|
Γλώσσα
简体中文 (Απλοποιημένα κινεζικά)
繁體中文 (Παραδοσιακά κινεζικά)
日本語 (Ιαπωνικά)
한국어 (Κορεατικά)
ไทย (Ταϊλανδικά)
Български (Βουλγαρικά)
Čeština (Τσεχικά)
Dansk (Δανικά)
Deutsch (Γερμανικά)
English (Αγγλικά)
Español – España (Ισπανικά – Ισπανία)
Español – Latinoamérica (Ισπανικά – Λατινική Αμερική)
Français (Γαλλικά)
Italiano (Ιταλικά)
Bahasa Indonesia (Ινδονησιακά)
Magyar (Ουγγρικά)
Nederlands (Ολλανδικά)
Norsk (Νορβηγικά)
Polski (Πολωνικά)
Português (Πορτογαλικά – Πορτογαλία)
Português – Brasil (Πορτογαλικά – Βραζιλία)
Română (Ρουμανικά)
Русский (Ρωσικά)
Suomi (Φινλανδικά)
Svenska (Σουηδικά)
Türkçe (Τουρκικά)
Tiếng Việt (Βιετναμικά)
Українська (Ουκρανικά)
Αναφορά προβλήματος μετάφρασης
Real PC gamers don't own computers to play games at 60 FPS, that is what consoles are made for.
Insert "This is fine, fire GIF here"...
https://i.giphy.com/media/QMHoU66sBXqqLqYvGO/giphy.webp
We had these stupid problems ever since the creation of RTX cards.
So while, yes, nVidia's bad drivers are to blame for the DLSS crashing issue, there's also the stuttering which is a completely separate hardware level issue, because nVidia's VRAM only counts to potato. It has literally half the VRAM of comparable AMD cards. This permanently cripples performance. So you may be able to tame the crashing issues, but you can never get rid of the stuttering no matter how many optimizations and patches are made to this game, because you are always going to be stuck running a game that was made for 16gb of VRAM on a potato RTX 3070. I know some people got hoodwinked into thinking this is bright and shiny and powerful because it was new and expensive, but that's the same thing that happens to boomers buying Macbooks. They get conned because they don't know any better. Now you do know better. nVidia intentionally makes their products age out and get busted within 2 years so they can force their customers to buy replacements every 2 years unless they bought an 80ti. AMD Radeon doesn't do this. That's one of the biggest reasons why lots of people especially lately switch to AMD, because they know they can sort of futureproof with an AMD card, but their nVidia card is going to become a stuttering mess within a few years because it hasn't got enough VRAM despite costing ridiculously too much money. Why can nVidia do this? Forcing "brand awareness" through mass marketing and signing contracts with the major SIs to ram it into every prebuilt and laptop to make gamers think of it as the default no matter how bad nVidia's drivers are or how bad their hardware performs.
If I set additional details to ultra, I get crashes, so just dont over max out your settings. :)
NVIDIA has acknowledged their latest drivers have a known bug causing crashing in Last of Us, Forza, and other games.
It doesn't matter how long a game company had a bugged driver to test with. If the driver is bugged (which NVIDIA themselves confirm) it is not up to the game company to try and cobble together some BS Rube Goldbergian workaround for the video card company's f-up.
That's like a ladder company selling a bunch of ladders with defective rungs that break and fall off when you step on them, then blaming the construction workers for falling and breaking their neck because THEY had those faulty ladders for weeks, and THEY could have added extra bolts and fixed the rungs themselves. It's bananas man.
The faulty drivers need to be fixed by the faulty driver company. End of story.
What is it with this NVIDIA apologizing anyway? This company charges scalper price for cards, doubling the price from previous generations that cost basically the same to make, and people bend over to defend THEM?
Bro. Just no.
The problem running this game is people who have 8gb potato cards like the 3060ti and 3070 are running way past VRAM limits on higher settings and/or higher resolution and so it's causing these massive frametime drops being all over the place. Do note: this is separate from the DLSS crashing afaik. It's a combination of nVidia's bad drivers with their crippled VRAM. This was a game designed to run on an octacore, 16gb VRAM, high RAM AMD/Radeon based console. This is going to be a problem 3070 owners are going to keep seeing, because your 8gb VRAM can't be fixed, all you can do is turn down settings to high or medium. This is also going to be a problem for most RTX 4000 owners in that AMD hardware is on Xbox and PS5 so it's easier to optimize for AMD hardware AAA titles. These were obvious problems in 2020 with Ampere. The problems with Lovelace are obvious right now, like power draw, size, bursting into flames etc. we'll see how bad RTX 4000 drivers are, but the 4060 is probably dead in the water for not having enough VRAM.
Yeah until now that you noticed, because like I and others have said nVidia deliberately designs their cards to have the exact limit of gimped VRAM to still perform decently for about two years which is just long enough to start breaking down around the next release. They design this ♥♥♥♥ on purpose to have exactly the barest minimum to not have these problems for two years for the most part, and even then lots of us had warned you guys that already a game released in 2019 called Doom Eternal was sucking up 11gb worth of VRAM and that 8-10gb wasn't going to be enough. Now it isn't. This may be the first game you experienced this, but it is only the first. No doubt there is some other game in your backlog somewhere you haven't even tried yet that is going to have these problems. They will crash and stutter, it will be a miserable experience on ultra. Your solution is going to be to turn down all those quality settings and content yourself with a 3070 or 3080 only being able to do medium to high unless you want the stutters back. This is why so many of us were in an uproar about nVidia's practices, because we knew full well that it wasn't enough, like imagine designing a plane to have just enough fuel and thrust to take customers' money on boarding and then have just barely enough fuel to crash land at the next airport. That is what they're doing. And it is going to keep happening to you as you discover more and more new games that need above 8-10gb at your resolution. This was always the problem with nVidia cards though, and some were worse than others like the 3gb 1060, 4gb 6500XT is garbage, GTX 770 2gb and so on won't work. IIRC HD 7970 also is a 2gb isn't it? So some cards might not be weak and have the performance, but even if they don't get driver problems like API version they may age way more poorly and rapidly if they don't got the VRAM. 3070 didn't. The 3060t and 3080 did not have enough VRAM to maintain their resolutions. The same is going to be true for the 4050 and 4060 that won't have the VRAM to keep up. This is why we thought it was nuts that the 3080 only had 10gb, because it clearly needed to have at least 12gb minimum to keep going. This is also why people saw the 10gb version as considerably less valuable.
If you didn't know this on purchase I'm sorry you had to find out this way but you guys need to realize VRAM limits is a real thing and hold the company you purchase from to a higher standard, otherwise you're just going to replace it with something else that stutters two years later.