Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
https://www.youtube.com/watch?v=xbvxohT032E
If you think your 3060ti will drop a constant 70fps to a constant 3fps because of one fifth of one setting then I have a rock to sell you.
Not to mention that, this puts Nvidia at more disadvantage than AMD, because it's Nvidia who is known to cut corners with VRAM.
Just buy a better GPU for over a thousand dollars, what’s the problem? :P
Yeah, that's right, ♥♥♥♥ me
This game is not using traditional memory managment systems pc use (its a console port), and imo has hard cuts for ram usage... forgoing the need to optimize for pc memory structure (i dont think this game is using system ram for video with a traditional pool, it is streaming a ton from disk). This is why you, for the first time ever, see much lesser GPUs outperforming greater ones. All because of cheap vram? Gimme a break. Texture quality doesnt vary like that because of 4gb of video memory. Your 3060ti doesnt drop 99% of frames because of one fifth of one setting. EVER, until now.
I mean this **** is on a 128bit bus, outperforming 2x bus speed because of vram total (of the same memory on same architecture!)? Where is the management? Its a 99 percent difference!
If you bump up the setting one notch it expands the cache by 512 MB so probably not. But in time it will because it goes over 8 GB
Actually I know what I am talking about? Nice assumption though. That doesnt call anything into question.
8gb 256bit >>>>> 16gb 128bit on same architecture, until THIS game. Stood for 40 years, until THIS GAME. This is a buffer not an explicit pool, and this game is exploiting the fact that you dont understand what that means.
But saying that it's because of Nvidia sponsorship is silly, because it will affect Nvidia cards more than AMD, because Nvidia is stingy on VRAM.
You have lesser 128bit cards running 3x-4x higher texture settings than 256bit cards. Same architecture. That has never happened. Not one time, ever.
I did. It was playable. Not a hard, constant 5fps. Because it was programmed like a pc game and not a console port with optimization amputations.
Also ran the Kingdom Come Deliverance and Shadow of Mordor texture packs on a 970 at decent fps, both called for much more than the 4gb vram on that card. It had the power and bandwidth, like most of these cards do to make that minute jump in quality here, but "strangely" cant for the first time. Its play or powerpoint in a fifth of a single setting. Nothing else matters.