Instalează Steam
conectare
|
limbă
简体中文 (chineză simplificată)
繁體中文 (chineză tradițională)
日本語 (japoneză)
한국어 (coreeană)
ไทย (thailandeză)
български (bulgară)
Čeština (cehă)
Dansk (daneză)
Deutsch (germană)
English (engleză)
Español - España (spaniolă - Spania)
Español - Latinoamérica (spaniolă - America Latină)
Ελληνικά (greacă)
Français (franceză)
Italiano (italiană)
Bahasa Indonesia (indoneziană)
Magyar (maghiară)
Nederlands (neerlandeză)
Norsk (norvegiană)
Polski (poloneză)
Português (portugheză - Portugalia)
Português - Brasil (portugheză - Brazilia)
Русский (rusă)
Suomi (finlandeză)
Svenska (suedeză)
Türkçe (turcă)
Tiếng Việt (vietnameză)
Українська (ucraineană)
Raportează o problemă de traducere
They are optional, and not for low or mid range systems really, they usually even give said warning.
As gir you wanting options, you have them, it's just not free.
Anyway, I'll have DD2 downloaded shortly and I'll boot it up to check how it does on 4k, I mean, it'll run like poop, but not because of textures.
Edit, I mean, there us also the question of why are people trying to run new demanding games at 4k native on low / mid tier hardware to begin with?
I mean, frankly, you have to be a bit daft to try and play at 4k maxed out on an old or mid range card, so it's not really on nvidia now is it?
It goes farther back than that even.
Take the 600 series. There are 2GB GTX 680's that were "flagships" for their time that struggle to play anything now due to VRAM.
There was a 4GB GTX-670 that today is still roughly equivalent to a power hungry 1050ti and still serviceable to basic game use because it has twice the VRAM on a weaker core...
NV has used VRAM as a paywall for decades now, and it always ends the exact same. Anyone who goes for the lower VRAM cards looses long term. Anyone who opts for either the higher VRAM NV versions, or the higher VRAM AMD versions, even when the card has comparable or slightly weaker compute, gets a longer serviceable life out of the card both for games and non-game use (read things other than 2d, ie video rendering, Ai of GP-GPU compute loads, mining, SETI, etc).
Unrelated to the rest of the convo, but dude didnt know what interlacing was and seems confused that progressive scan would render lower fps lolz
Also, wtf is with a dev of a game making an interlaced option in 2024? There are literally no more displays that work that way in frequent use...
Of course the other option is to turn down the graphics settings to medium-low but that just makes the games look awful too and is a constant reminder that we can't afford a better video card, which is just depressing as heck and makes playing the games not even worth it.
It's part of why I made a terrible choice with my RTX 3070 Ti last time. At only 8GB of VRAM there are quite a few (about 15) games I would love to be playing right now but I just can't play them because of the VRAM requirements.
Meanwhile Nintendo will give 12GB combined with just rtx2050 level of GPU performance in their new Switch2 because Nintendo knows VRAM is the cheaper way to make games look better.
Again, I try games on both settings, sounds more like you are expecting top tier performance for mid tier money not thst you are running out of vram, again, share with me the games using 12GB at standard, not optional high res texture packs or mods and I'll go see myself if I have the games. It's not like I can do anything else, my main rig is in pieces awaiting a 5090 + water block and gir me to stop being lazy and delid the new 14900ks.
8GB cards are actually a pain, memory compression algorithms are pushed to the limits, without them we would need GPU's with huge amounts of VRAM.
Also someone already linked a video in this thread that shows about 10+ games that all require at least 10GB (or more) VRAM @ 1080p. I think you didn't read the previous comments in this thread before jumping in to comment so I'll re-link the video for you:
https://www.youtube.com/watch?v=dx4En-2PzOU
8GB is still enough for medium settings but for powerful cards like the 3070 series, it was definitely wasted potential because those cards should've had at least 10GB
As for dragon's dogma 2, I just got done testing, no raytracing, but 4k native max everything else was sitting at 7.9GB usage, dropping to 1080p dropped that to 7GB. This was in the market outskirts of the desert city looking / running around.
And frankly at settings that 8GB cards will struggle with more than the vram limit.
The 3070ti is not a replacement for a 1080ti., there's 5 years and 2 1/2 generations between them and in the performance stack, a 1080ti is decidedly low end now.
I could test it as well, but I really cannot be bothered to put it in my htpc.
And the 3070 is very much a mid range card at release, in the overall performance stack now, id put it more towards lower end of mid range tbh.
Fact is, memory isn't cheap and if you set your settings within reason for your hardware, 8GB is plenty of vram.
If someone wants to run settings really intended gir the current high end cards on the low or mid tier stuff native at 4k instead of 1080p, 140p medium with some high, that is on them, maybe they should buy a console if they cannot figure out sensible settings to r in n stuff at.
Anyway, my thought is that some form of Gpu cache would be required no? otherwise the Gpu will always be waiting. If we need 6-8Gb for game software, some for Data cache from Nvme DS then 12Gb is going to be absolute minimum needed I think.
So because I don't watch 1 video by a guy who thought a windows update got him 34% better performance, I cannot speak on a subject of vram usage, despite me just saying how I test this stuff myself and JUST got finished testing DD2 as asked, where I gave my results....... From an 8GB card.
Yes, I clearly should not speak on a subject that I actually tested myself, I should rely and parrot information from a YouTube who is often wrong about performance, my bad.
It’s often not even easy to tell if you are at the limit as data comes in chunks and you can use only 7,5GB and already be limited.