Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
If you have enough VRAM to run it, run it. Like the old saying about RAM, unused VRAM is wasted VRAM. If you don't have enough for it, don't use it or you'll be causing lots of unnecessary RAM>VRAM swapping and probably some hitches and bad 1% lows in framerate by using it.
Meh.
For me, I saw new visual bugs that never happened before. It just wasn't worth using. VRAM and RAM are not the issue, nor is the processor. But I am not sure they tested the Hi Res pack so well. But, w/e.
i9, 10900k, with a 2080S,
Awesome contribution...
Yes well, my question is whether they are only 4k textures or also got a 2k version, because without it the highest texture settings seem to contain a lot of textures of a quite low res.
I am not sure how well I could handle 4k textures with just 6gb of VRAM though, but I might try anyway.
They don't disclose what resolution they are and often they are a mix of 8k/4k/2k in most games that have an HD pack anyway, so there's no simple explanation for the actual texture resolutions, due to them being a mix. It's never as simple as the devs just running a given resolution for all textures across the board. It would be wasteful to do so. Only stuff near the camera needs hi-res variations. Stuff that is always moderately far away never needs to be as high as things like character textures in a TPS/TPARPG, so what you see when you pick apart the textures of a game like Shadow of War or Monster Hunter World, etc. is that the texture packs contain a mix of all different sizes depending on the context of what a texture is used for.
But yeah, a 6GB card won't cut it. The game routinely uses 7.5-7.8GB of VRAM[i.imgur.com] on my 8GB card, so yeah, stick with very high cause you don't have the VRAM for the HD pack anyway. Ultra is for people with 8GB+ of VRAM, IMO.
Yet another reason to upgrade.
What a stupid time to have a board without PCIE 4 slots. =/
But when I go by how the normal textures look, I think my system might be able to handle high rest ones, because I run other games with much sharper textures. I am actually surprised how low quality the default ones are.
Wait, what?
You mean PCIE 4.0?
That doesn't matter yet. Even a 2080 Ti is proven to be unable to fully saturate a PCIE 3.0 16x slot's bandwidth.
My GPU is only a GTX 1080 and is miles away from doing so. I doubt even an RTX 3080 will be able to do so.
Do keep in mind that PCIE standards are backwards compatible (a PCIE 4.0 card will work just fine on PCIE 3.0 boards) and the difference would only matter when it comes to raw bandwidth concerns. The bandwidth of PCI 3.0 is unlikely to become outdated with this gen of GPUs or even in the next gen (RTX 4000 and Big Navi 3 or whatever it's called), because the PCI 3.0 x16 is capable of some amazing bandwidth already. Far more than we've needed or are expecting to need in the immediate future. PCIE 3.0 x16 can handle 15,754 GByte/s and that is still a giant ceiling over what is required by any existing GPU.
It's more likely that your CPU and RAM's memory bandwidth will become a performance bottleneck before PCIE generation does.
Are you sure? I seem to remember channels like LTT recommending the highest PCIE ports for the respective cards.
Yeah, in the latest generations, you do not want to be running a PCIE 2.0 x16 motherboard for higher-end Pascal or Turing GPUs, but we've yet to see a single benchmark showing that 3.0 vs 4.0 matters at all yet. The math involved also fails to show a future issue for at least a while. The main reason is that PCI 3.0 to 4.0 doubles bandwidth, but GPU generations are much more incremental and not even close to 100% gains and even if they are, they tend to be more efficient in bandwidth needs which creep up a little bit per generation instead of doubling like NVMe bandwidth usages have. I'd be willing to bet that RTX 4000 cards are still commonly run in a PCIE 3.0 x16 slot by the average users come their release.
The main reason to have a PCIE 4.0 board revolves around the NVMe SSD world, where we do see about 20-30% increases in usable read and write speeds, but with GPUs, we're a healthy ways off from needing 4.0 yet.
What? Are we talking RTX 3000 series benchmarks??
Yeah well, I wonder when those are becoming affordable. I heard the PS5 might run those which should drop the price for PC ones too, but I haven't kept with the news and last time I saw it they were insanely expensive.
There really are none. 3000 benchmarks are still under a semi-embargo (they can't show FPS or frametimes in any video yet). I said Pascal and Turing which are GTX 1000 and RTX 2000's codenames. My point was you don't want to be running 2015-2017 GPUs on PCI 2.0 boards which are from like... pre-2010 era?
Nah... I mean, I bought a 1TB Samsung 960 EVO for $479 in 2017 and then replaced it with a 2TB Samsung 970 EVO Plus in 2019 for another $449. Handed the old 1TB down to my wife's PC.
Cost is subjective. I am a senior network engineer at my company and am paid 230k a year for my job so some good PC hardware costing a little extra money won't slow me down from owning it.
I want to be on your side but what you said is both illegible and incomprehensible so I'm just standing here holding my ♥♥♥♥ and wondering where I should point it... I feel like a certain Adam Sandler movie meme video from 1995 might be appropriate here.
Yep, cause you have 8GB, which is enough for the HD pack. That was kind of the whole takeaway of the entire, ongoing thread. You must be a wizard.