Marvel's Avengers - The Definitive Edition

Marvel's Avengers - The Definitive Edition

View Stats:
765 Sep 5, 2020 @ 9:44pm
High Res Texture Pack?
Anyone know whether it's only for 4k res or also 2k?
< >
Showing 1-15 of 34 comments
Ancient Sep 5, 2020 @ 9:47pm 
You can use it for any resolution, but it's value is greatly diminished below 1440p. 1080p simply does not have enough pixels on the screen for most people to even see the difference in quality (they'll just be mipped back to lower versions unless you're right up against a wall even at 4K anyway).

If you have enough VRAM to run it, run it. Like the old saying about RAM, unused VRAM is wasted VRAM. If you don't have enough for it, don't use it or you'll be causing lots of unnecessary RAM>VRAM swapping and probably some hitches and bad 1% lows in framerate by using it.
Last edited by Ancient; Sep 5, 2020 @ 9:50pm
Moga CMDR Sep 5, 2020 @ 9:48pm 
none of that actually means a thing. Your question isn't really valid. Literally. It is oversimplifying concepts that most people don't understand.

Meh.

For me, I saw new visual bugs that never happened before. It just wasn't worth using. VRAM and RAM are not the issue, nor is the processor. But I am not sure they tested the Hi Res pack so well. But, w/e.

i9, 10900k, with a 2080S,
Last edited by Moga CMDR; Sep 5, 2020 @ 9:51pm
765 Sep 5, 2020 @ 10:33pm 
Originally posted by Guild Sweetheart:
none of that actually means a thing. Your question isn't really valid. Literally. It is oversimplifying concepts that most people don't understand.

Meh.

For me, I saw new visual bugs that never happened before. It just wasn't worth using. VRAM and RAM are not the issue, nor is the processor. But I am not sure they tested the Hi Res pack so well. But, w/e.

i9, 10900k, with a 2080S,

Awesome contribution...
765 Sep 5, 2020 @ 10:37pm 
Originally posted by Ancient:
You can use it for any resolution, but it's value is greatly diminished below 1440p. 1080p simply does not have enough pixels on the screen for most people to even see the difference in quality (they'll just be mipped back to lower versions unless you're right up against a wall even at 4K anyway).

If you have enough VRAM to run it, run it. Like the old saying about RAM, unused VRAM is wasted VRAM. If you don't have enough for it, don't use it or you'll be causing lots of unnecessary RAM>VRAM swapping and probably some hitches and bad 1% lows in framerate by using it.

Yes well, my question is whether they are only 4k textures or also got a 2k version, because without it the highest texture settings seem to contain a lot of textures of a quite low res.
I am not sure how well I could handle 4k textures with just 6gb of VRAM though, but I might try anyway.
Last edited by 765; Sep 5, 2020 @ 10:37pm
Ancient Sep 5, 2020 @ 10:43pm 
Originally posted by RuneGuard:
Originally posted by Ancient:
You can use it for any resolution, but it's value is greatly diminished below 1440p. 1080p simply does not have enough pixels on the screen for most people to even see the difference in quality (they'll just be mipped back to lower versions unless you're right up against a wall even at 4K anyway).

If you have enough VRAM to run it, run it. Like the old saying about RAM, unused VRAM is wasted VRAM. If you don't have enough for it, don't use it or you'll be causing lots of unnecessary RAM>VRAM swapping and probably some hitches and bad 1% lows in framerate by using it.

Yes well, my question is whether they are only 4k textures or also got a 2k version, because without it the highest texture settings seem to contain a lot of textures of a quite low res.
I am not sure how well I could handle 4k textures with just 6gb of VRAM though, but I might try anyway.

They don't disclose what resolution they are and often they are a mix of 8k/4k/2k in most games that have an HD pack anyway, so there's no simple explanation for the actual texture resolutions, due to them being a mix. It's never as simple as the devs just running a given resolution for all textures across the board. It would be wasteful to do so. Only stuff near the camera needs hi-res variations. Stuff that is always moderately far away never needs to be as high as things like character textures in a TPS/TPARPG, so what you see when you pick apart the textures of a game like Shadow of War or Monster Hunter World, etc. is that the texture packs contain a mix of all different sizes depending on the context of what a texture is used for.

But yeah, a 6GB card won't cut it. The game routinely uses 7.5-7.8GB of VRAM[i.imgur.com] on my 8GB card, so yeah, stick with very high cause you don't have the VRAM for the HD pack anyway. Ultra is for people with 8GB+ of VRAM, IMO.
Last edited by Ancient; Sep 5, 2020 @ 10:57pm
765 Sep 5, 2020 @ 10:56pm 
Originally posted by Ancient:
Originally posted by RuneGuard:

Yes well, my question is whether they are only 4k textures or also got a 2k version, because without it the highest texture settings seem to contain a lot of textures of a quite low res.
I am not sure how well I could handle 4k textures with just 6gb of VRAM though, but I might try anyway.

They don't disclose what resolution they are and often they are a mix of 8k/4k/2k in most games that have an HD pack anyway, so there's no simple explanation for the actual texture resolutions, due to them being a mix.

But yeah, a 6GB card won't cut it. The game routinely uses 7.5-7.8GB of VRAM[i.imgur.com] on my 8GB card, so yeah, stick with very high cause you don't have the VRAM for the HD pack anyway. Ultra is for people with 8GB+ of VRAM, IMO.

Yet another reason to upgrade.
What a stupid time to have a board without PCIE 4 slots. =/
But when I go by how the normal textures look, I think my system might be able to handle high rest ones, because I run other games with much sharper textures. I am actually surprised how low quality the default ones are.
Last edited by 765; Sep 5, 2020 @ 10:57pm
Ancient Sep 5, 2020 @ 11:00pm 
Originally posted by RuneGuard:
Originally posted by Ancient:

They don't disclose what resolution they are and often they are a mix of 8k/4k/2k in most games that have an HD pack anyway, so there's no simple explanation for the actual texture resolutions, due to them being a mix.

But yeah, a 6GB card won't cut it. The game routinely uses 7.5-7.8GB of VRAM[i.imgur.com] on my 8GB card, so yeah, stick with very high cause you don't have the VRAM for the HD pack anyway. Ultra is for people with 8GB+ of VRAM, IMO.

Yet another reason to upgrade.
What a stupid time to have a board without PCIE 4 slots. =/

Wait, what?

You mean PCIE 4.0?

That doesn't matter yet. Even a 2080 Ti is proven to be unable to fully saturate a PCIE 3.0 16x slot's bandwidth.

My GPU is only a GTX 1080 and is miles away from doing so. I doubt even an RTX 3080 will be able to do so.

Do keep in mind that PCIE standards are backwards compatible (a PCIE 4.0 card will work just fine on PCIE 3.0 boards) and the difference would only matter when it comes to raw bandwidth concerns. The bandwidth of PCI 3.0 is unlikely to become outdated with this gen of GPUs or even in the next gen (RTX 4000 and Big Navi 3 or whatever it's called), because the PCI 3.0 x16 is capable of some amazing bandwidth already. Far more than we've needed or are expecting to need in the immediate future. PCIE 3.0 x16 can handle 15,754 GByte/s and that is still a giant ceiling over what is required by any existing GPU.

It's more likely that your CPU and RAM's memory bandwidth will become a performance bottleneck before PCIE generation does.
Last edited by Ancient; Sep 5, 2020 @ 11:15pm
765 Sep 5, 2020 @ 11:15pm 
Originally posted by Ancient:
Originally posted by RuneGuard:

Yet another reason to upgrade.
What a stupid time to have a board without PCIE 4 slots. =/

Wait, what?

You mean PCIE 4.0?

That doesn't matter yet. Even a 2080 Ti is proven to be unable to fully saturate a PCIE 3.0 16x slot's bandwidth.

My GPU is only a GTX 1080 and is miles away from doing so. I doubt even an RTX 3080 will be able to do so.

Do keep in mind that PCIE standards are backwards compatible (a PCIE 4.0 card will work just fine on PCIE 3.0 boards) and the difference would only matter when it comes to raw bandwidth concerns. The bandwidth of PCI 3.0 is unlikely to become outdated with this gen of GPUs or even in the next gen (RTX 4000 and Big Navi 3 or whatever it's called), because the PCI 3.0 x16 is capable of some amazing bandwidth already. Far more than we've needed or are expecting to need in the immediate future. PCIE 3.0 x16 can handle 15,754 GByte/s and that is still a giant ceiling over what is required by any GPU.

Are you sure? I seem to remember channels like LTT recommending the highest PCIE ports for the respective cards.
Ancient Sep 5, 2020 @ 11:19pm 
Originally posted by RuneGuard:
Originally posted by Ancient:

Wait, what?

You mean PCIE 4.0?

That doesn't matter yet. Even a 2080 Ti is proven to be unable to fully saturate a PCIE 3.0 16x slot's bandwidth.

My GPU is only a GTX 1080 and is miles away from doing so. I doubt even an RTX 3080 will be able to do so.

Do keep in mind that PCIE standards are backwards compatible (a PCIE 4.0 card will work just fine on PCIE 3.0 boards) and the difference would only matter when it comes to raw bandwidth concerns. The bandwidth of PCI 3.0 is unlikely to become outdated with this gen of GPUs or even in the next gen (RTX 4000 and Big Navi 3 or whatever it's called), because the PCI 3.0 x16 is capable of some amazing bandwidth already. Far more than we've needed or are expecting to need in the immediate future. PCIE 3.0 x16 can handle 15,754 GByte/s and that is still a giant ceiling over what is required by any GPU.

Are you sure? I seem to remember channels like LTT recommending the highest PCIE ports for the respective cards.

Yeah, in the latest generations, you do not want to be running a PCIE 2.0 x16 motherboard for higher-end Pascal or Turing GPUs, but we've yet to see a single benchmark showing that 3.0 vs 4.0 matters at all yet. The math involved also fails to show a future issue for at least a while. The main reason is that PCI 3.0 to 4.0 doubles bandwidth, but GPU generations are much more incremental and not even close to 100% gains and even if they are, they tend to be more efficient in bandwidth needs which creep up a little bit per generation instead of doubling like NVMe bandwidth usages have. I'd be willing to bet that RTX 4000 cards are still commonly run in a PCIE 3.0 x16 slot by the average users come their release.

The main reason to have a PCIE 4.0 board revolves around the NVMe SSD world, where we do see about 20-30% increases in usable read and write speeds, but with GPUs, we're a healthy ways off from needing 4.0 yet.
Last edited by Ancient; Sep 5, 2020 @ 11:29pm
765 Sep 5, 2020 @ 11:29pm 
Originally posted by Ancient:
Originally posted by RuneGuard:

Are you sure? I seem to remember channels like LTT recommending the highest PCIE ports for the respective cards.

Yeah, in the latest generations, you do not want to be running a PCIE 2.0 x16 motherboard for Pascal or Turing GPUs, but we've yet to see a single benchmark showing that 3.0 vs 4.0 matters at all yet.

What? Are we talking RTX 3000 series benchmarks??

Originally posted by Ancient:
The main reason to have a PCIE 4.0 board revolves around the NVMe SSD world, where we do see about 20-30% increases in usable read and write speeds, but with GPUs, we're a healthy ways off from needing 4.0 yet.

Yeah well, I wonder when those are becoming affordable. I heard the PS5 might run those which should drop the price for PC ones too, but I haven't kept with the news and last time I saw it they were insanely expensive.
Ancient Sep 5, 2020 @ 11:45pm 
Originally posted by RuneGuard:
Originally posted by Ancient:

Yeah, in the latest generations, you do not want to be running a PCIE 2.0 x16 motherboard for Pascal or Turing GPUs, but we've yet to see a single benchmark showing that 3.0 vs 4.0 matters at all yet.

What? Are we talking RTX 3000 series benchmarks??

There really are none. 3000 benchmarks are still under a semi-embargo (they can't show FPS or frametimes in any video yet). I said Pascal and Turing which are GTX 1000 and RTX 2000's codenames. My point was you don't want to be running 2015-2017 GPUs on PCI 2.0 boards which are from like... pre-2010 era?

Originally posted by RuneGuard:
Originally posted by Ancient:
The main reason to have a PCIE 4.0 board revolves around the NVMe SSD world, where we do see about 20-30% increases in usable read and write speeds, but with GPUs, we're a healthy ways off from needing 4.0 yet.

Yeah well, I wonder when those are becoming affordable. I heard the PS5 might run those which should drop the price for PC ones too, but I haven't kept with the news and last time I saw it they were insanely expensive.

Nah... I mean, I bought a 1TB Samsung 960 EVO for $479 in 2017 and then replaced it with a 2TB Samsung 970 EVO Plus in 2019 for another $449. Handed the old 1TB down to my wife's PC.

Cost is subjective. I am a senior network engineer at my company and am paid 230k a year for my job so some good PC hardware costing a little extra money won't slow me down from owning it.
Last edited by Ancient; Sep 5, 2020 @ 11:52pm
alex69noone Sep 5, 2020 @ 11:49pm 
yeah and honestly the current speeds of the NVME M.2 Drives on the PCIE 3.0 x4 i think it is - is truely fast enough currently. I mean its already recommended NOT installing windows on a NVME SSD anyways its recommended to use a basic SSD with SATA express i forget the number but thats whats recommended since a constant use of a NVME SSD PLUS GAMES would put a huge hole on your drives.... They actually say just install windows on a basic SSD and use other SSD drives or NVME drives for games and give your windows the ENTIRE (typically) SSD to its self so its page file can be as I HIGHLY Recommend a STARTING number of 150 GB pagefile to 220 GB or higher range so it starts at 150 GB as free raw RAM space for games instead of windows managing it and giving its self 16 GB or whatever i think the minimum was which causes studder's each time a game or web browser needs more because its constantly increasing it which studder's the system for a second or more. start high and keep the entire drive to its self and it wont die faster since its only running windows and not massively intense games all at the same time.
Ancient Sep 5, 2020 @ 11:59pm 
Originally posted by alex69noone:
yeah and honestly the current speeds of the NVME M.2 Drives on the PCIE 3.0 x4 i think it is - is truely fast enough currently. I mean its already recommended NOT installing windows on a NVME SSD anyways its recommended to use a basic SSD with SATA express i forget the number but thats whats recommended since a constant use of a NVME SSD PLUS GAMES would put a huge hole on your drives.... They actually say just install windows on a basic SSD and use other SSD drives or NVME drives for games and give your windows the ENTIRE (typically) SSD to its self so its page file can be as I HIGHLY Recommend a STARTING number of 150 GB pagefile to 220 GB or higher range so it starts at 150 GB as free raw RAM space for games instead of windows managing it and giving its self 16 GB or whatever i think the minimum was which causes studder's each time a game or web browser needs more because its constantly increasing it which studder's the system for a second or more. start high and keep the entire drive to its self and it wont die faster since its only running windows and not massively intense games all at the same time.

I want to be on your side but what you said is both illegible and incomprehensible so I'm just standing here holding my ♥♥♥♥ and wondering where I should point it... I feel like a certain Adam Sandler movie meme video from 1995 might be appropriate here.
[GER] SScRiBLe Sep 6, 2020 @ 12:04am 
running texture pack with my 1080 and it works just fine
Ancient Sep 6, 2020 @ 12:08am 
Originally posted by GER SScRiBLe:
running texture pack with my 1080 and it works just fine

Yep, cause you have 8GB, which is enough for the HD pack. That was kind of the whole takeaway of the entire, ongoing thread. You must be a wizard.
< >
Showing 1-15 of 34 comments
Per page: 1530 50

Date Posted: Sep 5, 2020 @ 9:44pm
Posts: 34