Shodan Sep 12, 2020 @ 1:23pm
10 GB is way too low for RTX 3080
I'm planning to get a 4K 60 Hz monitor next to my current 1440p 165 Hz one, but 10 GB will eventually become way too low. It's already barely enough for the most demanding games and the next time I'll upgrade my PC will be in at least 6 years.

There are many rumors about a 20 GB version, but I have no idea when that will come out and I really want to upgrade before Cyberpunk 2077 comes out on November 19th.

What do you guys think about this and is it known when (if) the 20 GB RTX 3080 will come out?
Last edited by Shodan; Sep 12, 2020 @ 3:02pm
< >
Showing 1-15 of 21 comments
Jelly Donut Sep 12, 2020 @ 1:32pm 
Couple of questions..

Why would you trade your 1440p 165Hz for 4K 60Hz? You lose the higher refresh rate for higher resolution and the difference between 1440p to 4K is meh unless you're using a really large monitor or TV and sitting close to it. It's not worth the trade off.

Also, with regards to 10GB being too low, you can tell that to Nvidia, but at the same time, you can't "future proof" your PC since technology evolves VERY fast, so there isn't much point dumping a lot of money right now.

Andrius227 Sep 12, 2020 @ 1:40pm 
Spend a little extra and get a 3090 with 24gb vram. That thing could probably last 10 years. Definitely futureproof.
Jelly Donut Sep 12, 2020 @ 1:43pm 
Originally posted by Andrius227:
Spend a little extra and get a 3090 with 24gb vram. That thing could probably last 10 years. Definitely futureproof.

This is not a good advice.

There are also people saying that when the RTX 2080 Ti came out.

Technology changes very fast and how can you explain it if Nvidia also decides to release a new lineup with double the VRAM at the same price?

There is no such thing as “future proofing” in the world of technology.
Last edited by Jelly Donut; Sep 12, 2020 @ 1:43pm
Peterg Sep 12, 2020 @ 1:45pm 
Originally posted by UponTheSky:
Originally posted by Andrius227:
Spend a little extra and get a 3090 with 24gb vram. That thing could probably last 10 years. Definitely futureproof.

This is not a good advice.

There are also people saying that when the RTX 2080 Ti came out.

Technology changes very fast and how can you explain it if Nvidia also decides to release a new lineup with double the VRAM at the same price?

There is no such thing as “future proofing” in the world of technology.

Yep there is no way to really future proof!
Rumpelcrutchskin Sep 12, 2020 @ 1:46pm 
Also "little extra" means twice the price of 3080.
Bad 💀 Motha Sep 12, 2020 @ 1:51pm 
4K/60 is terrible and many games won't run well regardless of hardware used for this. 1440 at 144 or above hz is way better
Andrius227 Sep 12, 2020 @ 1:51pm 

Originally posted by UponTheSky:
Originally posted by Andrius227:
Spend a little extra and get a 3090 with 24gb vram. That thing could probably last 10 years. Definitely futureproof.

This is not a good advice.

There are also people saying that when the RTX 2080 Ti came out.

Technology changes very fast and how can you explain it if Nvidia also decides to release a new lineup with double the VRAM at the same price?

Well i mean, 2080ti is still great considering it's 2 years old. And its got more vram than 3080 so... It would still last 2 or 3 more years and thats the most you can expect from most gpu's. 5 years is futureproof for a gpu.


Originally posted by Rumpelcrutchskin:
Also "little extra" means twice the price of 3080.

How much did the 2080ti originally go for? £1400? £1500? About the same as 3090.
Last edited by Andrius227; Sep 12, 2020 @ 1:52pm
Shodan Sep 12, 2020 @ 2:27pm 
If I got a 4K monitor right now, it would be this one:
https://www.dell.com/en-us/work/shop/dell-ultrasharp-27-4k-usb-c-monitor-u2720q/apd/210-avjv/

It has fantastic color reproduction, accuracy and calibration. At 27", 4K would look really crisp.

Here's the thing:

165 Hz is really smooth and mouse input is noticeably more responsive, I can easily see huge differences between refresh rates / frame rates. I would even dare to say that I can see a bigger difference than most people, but I would still much rather sacrifice that smoothness for image quality now that I've seen both.

A few months ago, I saw a 27" 5K iMac in a store and it looked absolutely mind-blowing to me in terms of resolution and color reproduction, and the Dell monitor should have even better colors. It would be an understatement to say that it looked a million times better than the monitor I'm using now, except for refresh rate. I still can't stop thinking about how fantastic that iMac looked and I realized that I want a monitor like that badly.
Last edited by Shodan; Sep 12, 2020 @ 2:27pm
Jelly Donut Sep 12, 2020 @ 2:50pm 
Originally posted by Andrius227:

Well i mean, 2080ti is still great considering it's 2 years old. And its got more vram than 3080 so... It would still last 2 or 3 more years and thats the most you can expect from most gpu's. 5 years is futureproof for a gpu.

Again, bad advice. VRAM does not mean everything. You might as well say it the same for a GTX 1080 Ti is better because it has more VRAM than a 3080.

Don't use the term "future proof".

Originally posted by Rumpelcrutchskin:
Also "little extra" means twice the price of 3080.

That's still a lot of money and not little.

There is no point spending a lot of money right now to "future proof" your technology, considering the same extra amount of money can be used to upgrade in the future with more better, more efficient newer tech when they get released.

Get that technology moves super fast and chances are Nvidia may already have a RTX 4000 series or Ampere Refresh/Successor, just maybe waiting for the right time to release it.
Last edited by Jelly Donut; Sep 12, 2020 @ 2:51pm
Jelly Donut Sep 12, 2020 @ 2:54pm 
Originally posted by unturorum:
If I got a 4K monitor right now, it would be this one:
https://www.dell.com/en-us/work/shop/dell-ultrasharp-27-4k-usb-c-monitor-u2720q/apd/210-avjv/

It has fantastic color reproduction, accuracy and calibration. At 27", 4K would look really crisp.

Here's the thing:

165 Hz is really smooth and mouse input is noticeably more responsive, I can easily see huge differences between refresh rates / frame rates. I would even dare to say that I can see a bigger difference than most people, but I would still much rather sacrifice that smoothness for image quality now that I've seen both.

A few months ago, I saw a 27" 5K iMac in a store and it looked absolutely mind-blowing to me in terms of resolution and color reproduction, and the Dell monitor should have even better colors. It would be an understatement to say that it looked a million times better than the monitor I'm using now, except for refresh rate. I still can't stop thinking about how fantastic that iMac looked and I realized that I want a monitor like that badly.

Does your workload or usage benefit from color accuracy? Because if you're just going to be gaming you're not going to benefit much from a color accurate monitor versus a high refresh rate monitor.

Also, you cannot compare your workload with an iMac, considering iMacs/Macs are designed more for a professional market and they don't focus primarily on gaming.

Colors may look nice but if you're not going to take full advantage of the color spectrum in your workflow, it's a luxury item but not something you need, just like if you upload a photo on Facebook, it's going to be using sRGB and not AdobeRGB, even if your picture is perfectly calibrated or done for color accuracy.
vadim Sep 12, 2020 @ 2:59pm 
Originally posted by unturorum:
I'm planning to get a 4K 60 Hz monitor next to my current 1440p 165 Hz one, but 10 GB will eventually become way too low.
If you think 10 GB isn't enough to you - do not buy this card. No one cares. It will be bought by people with more adequate ideas about game requirements. Problem solved.
xSOSxHawkens Sep 12, 2020 @ 3:04pm 
10GB will do OK, though I agree it would have been nice to see 12...

For now you will be fine, even at 4K few games use or need greater than 8GB... But there are *some* that are already pushing that limit hard (looking at you MSFS2020) and I would expect the limit to continue to get pushed.

Only real option would be to toss down on the 3090, or wait and see what AMD brigs to the table. If their performance is competitive/comparable, which its is sounding like it will be up through the 3080, then you might find them a better option as they *tend* to have more VRAM.

right now we are all in the waiting game.
vadim Sep 12, 2020 @ 6:29pm 
Originally posted by xSOSxHawkens:
For now you will be fine, even at 4K few games use or need greater than 8GB... But there are *some* that are already pushing that limit hard (looking at you MSFS2020) and I would expect the limit to continue to get pushed.
You right. Microsoft Flight Simulator really benefits (not much, though) from more than 8 GB VRAM at 4k ultra settings. But at this quality and resolution even RTX 2080 Ti gives unplayable FPS.
RTX 3080 will be faster, but not enough to get average 60 FPS.
So, you will need to lower either settings or resolution. And at 1440p even 6 GB will run the game well.
r.linder Sep 12, 2020 @ 6:37pm 
Even 8GB is enough for pretty much every game at 4K. 10GB is enough.
🦜Cloud Boy🦜 Sep 12, 2020 @ 11:23pm 
10 GB is not low. It is enough for at least next 3 years (looks by the trend).
After that period, you can lower the TEXTURE to HIGH, instead of MAX Ultra. It looks almost SAME anyway.
< >
Showing 1-15 of 21 comments
Per page: 1530 50

Date Posted: Sep 12, 2020 @ 1:23pm
Posts: 21