Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
Why would you trade your 1440p 165Hz for 4K 60Hz? You lose the higher refresh rate for higher resolution and the difference between 1440p to 4K is meh unless you're using a really large monitor or TV and sitting close to it. It's not worth the trade off.
Also, with regards to 10GB being too low, you can tell that to Nvidia, but at the same time, you can't "future proof" your PC since technology evolves VERY fast, so there isn't much point dumping a lot of money right now.
This is not a good advice.
There are also people saying that when the RTX 2080 Ti came out.
Technology changes very fast and how can you explain it if Nvidia also decides to release a new lineup with double the VRAM at the same price?
There is no such thing as “future proofing” in the world of technology.
Yep there is no way to really future proof!
Well i mean, 2080ti is still great considering it's 2 years old. And its got more vram than 3080 so... It would still last 2 or 3 more years and thats the most you can expect from most gpu's. 5 years is futureproof for a gpu.
How much did the 2080ti originally go for? £1400? £1500? About the same as 3090.
https://www.dell.com/en-us/work/shop/dell-ultrasharp-27-4k-usb-c-monitor-u2720q/apd/210-avjv/
It has fantastic color reproduction, accuracy and calibration. At 27", 4K would look really crisp.
Here's the thing:
165 Hz is really smooth and mouse input is noticeably more responsive, I can easily see huge differences between refresh rates / frame rates. I would even dare to say that I can see a bigger difference than most people, but I would still much rather sacrifice that smoothness for image quality now that I've seen both.
A few months ago, I saw a 27" 5K iMac in a store and it looked absolutely mind-blowing to me in terms of resolution and color reproduction, and the Dell monitor should have even better colors. It would be an understatement to say that it looked a million times better than the monitor I'm using now, except for refresh rate. I still can't stop thinking about how fantastic that iMac looked and I realized that I want a monitor like that badly.
Again, bad advice. VRAM does not mean everything. You might as well say it the same for a GTX 1080 Ti is better because it has more VRAM than a 3080.
Don't use the term "future proof".
That's still a lot of money and not little.
There is no point spending a lot of money right now to "future proof" your technology, considering the same extra amount of money can be used to upgrade in the future with more better, more efficient newer tech when they get released.
Get that technology moves super fast and chances are Nvidia may already have a RTX 4000 series or Ampere Refresh/Successor, just maybe waiting for the right time to release it.
Does your workload or usage benefit from color accuracy? Because if you're just going to be gaming you're not going to benefit much from a color accurate monitor versus a high refresh rate monitor.
Also, you cannot compare your workload with an iMac, considering iMacs/Macs are designed more for a professional market and they don't focus primarily on gaming.
Colors may look nice but if you're not going to take full advantage of the color spectrum in your workflow, it's a luxury item but not something you need, just like if you upload a photo on Facebook, it's going to be using sRGB and not AdobeRGB, even if your picture is perfectly calibrated or done for color accuracy.
For now you will be fine, even at 4K few games use or need greater than 8GB... But there are *some* that are already pushing that limit hard (looking at you MSFS2020) and I would expect the limit to continue to get pushed.
Only real option would be to toss down on the 3090, or wait and see what AMD brigs to the table. If their performance is competitive/comparable, which its is sounding like it will be up through the 3080, then you might find them a better option as they *tend* to have more VRAM.
right now we are all in the waiting game.
RTX 3080 will be faster, but not enough to get average 60 FPS.
So, you will need to lower either settings or resolution. And at 1440p even 6 GB will run the game well.
After that period, you can lower the TEXTURE to HIGH, instead of MAX Ultra. It looks almost SAME anyway.