安装 Steam
登录
|
语言
繁體中文(繁体中文)
日本語(日语)
한국어(韩语)
ไทย(泰语)
български(保加利亚语)
Čeština(捷克语)
Dansk(丹麦语)
Deutsch(德语)
English(英语)
Español-España(西班牙语 - 西班牙)
Español - Latinoamérica(西班牙语 - 拉丁美洲)
Ελληνικά(希腊语)
Français(法语)
Italiano(意大利语)
Bahasa Indonesia(印度尼西亚语)
Magyar(匈牙利语)
Nederlands(荷兰语)
Norsk(挪威语)
Polski(波兰语)
Português(葡萄牙语 - 葡萄牙)
Português-Brasil(葡萄牙语 - 巴西)
Română(罗马尼亚语)
Русский(俄语)
Suomi(芬兰语)
Svenska(瑞典语)
Türkçe(土耳其语)
Tiếng Việt(越南语)
Українська(乌克兰语)
报告翻译问题
But 6 years on 8GB is what I get on TWO tiers of GPUs, mid included. The higher VRAM only exists in the top tier cards which is just BS. You know why? Because they don't have to do it that way. They do it just because
would you share the tweaks?
The VRAM capacity isn't that important, capacity only really matters based on how much you're actually using, unused memory is effectively wasted memory. The fact that flagships even have more than even 16GB is stupid and adding extra cost for nothing when it comes to gaming GPUs, but it's something that AMD started because they don't really have any other selling points aside from trying to get more FPS per dollar value and throwing in extra VRAM to entice buyers. But very often these cards can't actually perform well enough in games at settings that actually use 100% of the available VRAM to make that even viable, the 3060 12GB was criticised for that, as was the 4060 Ti 16GB.
I have a 3080 10GB and rarely go over 8GB usage at 1440p, the only time I've seen use up all 10GB it was running Cyberpunk 2077 at native (no DLSS) at max settings because it's an extremely detailed game, with DLSS it's closer to 8 but few people actually run the game at max settings due to performance issues.
I haven't noticed any of my other games go above 8GB, it's fine for most titles, especially for low end GPUs that really have no business having excess memory. Flagships always had extra VRAM and it never really mattered, for the longest time even 2160p was perfectly fine with 8GB, but they still gave the 1080 Ti 11GB, and it's only a few percent faster than the 3060 12GB, yet I don't recall the 1080 Ti ever being criticised like the 3060 was, because it was the GTX 10 series flagship, even though the extra 3GB wasn't useful.
a 4g 730 is still weaker than the 1g 750
the only oddball was the 1060, 3g and 6g
they used the same gpou 'core' but the 6g had other parts enabled to allow it to use more ram
https://www.hwcompare.com/31385/geforce-gtx-1060-vs-geforce-gtx-1060-3gb/
6g has more cores and tmus
there is always 1 thing your bottleneck.
your regulair cpu and ram always 1 of them will be the limit the other has room to spare.
same with your gpu and vram.
but just as not all programs use the same balance of cpu and ram (some tend to use more ram but less cpu and vica versa)
same is true for programs demand on gpu.
adding more vram to the same gpu in such cases does help.
geneally the demand of programs for ram/vram goes up faster than the demand for flops/gflops
it would be great if vram became modulair like normal ram.
but lacking that it makes sense to buy sone overkill of vram. it may not help you now but it can help in future..
by supplying gpus standard with only a relative minimum of vram (exceptt the top models) the manufacturers do planned obselesesence.
just more vram does not make it a better gpu
you can allocate 8g or more of system ram to a intel uhd igpu, it does not make it any stronger
vram on a dedicated gpu will never be modular, since the gpus core needs to talk directly to it
each gpu gen/tier has its core designed to talk to specific ram type, freq/timings/buss width/voltage etc..
they mix what the gpu can effectively use and price to make gpus for each tier
I have a 3060 laptop with 6GB VRAM that can run Control on max settings in 1080p in 60fps, but textures have a noticeable pop-in.
On my GTX1080 desktop I need to turn most settings down to Medium, except texture resolution, which is Ultra and I've never seen an pop-in there.
Funny thing is, Control looks so good I'd never have guessed most of its effects were on medium on my machine, but texture pop-in? THAT'S noticeable even on Ultra settings.
Same with Resident Evil 4, which already wants more than 8GB to properly pre-load all high-res textures. The difference isn't noticeable unless you really zoom into some of them, but it's there for sure.
You mentioned Cyberpunk, which is super detailed, but it's a 3 year-old game.
The list will only keep growing.
but i have seen that video, the gtx 1080 ti was the best value gpu to ever exist
we didn't know that at the time of it's release tho
if you don't mind 1080p-60fps the 1080 ti probably will still be relevant 3 years from now.
also as an answer of what to upgrade the GPU to, i usually recommend people atleast double their performance for about what they paid for their last GPU
example. i paid 400 cad for my 1070 back in 2017 and i'm looking at the 4060 as an upgrade option as it's about 400 cad rn.
that said i would really focus on upgrading the CPU/mobo and ram first, that i7 is really closing on it's last year or so. Games are becoming alot more cpu demanding as cpu's have been getting stupidly fast lately. I'm even considering upgrading my i5 8600k soon.
The GTX 1080 Ti was a good value for a high end card specifically, but I'd definitely put the GeForce 4 Ti 4200 or especially the 8800 GT above it or we're talking anything of all time, and that's only counting nVidia's own stuff. AMD's had some exceptional values. I think the GTX 1080 Ti is getting the benefit of recency bias in people's minds. It happened on the tail end of GPU generations going from every one year to every two, and before prices started rising, so it seems better due to those factors. It was still very good regardless, but I'd say it was the best flaship value only, not the best of all time.
The 8800 GT (G92 so it was really a 9000 series chip) was like almost all of the flagship (8800 GTX/Ultra) performance for less than half the cost. And that was when flagships cost closer to half a grand. Imagine nVidia releasing an RTX 4080 Ti that just stops shy of matching the RTX 4090 but only costs one third of it (and I mean one third of the MSRP, not one third of this currently going on nonsense with its pricing). It would be insane, and that's exactly what the 8800 GT was. Actually the 8800 GT would still be better because its $200 to $250 price point was in reach of most consumers whereas a card even at one third the price of the RTX 4090 would still not be. Most people buy x50 and x60s and that price point is like a quarter or less than the price of the RTX 4090.
It was flagship performance at midrange pricing, and we took it so much for granted that we overlook it even today. We'll never see it again. Never. nVidia seemingly cannibalized their own lineup just to begin burying AMD, and it eventually worked. The 8800 GT was practically the only card that mattered for a while after it released.
i guess since i didn't get into pc gaming until 2009 im not to familiar with those cards
but i do agree that the 1080ti is the best value flagship, but not best value gpu of all time
also my bias of cards being more expensive now def plays apart into it
Yeah, the GTX 1080 Ti is like the Core i5 2500K (or 2600K if you prefer) of GPUs. Part of what made it so good for so long was because things slowed (and/or got more costly) after it, and it helped elongate the longevity of the GTX 1080 Ti. The 8800 GT actually probably had a shorter life than it, but all told, I feel like it represented the best value we've ever seen from any GPU product since the GTX 1080 Ti was merely a value for the flagship buyers.
The 8800 GT, by contrast, actually removed just about any reason to buy the flagship, and was at a price accessible to the mainstream segment. It was like the perfect storm. Unfortunately we'll never see it again with the way the market is going.
really only upgraded because the phenom line stopped supporting certain hardware features
for certain games, performance was actually fairly solid up until then
especially when i was only aiming for 60fps at the time
I used a 2500K until mid-2020 and it was alright for me until then. But I only go for 60 FPS, and I imagine playing stuff like the Sims series or Minecraft has given me a mentality of not sharing the "I can't live with 140 FPS dropping to 80 FPS" mindset the modern crowd seems to throw out so often. Almost makes me want to just stick with 60 FPS so I don't ruin that, haha. Bigger reason I want higher isn't even for games (although it might be nice there too), but for the desktop to be honest.
even 60 is okay since i have a g-sync monitor . but yeah i kinda agree that i don't really
understand the people who will only play at 144. it's nice forsure, especially for multiplayer games
it's def not needed to enjoy a game tho imo
Oh gosh, you just made a case for holding onto a gtx 1080 (or any replaced gpu) just a bit longer.. Sorry you had to rma your new card already. Good thing you held onto your older one, at least you're in familiar territory again.
Even though I have an acct at HeatWare, I"m in no rush to sell my gtx 1080. Sometime first quarter next year, maybe.