安裝 Steam
登入
|
語言
簡體中文
日本語(日文)
한국어(韓文)
ไทย(泰文)
Български(保加利亞文)
Čeština(捷克文)
Dansk(丹麥文)
Deutsch(德文)
English(英文)
Español - España(西班牙文 - 西班牙)
Español - Latinoamérica(西班牙文 - 拉丁美洲)
Ελληνικά(希臘文)
Français(法文)
Italiano(義大利文)
Bahasa Indonesia(印尼語)
Magyar(匈牙利文)
Nederlands(荷蘭文)
Norsk(挪威文)
Polski(波蘭文)
Português(葡萄牙文 - 葡萄牙)
Português - Brasil(葡萄牙文 - 巴西)
Română(羅馬尼亞文)
Русский(俄文)
Suomi(芬蘭文)
Svenska(瑞典文)
Türkçe(土耳其文)
tiếng Việt(越南文)
Українська(烏克蘭文)
回報翻譯問題
For me, the biggest arguments are that the 3060 came with 12GB, while the 3070 had only 8GB and the 3080's 20GB version was cancelled. Also, there was only a tiny price difference between the 7900XTX and the XT, even though there was a substantial performance difference and an additional 4GB of VRAM.
Think about this: If VRAM were expensive, AMD would use less VRAM and a larger die instead, or lower the prices even more with less VRAM to sell their GPUs at half the price of NVIDIA's.
I'm not trying to be a jerk, but it sounded like you were well educated in the cost of vram but you really don't know and what you've read is just around here and there from other random people on here?
I mean if you have a link to someone credible, I suppose I wouldn't be typing this. If it really did cost $20 to add maybe like 8 GB of vram then yeah I suppose that's messed up (kind of).
But I would still say, in Nvidia's defense they're a business. If they CAN give less and make more, it's what they're here for to make money. Every business does it. They're LITERALLY doing AMD a favor, because they would put them out of business if they offered the same amount of vram.
The use of the word "just" was to denote that it's a game that's already causing me VRAM issues at 1920 x 1200 (and at quite tame render distances), as opposed to being a recent triple A title at maximum settings at 4K or whatever.
Maybe so. I was just listing one that I play that didn't fall under your criteria of being a triple A title at maximum settings at 4K. Pretty far from it, in fact.
And there's no end to the supply of reviews and articles finding places where it can't be enough. So as someone "in the buying market", 8 GB isn't even a consideration for me. I'm even questioning the longevity of 12 GB, but that's only because I want the option to NOT have it be a potential concern 3 or so years from now.
My current card had a generous amount of VRAM for the time, and I believe that's what helped it last as long for me. So I'm sticking to the experience that worked for me and looking for something not borderline on VRAM.
One of the main things I was getting at with that question is that, do you think it's fair to say that if "vanilla" Minecraft works fine at 1080p/1440p then the developers of Minecraft as well as Nvidia provided exactly what was needed for that game?.
Yet, as per the stock market as of yesterday, NVIDIA's estimated net worth is 669 billion dollars. They've earned almost 27 billion in revenue so far this year and we're not even halfway done the year yet, with a net income (income after all taxes, deductions, costs taken out of revenue) of 4.3 billion, so it's safe to say that they really could have kept their pricing more competitive, or made better cards that would actually be closer to their MSRP in price/performance value. Instead, they decide to be as lazy as freaking possible, making the tiniest of changes with each generation, and wondering why their sales aren't meeting expectations, as if they thought we'd be that ♥♥♥♥♥♥♥♥ stupid, but we're not, hardly anyone that knows what they're looking at would want the 4070, which is basically just a 3070 Ti, and 40%+ slower than the 4080 in numerous occasions, for half the price, but you can already get better for the same price or less.
It's a damn good thing that they weren't allowed to acquire ARM Holdings from SoftBank, one can only imagine the damage they'd have caused to the industry if they had the same practices there as they do here.
One thing people don´t factor in, is the fact that Nvidia use GDDR6X unlike AMD´s GDDR6
The cost is a tad higher due to that ofc. But overall the AMD cards ofc are cheaper (they always have been) but overall performance and stability is just much better on the Nvidia card, a long with better software (in my opinion)
We could talk about the heat issues for AMD cards atm as well.
Prices will go higher still, due to inflation and current world conflicts. That is how it is.
Most people don´t need a high end GPU however, the vast majority will be fine with a entry level or mid tier GPU.
It would be silly to relaunch a 30th generation card with more Vram, if anything, then it should be a relaunch later on for the 40th generation (but tbh.. 12/16 gb vram is more than enough for the mid/high tier. Entry level with 8 gb is just fine for 1080p
But nVidia doesn't provide hardware for games on a per game basis. They just sell hardware to consumers (and right now, one of those demands of many consumers is "you're providing too little VRAM "). And stock games aren't the only things driving users to want more resources. So I don't think it's fair to say other uses "don't count" if that's something you're trying to suggest?
Because if so, that's not how it works. If, say, a current configuration I set up of Minecraft needs X amount of CPU, X amount of RAM, X amount of GPU processing, and X amount of VRAM, well... that's what I need then if I want to play it that way, right? Who's fault it is, is neither here nor there. It doesn't matter. The only thing that matters is if I want to play it that way, then that is what I need. So it becomes a justifiable need for whatever it is I am seeking. So I either get it, or I don't play it that way. Same as with everything. There's no alternatives.
You are right, you didn't pick a side to blame. You were discussing the hardware aspect of this. I messed up.
Yeah, to me, there's little point in blaming things when it pertains to something I'm not completely sure of (which is why I posed the question I did to the people who were placing blame).
I just take a more practical approach. Is something I'm doing showing it is in need more of a certain something? Well, guess I need more of that certain something, then. It doesn't matter why. If I apparently need it, then I just need it.
In this particular case, I can not and will not go without my bushy Minecraft leaves! So if I have to feed the game VRAM for that, well, I guess I will, haha. Also, I'm really hoping I can turn the render distance up, turn a few more settings on or up higher, stay at 60 FPS a little more often, and maybe even do 1440p in case it comes to that later. And yes, I'm even wondering if a 6800 XT would be enough for that. I wasn't exaggerating when I said I don't think even an RTX 4090 could max this. And I'm not even looking to max it, nor am I using the most demanding shaders (or very high resolution resource packs). This game can apparently get silly with what it needs at times.
Apparently it makes some games to load mora data into VRAM to improve performance and reduce stutters.
I personally can't test it out with my 8GB GPU as Returnal for example take 20GB VRAM with that option on.
5 mainstream GPUs with that amount of VRAM: 3090, 390Ti, 4090, 7900XT, 7900XTX
EDIT: This option is for UE4 games and make tittles like Returnal, Lost Ark, Fortnite, Borderlands run better. That includes texture popping.
Those are not mainstream GPU´s those are niche GPU´s all of them. Like under 5% of all gamers use one of the cards you listed.
Mainstream GPU´s atm is a 3060 (on steam alone, around 10.5% of all users have one, that means that one in every ten user on steam, has a 3060 specifically... This used to be the 1060, but after some generations, the avarage change to a new entry card.
The 3060 has 8 GB vram, something that 32% of steam users have (it is the biggest margin of vram and it is above the avarage vram as well. 22% have 6 gb and then we have the 12gb users at around 16%
Interesting enough 8% still use 4gb vram. only 1% has 24gb vram and only 0.6% have other (inbetween)
The 2060 and 1060 both have around 8% users.
The 3060ti and the 3070 do have alright representation with around 5%
Lets look at the card you call mainstream................
A 4090 has a representation of 0.25% of users..
A 3090 is 0.43%
The AMD cards you mention are under 0.10% here, with the older ones being a tad higher (AMD is not as popular as some people think and let alone for gaming. )
IE. AMD Radeon RX 580 is at 0.74%
High and coup de grace tier GPU´s are a tiny tiny tiny minority, they are not mainstream. Entry level and avarage GPU´s are..
What do I mean by entry level?
xx60 cards <-- that is entry level, avarage cards are like xx60ti and xx70
We also have many people with budget cards, like the xx50(ti) series. These are almost not even gaming cards in my opinion, but will play many things anyway (if your teenager just plays roblox and minecraft, they are perfectly fine)
My point here is.. Talking about the high tier GPU´s like they are mainstream is misleading, talking about above 8gb vram like a must have, is silly in the vast majority of cases..
As I said earlier, I am yet to see a game that demands more than 8gb vram to function and run at high/mid graphical settings (that is with a now entry level tier card) and on 1440p !!!
If you wanna play upcoming graphical heavy AAA releases in 4k and with (in my opinion silly ultra settings) then you surely will benefit from having a tad more vram, 12 gb would most likely be enough for that maybe 16 (a lot of the high tier nvidia gpus in current gen, have atleast 12gb and some 16 and 24)
But for the avarage user, playing dota, counter strike, roblox, fortnite, lost ark, GTA or Rust. I reckon that even a ancient machine, build around an ancient GPU (tech wise) would run them perfectly fine.. a 1060 will run all of these games just fine, a 3060 would handle all of them on higher settings with 1080p without issue.
My point here is, that there is no issue for the vast majority of people, this is a niche issue.. Even for someone as me, that sometimes buy some of the AAA titles, have no issue on 1440p with an entry level, granted, I understand the argument about getting better, I am building a new machine as well.. I understand I am part of the minority and that I will get a high mid or high tier GPU to build around.. But that has more to do with me being able to lavish a higher investment, than actual needs.
In the vast majority of games, I would be perfectly fine with what I have now, even upcoming AA and AAA titles.
At the end here, let me agree to one thing however. The 40th generation series is not a good generation, I reckon it is a avarage, maybe even below in terms of performance per euro, but maybe we underestimate all the software advances we have seen as well.
The 7th and 10th generation (to name some modern ones and not go all the way back to 2th generation) has been awesome!! My old mid tier card from the 7th generation lasted a long long time, sometimes bigger changes happen, sometimes not.
If we look at 20th generation, it was under avarage, but still fine enough to merit a buy (I had to upgrade around that point, so I took a chance and was happy with my card) but lets be honest, it was lower performance, because you paid for new tech (RT) capability. But none of those 20th generation cards, can pull off acceptable 1440p/4k RT compared to rasto.
RT is the future ofc (even if its been used for ages in ie. animation) it is so much easier and once it catches up with the Rasto, it will become used much more.
The thing about rasto (rasterazation) is that its been in the game (pun intended) for so long, that developers have developed so many ways to "cheat" with ie. lightning, shadows, etc. for a low cost performance, that most would not notice yet, unless the game is heavy in RT.
At the end of the day, the avarage gaming needs no more than 8gb vram, neither in many years, if you play many different games and do get AAA titles for 1440p or 4k, then obviously I am not gonna advice you to buy the cheapest, I will advice you to get whatever you can at higher tier, that still balance the machine... My point however is, that we are a minority and the vram talk I see, has blown up..
I reckon that is because some games are awfully optimized, I´ve seen some people claim that the only issue comes with sloppy console ports (I don´t play many of those) but in general, that is not the GPU´s fault then, its the developers fault, don´t support a practice like that.
Steam hardware survey
https://store.steampowered.com/hwsurvey/Steam-Hardware-Software-Survey-Welcome-to-Steam?utm_source=google&utm_campaign=20005915143&utm_medium=ad&utm_content=&utm_term=
I think what confused me was (if I'm not mistaken), you were defending NOT blaming developers, but I don't remember you vocally defend NOT blaming Nvidia.
If you have said "don't blame Nvidia", then ok. But, I just thought it was one sided to say "not fair to blame developers", while not saying "don"t blame Nvidia".
If I'm wrong on any of this, I do apologize.
Then you should not use the word "Mainstream" as it kinda leads to "commonware" and "What the majority wants and have"
It is true that these cards are avaliable to the public and that they are priced within a range, where somewhat rich enthusiast tech and gamers can obtain them.
Mainstream they are not. Mainstream cards have between 6 and 12gb vram currently.