C1REX 9 ENE a las 13:56
Why Nvidia is giving not enough VRAM? - rant.
New GPUs were announced.

16GB for $1000+ just to match PS5 VRAM.
12GB for $550+ just to match the incoming Switch 2.

If the 5060 will have 8GB, it will match the PS4.

I understand and agree that many devs are doing a very bad job recently, but this tiny amount of memory for ridiculous money only makes it worse. Sure, games can be optimized for VRAM better, but it often comes at the cost of worse looking textures, stutters, or increased CPU demand for data streaming and decompression.

Nvidia tries to condition us into believing that VRAM is made of gold and ultra expensive. Same as what Apple does. Nintendo is super stingy, and they understand that VRAM gives more than computing power for less money. It would be cheaper for us to have games optimized for more memory but weaker GPUs. Nintendo gives 12GB to a GPU similar to the RTX 2050.

BTW: Something odd is happening on the forum. My previous topic was closed for being too old when it was made just before Christmas. Other topics commenting on Nvidia were also closed.
Última edición por C1REX; 9 ENE a las 13:58
< >
Mostrando 136-142 de 142 comentarios
C1REX 23 MAR a las 13:50 
Publicado originalmente por xSOSxHawkens:
But please guys, tell me how I (and all the professional reviewers) are totally wrong here...
9070XT is better value at MSRP ($599) but you can't currently buy the card at this price. Reviews are based on MSRP prices and 9070XT would be destroyed in reviews if it had current, real life price of about $750-$800.
When it will be back in stock at MSRP it will make 5070 12GB stupid and pointless but it's not known when and if that will happen.


- FSR4 is worse than DLSS4 transformers mode. Worse quality, worse performance and very few games officially support it. ("officially" means by drivers overdrive as I don't know a single game with fully native FSR4 support.

- consumes more energy

- still worse RT performance and worse quality due to no ray reconstruction.

I personally like AMD due to 2xHDMI options from Sapphire. My Sapphire 7900xtx 24GB with 2 HDMI ports costed about the same as Asus TUF 4070 Ti 12GB. Only ASUS gives 2 HDMI ports on NVidia side and not at MSRP. I also don't care about Ray Tracing.

But I do care about FSR a lot and it's not better than DLSS4 upscaling. FSR3 is sometimes acceptable at 4K but often really bad. FSR4 got a lot better but it's heavy/slow, has close to no official support and still looks worse than DLSS4 transformer model.

I personally also like to buy AMD just to not support NVidia and their anti gamers practice but it's not a strong argument. You can make such decision only when there is no other difference between products. Otherwise you are risking being a fanboy. However, there are more differences and if AMD won't solve their upscaling I will pay extra for an Nvidia card next time. I'm skipping this generation.
Rod 23 MAR a las 14:12 
Publicado originalmente por xSOSxHawkens:
Publicado originalmente por Rod:

You are being a fanboy Freesync has vrr flicker which the Gsync module greatly reduces. The Nvidia app trounces anything AMD have injecting DLAA and updating all games to the latest DLSS. Then there the aib support. Sapphire is the best AMD have yet the Nitro XT looks like a wish.com Aorus 5080 but more ugly. Name one thing apart from price AMD do better i think thats the easy question to ask.
I mean, they had had better raw raster now for what, 3 generations. Considering that is the be all end all of actual performance, its probably a great place to start.

FreeSync has been just as good as G-Sync for ages, if you think otherwise its only because you are too fanboi to bother even trying the competition. But most people, including NV themselves, and most monitor makers themselves, have been off the G-Sync hardware badnwaggon for about 4 -5 generations of GPU and Monitor tech now bud...

Opensource and Free-Sync won. Go cry with that nvidia tax that no one wants to pay except you I guess...

AMD's software suite UI, and its usability to inject upscaling, AA, and if desired FSR (either as a scaling tech or as a native rez AA implementaion) has been a thing just as long as NV has had it, some of the features even longer than NV had them. And the UI is, by most accounts, far better and more modern, with a far more usable software suite than NV Control Panel + GeForce App + MSI Afterburner...

Again, as I said above, I like how (as per usual) I back up my claims with links to reputable sources, get called a fan boi, then have people spew things they dont bother to back up (because pro-tip, most of the things you tossed on the table cant be backed up anymore from any reputable articles or reviews that are not 5+ years old).

But you keep beating that dead horse...


Ok Raster they win but Raster perf is dead like Teraflop and Moores Law. But mocking G Sync modules says you never even owned one. You want articles well thats like demanding an article about flat earth had you owned the different modules and become used to vrr flicker you would see how much it is reduced on the AW3423DW you said it was dead sure on all non premium models it is. And theres more to it than the module panels are graded the Freesync panels are worse panels hence why the Freestnc version the AW3423DWF is 165hz not 175hz hence the Gsync premium. They also can do ULMB unlike Freesync and the range of VRR tend to be greater too.
Monk 23 MAR a las 14:34 
Publicado originalmente por xSOSxHawkens:
Publicado originalmente por Monk:
Every thing you just linked said it's close or nearly there heck you even highlighted a sectionn clearly stating the 5070ti is 15 to 20% faster than the 9070xt with raytracing!!!

Yes AMD has made huge gains, but they are still behind.

You are the darn definition of a fanboy, you are literally linking stuff that disagrees with you, then entering your own interpretation.

I'll say it again, if AMD can stay a 100 to 150 bucks below the equivalent nvidia card, they are onto a winner at the 9070/xt level, but, they are not quite equal on performance yet, but, that is fine as long as they are priced correctly, why is that so hard to understand?

I will buy an AMD gpu the moment its features and performance beat the nvidia equivalent I would be looking to buy as performance and experience is all I care about, I do not care who's name is on the card.

Same goes for AMD cpu's vs Intel.
No, it didnt disagree at all. The article clearly stated that they are on par in RT.

The 5070ti being a bit faster *is still on par as its more expensive* a fact that the reviewer clearly accounted for you are you ignoring...

FFS my man... Can you not bother to read more than one sentance?


You cannot argue it is better when it is factually not, price does not change facts.

You could argue better value, but, I've long admitted that if it's a decent but cheaper once supplies are back to normal then it is a great choice.

But on raw hardware and technical ability AMD IS still behind.

I've no need to read articles that tell me the sane thing all the others do and what I have said myself since launch, the 9070xt at £600 is great, good luck getting one at that price though and if it can remain a good chunk cheaper than the 5070ti it will be a good choice, that does not make it better though, nor does it magically maje AMD's tech the best.

Honestly I am yet to see a fair review comparing all aspects of the each card under ideal conditions, honestly most seem to skip upscaling, frame gen and raytracing entirely.

I also do not expect to see one for months until availability is to a level that you can just go and buy a card and that is when the truly best option will showup.

If (and it's unlikely) the 9070xt remains a £600 card vs the £750 5070ti, AND wins on value, if it costs £700, nvidia wins on spec.
C1REX 23 MAR a las 15:34 
Publicado originalmente por Monk:
If (and it's unlikely) the 9070xt remains a £600 card vs the £750 5070ti, AND wins on value, if it costs £700, nvidia wins on spec.
AMD card only makes sense at MSRP and when compared to 5070 12GB. Stronger and more VRAM for just $50 more.
The 5070Ti 16GB is just a better card however. It's overall stronger even at pure raster while using less energy and it goes further ahead when considering heavy RT and new upscaling. These cards shouldn't be compared in my opinion as realistically 5070Ti is like a tier better card but unfortunately at a stupid price right now. But current 9070XT prices are also stupid so it'c a choice between stupid options right now.
Monk 23 MAR a las 15:48 
Publicado originalmente por C1REX:
Publicado originalmente por Monk:
If (and it's unlikely) the 9070xt remains a £600 card vs the £750 5070ti, AND wins on value, if it costs £700, nvidia wins on spec.
AMD card only makes sense at MSRP and when compared to 5070 12GB. Stronger and more VRAM for just $50 more.
The 5070Ti 16GB is just a better card however. It's overall stronger even at pure raster while using less energy and it goes further ahead when considering heavy RT and new upscaling. These cards shouldn't be compared in my opinion as realistically 5070Ti is like a tier better card but unfortunately at a stupid price right now. But current 9070XT prices are also stupid so it'c a choice between stupid options right now.

I 100% agree, I really hope AMD can jeep the 9070xt at 600 as its an awesome card at that price.

Until prices and availability settles down it's very hard to say which is best without a big ol asterix next to the conclusion saying 'at this time with inflated prices' and my advice to everyone is to just wait, it's what I am doing with the 5090.

The 5070 vs 9070 will come down to price too, they seem to trade blows well, one has more vram one has faster, your res and type of game you play will dictate the choice gir many, if thry are the same price, id go with nvidia, I feel 12GB vram is plenty for pretty much any game and it has the edge with dlss and raytracing, but, having 16GB of vram will give peace of mind if looking to buy every 2 or 3 generations not upgrade each time.
smokerob79 26 MAR a las 0:15 
it really is a mixed bag this time around.....we have a spread of both the 9070xt and the 5070ti going toe to toe with each other in most games with some running the same frame rates to things like red dead 2 showing favor to the AMD card for the first time ever!!!

real world this gen NgreedyA can NOT justify the price of what their selling and depending on AMD's GPU allocation they could lose a massive part of the market segment.....

IF AMD can keep the cards rolling in close to MSRP their going to move every card they produce....
Rod 26 MAR a las 5:46 
Looks like the Nvidia feature set got better you can now dial in by % the resolution used as a base upscaler from i guess 33-100%? 100% being native and dlaa. So you can really now refine what the ai is doing.
< >
Mostrando 136-142 de 142 comentarios
Por página: 1530 50

Publicado el: 9 ENE a las 13:56
Mensajes: 142