xSOSxHawkens 20/abr./2023 às 10:45
Time is not being kind to nVidia and its VRAM hoarding...
https://www.techspot.com/article/2661-vram-8gb-vs-16gb/


Escrito originalmente por "At a 1080p Rez...":

As we've seen a number of times now, the issues for the RTX 3070 start when we enable ray tracing. You could claim that the ultra quality preset is too much, but you'd also have to ignore the fact that the Radeon 6800 is good for 73 fps, while the 3070 is unusable thanks to its 8GB VRAM buffer. Worse still, the 6800 is still able to deliver highly playable performance at 1440p.

Escrito originalmente por "In Hogwarts Legacy":
With the RTX 3070 we could notice the occasional jarring frame stutter, but the bigger issue is the image quality. The 3070 is constantly running out of VRAM and when it does the game no longer runs into crippling stuttering that lasts multiple seconds, instead all the textures magically disappear and then reappear at random. What you're left with is a flat horrible looking image and this occurs every few seconds. Even when standing completely still, the RTX 3070 keeps cycling textures in and out of VRAM as it simply can't store everything required for a given scene.

But what about DLSS?

If we enable upscaling (DLSS and FSR quality modes) for the RTX 3070 and the Radeon 6800, frame rate performance certainly improves but unfortunately the RTX 3070 just doesn't have enough VRAM, even when upscaling from a 720p image and we see the constant popping in and out of textures. Performance is also much worse on the 3070 when compared to the 6800, we're also still seeing stuttering, it's just a mess.

The next step is to dial down the quality preset to high, with high ray tracing, after all we don't typically recommend the ultra settings in most games. But even here the RTX 3070 suffers from regular frame stutter and in some instances severe frame stuttering.

Dropping down to the medium preset with ray tracing still set to high is a much improved experience with the RTX 3070, and texture quality is no longer an issue, though there is still the occasional frame stutter as VRAM usage is right on the edge and occasionally spills over.

__________________________________________________________________________

Personal take:

I have said for a long time that retaliative performance being close, two GPU's with differing VRAM capacities will have their usable life determined by that capacity, and thus more VRAM is worth it within reason.

Since the RTX3k/RX6k release I was critical of nvidia and its choice to be stingy and hoard VRAM and not put it on their cards. I regularly posited that the 16GB cards from AMD starting at the 6800 non-xt would offer better long term value than the 8 and 10 GB cards from the NV line up. I thought both back then, and now (and rightfully so it would seem) that spending money on something like a 3070 was an absolute waste. Seems it was indeed.

I pointed out when the 6500xt was out that the card itself was OK if you stuck under the VRAM, but was criticized that no one, even on an *entry* level card should have to be managing VRAM usage like that and having to optimize settings to get playable frames on hardware (again, even though it was entry). I pointed out that it would not be long before the same was having to be done with anything with less VRAM than it has usable core power (looking at you 8/10GB modern cards from NV).

Now upper mid range $500 class GPU's are having to compromise hard on settings, or simply live with literal texture pop in due to lack of VRAM to use.

2023 is here. And the 8GB 3070 cant even keep up with a 6800 non-xt in multiple ray traced titles anymore due to VRAM limits. It cant run quality 1440p anymore at reasonable high refresh, and at 1080p, its so bad at times that the 6800's* 1% lows match the 3070's average.

And the smoking gun in all of this? The reason the 3070 is (by comparison) now* trash?

VRAM and nvidia being unwilling to put it on the card without pushing costs up.

To be clear, both companies have cards with 8GB of VRAM, but one has 8gb cards positioned as high end, or upper mid range, both in price and product stack, while the other has the cards listed as mid range or lower and priced accordingly.

At the end of the day the only real people who lose out here are NV owners of 8 (and soon enough) 10 GB cards... While the only actual winner is NV.

Take a bow Jensen, you earned it.

* : Edits
Última edição por xSOSxHawkens; 20/abr./2023 às 11:04
< >
Exibindo comentários 3145 de 98
AdahnGorion 21/abr./2023 às 4:54 
Escrito originalmente por C1REX-PL:
Escrito originalmente por Darkie:

The article specifically mention 1440p is the issue and that unbalanced bottlenecked PC´s that wanna run ultra with a entry level and low medium card are in trouble.. well OFC !!!! you are trying to gain 3% extra visual gain, for the cost of 50% resources spend.

Anyone that plays 1080p and mix medium/high/very high, will have no issue in 99% of all games released the next 4 years with a current gen entry level card (4060 ie)
Why so emotional.
If the card is able to do 1440p and the only limiting factor is vram then the argument about imbalance sounds valid.

My point is, that if you wanna play on 1080p, then you don´t buy or look for a card, that runs 1440p at the same level.
A real problem is also that benchmarks and reviews, often use Ultra settings as a benchmark, it is a faulty way in my opinion, since Ultra from a cost benefit pov, is silly to use, unless the game is 5 years old or you have a extravaganza setup, that can justify the huge resource hit, for the little visual gain.

Most games run 1080p perfectly fine on higher settings, with even old GPU´s with 6gb vram.
Obviously. you can´t future proof anything, but if you wanna do 1440p, then you don´t buy a 4060 and build around that... you would most likely build a PC around a 3070ti or even 3080, if you have the cash to burn.

For a 1080p system however... it is just a whole other world.
AmaiAmai 21/abr./2023 às 9:40 
I don't think it will be an issue for me to change how memory is allocated if a game won't run.

I have many GPUs from both brand, but mainly use an 8 GB card (6650XT) because it doesn't turn my room into a furnace and I did manage to undervolt it and drop power consumption to 45-90W @ 2990 Mhz clock. No GPU is giving me the same performance/watt at that setting, so I keep it around.

Many games are requesting more VRAM because the dev team are extremely lazy and can't optimize their own games. They just leave it for someone else to do because they are getting their money, and when the games don't run people blame AMD and NV, or others besides the devs.

I won't be surprised if the trend gets worse. But I am not worried because I will soon be working in the law field, and not cleaning up the mess of game devs in drivers. Can't wait to lean back and watch the show soon.
Holografix 21/abr./2023 às 9:43 
Escrito originalmente por AmaiAmai:
I don't think it will be an issue for me to change how memory is allocated if a game won't run.

I have many GPUs from both brand, but mainly use an 8 GB card (6650XT) because it doesn't turn my room into a furnace and I did manage to undervolt it and drop power consumption to 45-90W @ 2990 Mhz clock. No GPU is giving me the same performance/watt at that setting, so I keep it around.

Many games are requesting more VRAM because the dev team are extremely lazy and can't optimize their own games. They just leave it for someone else to do because they are getting their money, and when the games don't run people blame AMD and NV, or others besides the devs.

I won't be surprised if the trend gets worse. But I am not worried because I will soon be working in the law field, and not cleaning up the mess of game devs in drivers. Can't wait to lean back and watch the show soon.
I agree with the underlined. Game devs are not optimizing their games, and that's why there's an issue with VRAM.
AdahnGorion 21/abr./2023 às 9:48 
Escrito originalmente por Holografix:
Escrito originalmente por AmaiAmai:
I don't think it will be an issue for me to change how memory is allocated if a game won't run.

I have many GPUs from both brand, but mainly use an 8 GB card (6650XT) because it doesn't turn my room into a furnace and I did manage to undervolt it and drop power consumption to 45-90W @ 2990 Mhz clock. No GPU is giving me the same performance/watt at that setting, so I keep it around.

Many games are requesting more VRAM because the dev team are extremely lazy and can't optimize their own games. They just leave it for someone else to do because they are getting their money, and when the games don't run people blame AMD and NV, or others besides the devs.

I won't be surprised if the trend gets worse. But I am not worried because I will soon be working in the law field, and not cleaning up the mess of game devs in drivers. Can't wait to lean back and watch the show soon.
I agree with the underlined. Game devs are not optimizing their games, and that's why there's an issue with VRAM.

Most of the dire examples of games not being able to run well, are either console ports or cross platform development.
A lot of games actually do have sufficiant and acceptable optimizing, often it is just not the AAA titles.
Escrito originalmente por C1REX-PL:
That Silent Hill specs requirements are substantially higher than for The Last of Us.
Well that's just great!

The CPU and RAM requirements aren't bad at all.

What I'm concerned by is the video card.

Not so much the minimum requirements as I guess something on the level of a GTX 1080/RTX 3060 isn't super high anymore compared to a next generation game (but has a minimum requirement been this high yet for the video card?).

But the part of how the recommended requirements ask for x80/x800 GPUs from the last couple of generations, and the description says that is good for "play at medium settings at 60 FPS (or 30 FPS at high) at 1080p? So you're choosing high settings or 60 FPS even on that hardware? And just at 1080p? And needing DLSS for 4K? Yikes.
Escrito originalmente por AmaiAmai:
I don't think it will be an issue for me to change how memory is allocated if a game won't run.

I have many GPUs from both brand, but mainly use an 8 GB card (6650XT) because it doesn't turn my room into a furnace and I did manage to undervolt it and drop power consumption to 45-90W @ 2990 Mhz clock. No GPU is giving me the same performance/watt at that setting, so I keep it around.

Many games are requesting more VRAM because the dev team are extremely lazy and can't optimize their own games. They just leave it for someone else to do because they are getting their money, and when the games don't run people blame AMD and NV, or others besides the devs.

I won't be surprised if the trend gets worse. But I am not worried because I will soon be working in the law field, and not cleaning up the mess of game devs in drivers. Can't wait to lean back and watch the show soon.
While I'm not going to say it ISN'T bad development work, because I won't speak for something I don't know, and I have little doubt that "take the quickest and easiest route even if it's not most efficient" is the way it sometimes goes (especially with how much crunch time many developers are under), but I do want to pose a question for consideration.

Why are some people quick to say it IS their fault?

Like with this example, VRAM needs have a history of jumping up when new console/game engine generations start coming around, and that is part of what we're seeing, no? These new games are exposing the current GPU market for its lack of VRAM, where 8 GB is the most common but almost everyone else has less (with "only" 12 GB starting to go up just now, and mostly because of the RTX 3060). I have to wonder if the poor market state in regards to VRAM amount is being exposed and making the launch situations for many of these titles worse. The new consoles have a shared 16 GB of very fast RAM. Half of that amount might not be enough VRAM for some titles, especially with higher resolution textures. It's a consideration.

You also see a lot of complaining about shader compilation times or shader stutter. I saw this video and gave it a watch, because I always wondered it.

https://www.youtube.com/watch?v=f7yml1y3fDE

From what I gather, modern games are just so numerous/heavy on shaders that it's caught up, and this is the result? Consoles get around the issue because the shaders come pre-compiled for them, and a lot of people say this could be done on PC and that it's the failure of the developer if they don't, but apparently this isn't realistic on PCs due to them not being fixed platforms. So many different GPUs and drivers make it impossible (?), so it looks like lengthy sessions of pre-compiling them before game start, or stutters during play, are what we will have to deal with.

Higher frame rates will likely intensify this too.

I've seen a lot of people blaming this on bad developers too.

But I have to wonder how much of it is their fault, and how much of it isn't.
Última edição por Illusion of Progress; 21/abr./2023 às 11:10
Tomi Montana 21/abr./2023 às 11:28 
Escrito originalmente por Illusion of Progress:
Escrito originalmente por smallcat:
Guys , just read about Immortals of Aveum , min sys req

1080p, 60 FPS, low to med settings:

OS Windows 10 (64bit);
CPU Intel Core i7-9700 или AMD Ryzen 7 3700X;
GPU: NVIDIA GeForce RTX 2080 Super (8GB) или AMD Radeon RX 5800 XT (8GB)
16GB RAM;
DirectX: 12;
storage: 110GB ( SSD recommended).

what ?
Glad , i dont like it much
Look at this, not much better.

https://store.steampowered.com/app/2124490/SILENT_HILL_2/

Minimum:
Requires a 64-bit processor and operating system
OS: Windows 10 x64
Processor: Intel Core i5-8400 | AMD Ryzen 3 3300X
Memory: 12 GB RAM
Graphics: AMD Radeon™ RX 5700 / NVIDIA® GeForce® GTX 1080
DirectX: Version 12
Storage: 50 GB available space
Sound Card: Windows Compatible Audio Device.
Additional Notes: Playing on minimum requirements should enable to play on
Low/Medium quality settings in FullHD (1080p) in stable 30 FPS.

Recommended:
Requires a 64-bit processor and operating system
OS: Windows 11 x64
Processor: Intel Core i7-8700K | AMD Ryzen 5 3600X
Memory: 16 GB RAM
Graphics: NVIDIA® GeForce® 2080RTX or AMD Radeon™ 6800XT
DirectX: Version 12
Storage: 50 GB available space
Sound Card: Windows Compatible Audio Device.
Additional Notes: Playing on recommended requirements should enable to play on
Medium quality settings in 60 FPS or High quality settings in 30 FPS, in FullHD
(or 4k using DLSS or similar technology).
And I do want this one, so it matters to me.
People will find ways to play it with specifications that are well below the minimum requirements. There are people that got AC odyssey running on a GT 710 almost hitting 30 FPS (albeit at really low resolution and settings). That game had a GTX 660 as a mimimum and is like 1/5th of that card in terms of performance.
I imagine that GTX 1060 6GB will be fine at 720p resolution.
Liquid Inc 21/abr./2023 às 12:03 
minimum req of a 1080? i know my card's old but... haven't been that "low" on the hardware side for many years....

it can run FH5 on ultra at 130fps+ but this game is low/mid at 30fps at the same res?

what did they optimise it with? a brick?
C1REX 21/abr./2023 às 12:22 
Escrito originalmente por Illusion of Progress:
Well that's just great!
The CPU and RAM requirements aren't bad at all.
CPU requirement is quite high actually. Bare in mind it's a console port showing you specs for 30fps and not 60fps. Just double the requirements if you want playable framerate. Recommended specs is for 1080p60fps Medium settings. 1080p High gives 30fps on a 2080.
We are in transition period when devs are dropping support for potato ps4 and games won't scale down very well any more.
This topic is a strange one to me. I don't know why everyone else needs so much VRAM. I'm able to run everything that I own with a 3050, quite easily.

It only has 8gb VRAM👀 how come you guys need so much!? Even my 1660 super has no trouble and that only has 6gb.

Oof!
emoticorpse 21/abr./2023 às 12:42 
Escrito originalmente por Illusion of Progress:
Escrito originalmente por AmaiAmai:
I don't think it will be an issue for me to change how memory is allocated if a game won't run.

I have many GPUs from both brand, but mainly use an 8 GB card (6650XT) because it doesn't turn my room into a furnace and I did manage to undervolt it and drop power consumption to 45-90W @ 2990 Mhz clock. No GPU is giving me the same performance/watt at that setting, so I keep it around.

Many games are requesting more VRAM because the dev team are extremely lazy and can't optimize their own games. They just leave it for someone else to do because they are getting their money, and when the games don't run people blame AMD and NV, or others besides the devs.

I won't be surprised if the trend gets worse. But I am not worried because I will soon be working in the law field, and not cleaning up the mess of game devs in drivers. Can't wait to lean back and watch the show soon.
While I'm not going to say it ISN'T bad development work, because I won't speak for something I don't know, and I have little doubt that "take the quickest and easiest route even if it's not most efficient" is the way it sometimes goes (especially with how much crunch time many developers are under), but I do want to pose a question for consideration.

Why are some people quick to say it IS their fault?

Like with this example, VRAM needs have a history of jumping up when new console/game engine generations start coming around, and that is part of what we're seeing, no? These new games are exposing the current GPU market for its lack of VRAM, where 8 GB is the most common but almost everyone else has less (with "only" 12 GB starting to go up just now, and mostly because of the RTX 3060). I have to wonder if the poor market state in regards to VRAM amount is being exposed and making the launch situations for many of these titles worse. The new consoles have a shared 16 GB of very fast RAM. Half of that amount might not be enough VRAM for some titles, especially with higher resolution textures. It's a consideration.

You also see a lot of complaining about shader compilation times or shader stutter. I saw this video and gave it a watch, because I always wondered it.

https://www.youtube.com/watch?v=f7yml1y3fDE

From what I gather, modern games are just so numerous/heavy on shaders that it's caught up, and this is the result? Consoles get around the issue because the shaders come pre-compiled for them, and a lot of people say this could be done on PC and that it's the failure of the developer if they don't, but apparently this isn't realistic on PCs due to them not being fixed platforms. So many different GPUs and drivers make it impossible (?), so it looks like lengthy sessions of pre-compiling them before game start, or stutters during play, are what we will have to deal with.

Higher frame rates will likely intensify this too.

I've seen a lot of people blaming this on bad developers too.

But I have to wonder how much of it is their fault, and how much of it isn't.

I would blame developers if I were to blame anyone (at least give them the majority of the blame as opposed to blaming Nvidia or even gamers for buying whatever is being put out to begin with).

My simple reason is this. If a game like Bioshock Infinite can be crafted to run great old consoles and old pcs (it was released 10 years ago), then logically I don't see why it would be so much trouble to make a game that runs great now?.

From that game to now, everything has gotten worse. Games are much larger, much more performance taxing, and in my opinion crappier storylines. At least for AAA games and for what reason? negligibly better lighting and you can see more freckles and texture details at a cost of 150 fps?.

I can't see how we can't blame developers. They're the ones who made the game?.

Now as far as breaking down "developers" even further, I'm not sure how it works but maybe the people actually coding aren't to blame since they probably are just taking orders from whoever it is that makes the decisions on how the game will be made. But, I do think somewhere in that group is where most of the blame should go.

Unless, It's possibly looked at like "it's just not viable or feasible to code a game efficiently anymore" due to all sorts of factors in the gaming development environment. Maybe if they did code it properly, it would take too long, cost too much or whatever and so it wouldn't get made to begin with.

But, I would still end up coming back to Bioshock Infinite. If they made that game, not sure why they wouldn't be able to make it again?. IIRC, it's offically been confirmed that they are making a Bioshock 4 also.
ZeekAncient 21/abr./2023 às 14:49 
Escrito originalmente por Illusion of Progress:
Escrito originalmente por ZeekAncient:
Unfortunately, during the GPU shortage, when I was in the process of building a new PC, Vram amount was never really a concern. It felt at the time that hardly any games were really exceeding 8GB that much, and even at 4K, it felt that 8GB wasn't going to be a limiting factor. In fact, it almost felt like AMD's GPUs had way more than they would ever need, and it was just going to go to waste, lol. Sure, lmao.
I seriously wonder how much less of an issue some of these modern games would be if the amounts of VRAM on the market weren't so low. I think these games are just exposing that factor. Obviously there's other issues with some of these games (though PC gamers might want to get used to dealing with either shader compilation times at startup or stutter during gameplay, because that's not necessarily one of them, and is instead a result of PCs not being fixed platforms like the consoles). So I'm not saying the games didn't have their issues, but I do feel VRAM is just one of them making it worse.

The recent games we've been seeing have only been the first of these. Now obviously the RTX 3070 series is a bit older now, but it has only JUST been replaced, and there were instances of an RTX 3060 outperforming not just an RTX 3070 but an RTX 3080 (!) due to VRAM. For that to ever happen, at all, before a card is even formally replaced, is shocking. It's not going to get better moving forward, unfortunately. My heart really does go out to RTX 3070/Ti buyers (and even 3060 Ti and 3080 to a point). Those products warranted more and the buyers deserved more, even if they chose to buy them as they were.

It's one thing to say a card not maxing everything out five years later is expected, but to have severely reduced impacts at times while it's still relevant in the current selling market, and has only recently been formally replaced (if you want to call it that, given the severe price hike the RTX 4070 Ti represented)? Different story altogether.

I really would have liked to see a 3070 Ti 16GB. I know Nvidia was thinking of releasing one at the time that the 3080 12GB got released, and I was almost certain that it was going to happen but it never did.

Now, I wouldn't have replaced my 3070 Ti 8GB with a 3070 Ti 16GB, but I would of liked to see how it would have compared to the 8GB card. It is clear from what we are seeing from these games, that more Vram would have prevented the issues that the 8GB card, and have surpassed the 6800 a great deal in recent games and have given the 6800XT a run for its money. There would have been a huge discrepancy in performance between the 8GB card and the 16GB. Something not seen before with cards having the exact same architecture, except for the Vram amount.

I am sure Nvidia would not have liked this. Perhaps that is why they decided not release it after all, after knowing the future that the 3070 and 3070 Ti faced. And honestly, I have never seen a situation quite like this. I mean sure, a GPU 3 to 5 years after release is not going to be on the same tier that it was performance wise as it was on release, but even just a couple years after release, it should still be a great performer and a viable card. Not saying that the 3070 and 3070 Ti are now NOT viable cards, but the type of problems that these cards are facing are not based because of their ability for rasterized performance. They simply run out of Vram, and games begin experience stuttering, texture pop-in and pop-out issues, and crashes, not to mention reduced FPS.

I have had a tendency over the last decade, or so, to start looking for my next GPU upgrade a couple years after I bought it. But that has always been do to wanting straight up more performance. Never because I was thinking that my card was limited in Vram. Performance wise, the 3070 TI is still a great card. And the problems we are seeing are from a crop of very recent games that have been criticized for being poor console ports. But still the problems occurring do raise an alarm. And I am seeing it myself in games that require more Vram. Either I am simply not allowed to max out a couple settings because my 3070 Ti has not enough Vram, or the game will encounter issues if I do max out those settings.

Games will only start using more and more Vram. So, I am sure there will be some games that are released that I really want to play, but I will think twice about because of the GPU I have. So, even before the 3070 Ti's overall performance comes into question, I am held back by the card's 8GB Vram amount.

Would have loved to see a 16GB version of this card. Heck, how about a 20GB 3080!
C1REX 21/abr./2023 às 15:07 
Escrito originalmente por W O K E I S M:
This topic is a strange one to me. I don't know why everyone else needs so much VRAM. I'm able to run everything that I own with a 3050, quite easily.

It only has 8gb VRAM👀 how come you guys need so much!? Even my 1660 super has no trouble and that only has 6gb.

Oof!
Low VRAM can make life difficult for developers. While 8GB VRAM may feel like enough, the reality is that developers often have limited options when it comes to optimizing and squeezing games into less memory. Unfortunately, slimy NVIDIA has conditioned the PC market to believe that more memory is unnecessary or prohibitively expensive, when in reality VRAM is one of the most cost-effective ways to improve game performance and visuals.
Escrito originalmente por Tomi Montana:
People will find ways to play it with specifications that are well below the minimum requirements. There are people that got AC odyssey running on a GT 710 almost hitting 30 FPS (albeit at really low resolution and settings). That game had a GTX 660 as a mimimum and is like 1/5th of that card in terms of performance.
I imagine that GTX 1060 6GB will be fine at 720p resolution.
Wait you knew what I had? Haha.

But I'm actually planning on upgrading from it soon anyway so I'm not so much as worried about not being able to play it due to being below minimum. I just find it thought provoking and wonder if it's might be yet another sign of the sorts of asks that games will have going forward.
Escrito originalmente por emoticorpse:
I would blame developers if I were to blame anyone (at least give them the majority of the blame as opposed to blaming Nvidia or even gamers for buying whatever is being put out to begin with).
I want to clarify that I'm merely presenting a question for curiosity sake (or food for thought, as some say), rather than making a declaration of where the blame does or doesn't lay.

I'm just asking those who are eager to blame developers, do you have extensive knowledge of those sorts or things to know it is their fault, or are complaints just coming from a place of "I think it should be better". Because I think the latter happens a lot. It doesn't mean I'm not saying developers never make faults either. They're not mutually exclusive things.
Escrito originalmente por emoticorpse:
I can't see how we can't blame developers. They're the ones who made the game?
That's too simplistic of a view.

The developers don't control everything about the platforms (plural) they're making the games for. They don't control their time and schedule limits.

Do you actually KNOW and not just THINK the issues could be fixed?

Do these solutions make sense in our capitalistic system? If not, at what point do you realize maybe it's perhaps the system and not the developers that are maybe more at fault?
Última edição por Illusion of Progress; 21/abr./2023 às 15:09
antoniobennett72 21/abr./2023 às 15:09 
Escrito originalmente por ZeekAncient:
That article was a pretty good read. I like it when Techspot revisits comparisons of GPUs a few years after release, so you can see how the GPUs aged. AMD GPUs tend to age better than Nvidia GPUs.

That never bothered me too much in the past, even though I have bought Nvidia, because I tend to upgrade GPU every couple years anyway.

But I am one who purchased a 3070 Ti not long after it was released in 2021, during the GPU shortage, and I paid a little over $800 for it. An EVGA FTW3 Ultra, and I have to say that I have never felt such buyer's remorse with a GPU, as I have recently with the 3070 Ti. And it all has to do with the Vram amount.

Thankfully, I am not yet playing most of the games that have had issues with the 3070 Ti, but I am currently playing some where it does appear that the 3070 Ti's 8GB limit is becoming a problem. And I feel it is only going to get worse. Sure, DLSS aleviates a lot of this, but it is still a bumber to have to tweak a texture setting here or there to maintain stability when it feels that otherwise the 3070 Ti would cruise through these games if it had 16GB Vram. Both the 3070 and 3070 Ti would most likely outperform the 6800 in every scenario if it had more Vram.

Unfortunately, during the GPU shortage, when I was in the process of building a new PC, Vram amount was never really a concern. It felt at the time that hardly any games were really exceeding 8GB that much, and even at 4K, it felt that 8GB wasn't going to be a limiting factor. In fact, it almost felt like AMD's GPUs had way more than they would ever need, and it was just going to go to waste, lol. Sure, lmao.

I mean, I was looking to get a 3080. That is what I really had my eyes on. But with the way prices were then, and still are, and with the shortage, getting one was almost impossible. So, I ended up settling for a 3070 Ti when I was able to get one at, or slightly above, MSRP. But even if I got a 3080, it still only has 10GB. I think the 3080 12GB would of been a better buy.

Don't get me wrong, the 3070 TI has performed well, and has allowed me to have a great experience, but just recently seeing the direction that games are taking with Vram, and seeing first hand how only having 8GB has turned the 3070 and 3070 Ti from upper midrange GPUs, that should still perform admirably, to entry level GPUs, is pretty alarming and awakening.

I am thinking that my 3070 Ti will be lucky to make it to the 2 year mark before I upgrade again. This time I will take Vram a lot more serious when making my decision. 16GB is probably the lowest I will go. Nvidia's 40 Series GPUs that only come with 12GB will probably be facing the same situation that the 3070 and 3070 Ti are facing now in a couple years.

Similar for me, I was shooting for the 3080 but had better luck on cost and was able to score the 3060ti at retail (can't say MSRP because it was an AIB for like 79 above or so).

Gotta have that 16, I am not trying to buy 500 or 600 or 700 card every generation.
< >
Exibindo comentários 3145 de 98
Por página: 1530 50

Publicado em: 20/abr./2023 às 10:45
Mensagens: 98