AdahnGorion 16 серп. 2020 о 2:56
2
2
12
The big GPU/Tech Rumours thread - Nvidia/AMD/Others - Is it worth to invest now? what news are here now? Feel free to debate
This thread has evolved a alot and has now become a New Runours on tech thread - It is primarily focused on GPU´s and it should stay that way. I will update the thread, whenever we have new information about newer GPU´s releasing. Feel free to debate some of the more catchy tech rumours as well.

I have reserved this field in our thread starter to keep current debates up and new information.
Atm.


2024/2025

Here are the official announcements
https://www.youtube.com/watch?v=k82RwXqZHY8&t=1496s
https://www.nvidia.com/en-us/geforce/graphics-cards/50-series/
------

Varous information about the new 5000 series.
https://www.techradar.com/computing/gpu/nvidia-unveils-new-geforce-rtx-5090-rtx-5080-rtx-5070-ti-and-rtx-5070-graphics-cards-at-ces-2025


We have started to see more information about the new Nvidia Blackwell GPU´s (5000´s series)

Here is a leak, not very interesting tbh.
https://www.pcgamesn.com/nvidia/geforce-rtx-5000-specs-boring-leak
https://www.pcgamer.com/nvidia-blackwell/



2023

We debate the 4000 series released and the ones that will release, a long with the rumours about the 5000 series, that is showcased to be the biggest leap in GPU history performance wise (like everytime ofc)

https://wccftech.com/nvidia-preps-mass-production-two-ad104-ada-gpus-possibly-geforce-rtx-4070-rtx-4060-ti/amp/


2021

https://www.tomshardware.com/news/amd-increase-efficiency-of-chips-thirtyfold-by-2025
https://www.tomshardware.com/news/asus-shows-off-geforce-rtx-with-noctua-cooler
https://winfuture.de/news,125475.html



So I figured we should have a thread about the new RTX 3xxx series, that is soon to release.
What is your feeling about it? Do you think some of the rumours are true? Are you upgrading?


The flagship seems to be a 3090 (I like them going back to the x90 tier)
https://videocardz.com/newz/micron-confirms-nvidia-geforce-rtx-3090-gets-21gbps-gddr6x-memory

Personally I will not upgrade yet and even if I had to (I do not) I would still wait for prices to go down, early adoption is expenssive and often not worth it. But I am interested about what rumours might be true.

Some of the earlier rumours have suggested even higher GB numbers (from 16 and up) and performance claims have been all from 10-50% in effective speed and up to 60% in ray tracing inprovement.


I think it will be a 10-15% max in effective speed on each tier, but I do think we will see significant increases in the Ray Tracing spectrum.


Other sources (remember to take sources and numbers with a grain of salt)
https://www.techradar.com/news/nvidia-rtx-3000-launch-details-leak-and-amd-could-be-in-big-trouble
https://www.tweaktown.com/news/69629/nvidia-ampere-gpu-50-faster-turing-half-power/index.html
https://www.gpumag.com/nvidia-geforce-rtx-3000-series/
https://www.digitaltrends.com/computing/nvidia-rtx-3080-rumors-news-release-date-performance/
https://www.rockpapershotgun.com/2020/08/10/nvidia-ampere-rtx-3000-everything-we-know-so-far/
https://videocardz.com/newz/nvidia-geforce-rtx-3090-graphics-card-pictured
https://videocardz.com/newz/nvidia-allegedly-cancels-geforce-rtx-3080-20gb-and-rtx-3070-16gb
Pricing
https://www.techpowerup.com/271081/rumor-geforce-rtx-3090-pricing-to-arrive-around-the-usd-2-000-mark
https://www.neowin.net/news/alleged-rtx-3080-ti-rtx-3070-ti-launch-dates-have-leaked/

AMD - Big Navi

It will be interesting to see what AMD brings to the table. Personally I think they will go for the mid tier market and if they succeed then, we might see battles for the top tier next generation.

Rumours

https://www.eteknix.com/amd-navi-21-xt-graphics-card-specs-leak/
https://www.kitguru.net/components/graphic-cards/joao-silva/amd-navi-21-xt-xl-leak-suggests-an-over-2-0ghz-boost-clock/
https://www.guru3d.com/news-story/rumor-amds-navi-21-gpu-has-255-watts-tgp-and-boost-up-to-2-4-ghz.html
https://videocardz.com/newz/amd-navi-21-xt-to-feature-2-3-2-4-ghz-game-clock-250w-tgp-and-16-gb-gddr6-memory
https://www.igorslab.de/en/3dmark-in-ultra-hd-benchmarks-the-rx-6800xt-without-and-with-raytracing/
https://videocardz.com/newz/amd-radeon-rx-6800xt-alleged-3dmark-scores-hit-the-web
https://wccftech.com/intel-first-high-end-xe-hpg-dg2-gaming-graphics-pictured-rumored-specs-performance-rtx-3080-performance/amp/


Interesting stats about Big Navi

It seems like all will ship with atleast 16 GB, if that is true, that is surprising tbh.
Rumours talk about 2.4GHz clock on the Navi 21 XT. I think it will be pretty exciting to watch. We will know more in a week, once it gets revealed.


It seems that Nvidia is releasing a new series for miners and are changing the 3060, that will release on 25 of feb. Links below.
https://www.pcgamer.com/nvidia-cmp-mining-cards-rtx-3060-half-hash-rate/
https://hothardware.com/news/nvidia-geforce-rtx-3060-availability-crypto-mining-cmp-gpu
https://www.nvidia.com/en-us/cmp/

Various Rumours about hardware

https://videocardz.com/newz/bitmain-antminer-e9-ethereum-asic-is-as-powerful-as-32-geforce-rtx-3080-graphics-cards
https://wccftech.com/mainstream-ddr5-memory-modules-pictured-rolling-out-for-mass-production/amp/



Do you want to show off your new build or even just your old build. Then you can do so at
https://steamcommunity.com/discussions/forum/11/5413843407449992305/

That is our benchmark thread. You are still free to debate hardware related stuff and new builds here ofc.
Автор останньої редакції: AdahnGorion; 8 січ. о 0:36
< >
Показані коментарі 4,7564,770 із 4,884
Цитата допису AdahnGorion:
The argument I was trying to make is, that it makes little sense to compare a x50 with 6 and 8gb vram on higher end systems (good cpu, mb, ram, etc) and then judge it all from benchmark and games.. in reality, the people that use this tier will not see a difference in most of its usage, hence why I think its a silly bash of the 3050.
Yes, I do agree, but on the other hand, if it's going to lose 25% of the bus width and VRAM and also lose performance, calling it the same chip deserves criticism.

I'm not sure if the criticism was being given to it only because it's a low end chip. It was because it was carrying the x50 branding while barely being faster than an x50 two generations ago.
Цитата допису AdahnGorion:
The leap from a x50 budget, to a x60 entry level card is huge...
It is, yes. The biggest gaps in the lineup tend to between x50 and x60 (and when it exists, x30 to x50). Those two gaps are a doubling of performance. Nowhere else is there a doubling of performance.

x60 used to be a bit over half the performance of the x80, so even two tiers there isn't double. And then x50 is half of the x60. And then x30 is again half of the x50.

GTX 1080 - 100%

GTX 1060 - Almost two thirds performance of GTX 1080 (while costing less than half; it's easy to see why most used to choose x60 tier.)

GTX 1050 - Half of GTX 1060

GT 1030 - Half of GTX 1050

x60 isn't (well, wasn't) entry level though. Calling something "budget" makes little sense to me because someone can have a "budget" but still choose an x70. What is "budget" is relative.

nVidia seems to call x60 "mainstream" and x70 "performance mainstream" and those make more sense to me. x80 is "enthusiast". I've heard people refer to x50 as "entry level gaming tier". This sort of predates x90, but the x90 existing isn't going to change that most people will probably be in the market for x50 and x60. I do expect the RTX 4070 to make the x70 tier a bit more popular this specific generation though (similar to how GTX 970 was) since the x60 is a bit disappointing this generation. That's because all cards are cut down, but it "catches up" by time you get down to the x60 tier. The x70 (Ti in particular) is what an x60 used to be so people find it acceptable enough since the x60 always gave good enough performance before.
APUs will probably never catch up to the low end discrete graphics market unless manufacturers make literally zero improvement in the low end for the next few generations or cut it off altogether, neither of which are likely scenarios to happen.

APUs relying on system memory is also a key bottleneck, their memory performance is always going to be terrible and will always make a difference even if the core performance were to be the same; the GT 1030 was still beating some decent iGPs in the past because of memory bandwidth constraints.
https://www.tomshardware.com/pc-components/gpus/nvidias-new-trimmed-down-entry-level-gaming-gpu-fails-to-outperform-rivals

The RTX 3050 6GB is so cut down from the original 8GB model that it's slower than a GTX 1660 Ti. Literally the only use case this card has is in systems with garbage proprietary design power supplies that can't carry the load of a card that draws 100+ watts.
Цитата допису r.linder:
APUs will probably never catch up to the low end discrete graphics market unless manufacturers make literally zero improvement in the low end for the next few generations or cut it off altogether, neither of which are likely scenarios to happen.

APUs relying on system memory is also a key bottleneck, their memory performance is always going to be terrible and will always make a difference even if the core performance were to be the same; the GT 1030 was still beating some decent iGPs in the past because of memory bandwidth constraints.
There seems to be some toxic redefining of "low end discrete graphics market", which made me surprised when you bothered to mention the GT 1030, as swaths of today's gamers would not even have acknowledged the existence of anything below the GTX 1060.

With NVIDIA not releasing any 40 series cards at the 30 bracket, there essentially isn't a low end discrete graphics market for team green. That is not really where an RTX 3050 6GB is marketed to sit, and I cannot find a "Low Profile" variant.

AMD's 780M APU seems to offer a compelling alternative to an RX 6400 discrete GPU, so as far as AMD is concerned, APUs have already caught up with the low end discrete graphics market. Those with older PCs running Windows 10, could install an RX 6400 Low Profile GPU for their spreadsheeting and 4k video playback needs.

The 780M seems to completely destroy the GT 1030 and the Iris Xe seems to have surpassed it, which is unsurprising. The GT 1030 has aged out of relevance. 1030 -> 2020 -> 3010 -> iGP/APU.
Цитата допису CJM:
Цитата допису r.linder:
APUs will probably never catch up to the low end discrete graphics market unless manufacturers make literally zero improvement in the low end for the next few generations or cut it off altogether, neither of which are likely scenarios to happen.

APUs relying on system memory is also a key bottleneck, their memory performance is always going to be terrible and will always make a difference even if the core performance were to be the same; the GT 1030 was still beating some decent iGPs in the past because of memory bandwidth constraints.
There seems to be some toxic redefining of "low end discrete graphics market", which made me surprised when you bothered to mention the GT 1030, as swaths of today's gamers would not even have acknowledged the existence of anything below the GTX 1060.

With NVIDIA not releasing any 40 series cards at the 30 bracket, there essentially isn't a low end discrete graphics market for team green. That is not really where an RTX 3050 6GB is marketed to sit, and I cannot find a "Low Profile" variant.

AMD's 780M APU seems to offer a compelling alternative to an RX 6400 discrete GPU, so as far as AMD is concerned, APUs have already caught up with the low end discrete graphics market. Those with older PCs running Windows 10, could install an RX 6400 Low Profile GPU for their spreadsheeting and 4k video playback needs.

The 780M seems to completely destroy the GT 1030 and the Iris Xe seems to have surpassed it, which is unsurprising. The GT 1030 has aged out of relevance. 1030 -> 2020 -> 3010 -> iGP/APU.
I only mentioned the GT 1030 because older integrated graphics had similar performance but fell behind due to memory limitations from using DDR instead of GDDR. You would've found that out if you read what I said.

GT series existed as a replacement for integrated graphics and it did that job for years, iGPs will never catch up with actual discrete graphics that performs well for gaming, it'll always lag behind. Even though it's getting into competing with the 1050 series, that's not even special anymore, and below entry level.
Автор останньої редакції: r.linder; 17 лют. 2024 о 18:11
Цитата допису r.linder:
GT series existed as a replacement for integrated graphics and it did that job for years, iGPs will never catch up with actual discrete graphics that performs well for gaming, it'll always lag behind. Even though it's getting into competing with the 1050 series, that's not even special anymore, and below entry level.

I don't know about the GT 1050 being below entry level. There is the Steam Deck to consider.

Admittedly, it seems the GT 1050 only has 2GB of VRAM, which is paltry by today's standards. The rasterization performance, however, is still good enough to at least qualify for entry level. Also, it qualifies as a GTX and not merely a GT class potato GPU.

1050 -> 2040 -> 3030 -> 4020

The GTX 1050 and GTX 1650 are entry level. The GT 650 is a potato.

The GT series was at its prime back between the GeForce 8000 series, and the 700 series. With the GT 750 Ti being a very solid mid-range GPU. I purchased a GT 240 back when it was considered mid-range, as a discrete GPU over the Intel Graphics Media Accelerator (GMA) 3000 series iGPU on my Socket 775, Q33, motherboard.
Цитата допису CJM:
Цитата допису r.linder:
GT series existed as a replacement for integrated graphics and it did that job for years, iGPs will never catch up with actual discrete graphics that performs well for gaming, it'll always lag behind. Even though it's getting into competing with the 1050 series, that's not even special anymore, and below entry level.

I don't know about the GT 1050 being below entry level. There is the Steam Deck to consider.

Admittedly, it seems the GT 1050 only has 2GB of VRAM, which is paltry by today's standards. The rasterization performance, however, is still good enough to at least qualify for entry level. Also, it qualifies as a GTX and not merely a GT class potato GPU.

1050 -> 2040 -> 3030 -> 4020

The GTX 1050 and GTX 1650 are entry level. The GT 650 is a potato.

The GT series was at its prime back between the GeForce 8000 series, and the 700 series. With the GT 750 Ti being a very solid mid-range GPU. I purchased a GT 240 back when it was considered mid-range, as a discrete GPU over the Intel Graphics Media Accelerator (GMA) 3000 series iGPU on my Socket 775, Q33, motherboard.
The hardware in a very small form factor console isn't a fair comparison to full size desktop components, there are far greater limitations on what can be used for that whereas desktop systems are completely different.

Compared to newer low end GPUs like the 2060, 3060, etc. the 1050 is nothing. It struggles to reach 60 FPS in a lot of titles nowadays which shouldn't be acceptable, people shouldn't have to convince themselves that less is fine when 60 is so easily obtainable.

The GTX 1050 was entry level, years ago, it stopped being that ages ago. It's bordering on obsolete at this point. I would say that anyone that would continue trying to argue against that in 2024 is not very up to date on GPU hiearchy
Автор останньої редакції: r.linder; 17 лют. 2024 о 19:17
Цитата допису r.linder:
The hardware in a very small form factor console isn't a fair comparison to full size desktop components, there are far greater limitations on what can be used for that whereas desktop systems are completely different.

Compared to newer low end GPUs like the 2060, 3060, etc. the 1050 is nothing. It struggles to reach 60 FPS in a lot of titles nowadays which shouldn't be acceptable, people shouldn't have to convince themselves that less is fine when 60 is so easily obtainable.

The GTX 1050 was entry level, years ago, it stopped being that ages ago. It's bordering on obsolete at this point. I would say that anyone that would continue trying to argue against that in 2024 is not very up to date on GPU hiearchy

Entry Level has always been the domain of 30 FPS gaming. Which the GTX 1050's rasterization performance is well suited for. For a time, Entry Level was 30 FPS at 480p. Eventually, it moved to 30 FPS at 720p.

What makes it obsolete is the 2GB of VRAM. Modern DDR5 might be fast enough to support 30 FPS, and so the VRAM performance might not even matter anymore on an entry level GPU. Though, the GPU is probably not paired with a DDR5 machine.

60 FPS is not so easily obtainable these days either. More and more games are CPU bottlenecked, not being able to utilize more than four cores effectively. Jedi: Survivor is notorious for not supporting a locked 60 FPS.

A GT 650 is nothing as I said, that is the definition of a potato. A GTX 1050 is a smidge more than nothing. It was mid-range back in 2016, which means that between 2019 and 2022 it dropped to Entry Level.
Цитата допису CJM:
Цитата допису r.linder:
The hardware in a very small form factor console isn't a fair comparison to full size desktop components, there are far greater limitations on what can be used for that whereas desktop systems are completely different.

Compared to newer low end GPUs like the 2060, 3060, etc. the 1050 is nothing. It struggles to reach 60 FPS in a lot of titles nowadays which shouldn't be acceptable, people shouldn't have to convince themselves that less is fine when 60 is so easily obtainable.

The GTX 1050 was entry level, years ago, it stopped being that ages ago. It's bordering on obsolete at this point. I would say that anyone that would continue trying to argue against that in 2024 is not very up to date on GPU hiearchy

Entry Level has always been the domain of 30 FPS gaming. Which the GTX 1050's rasterization performance is well suited for. For a time, Entry Level was 30 FPS at 480p. Eventually, it moved to 30 FPS at 720p.

What makes it obsolete is the 2GB of VRAM. Modern DDR5 might be fast enough to support 30 FPS, and so the VRAM performance might not even matter anymore on an entry level GPU. Though, the GPU is probably not paired with a DDR5 machine.

60 FPS is not so easily obtainable these days either. More and more games are CPU bottlenecked, not being able to utilize more than four cores effectively. Jedi: Survivor is notorious for not supporting a locked 60 FPS.

A GT 650 is nothing as I said, that is the definition of a potato. A GTX 1050 is a smidge more than nothing. It was mid-range back in 2016, which means that between 2019 and 2022 it dropped to Entry Level.
30 FPS is a standard that died a long time ago, it's archaic and just silly to aim for that when there are still old GPUs like the RX 580 and GTX 1060-6G that can easily do 60, and can be found for less than $100. Anyone still aiming for just 30 FPS and thinking that's fine is in denial.

x50 tier is never mid range, it was always entry level for the generation, but they don't maintain their status with time because the hiearchy always changes, even a 1080 Ti is low end and the 2080 Ti is mid-range at best. 3090 is arguably mid-range, maybe upper mid range now. GTX 16 series is often referred to as entry level, even the 1660 Ti, because it's a 5 year old low end card.

x50 = entry level
x60 = low end
x70 = mid range
x80 = high end
x80 Ti/x90 = flagship

That is how NVIDIA's stack has been for a long time.
Автор останньої редакції: r.linder; 17 лют. 2024 о 19:59
Цитата допису r.linder:
30 FPS is a standard that died a long time ago, it's archaic and just silly to aim for that when there are still old GPUs like the RX 580 and GTX 1060-6G that can easily do 60, and can be found for less than $100. Anyone still aiming for just 30 FPS and thinking that's fine is in denial.

x50 tier is never mid range, it was always entry level for the generation, but they don't maintain their status with time because the hiearchy always changes, even a 1080 Ti is low end and the 2080 Ti is mid-range at best. 3090 is arguably mid-range, maybe upper mid range now. GTX 16 series is often referred to as entry level, even the 1660 Ti, because it's a 5 year old low end card.

x50 = entry level
x60 = low end
x70 = mid range
x80 = high end
x80 Ti/x90 = flagship

That is how NVIDIA's stack has been for a long time.
Check Wikipedia. The x50 tier is always mid-range, that is what x50 means. On a scale of 1-9, it is in the middle.

The Flagship GPU for NVIDIA is usually the x80, but occasionally it is the x70. Their x90 GPU is almost never the flagship.

The x60 is generally the mid-range standard.

---

I don't think you understand what Entry Level is. Entry Level is the Core i7-3770, 12GB DDR3, GTX 970 4GB I just upgraded from. It is the $50-$100 range for children, and current gen iGPUs and APUs.

It is what you upgrade from, to get a mid-range computer. If you are on the high-end you upgrade from your old mid-range PC, or you ride your mid-range PC out another console generation until it becomes Entry Level.
Цитата допису CJM:
Цитата допису r.linder:
30 FPS is a standard that died a long time ago, it's archaic and just silly to aim for that when there are still old GPUs like the RX 580 and GTX 1060-6G that can easily do 60, and can be found for less than $100. Anyone still aiming for just 30 FPS and thinking that's fine is in denial.

x50 tier is never mid range, it was always entry level for the generation, but they don't maintain their status with time because the hiearchy always changes, even a 1080 Ti is low end and the 2080 Ti is mid-range at best. 3090 is arguably mid-range, maybe upper mid range now. GTX 16 series is often referred to as entry level, even the 1660 Ti, because it's a 5 year old low end card.

x50 = entry level
x60 = low end
x70 = mid range
x80 = high end
x80 Ti/x90 = flagship

That is how NVIDIA's stack has been for a long time.
Check Wikipedia. The x50 tier is always mid-range, that is what x50 means. On a scale of 1-9, it is in the middle.

The Flagship GPU for NVIDIA is usually the x80, but occasionally it is the x70. Their x90 GPU is almost never the flagship.

The x60 is generally the mid-range standard.

---

I don't think you understand what Entry Level is. Entry Level is the Core i7-3770, 12GB DDR3, GTX 970 4GB I just upgraded from. It is the $50-$100 range for children, and current gen iGPUs and APUs.

It is what you upgrade from, to get a mid-range computer. If you are on the high-end you upgrade from your old mid-range PC, or you ride your mid-range PC out another console generation until it becomes Entry Level.
Going by a 1-9 scale doesn't make sense because they don't make models for every single number. GTX started with x50 and ended in x80 or x90 depending on the generation, and x90 is probably going to stick around as the flagship now. Just seems like a pointless argument over semantics.

The 970 is almost double the performance of the 1050, so if that's considered entry level, then the 1050 obviously isn't.

https://www.gpucheck.com/compare/nvidia-geforce-gtx-970-vs-nvidia-geforce-gtx-1050/intel-core-i7-4770k-3-50ghz-vs-intel-core-i7-8700k-3-70ghz/
Автор останньої редакції: r.linder; 17 лют. 2024 о 20:11
Цитата допису r.linder:
The 970 is almost double the performance of the 1050, so if that's considered entry level, then the 1050 obviously isn't.

https://www.gpucheck.com/compare/nvidia-geforce-gtx-970-vs-nvidia-geforce-gtx-1050/intel-core-i7-4770k-3-50ghz-vs-intel-core-i7-8700k-3-70ghz/
Not quite.

Entry Level is a range, between the GTX 1050 at the bottom, and the GTX 970 at the top.

The RTX 3050 6GB is straddling somewhere between mid-range and entry level, landing more on the side of Mid-Range. x40 cards were generally never a good value anyway, and x30 cards were generally abysmal value outside of Low Profile office use cases. Most of the Entry Level cards are second hand used cards that are several generations out of date.

The range between the GTX 1050, and the GT 650 is a no man's land where it does not quite qualify as nothing, it it does not qualify as potato. Yet it is pretty much worthless. The GT 750 Ti being the worthless non-potato/non-nothing card of the day. I suppose this region between the GTX 1050 and GT 650 is subjective.

A Core 2 Quad Q6600 does not support AVX extensions, that does not mean that the computer is nothing. There are still use cases for such a workstation. There are reasons why a Rasperry Pi is popular. A Pentium 4 is nothing. Single core CPUs are potatoes, and most would agree that a dual core CPU is a potato.
Цитата допису CJM:
Цитата допису r.linder:
The 970 is almost double the performance of the 1050, so if that's considered entry level, then the 1050 obviously isn't.

https://www.gpucheck.com/compare/nvidia-geforce-gtx-970-vs-nvidia-geforce-gtx-1050/intel-core-i7-4770k-3-50ghz-vs-intel-core-i7-8700k-3-70ghz/
Not quite.

Entry Level is a range, between the GTX 1050 at the bottom, and the GTX 970 at the top.

The RTX 3050 6GB is straddling somewhere between mid-range and entry level, landing more on the side of Mid-Range. x40 cards were generally never a good value anyway, and x30 cards were generally abysmal value outside of Low Profile office use cases. Most of the Entry Level cards are second hand used cards that are several generations out of date.

The range between the GTX 1050, and the GT 650 is a no man's land where it does not quite qualify as nothing, it it does not qualify as potato. Yet it is pretty much worthless. The GT 750 Ti being the worthless non-potato/non-nothing card of the day. I suppose this region between the GTX 1050 and GT 650 is subjective.

A Core 2 Quad Q6600 does not support AVX extensions, that does not mean that the computer is nothing. There are still use cases for such a workstation. There are reasons why a Rasperry Pi is popular. A Pentium 4 is nothing. Single core CPUs are potatoes, and most would agree that a dual core CPU is a potato.
The GTX 1050 isn't entry level anymore, that isn't debatable, you said it yourself that the VRAM kills it. 2GB isn't enough anymore, the bare minimum for modern games is between 4 and 8 depending on the game. There is no argument to make for it, it's too old and slow, and a lot of people are trying to get away from it.
Автор останньої редакції: r.linder; 17 лют. 2024 о 20:36
Цитата допису r.linder:
The GTX 1050 isn't entry level anymore, that isn't debatable, you said it yourself that the VRAM kills it. 2GB isn't enough anymore, the bare minimum for modern games is between 4 and 8 depending on the game.
Entry Level isn't always supported by modern games. The GTX 970 doesn't even have a proper 4GB, it is 3.5GB.

8GB is what NVIDIA is still putting on RTX 4060 series GPUs. AMD's RX 6600 is an 8GB card. These are mid-range cards, with mid-range VRAM capacities.

The GTX 1050 still classifies as entry level. Horizon: Zero Dawn requires a GTX 780 3GB. The Witcher 3 requires a GeForce GTX 660 2GB. Red Dead Redemption 2 requires NVIDIA GeForce GTX 770 2GB.

Read Dead Redemption 2 was released in 2018. Its requirements were built around the PlayStation 4 and XBox ONE. EA's Formula 1 2022 (F1 22) was released on PS4. Meaning that even up to two years ago the PlayStation 4 was still "Entry Level".

The Nintendo Switch still exists and has not been replaced, as another argument for Entry Level. Several of its games are remasters of XBox 360 and PlayStation 3 games. Such games still have a market in 2024.

The Entry Level status of the GTX 1050 is debatable.
Автор останньої редакції: CJM; 18 лют. 2024 о 7:01
< >
Показані коментарі 4,7564,770 із 4,884
На сторінку: 1530 50

Опубліковано: 16 серп. 2020 о 2:56
Дописів: 4,978