AdahnGorion 2020年8月16日 2時56分
2
2
12
The big GPU/Tech Rumours thread - Nvidia/AMD/Others - Is it worth to invest now? what news are here now? Feel free to debate
This thread has evolved a alot and has now become a New Runours on tech thread - It is primarily focused on GPU´s and it should stay that way. I will update the thread, whenever we have new information about newer GPU´s releasing. Feel free to debate some of the more catchy tech rumours as well.

I have reserved this field in our thread starter to keep current debates up and new information.
Atm.


2024/2025

Here are the official announcements
https://www.youtube.com/watch?v=k82RwXqZHY8&t=1496s
https://www.nvidia.com/en-us/geforce/graphics-cards/50-series/
------

Varous information about the new 5000 series.
https://www.techradar.com/computing/gpu/nvidia-unveils-new-geforce-rtx-5090-rtx-5080-rtx-5070-ti-and-rtx-5070-graphics-cards-at-ces-2025


We have started to see more information about the new Nvidia Blackwell GPU´s (5000´s series)

Here is a leak, not very interesting tbh.
https://www.pcgamesn.com/nvidia/geforce-rtx-5000-specs-boring-leak
https://www.pcgamer.com/nvidia-blackwell/



2023

We debate the 4000 series released and the ones that will release, a long with the rumours about the 5000 series, that is showcased to be the biggest leap in GPU history performance wise (like everytime ofc)

https://wccftech.com/nvidia-preps-mass-production-two-ad104-ada-gpus-possibly-geforce-rtx-4070-rtx-4060-ti/amp/


2021

https://www.tomshardware.com/news/amd-increase-efficiency-of-chips-thirtyfold-by-2025
https://www.tomshardware.com/news/asus-shows-off-geforce-rtx-with-noctua-cooler
https://winfuture.de/news,125475.html



So I figured we should have a thread about the new RTX 3xxx series, that is soon to release.
What is your feeling about it? Do you think some of the rumours are true? Are you upgrading?


The flagship seems to be a 3090 (I like them going back to the x90 tier)
https://videocardz.com/newz/micron-confirms-nvidia-geforce-rtx-3090-gets-21gbps-gddr6x-memory

Personally I will not upgrade yet and even if I had to (I do not) I would still wait for prices to go down, early adoption is expenssive and often not worth it. But I am interested about what rumours might be true.

Some of the earlier rumours have suggested even higher GB numbers (from 16 and up) and performance claims have been all from 10-50% in effective speed and up to 60% in ray tracing inprovement.


I think it will be a 10-15% max in effective speed on each tier, but I do think we will see significant increases in the Ray Tracing spectrum.


Other sources (remember to take sources and numbers with a grain of salt)
https://www.techradar.com/news/nvidia-rtx-3000-launch-details-leak-and-amd-could-be-in-big-trouble
https://www.tweaktown.com/news/69629/nvidia-ampere-gpu-50-faster-turing-half-power/index.html
https://www.gpumag.com/nvidia-geforce-rtx-3000-series/
https://www.digitaltrends.com/computing/nvidia-rtx-3080-rumors-news-release-date-performance/
https://www.rockpapershotgun.com/2020/08/10/nvidia-ampere-rtx-3000-everything-we-know-so-far/
https://videocardz.com/newz/nvidia-geforce-rtx-3090-graphics-card-pictured
https://videocardz.com/newz/nvidia-allegedly-cancels-geforce-rtx-3080-20gb-and-rtx-3070-16gb
Pricing
https://www.techpowerup.com/271081/rumor-geforce-rtx-3090-pricing-to-arrive-around-the-usd-2-000-mark
https://www.neowin.net/news/alleged-rtx-3080-ti-rtx-3070-ti-launch-dates-have-leaked/

AMD - Big Navi

It will be interesting to see what AMD brings to the table. Personally I think they will go for the mid tier market and if they succeed then, we might see battles for the top tier next generation.

Rumours

https://www.eteknix.com/amd-navi-21-xt-graphics-card-specs-leak/
https://www.kitguru.net/components/graphic-cards/joao-silva/amd-navi-21-xt-xl-leak-suggests-an-over-2-0ghz-boost-clock/
https://www.guru3d.com/news-story/rumor-amds-navi-21-gpu-has-255-watts-tgp-and-boost-up-to-2-4-ghz.html
https://videocardz.com/newz/amd-navi-21-xt-to-feature-2-3-2-4-ghz-game-clock-250w-tgp-and-16-gb-gddr6-memory
https://www.igorslab.de/en/3dmark-in-ultra-hd-benchmarks-the-rx-6800xt-without-and-with-raytracing/
https://videocardz.com/newz/amd-radeon-rx-6800xt-alleged-3dmark-scores-hit-the-web
https://wccftech.com/intel-first-high-end-xe-hpg-dg2-gaming-graphics-pictured-rumored-specs-performance-rtx-3080-performance/amp/


Interesting stats about Big Navi

It seems like all will ship with atleast 16 GB, if that is true, that is surprising tbh.
Rumours talk about 2.4GHz clock on the Navi 21 XT. I think it will be pretty exciting to watch. We will know more in a week, once it gets revealed.


It seems that Nvidia is releasing a new series for miners and are changing the 3060, that will release on 25 of feb. Links below.
https://www.pcgamer.com/nvidia-cmp-mining-cards-rtx-3060-half-hash-rate/
https://hothardware.com/news/nvidia-geforce-rtx-3060-availability-crypto-mining-cmp-gpu
https://www.nvidia.com/en-us/cmp/

Various Rumours about hardware

https://videocardz.com/newz/bitmain-antminer-e9-ethereum-asic-is-as-powerful-as-32-geforce-rtx-3080-graphics-cards
https://wccftech.com/mainstream-ddr5-memory-modules-pictured-rolling-out-for-mass-production/amp/



Do you want to show off your new build or even just your old build. Then you can do so at
https://steamcommunity.com/discussions/forum/11/5413843407449992305/

That is our benchmark thread. You are still free to debate hardware related stuff and new builds here ofc.
最近の変更はAdahnGorionが行いました; 1月8日 0時36分
< >
4,651-4,665 / 4,884 のコメントを表示
C1REX の投稿を引用:
I also think about this. I was planning to skip this generation as my old GPU was plenty for me, but the rumours about what the next gen can look like (no high end AMD) convinced me to buy this gen's GPU.

I've listened to "Moore's Law is Dead" recent podcast, and I really like the idea of his guest (AI expert) that Intel and AMD can really mess up Nvidia's plans if they release relatively cheap "gaming" GPUs with an optional 32GB of VRAM. As the guest stated, VRAM = performance for AI.

I have a huge backlog of great but old games with easy, potato graphics. If the next gen or two will be a ♥♥♥♥ show, then I will enjoy the spectacle.
I'm not at all informed on the HPC/AI industry so I don't know enough to have an opinion here. But one thing I would find myself wondering/asking is "would AMD do this if it might also lower the profit they can get on their own HPC/AI segment chips?". The modern market, in its never ending chase for higher profits, seems (key words, mind you) to be leaning towards margins over volume these days, so I'm not sure if AMD would hurt itself in more important sales even if it allowed them to gain market share in the gaming segment. That might make sense if all HPC driving demand was expected to dry up soon and they would have to rely entirely on gaming, but that doesn't seem likely either.

Again, all thoughts from someone who doesn't know enough to have a formal opinion.

My 7800 XT (presuming it works out in the end; it's on its way back from RMA now) is pretty high end compared to what I typically go for, even if it really only falls around half the performance of nVidia's best, so I would definitely be able to sit with it for a while, I think (I mean my GTX 1060 lasted me around seven years, and while it is showing serious age now, I can't complain). Especially because most games don't need anywhere near the level of hardware most enthusiasts have. Not playing on the highest resolutions and not demanding the highest frame rates works in my favor too.
C1REX 2023年12月14日 14時01分 
Illusion of Progress の投稿を引用:
I'm not at all informed on the HPC/AI industry so I don't know enough to have an opinion here. But one thing I would find myself wondering/asking is "would AMD do this if it might also lower the profit they can get on their own HPC/AI segment chips?". The modern market, in its never ending chase for higher profits, seems (key words, mind you) to be leaning towards margins over volume these days, so I'm not sure if AMD would hurt itself in more important sales even if it allowed them to gain market share in the gaming segment. That might make sense if all HPC driving demand was expected to dry up soon and they would have to rely entirely on gaming, but that doesn't seem likely either.

They were talking about this as well. Intel is suspected to potentially make such a move first, and then AMD would need to decide if they want to respond by doing the same or not. I'm not an expert but found this part of the discussion fascinating. I like such theory crafting.
Jamebonds1 2023年12月14日 20時35分 
C1REX の投稿を引用:
Jamebonds1 の投稿を引用:
I think what we're dealing with is the after-effects of the gpu shortage. More people are buying a higher level, expecting to keep it longer, instead of buying a lower tier and planning to upgrade again in a year or two. They're spending the money now on a 4090 because if there's another shortage they're going to spend the same amount of $$$ for something that performs a lot worse. Might as well spend the money on a 4090 now and be set for a long time.

If my theory is right, then there's going to be a massive drop-off in consumer high-end sales next-gen compared to current-gen.

Maybe. Your theory makes sense. In my opinion, however, we are dealing with very bad 4060 and 4060Ti where people see the 3060 12GB as a better deal. It's the most popular GPU in the world now. Also, there is the exceptionally good 4090 that is uniquely better value than the 4080.
Umm... wrong quote?
C1REX 2023年12月15日 4時07分 
Jamebonds1 の投稿を引用:
Umm... wrong quote?
Yeah. My bad. I’m sorry.
look : it always has been for every nvidia chip

that the xx70 was the most bang for your buck
the xx60 usually did cost 60% of what an xx70 cost, but performed only 40%
the xx80 usually did cost 150% of what an xx70 cost, but performed only 120%
the xx80ti usually did cost 200% of what an xx70 cost, but performed only 140%
the titan cards (what xx90 actually are), did cost 300% of what an xx70 cost but performces only 150%

if we look in 9xx untill 40xx we do see that the xx90 and xx80ti have WAY more increased in price relatively than the xx70
but prices have massively spiked around the board

an 4060 cost now around 300 euro, that a 50% increase vs the 200 euro xx60 used to cost
it's pricepoint is now 50% of an 4070
this means it costs 20% less than it historicly should

an 4070 costs now around 600 euro.. thats a 70% increase vs the 350 euro xx70 used to cost

an 4080 costs now around 1200 euro, thats an 140% increase vs the 500 euro xx80 used to cost
it's pricepoint is now 200% of an 4070
this means it costs proportionally 34% more than it historicly should

an 4090 cost now around 2100 euro, thats an 100% increase vs the 1050 euro titan used to cost.
it's pricepoint is now 350% of an 4070
this means it costs proportionally 17% more than it historicly should

so we have an average price increase of 70% for gpu's in just 5 years.. while before prices had stayed stable for nearly 15 years..
but at the same time the xx60 has been made a bit more affordable while the xx80 and xx90 have made less attractive..

we all know the hated cryptoaholes are to blame for the first one (drop the price of gpu's back to much or yank up performance to much without a price increase, and cryptocraze 3 will start.. it will easely take 15 years to tripple production capacipty of gpu's to meet that new demand.. so until than.. all gpu producers can do is keep price per performance the same, meaning each generation needs to get proportionally more expensive.. to keep those cards not too attractive to miners.

the second one however.. likely is due the death of SLI.
-> in the days of SLI.. a person could always pick 2 of the lesser cards... than 1 more expensive one...
it did not scale perfect 2 cards performed equal to 1.6x 1 card.. prices of the better card still had to keep low enough to not make just go sli the cheaper option.


if the 5xxx series will give such a performance boost as rumoured... it likely will mean either a massive price increase or another cryptograzy meaning no gpu's for us gamers at all.
neither will be good.
最近の変更はDe Hollandse Ezelが行いました; 2023年12月15日 6時06分
De Hollandse Ezel の投稿を引用:
the xx60 usually did cost 60% of what an xx70 cost, but performed only 40%
the xx80 usually did cost 150% of what an xx70 cost, but performed only 120%
the xx80ti usually did cost 200% of what an xx70 cost, but performed only 140%
Wha... what!?

This is wrong on pretty much everything. I'm willing to state maybe pricing differed that wildly wherever you're from (though I'm questioning even this...), but the performance comparisons are still way off base then.

In what generation did an x70 add 100%+ performance to an x60? In what generation did an x80 add 100% performance to an x70?

Using Pascal and US pricing as an example here...

The GTX 1060 had over half (and really it's nearly two thirds at ~63%) of the performance of the GTX 1080 and cost a bit less than half ($250 to $600). The GTX 1060 had half of the performance of even the later GTX 1080 Ti which was an additional $100 more on top of the non-Ti pricing (so double the performance for well more than double the cost).

Even if pricing in your region differed, you didn't get different versions of these chips that had larger performance gaps unless we lived in different realities or something.

Sources...

TechPowerUp[www.techpowerup.com] (use the "relative performance" to compare it to the GTX 1060, GTX 1070, and GTX 1080 Ti).

Tom's Hardware[www.tomshardware.com]

Even if you don't like those sources, you have to find numerous better ones that suggest those two sources are just sooo far off the mark and that the real gap between tiers was over twice what they are shown here. I'm doubting you'll be able to find this, for obvious reasons.

Where are you seeing 100%+ performance gaps between x60s to x70s, and then 100%+ performance gaps from x70s to x80s!? You're lucky if that's the total difference from x60 to x80, Ti or not! Not in Pascal. Not in Turing. Not is Ada. Maxwell was an outlier where the x70 was at its cheapest and the x60 was relatively mediocre (and even then the GTX 970 had its issues, like the whole VRAM situation), but Maxwell was an outlier, not the "typical", in your words, trend. Go back further to generations before Maxwell like Kepler, Fermi, or even further back and the gaps are similar to the ones I'm describing.

People will do anything including straight up lie about factual performance to claim this current generation isn't an outlier and it's wild...
Illusion of Progress の投稿を引用:
De Hollandse Ezel の投稿を引用:
the xx60 usually did cost 60% of what an xx70 cost, but performed only 40%
the xx80 usually did cost 150% of what an xx70 cost, but performed only 120%
the xx80ti usually did cost 200% of what an xx70 cost, but performed only 140%
Wha... what!?

This is wrong on pretty much everything. I'm willing to state maybe pricing differed that wildly wherever you're from (though I'm questioning even this...), but the performance comparisons are still way off base then.

In what generation did an x70 add 100%+ performance to an x60? In what generation did an x80 add 100% performance to an x70?

Using Pascal and US pricing as an example here...

The GTX 1060 had over half (and really it's nearly two thirds at ~63%) of the performance of the GTX 1080 and cost a bit less than half ($250 to $600). The GTX 1060 had half of the performance of even the later GTX 1080 Ti which was an additional $100 more on top of the non-Ti pricing (so double the performance for well more than double the cost).

Even if pricing in your region differed, you didn't get different versions of these chips that had larger performance gaps unless we lived in different realities or something.

Sources...

TechPowerUp[www.techpowerup.com] (use the "relative performance" to compare it to the GTX 1060, GTX 1070, and GTX 1080 Ti).

Tom's Hardware[www.tomshardware.com]

Even if you don't like those sources, you have to find numerous better ones that suggest those two sources are just sooo far off the mark and that the real gap between tiers was over twice what they are shown here. I'm doubting you'll be able to find this, for obvious reasons.

Where are you seeing 100%+ performance gaps between x60s to x70s, and then 100%+ performance gaps from x70s to x80s!? You're lucky if that's the total difference from x60 to x80, Ti or not! Not in Pascal. Not in Turing. Not is Ada. Maxwell was an outlier where the x70 was at its cheapest and the x60 was relatively mediocre (and even then the GTX 970 had its issues, like the whole VRAM situation), but Maxwell was an outlier, not the "typical", in your words, trend. Go back further to generations before Maxwell like Kepler, Fermi, or even further back and the gaps are similar to the ones I'm describing.

People will do anything including straight up lie about factual performance to claim this current generation isn't an outlier and it's wild...


the "normal pricing" started around the 300 or 400 series.. and ended with the 900 series... (already with the 10 series prices were a bit increased)

and in EVERY ONE of these series..

an titan was 1050 euro
an xx80ti was 700 euro
and xx80 was 500 euro
an xx70 was 350 euro
an xx60 was 200 euro

there was ofcourse a little deviation.. like in one gen that 80ti was more 680 in another more 720..
so add +/- 10% deviation per gen on these prices
but they pretty much were the same for a long decade.

and while I do think its normal an xx60 as budgetcatd performs less than half for more than half the xc70 as midrangecard.
(and in fact in this 4xxxx it is an oulyerr.
the 4060 only cost 50% of an 4070 npt 60%
and it performs 50% (in 1440) to 60% (in 1080) of an 4070 which is higher than the 40% it should.

this gen 4060 is an outlyer in that is cheaper and performs bettrr than it should.
最近の変更はDe Hollandse Ezelが行いました; 2023年12月15日 8時08分
Sorry De Hollandse Ezel, but it does not sound correct. I will have to point it out once again. It is a bad idea to buy a new GPU every generation, because it could cause $1800 loss. So, it is not issues with GPUs' value.
最近の変更はJamebonds1が行いました; 2023年12月15日 9時34分
So pricing (roughly) matched that of the US it seems? That just makes me wonder where your earlier numbers came from even more.

Even if you add 10% deviation to your pricing, you earlier gaps make no sense. I wasn't nitpicking about 10% deviations here. I was asking why you were claiming there was a doubling of performance and price where there was typically half of that at best. That's not nitpicking; that's a "what planet were you on?" situation.

You're quoting 100% gaps in pricing and performance going from the x60 to x70, and again from x70 to base x80?

The only tiers where there was traditionally a roughly doubling of performance (plus or minus) was between the x50 to x60. That's it. To get another doubling of performance, you would have had to compare the x60 to x80 (and often times the x80 Ti), not the x70.

The x70 was usually was a worse value than the x60, and less performance and sometimes also less value than the x80. So it usually made sense to just skip and go for the x80 if you wanted more than the x70, but yes sometimes budgets force us to accept the lower value. And that's fine; the x70 serves its role, but I'm not sure why you're claiming that role was better value than the x60. That was almost never true.

See Ada for reference. The RTX 3060 Ti and RTX 3080 were good values. The RTX 3070 and RTX 3070 Ti were not (and they had 8 GB VRAM too, which was seen as too little for the times and for the level of performance they offer).

Same with Pascal. The GTX 1060 offered 75% of the performance of the GTX 1070, but cost 65% of the price.

Maxwell is the only likely exception that comes to mind, but both the x60 and x70 were compromised here (again, since we're discussing past generations so the current one being another exception is out of the picture here).
De Hollandse Ezel の投稿を引用:
this gen 4060 is an outlyer in that is cheaper and performs bettrr than it should.
I interestingly have the opposite opinion, and I used to buy precisely the x60 products and I used to like them. But the more I look at modern x60 products, the more they appear to be less than they used to be.

They used to be good, but the RTX 4060 in particular is somewhat mediocre. It's cheaper specifically because it has to be; because it has a low generational performance uplift, and it loses VRAM and bus width. It's too compromised. It's a glorified x50 product.

I know the RTX 4070 isn't a sufficient answer for everyone, since many people have budgets below this point, but it really feels like the lowest you can "blanket recommend" among nVidia's current offerings. It fills the spot the x60 used to. You used to be able to blanket recommend any x60 product and never feel wrong. Now the x60 feels bad to recommend. It's an "only if you can't save for an RTX 4070" solution, and that is by design. nVidia designed this lineup to push mainstream users up to the x70, and enthusiasts up to the x90. It's an upsell generation because they couldn't offer the traditional uplifts without cannibalizing the previous generation which they had too much of an oversupply of. Maybe the RTX 50 series better (and I hope it is!), but this generation is certainly an outlier compared to past norms.

The RTX 4060, RTX 4070 Ti, and RTX 4080 are all relatively mediocre values to me. With the RTX 4090 pricing going up, the RTX 4080 starts to look a bit better, but I'm hoping the SUPER refreshes fix both the RTX 4070 Ti and RTX 4080s bad offerings. Unfortunately, the x60 tier will not get a SUPER refresh (yet?) so the RTX 4060 will remain somewhat mediocre (at least for now).
最近の変更はIllusion of Progressが行いました; 2023年12月15日 9時42分
Jamebonds1 の投稿を引用:
Sorry De Hollandse Ezel, but it does not sound correct. I will have to point it out once again. It is a bad idea to buy a new GPU every generation, because it could cause $1800 loss. So, it is not issues with GPUs' value.

as a general rule one replaces ones gpu oncececery 2 generarions in the samecquality bracket .


so 680ti than skip 700 series buy 780tu skip 1xxxx series buy 2080tu skup 3xxx seeirs buy 4080ti

or 570 skip 6xx buy 770 skip 9xx buy 1070 skip 2xxx buy 3070 skip 4xxx

and the low end catd is the xx60.

so each geberation 50% of all gamers should be looking to replace their card.

and nobody should still game on an 2xxx seriea or older.
(cept ones on 2080ti as the 4080ti has not come out yet)
nor should anybody be useinh an xx50 or xx50ti

we should only see 3060 and up and 4060 and up.

but yeah the price spikes hace made it exoensive to stay in your bracket.

replacing an 700 euro 980ti for an 1600 2080ti did hurt.
and stepping down to a mere 2080 and still pay 1200?? never! I so hope when the 4080ti releases its priced better..

and yeah I liked it much better whenxx80ti did cost 600-700 euro.. not more than a titan used to cost.
最近の変更はDe Hollandse Ezelが行いました; 2023年12月15日 10時04分
De Hollandse Ezel の投稿を引用:
Jamebonds1 の投稿を引用:
Sorry De Hollandse Ezel, but it does not sound correct. I will have to point it out once again. It is a bad idea to buy a new GPU every generation, because it could cause $1800 loss. So, it is not issues with GPUs' value.

as a general rule one replaces ones gpu oncececery 2 generarions in the samecquality bracket .


so 6080ti than skip 700 series buy 780tu skip 1xxxx series buy 2080tu skup 3xxx seeirs buy 4080ti

or 570 skip 6xx buy 770 skip 9xx buy 1070 skip 2xxx buy 3070 skip 4xxx

and the low end catd is the xx60.

so each geberation 50% of all gamers should be looking to replace their card.

and nobody should still game on an 2xxx seriea or older.
nor should anybody be useinh an xx50 or xx50ti

we should only see 3060 and up and 4060 and up.

but yeah the price spikes hace made it exoensive to stay in your bracket.

replacing an 700 euro 980ti for an 1600 2080ti did hurt.
and stepping down to a mere 2080 and still pay 1200?? never!
Actually, you don't have to wait for two generation before buy a new video card. That is still a bad idea, because you will lost more money too. I jumped from GTX 680 to RTX 4080 when 2023 AAA games no longer working and security is out of date.
De Hollandse Ezel の投稿を引用:
as a general rule one replaces ones gpu oncececery 2 generarions in the samecquality bracket .

so each geberation 50% of all gamers should be looking to replace their card.
Why lock yourself to some strangely arbitrary things like these though?

You should replace something only when you want more performance.

And you should replace it with the best thing within your budget.

It doesn't need to be harder than that.
De Hollandse Ezel の投稿を引用:
and nobody should still game on an 2xxx seriea or older.
(cept ones on 2080ti as the 4080ti has not come out yet)
nor should anybody be useinh an xx50 or xx50ti

we should only see 3060 and up and 4060 and up.
That's nonsense.

People should use things as long as they prove sufficient, and there's no need to buy more than they need, nor sooner than they need to. Using an x50 is fine, and sticking with a card more than every other generation is fine. Why waste something if it's still proving useful?

Conversely, this goes other way. Maybe you have something only one generation old and already want more. Are those people wrong, too? No, they're not.

Get what you want, when you want it, and only then. Again, it's not harder than that.
Illusion of Progress の投稿を引用:
De Hollandse Ezel の投稿を引用:
and nobody should still game on an 2xxx seriea or older.
(cept ones on 2080ti as the 4080ti has not come out yet)
nor should anybody be useinh an xx50 or xx50ti

we should only see 3060 and up and 4060 and up.
That's nonsense.

I thought the same. I keep GTX 680 since 2013 with no issues until this year when I got Battlefield 2042 and had security update expired. Nothing wrong with that, not even for my pc with games I have.
Nope, nothing wrong with using something that works.

I kept my GTX 1060 until this year. I'm actually using it right now as I wait on the other to come back from RMA.
Illusion of Progress の投稿を引用:
De Hollandse Ezel の投稿を引用:
as a general rule one replaces ones gpu oncececery 2 generarions in the samecquality bracket .

so each geberation 50% of all gamers should be looking to replace their card.
Why lock yourself to some strangely arbitrary things like these though?

You should replace something only when you want more performance.

And you should replace it with the best thing within your budget.

It doesn't need to be harder than that.
De Hollandse Ezel の投稿を引用:
and nobody should still game on an 2xxx seriea or older.
(cept ones on 2080ti as the 4080ti has not come out yet)
nor should anybody be useinh an xx50 or xx50ti

we should only see 3060 and up and 4060 and up.
That's nonsense.

People should use things as long as they prove sufficient, and there's no need to buy more than they need, nor sooner than they need to. Using an x50 is fine, and sticking with a card more than every other generation is fine. Why waste something if it's still proving useful?

Conversely, this goes other way. Maybe you have something only one generation old and already want more. Are those people wrong, too? No, they're not.

Get what you want, when you want it, and only then. Again, it's not harder than that.

these rules are what defunes?agamer have done so since the 80s.
sure you have the "filthy casuals" but those aint gamers ....;)

1 you deciede whether gaming is your main hobby. if no.
dont invest in pc get a console.

2 you look at what your montly budget allows.
(45, 75 or 100+ euro per month)

3 you start setting apart that amount each month consuder it a fixed expense. like rentdo not touch what you set aside for pc for anything else.

4 after you reached your start amount you buy yoir forst system.. from than on you follow your ubdate scedule loyally. and keep setting asude the same nonthly amount.
you are allowed to delay a replacement or upgrade 3 months if a launch of a new series is near thats only logical but not 6..

that way you keep costs in your budget and keep gaming nicely.

better be a permament low end gamer with often replacemnts that but 1 mid end pc and not replace it for a decade.
最近の変更はDe Hollandse Ezelが行いました; 2023年12月15日 15時56分
< >
4,651-4,665 / 4,884 のコメントを表示
ページ毎: 1530 50

投稿日: 2020年8月16日 2時56分
投稿数: 4,978