Is Nvidia killing PC gaming?
With the ludicrous pricing of the 4000 series of GPU’s is Nvidia driving gamers away from PC gaming? I ask this because Sony recently said that 50% of PS5 sales are from people who never owned a PS4. So where is this incredible growth coming from? I suspect it might be PC gamers who have given up trying to buy a reasonably priced graphics card and have simply decided to buy a (relatively) cheap console instead.
My theory is that Nvidia saw that people were willing to pay hundreds of dollars more for 30 series cards a few years ago and decided to ‘scalp direct’ to the consumer. What I think they failed to understand is that the people willing to pay those exorbitant prices on sites like ebay weren’t gamers at all but rather cryptominers. Now the cryptomining bubble has burst the market evaporated as well.
Of course, there will always be outliers, people who will choose to pay Nvidia’s markups, but the majority seem to be sitting tight. How will Nvidia respond, will they cut prices or are they banking on the low willpower of the average PC gamer to ride this out? What will you do if prices don’t return to normal, Switch to console gaming or get a second mortgage?
Postat inițial de ΜΣ†ΛĿ:
I have been a PC gamer for more than 20 years, I have always built top end gaming rigs with the highest tier GPU and up to 980ti 2 of them for SLI to ensure that I could play the latest games with maxed out graphics no matter what game. I was about to upgrade my PC to a 4090 but could not locate one in stock at MSRP even just missed it a few times when Microcenter showed stock. PC gaming has become very expensive and I would spend roughly $3-$4K every few years for a whole new Build. There is no doubt that PC can achieve better graphics than console. however with the PS5 and Xbox SX that gap is smaller than ever and most games have a performance mode for 60/120 FPS on console. in the end playing on a Giant TV from the couch the graphics difference overall is not that noticable. Most of the time when you're actually playing the game you wouldn't even notice because of how close it is to maxed out PC graphics, I mean put me infront of a monitor and I can nitpick and point out the differences all day. So recently after playing some of the same games I have on PC on a PS5 at a friends house and seeing how good the graphics are I got a PS5 and the PS+ Extra subscription and have been playing games on the PS5 exclusively since then. I'm actually enjoying gaming more than on the PC, and I don't have to lug the PC around to different rooms to play with controller on the couch, and some games not even having PS button prompts. Anyway I have a rtx 3090 & 5800x build that I havent used for gaming since I got the PS5 and I only paid $399 for it + $100 for a year sub to PSPlus. More and more games are just bad console ports, and even FPS games are made for controller 1st. The Optimization for ports has gotten out of hand and alot of new gemes rely on DLSS or some sort of upscaling to hit High FPS.
UP close PC has the edge on graphics, on couch + big TV it almost looks the same and at good framerate.

I was actually contemplating about getting Hogwarts Legacy on PC, but Decided to get it on PS5.
just look at the comparison (video link below) and the PS5 version seems to run smoother, there is visible frame pacing/stutter issue on the PC in some scenes.

I can easily afford to keep upgrading my PC and have the latest and greatest I just don't think it's worth the price and I feel ripped off when the consoles can provide almost the same gaming experience for a tiny fraction of the price and I seem to enjoy it more. I don't think I will be building any gaming PC anytime soon, well at least until Consoles keep providing 60 FPS games ( I know 2 games that came out recently only have a 30 FPS mode) that PS5 Pro Needs to come out soon....
https://www.youtube.com/watch?v=esMKAOtLirw&t=639s
< >
Se afișează 76-90 din 98 comentarii
Postat inițial de Komarimaru:
Postat inițial de kingjames488:
I mean, there's also the thing about Nvidia actually controlling the GPU market...

idk if this is still something people talk about, but years ago there were rumours in PC magazines that Nvidia's R&D was actually ahead of what they release and they only something strong enough to beat AMD...

also the thing about how Nvidia just performs better with the same specs because things are programmed to work on Nvidia hardware better... though the same has also been said about intel and I think that's just down to that's the hardware the people who developed the benchmarks happened to use, so ya there is probably some advantage there but it's not like it's planned.
Every company does this. R&D starts years and years before any final product. Even after they complete it, the technology or the cost just isn't worth it.

This is one reason why the 40 series is at such a price point. Nvidia not only went from 8nm to 4nm, a substantial tech and price advance, they did it with a 600m^2 die size, nearly the same as the 30 series, with only 300mm wafers...

300mm wafers, 4nm process, 600+m^2 die. The loss on each wafer would be insane, not including just bad die's in general that can happen.

I blame it on silicon giants refusing to shift to 450mm wafers a decade ago like they should have. Prices would be far cheaper over all, but the initial cost to change from 300mm to 450 is in the double digits of Trillions, and they don't want to eat the cost of replacing all that manufacturing.
wow! 4nm now? TBH last time I looked into this in any detail they were back at like 16nm I think lol... but from what I've seen there haven't been a lot of changes sense then and they've just been pushing out whatever they can sell bit miners.

the general economy of GPUs used to mostly involve Nvidia paying for companies to make games like crisis to sell more powerful GPUs than people actually needed.

cuz as I said before, devs aren't going to make games that the majority of people can't play. it makes no sense unless someone else is funding it.
Komarimaru 17 febr. 2023 la 12:19 
Postat inițial de kingjames488:
Postat inițial de Komarimaru:
Every company does this. R&D starts years and years before any final product. Even after they complete it, the technology or the cost just isn't worth it.

This is one reason why the 40 series is at such a price point. Nvidia not only went from 8nm to 4nm, a substantial tech and price advance, they did it with a 600m^2 die size, nearly the same as the 30 series, with only 300mm wafers...

300mm wafers, 4nm process, 600+m^2 die. The loss on each wafer would be insane, not including just bad die's in general that can happen.

I blame it on silicon giants refusing to shift to 450mm wafers a decade ago like they should have. Prices would be far cheaper over all, but the initial cost to change from 300mm to 450 is in the double digits of Trillions, and they don't want to eat the cost of replacing all that manufacturing.
wow! 4nm now? TBH last time I looked into this in any detail they were back at like 16nm I think lol... but from what I've seen there haven't been a lot of changes sense then and they've just been pushing out whatever they can sell bit miners.

the general economy of GPUs used to mostly involve Nvidia paying for companies to make games like crisis to sell more powerful GPUs than people actually needed.

cuz as I said before, devs aren't going to make games that the majority of people can't play. it makes no sense unless someone else is funding it.
Ya, 4nm process for Nvidia. 608mm^2 die size.

Compare to AMD doing Chiplet. 300mm^2 at 5nm, with 6 36.6mm^2 at 6nm. This alone, drops the cost to below half of a monolithic 608mm^2 die and better yield.

To put it in perspective, since silicon manufacturers are on 300mm wafers still.

300mm Wafer, 1.95-2mm edge exclusion(all waste but puts dies as close to edge as possible)
2-3 µm scribe lines. The cuts between each die. That's 0.002-0.003mm.
Nvidia can at most, get 84 die per wafer, assuming each one is perfect.

AMD on the other hand, would get 209... For their main die from a 300mm wafer, and well over 900+ of the small MCM chiplets per wafer. Or, they could fill in the edge exclusions with MCM since so small, saving even more money.

Nvidia is on 4nm, AMD is on 5nm with MCM's at 6nm(the little chiplets).

The cost in manufacturing between the two, is MASSIVE.


And games will always exist that current hardware can't run maxed. It's been that way since forever. When Doom first came out, people struggled to play it until swapped to 486 or even faster. Same for Quake, Unreal, etc etc.

Games will always push beyond current hardware limits for PC.
Postat inițial de kingjames488:
idk if this is still something people talk about, but years ago there were rumours in PC magazines that Nvidia's R&D was actually ahead of what they release and they only something strong enough to beat AMD...
Talked about or not, something like that is less of a rumor and more of an apparent reality (and not just by nVidia, and not just in this industry). After all, why release everything you've got when, say, two thirds of that will suffice, and you can milk the rest of it in the coming years? Not only that, but if it's a market of only two players and one is struggling more, then it's more likely this happens because if you release everything, you more risk bringing your sole competition towards elimination, which means you become a monopoly. And you want to avoid that.

The same thing happened with CPUs not long ago. Intel did little more than sell the same product for half a dozen generations when AMD wasn't competing well. Why do you think that only stopped shortly after Ryzen, which is when AMD started competing well (specifically with Zen 2 and later)?

This is part of why only the RTX 4090 is a big jump and everything else in nVidia's lineup this generation is cut down basically a full tier. Relative to what they would be in the past, the RTX 4080 is an RTX 4070, the RTX 4070 Ti is an RTX 4060 Ti, and the upcoming RTX 4060 is possibly going to be a mobile chip, one that is a quarter of the full flagship chip when the x60 used to be half. That's how cut down everything besides the RTX 4090 is. It's almost worse with GPUs because even with the RX 6000 series competing well, nVidia just dominates in market share (and mind share). It's rather apparent most of the Ada Lovelace products are cut down (but yet still marked up in price) specifically because nVidia anticipates AMD can't (or won't) compete as well with it. They can afford to cut it down and STILL have a competitive product, even with price mark ups because AMD just follow their lead in that area. The GPU market in the last half a decade is in the poorest shape I've ever seen it in. It really does remind me of the CPU market right before it, only worse (because prices are higher, and because unlike CPUs, GPUs don't last as well, as long).

And of course they are often farther ahead on what they're working on than what's publicly available. Most of this stuff takes times, a lot of time, to develop.
I mean, it makes sense... and the barrier of entry is getting so high that there's not really anything anyone can do about it. there's really only one company in the world that makes the lithography machines capable of producing these tiny new die sizes and only a couple companies that can afford them and know how to use them...

it's pretty much just become a monopoly due to the specialized nature of it.
Komarimaru 17 febr. 2023 la 12:28 
Postat inițial de kingjames488:
I mean, it makes sense... and the barrier of entry is getting so high that there's not really anything anyone can do about it. there's really only one company in the world that makes the lithography machines capable of producing these tiny new die sizes and only a couple companies that can afford them and know how to use them...

it's pretty much just become a monopoly due to the specialized nature of it.
This is exactly why nvidia is moving to chiplets, for the 50 series. I think the 40 series was the last of their massive monolithic dies.
Postat inițial de kingjames488:
wow! 4nm now? TBH last time I looked into this in any detail they were back at like 16nm I think lol...
Process nodes haven't been shrinking at a rate that they're given name suggests for a long time now. It's more of an "effective" size since around 2000 or so.

In addition, certain things aren't scaling down at the same rate as others. SRAM is one of them that is scaling much slower. This is one reason why on modern CPUs, cache takes up so much more die size than it used to.
Editat ultima dată de Illusion of Progress; 17 febr. 2023 la 12:30
Postat inițial de Illusion of Progress:
Postat inițial de kingjames488:
wow! 4nm now? TBH last time I looked into this in any detail they were back at like 16nm I think lol...
Process nodes haven't been shrinking at a rate that they're given name suggests for a long time now. It's more of an "effective" size since around 2000 or so.

In addition, certain things aren't scaling down at the same rate as others. SRAM is one of them that is scaling much slower. This is one reason why on modern CPUs, cache takes up so much more die size than it used to.
I mean, I remember them saying they couldn't scale down past like 8-16nm because of something on the particle level and the electrons would just interfere with eachother or some such thing... but idk, maybe they solved that? or maybe it's just some marketing?
I can't speak much to the specifics, but the general idea is that, yes, there have been increasing hurdles along the way (and they've come up with some ingenious things to overcome some of them). It's not just marketing, the struggles and accomplishments are real. But the process sizes themselves stopped being literal a couple decades ago, around the late Pentium III/180nm time frame I think?
Postat inițial de Illusion of Progress:
I can't speak much to the specifics, but the general idea is that, yes, there have been increasing hurdles along the way (and they've come up with some ingenious things to overcome some of them). It's not just marketing, the struggles and accomplishments are real. But the process sizes themselves stopped being literal a couple decades ago, around the late Pentium III/180nm time frame I think?
u lost me...
Komarimaru 17 febr. 2023 la 12:41 
Postat inițial de Illusion of Progress:
Postat inițial de kingjames488:
wow! 4nm now? TBH last time I looked into this in any detail they were back at like 16nm I think lol...
Process nodes haven't been shrinking at a rate that they're given name suggests for a long time now. It's more of an "effective" size since around 2000 or so.

In addition, certain things aren't scaling down at the same rate as others. SRAM is one of them that is scaling much slower. This is one reason why on modern CPUs, cache takes up so much more die size than it used to.
They are very much shrinking that far. Shrinking past a certain point though, is getting hectic.

Just for a wafer, not including all the etching that is so precise it's insane...

5nm process is $12-16,000 USD.
4nm $18-21,000
3nm $20-35,000

That's today's prices currently.

Yes, process nodes are shrinking that low.

Back in 2018, 7nm was $10,000. 10nm was $6,000 in 2016.

SRAM is slower to keep up though, yes. Mainly due to cost to effectiveness.
guys no offense but this doom-saying of pc gaming dying because nvidia and amd decided to increase prices for one gen, is exhausting to say the least.

it's good to be vocal about your consumer rights. however in this case we already know that nvidia is sitting on a huge pile of ampere cards, that they need to get rid of first.

in europe you can fight the 4090 for 1800€ and if I had to hazard a guess, the price will drop further as more lower end ada lovelace gpus release.

in the end pc gaming is not going to die and please stop promoting the ps5. you can't find one at msrp in europe. the digital edition is pretty much non-existent at msrp and the only disk versions you can find, are horrendously overpriced bundles at 600€+ at retail.
Heretic 17 febr. 2023 la 16:33 
Postat inițial de Pelopidas:
guys no offense but this doom-saying of pc gaming dying because nvidia and amd decided to increase prices for one gen, is exhausting to say the least.

it's good to be vocal about your consumer rights. however in this case we already know that nvidia is sitting on a huge pile of ampere cards, that they need to get rid of first.

in europe you can fight the 4090 for 1800€ and if I had to hazard a guess, the price will drop further as more lower end ada lovelace gpus release.

in the end pc gaming is not going to die and please stop promoting the ps5. you can't find one at msrp in europe. the digital edition is pretty much non-existent at msrp and the only disk versions you can find, are horrendously overpriced bundles at 600€+ at retail.
It isn't just Nvidia or AMD pricing. The industry is a joke, where AAA games are mostly in the hands of a few greedy and highly politicised corporations, who literally teach developers to dumb everything down to meet a mass-market appeal.
_I_ 17 febr. 2023 la 17:15 
people have been saying everything is killing pc gaming for the last 20+ years

and there are still pc games and gamers
Postat inițial de _I_:
people have been saying everything is killing pc gaming for the last 20+ years

and there are still pc games and gamers
wouldn't be a proper online forum if there weren't at least some people predicting the death of pc gaming and dedicated hardware market.

Postat inițial de Heretic:
It isn't just Nvidia or AMD pricing. The industry is a joke, where AAA games are mostly in the hands of a few greedy and highly politicised corporations, who literally teach developers to dumb everything down to meet a mass-market appeal.
brother we are talking about the hardware market...

like I said before let's wait for a year. I know some people like to get angry when they don't get their new rtx 90 tier gpus at msrp at release but now they are on offer again and far cheaper than before.

currently in europe they sell 4090 for 1800€ ( entire cost of my 3080 12gb setup ), the 4080 for 1260€ ( entire 3060ti 5600x I built for my father ) and the 4070ti for 900€ ( apparently really popular in germany according to mindfactory ).

are they good value? no but again let's wait until nvidia sold all their sunset ampere gpus and we might then have a discussion about the price again
ZeekAncient 17 febr. 2023 la 19:35 
Can this thread die already?

Is Nvidia killing PC gaming? Give me a break. Sure, Nvidia's outrageous pricing of their GPUs is ridiculous, but AMD is doing the same thing. And how about the outrageous pricing on other hardware, like gaming keyboards? Does anyone need a $400+ keyboard?

But how about we look at these game developers that are releasing really crappy PC ports? Games at launch with massive stuttering issues and other performance issues, and limited features and settings. Sure, they sometimes get fixed with patches after launch. But there is really no need for games to be rushed to market and have early adopters struggle with "unoptimized" games.

This is what makes people think that their hardware is not good enough, and have to rush out to dish out hard earned cash on overpriced hardware. You shouldn't need top of the line hardware to enjoy newer games. That is why there are graphics settings. Not everyone has to play at Ultra settings and every setting maxed. But when even decent hardware from the last few years struggles to get decent frames at lower settings, there is a problem.

This is what is making people think that they need to upgrade, and what makes people rush out to dish out cash on the latest hardware. You shouldn't need the latest hardware. Especially if you have hardware from the last couple generations. You don't have a need for an overpriced 40 series GPU, if you purchased a decent 30 series GPU. Sh1t, a decent 20 series GPU should still be adequate in most circumstances. Especially considering that these GPUs weren't cheap as well. But if developers would stop rushing games out to the market, and would release them in much better states, I am sure a lot of people would begin to realize that their current hardware is good enough, and there isn't a need to upgrade.

This is what is killing PC gaming. PC gaming is not just about having better visuals, and more FPS, than consoles. Sure there is that. But PC gaming has always been about options. People with varying degrees of hardware being able to enjoy the same game with varying degrees of visual fidelity. Most PC gamers never went out to buy the latest most expensive ultra enthusiasts hardwere. That was left to the "ultra enthusiats". PC gamers used to be able to build decent builds without breaking the bank, and be able to enjoy all the games they wanted.

Now, with the value of the sub-highend GPUs diminishing, and the state that these games are being released in, it seems that a priority is being placed on the ultra expensive ultra enthusiasts hardware and making PC gamers feel that the mid range stuff will not be adequate anymore. So, yes Nvidia is killing PC gaming. But so is AMD. So is all these other hardware and component manufacturers. And most importantly, it is being fueled by these lazy game developers that are being forced by their corporate overloads to release rushed unoptimized games that should otherwise run great on the current hardware you own, but don't.
Editat ultima dată de ZeekAncient; 17 febr. 2023 la 22:22
< >
Se afișează 76-90 din 98 comentarii
Per pagină: 1530 50

Data postării: 10 febr. 2023 la 0:48
Postări: 98