Installa Steam
Accedi
|
Lingua
简体中文 (cinese semplificato)
繁體中文 (cinese tradizionale)
日本語 (giapponese)
한국어 (coreano)
ไทย (tailandese)
Български (bulgaro)
Čeština (ceco)
Dansk (danese)
Deutsch (tedesco)
English (inglese)
Español - España (spagnolo - Spagna)
Español - Latinoamérica (spagnolo dell'America Latina)
Ελληνικά (greco)
Français (francese)
Indonesiano
Magyar (ungherese)
Nederlands (olandese)
Norsk (norvegese)
Polski (polacco)
Português (portoghese - Portogallo)
Português - Brasil (portoghese brasiliano)
Română (rumeno)
Русский (russo)
Suomi (finlandese)
Svenska (svedese)
Türkçe (turco)
Tiếng Việt (vietnamita)
Українська (ucraino)
Segnala un problema nella traduzione
Doesn't the steam deck use cloud gaming? I wonder how fast your internet would need to be to send 60+ frames of highly graphical information down your line per second without massive input lag.
but you can stream/remote play from your devices to your or friends
using steam link hardware or steam link apps (pc/mac/linux/android/tv etc..) or steam desktop client both signed into same account, or remote play together when you invite a friends steam account
a good quality 1080p60 stream takes 30-40mb/s, on lan is fine with pretty much any router, but across internet would require atleast that for upload speed at host location and download at client location with low ping <50ms would be playable for most games
This is not like, say, a single company making food, branching into machinery and having that be very successful, and then deciding the original food venture is just not worth it for them anymore. That would be more likely as they are two very different things insofar as production and maintenance and all that. By contrast, while consumer CPUs and AI accelerators differ in use case, they share quite a lot in development. So it's sort of "free" to continue the lesser one, especially since they already have a near monopoly in that market and because the AI segment, while very lucrative right now, is still not as proven long term.
It is possible that we may see generations that improve less in gaming than they may have otherwise, either because they focused generation changes on advancements for AI types of use cases, or because they cut the consumer chips down more in order to focus finite production time on the AI segment (the latter at least is already very real and happening right now, with the RTX 5080 representing less than the x60 tier used to relative to a flagship).
7 to 10 years can be a long time, but unless the consumer GPU segment collapses hard between now and then, I imagine nVidia will most likely still be developing consumer GPUs at that point. If anything, there's a bigger chance that either Intel or AMD won't be (this isn't saying I think they won't be, just that the chance for nVidia not to be is probably far smaller).
And for the people talking about the "opportunity cost" that there is only so much resources to produce those chips; NVIDIA is the largest consumer of fabrication capacity and they aren't "loosing" production capacity for their Data Center chip production in order to produce consumer GPU chips. They typically design and book production of their Data Center chips for a different fabrication node than what they do for their consumer GPUs.
For example:
The 40-series AD102 is fabricated on TSMC's 4N process node
The first generation Blackwell that was the Data Center architecture during the 40-series generation was fabricated on the TSMC 4NP process node.
They currently are TSMC's largest customer booking more than 60% of their production capacity. NVIDIA could easily secure more production capacity with other chip fabrication companies, such as Samsung like they did with the 30-series, if they needed more capacity to meet both market segments demand.
Nvidia is turned into a scum bag. Missing rops, power delivery issues, power connection issues. Inflated price, bogus features, pathetic uplift, poor value.
These companies need to pull an Elon Musk and fire all the freaking losers that work their and come up with these anti consumer ideas. Why is this company so lame? Why??
15% of a trillion dollars is a lot of money FYI.
Also GPU revenue might be shrinking as part of their overall revenue but that doesn't mean it will continue to shrink perpetually.
I'm not saying you're wrong that Nvidia will eventually drop out of the GPU market, anything is possible. But I think some of your arguments have some weaknesses and make some assumptions. Besides Nvidia has such a large market share I think that would have to dwindle to nothing before it became reasonable to abandon the market. There's lots of ways Nvidia could cause that to happen by neglecting to innovate and develop.
I also think you overestimate the impact of some consumers complaining incessantly, while on the other hand Nvidia dominates the market and makes hundreds of billions of dollars from GPUs. I dunno all the cons you can dredge up kinda seem to pale in the face of that.
If you control a GPU making 3 fake frames versus every real one you control that persons perception of the internet. They may think they are doing online banking but they aren't.
Could suck up the planets wealth in 1 day. Do you trust the AI?
I was trying to make a different analogy; one that represented a situation where it would make sense for a company to abandon a smaller venture after coming across a bigger one. That did happen for nVidia (and Amazon), but those original things "fall under" the newer bigger thing, so it makes sense for them to continue doing them despite them now being a smaller slice of their revenue pie.
I mean people like me already refuse their rehashed junk.
nvidea basicly halted any r&d funding post 2000 series.. and just "went lets use the same chip but add more watt"
now they even no longer use the prime samples (look ad chip number.. it starts with a 2 now.. used to start with 1... this means not only are ai wafers getting preferable treatment meaning there is far to little production capacity alocated to gpu...
it also means that the wafers that are baked.. have the best part now used for ai.. and the lesser chips used for gpu.s
so you now get the crusts.. of an not updated architecture.. that has been overclocked and had a little ai-cores taged along.. to masque the fact nvidea has not been designing proper new gpu's since the 2000 series came out.
and if you slack and bluff.. basicly not doing your homework.. it gets ever harder to bluff your way out of that (showing in VERY meager performance gains of the 5000 series even with all the junk they been pulling off)
sure by creating artificial scarsity.. inflated prices.. they still can pump some profit and 15% of total revenue with now 99% margins instead of 20% margins... still makes it worth for now to toss a bone...
but there comes a bone when consumers say NO MORE
-when the gain in actual performance gets just to small
-when ever higher prices (due these shortages + nvidea wanting ever higher margins to even consider alocating ANY production capacity to gpu)
-when the rise in TDP for users just gets too high to accept or reaches it's physical ceiling
-when fake frames get so numerous can no longer masque the emperor is naked
that indeed nvidea will just drop gpu market completely..
-but I see a far darker future too.. where gpu as we know it is dead.. owning games is dead... all is planned obselence.. and we all need to only play games by subscruption... where all grafics are fully ai generated.. not on your pc but by an ai cloud service... that you can PAY for how many frames you want.
-
any fysical device you still have like your phone will only last 3-5 years tops before it will no longer be suported to run this service..
you own nothing...
retrograming is the future of gaming.. all hail the old pre-steam fysical disks.. and old hardware to run dos, win 95, 98 and xp to play it on with no internet or service needed.
at least that still can be owned.
Yeah I got you. You were making the inverse hypothetical which also made sense. I was just reinforcing the point you were making.
Again, I think it's far more likely to switch the other direction from what the OP is thinking. Given their
Project DIGITS[nvidianews.nvidia.com], I could see them continue to drive down cost to a point where they could potentially enter the consumer PC space with a range of ARM CPUs.
https://support.microsoft.com/en-us/windows/windows-arm-based-pcs-faq-477f51df-2e3b-f68f-31b0-06f5e4f8ebb5
In regards to Windows, after failing with Windows RT they realized they needed to follow the method Apple did for their CPU architecture transitions from PowerPC -> Intel, and subsequently from Intel -> ARM. Windows 10/11 on ARM natively supports the universal (UWP) apps, and for other applications are handled via emulation.
Most of the top linux distributions already natively support ARM CPUs. Valve has accelerated developing Proton for ARM (suspected for Deckard) so there is very likely going to be SteamOS for ARM.
I don't think there will be very much of a barrier to entry on the software side of the equation for NVIDIA in either case if they did decide to start competing with x86 CPUs for consumer PCs.
Still current Windows ARM PCs have x86 emulation, its functional but most resource intensive things dont really run well, I hope x86 will always stay an option, i have 0 interest in a ARM PC.
Apple did it better than MS could but there too have been tons of sacrifices to compatibility, only less noticeable on Mac because they already got people used to arbitrarily breaking stuff in OS updates.