Nvidea is making trillions off AI oriented cards
Those cards run almost exclusively on GNU/Linux servers,

Still they won't give gamers and personal PCs their support. Why??

The sooner we all move on to gaming on GNU, the sooner we will all enjoy a better and richer experience for everyone.
< >
Zobrazeno 115 z 56 komentářů
Omega 16. říj. 2023 v 21.09 
1 word; CUDA.

I am sorry I keep buying $15.000 Nvidia cards, my ML engineers are spoiled and refuse to use anything else.
IDK WTF this even is. The AI cards you speak of aren't for gaming.
Omega 16. říj. 2023 v 21.59 
Bad 💀 Motha původně napsal:
IDK WTF this even is. The AI cards you speak of aren't for gaming.
This post is about Nvidia moving away R&D from the consumer side and switching it over to "AI", the quality and functionality of consumer hardware and software will suffer as a result.

Also, high-end consumer cards are fairly common for machine learning, we for example use 2080 Tis in a couple of our machines.
NVidia doesn’t care about anything but money.

AI oriented cards make them far more than consumer oriented ones, and they know that a majority of OEM computer manufacturers and consumers will blindly buy whatever they crap out as long as the price tag isn’t TOO absurd.

Unless something changes the landscape, expect NVidia to keep milking their reputation for stupid amounts of cash, consumer be damned.
ZAP 17. říj. 2023 v 5.56 
AI is the new fidget spinner. Seen a few adverts for "have AI make money for you with one weird trick"...

People are dumb.
meanwhile their already overpriced gaming gpus have got even more expensive lol

https://www.tomshardware.com/news/nvidia-rtx-4090-prices-have-been-creeping-upward
ZAP původně napsal:
AI is the new fidget spinner. Seen a few adverts for "have AI make money for you with one weird trick"...

People are dumb.
It’s just the next NFT/Crypto, at least the broader landscape of it is. A lot of the same people are even dipping their fingers into it.

I don’t think AI tech itself is going to go away, but I doubt most of industry in its current form will last.
Omega 17. říj. 2023 v 8.30 
Slav Mcgopnik původně napsal:
ZAP původně napsal:
AI is the new fidget spinner. Seen a few adverts for "have AI make money for you with one weird trick"...

People are dumb.
It’s just the next NFT/Crypto, at least the broader landscape of it is. A lot of the same people are even dipping their fingers into it.

I don’t think AI tech itself is going to go away, but I doubt most of industry in its current form will last.
The only issue with ML is that most people do not understand the technology. I wouldn't compare it to NFTs and crypto, these two do not generate value, these pyramid schemes only subtract from it, unlike this technology. If these crypto bro parasites want to invest in something which generates value, great, let them, it will be the first time in a while these people did something useful.

You as a consumer have been interacting with these systems for 10-15 years now, previously they were simply called algorythms.

But in the last 3/4 years a couple of things changed; the technology and methods very rapidly advanced, and marketing figured out they could sell machine learning as "AI" implying Skynet-like capabilities which is utterly ridiculous.


ML has been here for a while, and as you said it is not going anywhere. "Facebook for cats but with AI", indeed has no future.
You mean CUDA. Ai oriented doesn't really mean anything. AMD Cards can run Ai too, but not close to Nvidia that is just plug and play. AMD cards requires to rewrite the Nvidia's CUDA back-end code completely to something like OpenCL or use a translation layer like HIP, ROCM. Ain't not nobody got time for that.
Omega původně napsal:
\my ML engineers are spoiled and refuse


This is pretty much the issue here. People being spoiled and people refusing to try alternatives. Businesses and managers need to quit accepting it when there is literally little to zero reason now.

Proof there is little to zero reason:

https://ts2.space/en/the-lamini-ceo-pokes-fun-at-nvidia-gpu-shortage-highlights-the-advantage-of-amd-gpus/

Linked Article původně napsal:
Lamini claims that AMD GPUs using ROCm have reached “software parity” with the previously dominant Nvidia CUDA platform. This means that Lamini’s performance isn’t overshadowed by Nvidia solutions, making AMD GPUs a viable choice for LLMs.

As the article points out, the claims should come with scrutiny since they are made by the company. But no more or less than any other company who has previously said CUDA is better because *they* use it. The whole "they use it so they are bias" argument has to go both ways or no ways.

In the end there are plenty of viable alternatives, including very arguably at least one dead on equal, option to CUDA. Depending on use case and workload there have been viable and/or equal alternatives for years. But at least now, there is little argument against the existence of parity products, often with differing advantages that can offset any minor differences.

And yes, to be clear, I know that you (omega) have more first hand exp here than I do. But I also know that Llamini and their CEO have more experience than you. So when weighing who I think is more correct I will land on the CEO of a major ML/LLM company who says they can use AMD alternatives just as well as CUDA in ML/LLM compute.


Heitor Villa-Lobos původně napsal:
Still they won't give gamers and personal PCs their support. Why??

Money. That simple. Money. They have people (read companies and managers) who will gladly spend $15,000.00 per card simply because of the name on it despite there being alternatives that do just as well or comparably well at a fraction of the cost. They will spend the money on the name, not on any actual tech advantage, and then they will claim they had to because their engineers demanded it. Imagine if auto-techs got to demand Snap-On or bust lmao. Sorry but not sorry here, devs and engineers demanding any tool due to being spoiled is just flat out childish. Specially if there are multiple tools that do the job more or less the same.

To be clear, I am not talking about people who *code* in cuda. I am talking about people who use cuda code. If someone has a working knowledge base in coding in Cuda asking them to use alternatives is not fair. But if we are talking about software that runs *on* cuda or other compute systems, where the Cuda/ROCm or other, is used as a form of compute acceleration, there is often little reason not to use the alternatives.

When ROCm can be used just as well as CUDA, someone who is an ML engineer refusing to use one over the other is like an automotive engineer being unwilling to work because the tools are Bosch and not Snap-On. A tool is a tool. They need to use it and do their job not refuse to use a specific brand of tool.
Naposledy upravil xSOSxHawkens; 17. říj. 2023 v 10.12
Not really news, Nvidia's been doing this for awhile now, they have their own higher priced segment of GPUs which are far more capable and built specifically for machine learning which sell for tens of thousands of dollars a piece but cost 1/10 of that to actually produce it.

That's why Nvidia can afford to not be very competitive with AMD and Intel, yet they still go out of their way to try to inhibit them anyway because Jensen just wants to be another Steve Jobs at this point.
Omega 17. říj. 2023 v 10.48 
xSOSxHawkens původně napsal:
snip
The main thing ROCm has so far lacked are two things,
1. Full-range day-1 hardware support
2. Support by ML frameworks and models

ROCm is only functional on a select set of hardware and support is slow to appear. It for example works on the 7900 XTX, but not on the 7800 XT. And this 7900 XTX support only landed a few days ago.

Many models will refuse to officially support AMD+ROCm related issues, they will simply tell you to switch to Nvidia because that is all the model was tested on.

ROCm is gaining traction, but still has all the negative annotations of yesteryear. AMD is fighting an uphill battle against a fortified monopoly, and these mathmaticians who know Python don't give a crap about open standards and open source.
Naposledy upravil Omega; 17. říj. 2023 v 10.52
xSOSxHawkens původně napsal:

Heitor Villa-Lobos původně napsal:
Still they won't give gamers and personal PCs their support. Why??

Money. That simple. Money.

I don't get it. They are into the gaming card business, so why not give gamers open-source drivers and use community to improve their very product?

That'd be money for them.
Naposledy upravil Heitor Villa-Lobos; 17. říj. 2023 v 12.32
Crawl 17. říj. 2023 v 13.19 
A company that was started to make money actually wants to make money, shocker.

I don't even know what you are talking about with "giving gamers their support". Are you trying to insinuate that there is some kind of gaming GPU shortage because Nvidia is selling cards for AI? Last I checked every card current and last gen are widely available.
Bad 💀 Motha původně napsal:
IDK WTF this even is. The AI cards you speak of aren't for gaming.
< >
Zobrazeno 115 z 56 komentářů
Na stránku: 1530 50

Datum zveřejnění: 16. říj. 2023 v 21.03
Počet příspěvků: 56