DLSS vs FSR Explained
So here is the thing

AMD has been working a lot with system manufactures and dominating the lower end GPU/APU market, it worked out better for them to not have to invest in machine learning and just use FSR to handle upscaling on systems that weren't running the games at high settings or resolutions to begin with. The goal was to get games with lots of shaders running at decent frame rates and that's what FSR does. It's a lot less noticeable on a handheld or a laptop. It was a business strategy. DLSS was going to be better out of the gate and Nvidia is always going to appease to whales in hardware. It made sense for them to save all that money and continue developing in the market they dominate. Nvidia doesn't make CPU's, nor do they help Xbox, PlayStation, Asus, Lenovo, Valve, ect with hardware in their consoles.

While Nvidia dominates the enthusiast market, corporate systems and their ARM based Tegra processor is in the Switch (Which is weak by today's standards - but we'll see what Gen 2 will do)

You have to see this from a market perspective and what corporate would do. Machine learning is not cheap. AMD's foot print is much larger now, so introducing it at this stage in the game is going to even further benefit devices that people already consider usable and close the performance gap during a time where Nvidia is finding indifference among consumers.

I think it was a smart move.

(Screw Windows auto-correct)
Terakhir diedit oleh 𝔈𝔩𝔊𝔯𝔦𝔫𝔤𝔬; 10 Mar 2024 @ 9:50am
< >
Menampilkan 1-5 dari 5 komentar
To hell with FSR 1.0 as it causes more issues than not. FSR 2.0 is better, but not perfect, and FSR 3.0 I have not used as I have older hardware.
AbedsBrother 10 Mar 2024 @ 11:33am 
Diposting pertama kali oleh Wynters:
To hell with FSR 1.0 as it causes more issues than not.
FSR1 is still a better option than just playing at non-upscaled 720p on a 1080p monitor.
PopinFRESH 10 Mar 2024 @ 12:00pm 
None of this post explains anything about either FSR or DLSS, of any variation, much less the actual differences between them
A&A 10 Mar 2024 @ 3:19pm 
NIS (Nvidia image scaling) stares at you

I'm really starting to wonder why these few weeks I only see AMD this Nvidia that upscaler is ala bala focused for something or somebody while Nvidia stopped producing ANY GTX cards. Why hell not use tensor cores that can achieve better accuracy? The RTX2000 series was released 5 years ago. Just let them cook in the labs.
Terakhir diedit oleh A&A; 10 Mar 2024 @ 3:26pm
max 12 Mar 2024 @ 7:56pm 
Diposting pertama kali oleh 𝖠̶̶̅𝖨̶̶̅𝖬̶̶̅:
Nvidia doesn't make CPU's
And then
and their ARM based Tegra
Why write something and then contradict yourself?

Besides Tegra, there's also: https://www.nvidia.com/en-us/data-center/grace-cpu
NVIDIA obviously makes CPUs, just not for desktops.

It was a business strategy
Honestly, AMD's business strategy sucked, and they should fire whoever came up with it (if they didn't already).
The "strategy" of stripping RDNA from all matrix multiplication acceleration and leave it only in CDNA was a critical mistake on AMD's part. It's that kind of mistakes that makes companies go bankrupt. The only reason that AMD haven't gone under is because their competition on the CPU side, Intel, has made even worse decisions some years before. While the only reason that Intel didn't go under is that they had enough money in the coffers to survive their numerous blunders.
Now AMD are acknowledging and walking back on their mistake, and only time will tell if it's not too late.
< >
Menampilkan 1-5 dari 5 komentar
Per halaman: 1530 50

Tanggal Diposting: 10 Mar 2024 @ 9:39am
Postingan: 5