Lossless Scaling

Lossless Scaling

what's if lossless scaling use npu?
as you can see this topic is a discussion for if lossless scaling can use npu because win11 will create an access to the npu for non snapdragon processor and probably create an api (i don't know if it's necessary for use it) for new npu and for copilot+ certification and it will present some advantages and incovenients so this discussion is open
for me the utilisation of the npu :
advantage:
- will decrease some artefact for LSFG because it will have a dedicated unit so that will be a must for quality
-increase base performance because the AI processing will't be allow to the gpu or igpu but to the npu
-can create an access to an Xess/DLSS or LS2(using AI if LS didn't) for old gpu like the RX580
incoveniants:
-need a recent hardware for don't need a new hardware like minimum ryzen 7xxxx/AI 3xx or RX7XX0/RTX 20XX probably-ish GTX16XX if RT core or DL core is reactivate with dark method
-if don't have a recent hardware you'll need buy a pcie recent card (use this link for reference); https://www.reddit.com/r/LocalLLaMA/comments/1cx5jvc/overview_of_m2_pcie_npus/
Last edited by clauteur; Aug 18, 2024 @ 9:18pm
< >
Showing 1-13 of 13 comments
Spook Aug 18, 2024 @ 1:02pm 
Hey, you're link reddit link is translated to French. If you could hack the "/?tl=fr" of the end of the link, it would be more easily understood for more people.

Regarding the topic; I think we will be seeing frame interpolation and upscaling go that way. Microsoft even say so in their Auto SR sales pitch; "Utilizing the NPU for scaling further reduces the GPU's load, enabling it to focus on rendering frames even more quickly."

How effective it will be depends on how powerful one can make a relatively small NPU. Don't underestimate the power of even a relatively weak GPU in AI workloads. As for allowing LSFG to make use of one, that's up to the Dev to implement. If it won't be made available for selection on an OS-level that is.
CaramelTurbine Aug 18, 2024 @ 1:34pm 
Unlikely.
It would require a complete rewrite of the LSFG algorithm to support this specific type of processor. It wont be supported for the same reason LSFG dosent utilise nvidia "RTX" hardware.
Xavvy Aug 18, 2024 @ 2:11pm 
Originally posted by CaramelTurbine:
Unlikely.
It would require a complete rewrite of the LSFG algorithm to support this specific type of processor. It wont be supported for the same reason LSFG dosent utilise nvidia "RTX" hardware.

Exactly. There's no way an NPU will be in most people PCs for at least another decade or more.
Spook Aug 18, 2024 @ 2:26pm 
Originally posted by Xavvy:
There's no way an NPU will be in most people PCs for at least another decade or more.
I wouldn't put my money on that. AMD are putting them in their mobile chips and it seems Intel's Meteor Lake desktop chips will feature them. Dedicated add-in NPUs may take a bit longer to become mainstream, but i doubt it will be 10yrs+. Provided this is not a fad/bubble.
Xavvy Aug 19, 2024 @ 6:59am 
Originally posted by Spook:
Originally posted by Xavvy:
There's no way an NPU will be in most people PCs for at least another decade or more.
I wouldn't put my money on that. AMD are putting them in their mobile chips and it seems Intel's Meteor Lake desktop chips will feature them. Dedicated add-in NPUs may take a bit longer to become mainstream, but i doubt it will be 10yrs+. Provided this is not a fad/bubble.

Okay let me be more clear. I didn't give context or thing, I am dumb. I am fully aware that NPUs will be in literally everything.. That isn't my point. My point is and if you look at steam stats for just gaming and other sources of average PC hardware stats for the rest of average users.. the most used PCs for gaming/every day use are still ones with CPUs that already more than half a decade old and GPUs that are 6-8 years old.. That's the average PC user.

Most people aren't like the 1-15% of us.. that upgrade once every 1-5 years.. MOST people upgrade every decade or so according to hardware stats worldwide.. The latest stats showed those stats for 2022.. A large portion of those people recently upgraded in 2023-2024 which means that the majority of people have done their once in a decade upgrade rather recently. Unfortunately it means that the average user is unlikely to upgrade for another 7-10 years..

So no the average user will still not have an NPU most likely for at least 5-6 years at the earliest.. but most likely the 7-10 years for most if we're being realistic in our expectations.. This is the point I meant to make, not that hardware itself wouldn't be manufactured with it. Honestly NPUs are amazing. They have existed in smart phones and tablets and some thin form factor laptops/2in1's etc for a number of years now.

I'm just talking the rate at which most people actually move up to the next level of tech when something ground breaking happens in the tech world in the case of PC. I mean smart phones were around since technically 2003/4 but we didn't see them in the average users hands in any sizeable number until almost 2011.

Then iphone and android dominated the market and nearly everyone had one. The average person was still using razer flip phones, nokias, and blackberries until the very end of that decade LOL.. soo .. Like i remember back in 2005/6 having a windows CE(HTC was the first actual full windows smart phone but there was CE rudimentary versions with a more exclusive OS a full 4-5 years before that) smart phone a full 2 years before the iphone and people thought it was just sorcery(LOL)..

I'm not saying it's impossible and that the dev of this app won't do it. But I do think that they would only be motivated to do so when most of there users have access to this.. I mean when tensor cores and optical flow stuff becomes industry norm for all manufacturers and are found in every card.. They'll probably add in full support for that to.

I'm just saying at this time I highly doubt it and I feel like something like that would actually be a DLC.. because they are going to reach a point where this app isn't profitable anymore because everyone already has it, so what motivation do they have to support previously proprietary tech that is only now industry standard when the support of this app over that many years has been milked.

They have to make a living.. So I couldn't see stuff like this being added in a DLC situation in the allotted time.. but definitely eventually. I mean it's in there best interest to keep this app going with new paid features down the road tbh.
Spook Aug 19, 2024 @ 10:26am 
Originally posted by Xavvy:
Originally posted by Spook:
I wouldn't put my money on that. AMD are putting them in their mobile chips and it seems Intel's Meteor Lake desktop chips will feature them. Dedicated add-in NPUs may take a bit longer to become mainstream, but i doubt it will be 10yrs+. Provided this is not a fad/bubble.

Okay let me be more clear. [...]
I understand, and i think you are right in your assumption that more than 50% of gamers won't have an NPU in their PC for the foreseeable future, though i do think a decade is at the far end of a reasonably estimate.

I am dumb
rubbish
76561199107024205 Aug 19, 2024 @ 1:27pm 
Originally posted by author:
I'm just saying at this time I highly doubt it and I feel like something like that would actually be a DLC.. because they are going to reach a point where this app isn't profitable anymore because everyone already has it, so what motivation do they have to support previously proprietary tech that is only now industry standard when the support of this app over that many years has been milked.
the motivation of support a previous proprietary tech is to be the first name of consumer will say for the exemple what's the handled device most powerfull you will think first and say me ? you'll say is steam deck and now an optimize OS you will say me steam OS because it's the only devices who use a proprietary OS and shortly not the steam deck will not be the alone it will be one of you're first choice for handled now if I say you 3 years before(1 years before the steam deck) , an handeled devices with an igpu with approximately the same power of an GTX1650 mobile (1years before amd do the 680M better than GTX1050 on R7 6800 but AMD aren't so popular at this moment) on windows and it's better on linux you'll say me i'm crazy and now it's the reality so the of support previous proprietary tech it's advantage is not actually for sale now but for the future to be the refenrence
Last edited by clauteur; Aug 19, 2024 @ 1:30pm
Xavvy Aug 20, 2024 @ 9:56pm 
Originally posted by clauteur:
Originally posted by author:
I'm just saying at this time I highly doubt it and I feel like something like that would actually be a DLC.. because they are going to reach a point where this app isn't profitable anymore because everyone already has it, so what motivation do they have to support previously proprietary tech that is only now industry standard when the support of this app over that many years has been milked.
the motivation of support a previous proprietary tech is to be the first name of consumer will say for the exemple what's the handled device most powerfull you will think first and say me ? you'll say is steam deck and now an optimize OS you will say me steam OS because it's the only devices who use a proprietary OS and shortly not the steam deck will not be the alone it will be one of you're first choice for handled now if I say you 3 years before(1 years before the steam deck) , an handeled devices with an igpu with approximately the same power of an GTX1650 mobile (1years before amd do the 680M better than GTX1050 on R7 6800 but AMD aren't so popular at this moment) on windows and it's better on linux you'll say me i'm crazy and now it's the reality so the of support previous proprietary tech it's advantage is not actually for sale now but for the future to be the refenrence

I'm really sorry. Not to be rude but I literally understood none of this. The language looks like it was spat out by 2013 google translate. I am really having a hard time understanding what you are trying to say here.
Xavvy Aug 20, 2024 @ 10:18pm 
Originally posted by Spook:
Originally posted by Xavvy:

Okay let me be more clear. [...]
I understand, and i think you are right in your assumption that more than 50% of gamers won't have an NPU in their PC for the foreseeable future, though i do think a decade is at the far end of a reasonably estimate.

I am dumb
rubbish

Yeah 10 years might be a stretch now. Seems more people are upgrading more often than they used to now that tech is actually becoming impressive enough to bother. I think 4-5 years and most people will have NPUs.. I remember for the longest bloody time a GTX 980 was enough for soooo LONG.. the GTX 1070-1080ti came out and it was incredible and people to this day are basically able to run everything on a RTX 1070-1080ti at least high quality 1080p with the exception of stupid Starfield(the worst game optimisation I have ever witnessed)..

To this day Starfield runs like absolute dog crap.. I'm pretty sure Starfield is creation kit heavily modded. Calling it Creation Engine 2 is just hilarious to me.. It's Creation Engine one with graphical face lift LOL. It's like they spent 0 time trying to improve the game engine. I mean it's essentially still GameBryo from 2008 with a face lift. Bethesda sucks at game engines. They should of just put their hat up and learned Unreal Engine like 15 years ago..

I'm not going to say Starfield is bad. I enjoy the game.. I think it's exactly what I expected space Skyrim/Fallout to be.. and I thoroughly enjoy playing it all the time. That said the engine is bad. If you look at the tech requirements for it.. It's almost laughable. I think tech requirements should reflect what we expect from gaming in the year of 2023(when it released).. In 2023 we expected decent fidelity with 60fps base for gaming. If you want 60fps with those minimums you're going to get potato Starfield at an intermittent 40fps without frame gen.

What's more hilarious is the recommended requirements.. With the recommended you'd be lucky to get 60fps @ 1080fps withe medium settings LOL? Truly funny af sh!t to me.. not sure who makes the requirements for these games but they are right out to lunch. The minimum requirements for Starfield should be 11th gen i5/Ryzen 5 and a RTX 2070 realistically. If you want solid 60fps with medium settings and bloody recommended settings imo should reflect the games 3/4 highest settings. Not bloody Ultra but High and they never do. It's always middle of the road borderline potato graphics for recommended.

That fact that the minimum requirements before they added frame gen to it were GTX 1070ti and a processor that basically can't run they game with any acceptable frame rate is hilarious to me. The best you are going to get with the proposed minimum requirements will barely net you more than 30 fps at lowest settings. LOL ..like what are we doing here.. I hope Starfield doesn't set a new precedent where devs think they can get away with this level of even worse optimisation than we've already seen.

This is getting ridiculous.. That said. I don't agree with the critics/people bashing Starfield. it's an atypical Bethesda game to me. The big selling point for Bethesda games it the ability to mod the game into levels of amazing that we saw in every other game they've made. They are very transparent about there plan to always make sure that modding support is paramount.

To me personally. Starfield was a decent 6/10 game with 7-8/10 gameplay.. You can definitely tell the makers of DOOM had a lot to do with the gun play behind the scenes. It's one of it's saving graces and the melee isn't bad either for what kind of game it is. Yeah it's got Bethesda atypical jank.. which I often love because it's just funny af. To me though.. Starfield is 7/10 game. It's not amazing.. it's just good. What I really like is that it doesn't need 1000 graphic mods to look good. It already looks good.

So I really like that they focused on graphics a little more this time around. i always missed the days of Oblivion.. Back when that game released it was absolutely gorgeous.. To me when Skyrim released it was a major graphical let down and it took nearly half a decade of modding to make it look good.
Spook Aug 21, 2024 @ 1:30am 
Originally posted by Xavvy:
I'm really sorry. Not to be rude but I literally understood none of this. The language looks like it was spat out by 2013 google translate. I am really having a hard time understanding what you are trying to say here.
First-mover advantage as a motivation for adopting a technology early. Using the Steam Deck and it using Linux as an example. Is what i can make of it.

OPs mother-tongue seems to be French. Who, among Euro's, are know for being stubbornly un-English-speaking. And which's sentence structure probably brain-translates to English very poorly.


Originally posted by Xavvy:
Originally posted by Spook:
I understand, and i think you are right in your assumption that more than 50% of gamers won't have an NPU in their PC for the foreseeable future, though i do think a decade is at the far end of a reasonably estimate.

Yeah 10 years might be a stretch now. Seems more people are upgrading more often than they used to now that tech is actually becoming impressive enough to bother. I think 4-5 years and most people will have NPUs..
This is an important factor, though allocating more chips to AI and related price-increases may diminish this effect. I'm somewhere between 5 and 10 years.

To this day Starfield runs like absolute dog crap.. [...]
It's about equal parts funny and sad. And there's no way that the roots of the Creation Engine 2 lying with Morrowind doesn't have anything to do with it. But the worst thing about it to me, is that it occasionally still clearly looks like a Creation engine game. I count myself "lucky" that i lost faith in Bethesda a while ago and don't have a hankering for Starfield, but it's a bummer TES6 will most likely be using the same engine.

I hope Starfield doesn't set a new precedent where devs think they can get away with this level of even worse optimisation than we've already seen.
Yeah, they'd be the lead horse pulling that cart, but not the only one. As long people keep paying for unfinished products, studios will keep delivering them.

You can definitely tell the makers of DOOM had a lot to do with the gun play behind the scenes.
That has roots in FO4, BGS stated no Id involvement on gunplay in Starfield. But agreed, BGS had seen a big improvement in gunplay since Id consulted.

In an IGN interview with Todd Howard, he sort of boils down Id's involvement to adding motion blur, which is hilarious. I can imagine someone from BGS going; Help, slideshow! and Id, after having had a good look and having deemed performance unsalvageable, going; Only motion blur can fix this.
Last edited by Spook; Aug 21, 2024 @ 1:51am
Xavvy Aug 21, 2024 @ 2:46am 
Originally posted by Spook:
Originally posted by Xavvy:
I'm really sorry. Not to be rude but I literally understood none of this. The language looks like it was spat out by 2013 google translate. I am really having a hard time understanding what you are trying to say here.
First-mover advantage as a motivation for adopting a technology early. Using the Steam Deck and it using Linux as an example. Is what i can make of it.

OPs mother-tongue seems to be French. Who, among Euro's, are know for being stubbornly un-English-speaking. And which's sentence structure probably brain-translates to English very poorly.


Originally posted by Xavvy:

Yeah 10 years might be a stretch now. Seems more people are upgrading more often than they used to now that tech is actually becoming impressive enough to bother. I think 4-5 years and most people will have NPUs..
This is an important factor, though allocating more chips to AI and related price-increases may diminish this effect. I'm somewhere between 5 and 10 years.

To this day Starfield runs like absolute dog crap.. [...]
It's about equal parts funny and sad. And there's no way that the roots of the Creation Engine 2 lying with Morrowind doesn't have anything to do with it. But the worst thing about it to me, is that it occasionally still clearly looks like a Creation engine game. I count myself "lucky" that i lost faith in Bethesda a while ago and don't have a hankering for Starfield, but it's a bummer TES6 will most likely be using the same engine.

I hope Starfield doesn't set a new precedent where devs think they can get away with this level of even worse optimisation than we've already seen.
Yeah, they'd be the lead horse pulling that cart, but not the only one. As long people keep paying for unfinished products, studios will keep delivering them.

You can definitely tell the makers of DOOM had a lot to do with the gun play behind the scenes.
That has roots in FO4, BGS stated no Id involvement on gunplay in Starfield. But agreed, BGS had seen a big improvement in gunplay since Id consulted.

In an IGN interview with Todd Howard, he sort of boils down Id's involvement to adding motion blur, which is hilarious. I can imagine someone from BGS going; Help, slideshow! and Id, after having had a good look and having deemed performance unsalvageable, going; Only motion blur can fix this.

LOL the motion blur thing legit made me snort laugh hahaha. So true though.
Spook Aug 21, 2024 @ 2:51am 
Originally posted by Xavvy:
LOL the motion blur thing legit made me snort laugh hahaha. So true though.
It's amazing in what it will hide, and practically necessary on low-persistence displays with a fast pixel-response time at lower/unsteady framerates.
Last edited by Spook; Aug 21, 2024 @ 3:03am
Xavvy Aug 21, 2024 @ 3:51am 
Originally posted by Spook:
Originally posted by Xavvy:
LOL the motion blur thing legit made me snort laugh hahaha. So true though.
It's amazing in what it will hide, and practically necessary on low-persistence displays with a fast pixel-response time at lower/unsteady framerates.

Very true. I won't use it on competitive games because I'd rather see the ugly and have the motion clarity advantage but I have very high end gaming OLED monitors and the better the monitor the worse it is because your motion clarity is so insane it shows up every frame skip, hitch, inconsistency and frame pace misstep.
< >
Showing 1-13 of 13 comments
Per page: 1530 50

Date Posted: Aug 18, 2024 @ 12:01pm
Posts: 13