Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
Regarding the topic; I think we will be seeing frame interpolation and upscaling go that way. Microsoft even say so in their Auto SR sales pitch; "Utilizing the NPU for scaling further reduces the GPU's load, enabling it to focus on rendering frames even more quickly."
How effective it will be depends on how powerful one can make a relatively small NPU. Don't underestimate the power of even a relatively weak GPU in AI workloads. As for allowing LSFG to make use of one, that's up to the Dev to implement. If it won't be made available for selection on an OS-level that is.
It would require a complete rewrite of the LSFG algorithm to support this specific type of processor. It wont be supported for the same reason LSFG dosent utilise nvidia "RTX" hardware.
Exactly. There's no way an NPU will be in most people PCs for at least another decade or more.
Okay let me be more clear. I didn't give context or thing, I am dumb. I am fully aware that NPUs will be in literally everything.. That isn't my point. My point is and if you look at steam stats for just gaming and other sources of average PC hardware stats for the rest of average users.. the most used PCs for gaming/every day use are still ones with CPUs that already more than half a decade old and GPUs that are 6-8 years old.. That's the average PC user.
Most people aren't like the 1-15% of us.. that upgrade once every 1-5 years.. MOST people upgrade every decade or so according to hardware stats worldwide.. The latest stats showed those stats for 2022.. A large portion of those people recently upgraded in 2023-2024 which means that the majority of people have done their once in a decade upgrade rather recently. Unfortunately it means that the average user is unlikely to upgrade for another 7-10 years..
So no the average user will still not have an NPU most likely for at least 5-6 years at the earliest.. but most likely the 7-10 years for most if we're being realistic in our expectations.. This is the point I meant to make, not that hardware itself wouldn't be manufactured with it. Honestly NPUs are amazing. They have existed in smart phones and tablets and some thin form factor laptops/2in1's etc for a number of years now.
I'm just talking the rate at which most people actually move up to the next level of tech when something ground breaking happens in the tech world in the case of PC. I mean smart phones were around since technically 2003/4 but we didn't see them in the average users hands in any sizeable number until almost 2011.
Then iphone and android dominated the market and nearly everyone had one. The average person was still using razer flip phones, nokias, and blackberries until the very end of that decade LOL.. soo .. Like i remember back in 2005/6 having a windows CE(HTC was the first actual full windows smart phone but there was CE rudimentary versions with a more exclusive OS a full 4-5 years before that) smart phone a full 2 years before the iphone and people thought it was just sorcery(LOL)..
I'm not saying it's impossible and that the dev of this app won't do it. But I do think that they would only be motivated to do so when most of there users have access to this.. I mean when tensor cores and optical flow stuff becomes industry norm for all manufacturers and are found in every card.. They'll probably add in full support for that to.
I'm just saying at this time I highly doubt it and I feel like something like that would actually be a DLC.. because they are going to reach a point where this app isn't profitable anymore because everyone already has it, so what motivation do they have to support previously proprietary tech that is only now industry standard when the support of this app over that many years has been milked.
They have to make a living.. So I couldn't see stuff like this being added in a DLC situation in the allotted time.. but definitely eventually. I mean it's in there best interest to keep this app going with new paid features down the road tbh.
rubbish
I'm really sorry. Not to be rude but I literally understood none of this. The language looks like it was spat out by 2013 google translate. I am really having a hard time understanding what you are trying to say here.
Yeah 10 years might be a stretch now. Seems more people are upgrading more often than they used to now that tech is actually becoming impressive enough to bother. I think 4-5 years and most people will have NPUs.. I remember for the longest bloody time a GTX 980 was enough for soooo LONG.. the GTX 1070-1080ti came out and it was incredible and people to this day are basically able to run everything on a RTX 1070-1080ti at least high quality 1080p with the exception of stupid Starfield(the worst game optimisation I have ever witnessed)..
To this day Starfield runs like absolute dog crap.. I'm pretty sure Starfield is creation kit heavily modded. Calling it Creation Engine 2 is just hilarious to me.. It's Creation Engine one with graphical face lift LOL. It's like they spent 0 time trying to improve the game engine. I mean it's essentially still GameBryo from 2008 with a face lift. Bethesda sucks at game engines. They should of just put their hat up and learned Unreal Engine like 15 years ago..
I'm not going to say Starfield is bad. I enjoy the game.. I think it's exactly what I expected space Skyrim/Fallout to be.. and I thoroughly enjoy playing it all the time. That said the engine is bad. If you look at the tech requirements for it.. It's almost laughable. I think tech requirements should reflect what we expect from gaming in the year of 2023(when it released).. In 2023 we expected decent fidelity with 60fps base for gaming. If you want 60fps with those minimums you're going to get potato Starfield at an intermittent 40fps without frame gen.
What's more hilarious is the recommended requirements.. With the recommended you'd be lucky to get 60fps @ 1080fps withe medium settings LOL? Truly funny af sh!t to me.. not sure who makes the requirements for these games but they are right out to lunch. The minimum requirements for Starfield should be 11th gen i5/Ryzen 5 and a RTX 2070 realistically. If you want solid 60fps with medium settings and bloody recommended settings imo should reflect the games 3/4 highest settings. Not bloody Ultra but High and they never do. It's always middle of the road borderline potato graphics for recommended.
That fact that the minimum requirements before they added frame gen to it were GTX 1070ti and a processor that basically can't run they game with any acceptable frame rate is hilarious to me. The best you are going to get with the proposed minimum requirements will barely net you more than 30 fps at lowest settings. LOL ..like what are we doing here.. I hope Starfield doesn't set a new precedent where devs think they can get away with this level of even worse optimisation than we've already seen.
This is getting ridiculous.. That said. I don't agree with the critics/people bashing Starfield. it's an atypical Bethesda game to me. The big selling point for Bethesda games it the ability to mod the game into levels of amazing that we saw in every other game they've made. They are very transparent about there plan to always make sure that modding support is paramount.
To me personally. Starfield was a decent 6/10 game with 7-8/10 gameplay.. You can definitely tell the makers of DOOM had a lot to do with the gun play behind the scenes. It's one of it's saving graces and the melee isn't bad either for what kind of game it is. Yeah it's got Bethesda atypical jank.. which I often love because it's just funny af. To me though.. Starfield is 7/10 game. It's not amazing.. it's just good. What I really like is that it doesn't need 1000 graphic mods to look good. It already looks good.
So I really like that they focused on graphics a little more this time around. i always missed the days of Oblivion.. Back when that game released it was absolutely gorgeous.. To me when Skyrim released it was a major graphical let down and it took nearly half a decade of modding to make it look good.
OPs mother-tongue seems to be French. Who, among Euro's, are know for being stubbornly un-English-speaking. And which's sentence structure probably brain-translates to English very poorly.
This is an important factor, though allocating more chips to AI and related price-increases may diminish this effect. I'm somewhere between 5 and 10 years.
It's about equal parts funny and sad. And there's no way that the roots of the Creation Engine 2 lying with Morrowind doesn't have anything to do with it. But the worst thing about it to me, is that it occasionally still clearly looks like a Creation engine game. I count myself "lucky" that i lost faith in Bethesda a while ago and don't have a hankering for Starfield, but it's a bummer TES6 will most likely be using the same engine.
Yeah, they'd be the lead horse pulling that cart, but not the only one. As long people keep paying for unfinished products, studios will keep delivering them.
That has roots in FO4, BGS stated no Id involvement on gunplay in Starfield. But agreed, BGS had seen a big improvement in gunplay since Id consulted.
In an IGN interview with Todd Howard, he sort of boils down Id's involvement to adding motion blur, which is hilarious. I can imagine someone from BGS going; Help, slideshow! and Id, after having had a good look and having deemed performance unsalvageable, going; Only motion blur can fix this.
LOL the motion blur thing legit made me snort laugh hahaha. So true though.
Very true. I won't use it on competitive games because I'd rather see the ugly and have the motion clarity advantage but I have very high end gaming OLED monitors and the better the monitor the worse it is because your motion clarity is so insane it shows up every frame skip, hitch, inconsistency and frame pace misstep.