Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
I see, yeah I figured it was something like that. I guess lossless scaling it is then
Also wondering this
I am not sure, I believe it is too far away before we know. What I do know is that it is not Iron Galaxy, so there is probably a bigger chance that we will see Nvidia's tech in part 2, but again, too early to tell. Probably will have to wait until the system requirements become officially available.
Ok tbf (not excusing it) but FSR 3 FG is very good. In some games it runs better with no real quality loss. Its not like a DLSS to FSR up-scaling type of deal, where dlss is flat out better (if you have a dlss 3 supported card.
Since Nixxes is doing the port, the likelihood is very high. They usually include all upscaling and frame generation options and other things like direct storage. Like the others have said, it’s too early for a definitive answer but Nixxes usually releases all of that info with the full system requirements chart a few weeks before the game launches.
overall... it runs pretty dang well on whatever hardware, if you skip all the ai fluff and how much xtra computation that needs. i just did a native pure compute/rendering run on my nvidia craptop, and... it runs cool and fine af. generally well optimized. yo
It’s not really because the PlayStation uses AMD, and it’s not because Iron Galaxy or Naughty Dog use Radeon graphics card- that would not make sense (other than for compatibility testing) because developers are going to use and target the vendor that has the largest market share- which is Nvidia, by a mile.
it’s just a business agreement on a per title basis. There’s other PlayStation published games that are not AMD sponsored and have all the Nvidia features in them. For example- Insomniac’s games use a fair bit of ray tracing and Ratchet and Clank Rift Apart was Nvidia sponsored and the ray tracing effects didn’t even function on AMD hardware AT ALL when the game launched. Publishers just make these deals with AMD or Nvidia, without rhyme or reason in a lot of cases. But it probably comes down to who wants to pay them more and if the game’s technology lends itself more to one brand’s strengths or the other.
Luckily, these sponsorship deals that exclude certain features have fallen out of favor due to public awareness that AMD was possibly/probably stipulating that games they sponsored were required to omit Nvidia features (unlike Nvidia sponsored titles which don’t impose such restrictions).
sponsored titles usually have some form of sponsor screen in the game tho. not sure if that changed, but any regular game title just uses what the developer has in house to develop. and this game doesn't have a sponsor screen nor any restrictions. dlss was always available. which would not be the case if it was "sponsored by amd". *looks at jedi survivor that had raytraing but patched dlss in later, after some backlash perhaps, since the plugin was avaialble at the time*
also using particular tech doesn't mean it's sponsored. rift apart is an example of a tech (raytracing) that was not possible or a beta feature on amd gpus for a while. it was probably just not available at the time they started development of the game or the engine. that simple.
I didn’t say using particular technology (like ray tracing, which is gonna make a Nvidia GPU look better than AMD) meant it was sponsored, I said that it *could* be one reason why a publisher might choose to go with one vendor or another. Rift apart launched in 2023, AMD had RT capable GPUs on the market for over two full years when the game launched and ray tracing wasn’t ever a “beta feature” for AMD GPUs. Rift Apart is an Nvidia sponsored game, it’s using the RTX I/O feature. I’m not saying the lack of RT support on AMD meant it was sponsored, it’s just evidence that it was developed with some help from Nvidia engineers and RT support for AMD GPUs wasn’t a priority, they did eventually patch it in to work on AMD GPUs.
And TLOUP2 launched before Frame Generation was really very common at all but it still was patched in FSR frame generation and just recently patched in FSR 3.1, yet the developer never patched in DLSS frame generation. There is likely some agreement with AMD that is preventing them from doing so. The exact details of what AMD prohibits in their sponsored games isn’t fully known. Even though it launched with DLSS upscaling, that isn’t what we’re referring to, we’re talking about frame generation. And this game IS a sponsored game, I got it for free with the purchase of a 6700xt and they don’t give away non-sponsored games with their GPUs.
https://www.amd.com/en/gaming/featured-games.html
Here’s announcement of Rift Apart from Nvidia-
https://www.nvidia.com/en-us/geforce/news/ratchet-and-clank-rift-apart-dlss3-rtx-io/
ugh... why you care about framegen so much? cause it's new and buzzwordy? it only works on 40 and 50 series. you an nvidia shill? at face value it still just adds smear frames and fake fps numbers. will not help underperforming cards or games for that matter. it's ai fluff for gamer peenus. "ma framerate is huge" you know?!? and it still has the same damn input latency than playing at native. absolute crap for shooting in games. for real. this is the wrong game and genre to ask for that feature, tbh. i don't get it, honestly.
lol. i'm outta here. i can't...
Dude, you were basically guessing off the top of your head what the reasons were for why it wouldn’t be included and were wrong, so I corrected you. So it’s good that you’re leaving before you make more incorrect statements. And if you don’t care about frame generation then why bother commenting at all? Frame Generation is the topic of the thread. Just don’t make uneducated guesses about things that don’t even matter to you, then no one will need to correct you.
how can you correct me if you don't really know it, either? you're guessing and speculating aswell. lol. unless you work as accountant at the game developer studios, there's no proof they received money or anything from amd. nor can you proof they intentionally revoke tech and apis to be integrated into the engines or games.
maybe it's coming later cause it takes time to integrate into an engine. maybe they have tested it and it didn't work or doesn't look good. quality standards on visuals differ. certain cinematics don't look too great or uncanny when the framerate is too high. you're maybe not that much of a movie watcher. i guess when gaming all day only fps matter. hmm...