Steam installeren
inloggen
|
taal
简体中文 (Chinees, vereenvoudigd)
繁體中文 (Chinees, traditioneel)
日本語 (Japans)
한국어 (Koreaans)
ไทย (Thai)
Български (Bulgaars)
Čeština (Tsjechisch)
Dansk (Deens)
Deutsch (Duits)
English (Engels)
Español-España (Spaans - Spanje)
Español - Latinoamérica (Spaans - Latijns-Amerika)
Ελληνικά (Grieks)
Français (Frans)
Italiano (Italiaans)
Bahasa Indonesia (Indonesisch)
Magyar (Hongaars)
Norsk (Noors)
Polski (Pools)
Português (Portugees - Portugal)
Português - Brasil (Braziliaans-Portugees)
Română (Roemeens)
Русский (Russisch)
Suomi (Fins)
Svenska (Zweeds)
Türkçe (Turks)
Tiếng Việt (Vietnamees)
Українська (Oekraïens)
Een vertaalprobleem melden
In RE 4 RX 570 gets 80 fps at native 1080p. Remnant 2 gets 30 fps at 1080p native.
https://www.youtube.com/watch?v=C26zXMWXwhA
https://www.youtube.com/watch?v=ZW9Xwdb72NQ
From PC Gamer:
"But the bigger issue here is the way this is being perceived as the beginning of a negative trend for PC gaming.
Upscaling is a crowning achievement for all involved: it represents (mostly) free performance, doesn't cost a penny, and huge swathes of PC gamers can utilise it on their machines. It's really a big deal for gaming, has negated a lot of the performance penalties of graphical techniques such as ray tracing, and is only set to become even more impressive. The thing is, if it becomes an accepted part of the baseline performance, it no longer acts as that performance boost players feel it to be.
Then there's the threat that upscaling becomes a way to cut corners for optimisation. That's already a sore subject due to the state of many PC games at release as of late. Looking back at some of those game releases, it's safe to say that there have been a few that essentially used upscaling as a crutch for performance prior to Remnant 2, especially at more demanding settings, but we've never had a developer come out and say it so plainly, which is why I think Remnant 2's devs are taking the brunt of backlash.
We've already begun to see upscaling technologies as a keystone of a game's settings menu, and with that sort of ubiquity there are going to be examples of games that run poorly without some form of upscaling enabled. Whether that's how every game will function in the future, I don't believe Remnant 2 alone is setting this precedent, but I'm also hoping that it doesn't become the inevitable future of PC gaming."
https://www.pcgamer.com/remnant-2s-upscaling-settings-are-more-necessity-than-nicety-and-players-arent-happy-about-it/
I can't fathom how cards that are supposedly much faster can't manage it without upscaling though, unless they're those people that think that anything under 120 fps is "unplayable" but absolutely refuse to tinker with settings. Most of the time a lot of the settings can be turned down for big FPS boosts without making a huge world of visual difference.
Those things use a little bit of cpu and ram and I guess you're playing at sub hd res?
I know Remnant 2 is uses a high tech engine but when people see a game like RDR 2 running at double the framerate it leads people to believe the game isn't well optimized.
Not even close, a 1070ti at 1080p at ultra only gets like 40FPS average. That's very similar to Remnant 2 without upscaling.
https://youtu.be/1k6Ehs89GJs?si=JQbUraWNdYLVNzdg&t=754
Low settings RDR 2 is more than double the fps of Remnant 2 low settings. Plus you can turn a few things down for huge fps gains in RDR 2. Only way to get a big gain in this is to use upscaling.
RDR 2 with optimized settings gets a locked 60 fps with a 1070ti
https://www.youtube.com/watch?v=eS_qRPjkYKE
4090 averages 95 fps in RDR 2 and 50 in Remnant 2 so it's quite close to double.
https://www.youtube.com/watch?v=xlswicxMCkA&t=752s
Besides you're moving goalposts, of course lowering settings increases FPS, I can lower settings in Remnant 2 and get more FPS too. That's not the point. The point is that RDR2 gets similar performance on the SAME card, but yet somehow Remnant 2 is "unoptimized."
However the game looks just marginally better than part 1 (which also ran terribly on top-end hardware at the time).
The video you linked shows 100 fps for the 1070ti. In this video you see the 1070ti get 43 on ultra and 48 on low.
https://www.youtube.com/watch?v=WVz9aRZGy78&t=23s
Re 4 is a more similar game. 3060 gets 100 fps max settings while Remnant 2 gets 50 fps on low.
https://www.youtube.com/watch?v=2gpGDssXVeA&t=329s
https://www.youtube.com/watch?v=uOgWPKxL00Y&t=1051s
As for the second thing. Leads people to think it is? Why cater to someone who doesn't know what they're talking about?
Different expectations. When a 4090 user is getting 130 fps at native 4k with raytracing in RE 4 they're surprised when Remnant 2 is sub 60 fps at native 4k.
I think it's not so much that the performance isn't as good as they want but it's not as good as they expect with what they've seen from other games.