Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
frame gen off
64.58 fps
frame gen off
30fps
Personally if I can play Rise on my Switch OLED at 30fps I can play Wilds at 30fps so I'm going to say how about they optimize the game better thats my recommendation.
Was on max settings with motion blur off.
Frame gen was off.
I've uninstalled it since then and I'm not reinstalling it for a specific type of ss.
Edit: I've updated my drivers since then, so maybe I'd get a better score, idk.
https://steamcommunity.com/sharedfiles/filedetails/?id=3429054513
https://steamcommunity.com/sharedfiles/filedetails/?id=3429054499
I tried it on three different settings.
I might try another test without any ray tracing, but I liked how the lake looked during the benchmark.
notably the cutscenes were always set to the highest settings and still had a more stable frame rate than the game play section of the benchmark, so feel free to go ham with that setting.
you disabled one of the main optimisations.
Hardware:
- CPU AMD Ryzen 9 5950x
- GPU EVGA Nvidia RTX3090Ti
- RAM 4x32Go G-Skill Trident Z DDR4 3600C16
- MB Gigabyte X570 Aorus Master
- SSD Samsung 970 pro 500Go
- Screens: Acer K282HUL (16/9 1440p) + Acer XR342CK (21/9 1440p)
Tests were done on clean install, on the 21/9 screen, with the same video running on youtube under Vivaldi / fullscreen 1440p on the 16/9 screen (I simulate my habits). No optimization on OS level, just full clean install without other any setup.
Benchmarks with default ultra settings + motion blur off + dlss disabled, launched 2 times for each OS
Linux:
- Fedora 41 KDE
- Nvidia 565.35 from akmods-nvidia
Results:
- avg 60.69fps
- 25 minutes of shaders compilation
- benchmark started instantly
- some floating triangle textures can be seen at the center of the screen in the desert during 20s before entering the bivouac
- results shown instantly after the benchmark
Windows:
- Windows 11
- Nvidia 566.36 from windows update
Results:
- avg 55.09fps
- 10 minutes of shaders compilation
- benchmarks started in 20s
- persistent tearing at the middle of the screen
- cloud texture bug in the sky during meteo change
- 10s black screen before benchmark results
And recording of the run;
(Recording on OBS, hits the minimum FPS by an average of 2-3fps)
https://youtu.be/0z6tdFXsLqc?si=91LWJa67uw09fmx_
I'm heavily CPU bottlenecked and losing about 20% gpu performance.
that's my custom settings I went through each setting individually and adjusted them according to a combination of how noticeable the setting was and how how much i was likely to care.
for example the sky setting wasn't very visible so I turned it right down, the trees were very visible so i turned them up, and lot of the textures could stay at medium settings outside of cutscenes because I won't really notice them while moving anyway.
also I have separate settings for cutscenes, but you can get away with having it set to the highest settings during cutscenes and your machine should still be fairly stable during those sections, because my pc was actually performing better during those portions of the benchmark than it did in the game play section with the wandering seikret.
lowest settings
https://steamuserimages-a.akamaihd.net/ugc/16428277777366568/EE93400171D25C63D6D2E90FA12E8130225F59A7/
1080p DLSS ultra performance
ultra settings
https://steamuserimages-a.akamaihd.net/ugc/16428277777366844/EDABAB71EDEA050D675C640BAF522AF0A2729C52/
1080p native
high settings
high textures
https://steamuserimages-a.akamaihd.net/ugc/16428277777203169/E52B3E0D18A9B2EFC15FBE9A0A0DACA5508A134C/
1440p native
Lowest Settings (not sure why it says custom on this one though)
https://steamuserimages-a.akamaihd.net/ugc/16428277777081030/E2EC1090FA237A0CC0D26DCE155F6500ECD21024/
1440p DLSS ultra performance
Lowest Settings
https://steamuserimages-a.akamaihd.net/ugc/16428277777092354/21869E37A28113D2880BCD8E0BD4F3254E44777B/
1440p DLSS ultra performance + frame generation
Lowest Settings
https://steamuserimages-a.akamaihd.net/ugc/16428277777109546/04D7630E6BFCCF7F1E7EC1EB8824B20FF72C0A9D/
1440p native
medium settings
high textures
https://steamuserimages-a.akamaihd.net/ugc/16428277777166325/6D57B33C8E3C45FC24CE5E80ABC634F0AA1E9D35/
1440p native
high settings
high textures
https://steamuserimages-a.akamaihd.net/ugc/16428277777203395/B1E12B0AA3122FA7D0165FA1EAB029D53A67DEB5/
If you compare my benchmarks, I only lost 10 fps (1440p native) changing from lowest settings to high settings - that's definitely odd.
And from medium settings (1440p native) to high settings I lost like 3 fps.
Lost roughly 50% of my average fps going from lowest (1080p + DLSS ultra performance) to ultra settings (1080p + DLSS ultra performance), which doesn't sound that bad - however the latency was the actual issue, perhaps it was a VRAM limit or something else, but it went up to nearly 70ms latency(usually I am at around 15ms) & it felt much slower than it was fps-wise.
Seems like people on 40 series cards are also able to achieve 60fps comfortably as long as they have a cpu that isn't bottlenecking.
People on 30 series cards however , they are 5 year old gpus...probably time for an upgrade.
If you have less than 12GB VRAM this also seems to be rather important and maybe time for an upgrade.
https://www.youtube.com/watch?v=i9QsAnV0XOM
https://www.youtube.com/watch?v=srrJc982b3Q
https://www.youtube.com/watch?v=K0lSEUjzKrQ
https://www.youtube.com/watch?v=mW2z3BJAm5Q