STAR WARS Jedi: Survivor™

STAR WARS Jedi: Survivor™

View Stats:
Frame gen is a mess
DLSS 3 is by and large fantastic, and if I see that option it's being turned on. But in this, a game with abysmal performance, a game which desperately needs anything to salvage it fro the mess that is Unreal Engine 4, they messed up DLSS 3.

When you do DLSS 3 correctly you exclude the UI, else the generated frames read the motion vectors with the UI still in place and you get a UI which jitters and wobbles with the camera. The very first version of DLSS 3 had this, but that was fixed long ago.

I wish I could say I was surprised that Respawn, a company now notorious for poorly performing games would mess this up, but sadly it's par for the course. Amazing that the same devs made the masterpiece that was Titanfall 2.

I don't imagine this will ever get fixed, but still, Respawn, do better. Your art team is great, your tech guys, not so much. And please for the love of PC gaming never, ever, ever go anywhere near UE 4 again. Just use Frostbite like any sensible EA studio would.
< >
Showing 1-1 of 1 comments
Freakshow May 5, 2024 @ 2:20am 
I have no idea what GPU or CPU you have or anything, but this game may say "recommends 16GB RAM", but I can tell you that it pushes my 32GB RAM rig to 28+GB RAM usage.

How much RAM do you have, maybe that explains the poor performance? I was able to show this to my mom on her nearly 70" 4K HDR TV lol. Don't ask, I've never showed her a game, before, and I loaded my rig up and brought it to her house. TBH, I needed to use her internet to download this game since she has fiber and lives a quarter mile away. I haven't gotten it, yet (it's not been around long) but, yeah, speedtest gives it a 3-4ms ping, I need this in my life lol.

ANYWAYS, since you're talking about DLSS 3, I reckon you have a better card than I do. I'm on a 5900x + 6700XT 12GB + 2x16GB DDR4 3600 C16 dual rank RAM.

But I know Nvidia likes to cheap out on VRAM and lots of gamers tend to have 16GB of memory (RAM), too, so, just wondering if any of that is causing you issues. Game played @ like 40fps at 4K everything maxed out for me. Mind you, I don't play at 4K, I have a 2560x1440 144hz HDR monitor. The 6700XT is a 1440 card, defo not a 4K card. I lplayed around with FSR2 and it'd play super smooth.

Off topic (as if I haven't been off topic, but hey, it's a reply)... I had no idea 4K would actually look so freakin' good. I always assumed it was more of a DPI thing. I mean, what's the dif between 1440p @ 27" vs 4K @ 68" or whatever (her TV is a giant Samsung, I don't remember the size.. and I'm a believer in Samsung TV's being #1, too). The clothes, the skin, the hair... things just POPPED that I don't really notice here on my wittle 27" 2560x1440 monitor.

Looks like I've got a lot of upgrades I need to do. A card that can handle 4K @ at least 80fps and a huge Samsung TV. Thing went to 1400 something nits, too. God, the game looked so much better on that TV.

But, yeah, I was A) surprised it ran decently at 4K (40fps) and B) was astonished I was gobblin' down 28+ GB RAM (total system consumption). Of course it was using all of my VRAM, too.

Long way to ask if you have 32GB RAM, ain't it? I ask because you're talking about it having abysmal performance and I'm on an AMD GPU in a game with ray tracing talking about smooth gameplay at 4K Native w/ HDR. I just say "HDR" because it does incur a slight performance hit. Probably doubly @ 4K vs 1440p.

And you're on an Nvidia card which does like 3 times better in 90% of ray tracing games.
< >
Showing 1-1 of 1 comments
Per page: 1530 50

Date Posted: May 5, 2024 @ 1:16am
Posts: 1