Alan Wake

Alan Wake

View Stats:
Bozz Collin Oct 26, 2023 @ 10:41pm
Alan Wake 2 performance thread
RTX 3080 ti here. Intro area I get 15-18fps maxed out at 1920p, RT/PT on max with DLSS on Quality. 20fps on Balanced. 1440p I get 25-30fps on Quality. With RT/PT off, I get ~35-40fps at 1920p. 1440p (my native resolution) gets me ~50fps. All using Quality DLSS except where I specified using Balanced. Don't even think about running anything at a native resolution. 1440p is extremely blurry and foliage is pixelated with DLSS, even on Quality. 1920p was the best middle ground for picture clarity and performance. Don't even think about 2160p.

All that said, the game looks incredible with or without RT/PT. Of course it looks better WITH, but it still looks like a very top-tier game without. It's just a shame that performance is very poor, despite what settings you use. At best, I was able to get 60-65fps with everything on low-medium and RT/PT disabled.
< >
Showing 121-135 of 212 comments
Barf Nov 3, 2023 @ 7:01am 
Originally posted by Lost Drunk:
Originally posted by SHREDDER:
But to run all new games maxed 1440p 60 fps and imean ALL of them you need RX 6700XT or better. gtx 1080 cant run most not all while mine rx 6700xt until now runs all games maxed 1440p 60 fps and with ray tracing.
I'm talking about 1080p... A GTX 1080Ti can't run an game even in 540p on very low today is a pure joke.
I don't care about your GPU or 1440p, this is about how the developers just don't want optimize your games.
OP just say that RTX 3080 already have poor results on Alan Wake 2 where was suppose to run on a flawless experience.

Mesh Shaders. Simple as that. Your 1080ti is going to struggle in games that start implementing it. 2 solutions to this. Either don't buy a game that has mesh shaders, or go out and purchase a card that has similar performance to the 6 year old card you have. A 4060 is less than $300 dollars US and performs roughly 10% better in most games.
Lost Drunk Nov 3, 2023 @ 7:56am 
Originally posted by patrick68794:
Originally posted by Lost Drunk:
I'm talking about 1080p... A GTX 1080Ti can't run an game even in 540p on very low today is a pure joke.
I don't care about your GPU or 1440p, this is about how the developers just don't want optimize your games.
OP just say that RTX 3080 already have poor results on Alan Wake 2 where was suppose to run on a flawless experience.
1080ti runs the game so poorly only because it doesn't support the mesh shaders used by the game. GPUs with equivalent compute that do have support run the game fine at 1080p with a couple of settings turned down.

Originally posted by Giant Feeble Lard Baby:
Originally posted by Lost Drunk:
I'm talking about 1080p... A GTX 1080Ti can't run an game even in 540p on very low today is a pure joke.
I don't care about your GPU or 1440p, this is about how the developers just don't want optimize your games.
OP just say that RTX 3080 already have poor results on Alan Wake 2 where was suppose to run on a flawless experience.

Mesh Shaders. Simple as that. Your 1080ti is going to struggle in games that start implementing it. 2 solutions to this. Either don't buy a game that has mesh shaders, or go out and purchase a card that has similar performance to the 6 year old card you have. A 4060 is less than $300 dollars US and performs roughly 10% better in most games.

Anyway have mod for make the game run on Vulkan API and give a good FPS boost even not supporting the mesh shaders. The developers just don't want to optimize the game when would.
Developers are very lazy now due so many technologies like RT and DLSS/FSR and just made products for run in specific conditions. (30 FPS on 2023 is amazing)
A minimum RTX 2060 for this game still hardly running on native 1080p/low and keep 30 FPS even having the mesh shaders support. (1080Ti is more strong than 2060 in several games...)
My GPU is an GTX 1070 and I not mention before because is not about try defending my setup.
GodisanAtheist Nov 3, 2023 @ 6:00pm 
Crazy idea people: just wait to buy the game until your hardware can play this game at reasonable settings. Not rocket surgery here.

It'll be $5 bucks with all DLC included and you can finally go "I waited all this time for this crap?"
Mingdug Nov 3, 2023 @ 7:02pm 
Originally posted by Lost Drunk:
Originally posted by SHREDDER:
so it is true htat it is the first game who works only on cards that support ray tracing! Then it is good as i have RX 6700XT RED DEVIL RYZEN 7 1700 16 GB DDR4 3200MHZCL15 and i run all games maxed 1440p. GTX 1080 is 7 years old so it is normal to not be able to run a 2023 game.
GTX 1080 still can run mostly every game on ultra 1080p. Even an GTX 750Ti can run some modern games at 720p on low.
This is not an hardware issue.
Other people with 30-40 series Nvidia cards with performance issues is fair enough, 60-70 series for AMD is also fair enough thats fine criticism. GTX 1080 performance issues is expected, its a great card but the game recommends mesh shaders for a reason among a few other things. Developers shouldn't have to make sure modern games run on a 7 year old GPU however because then we can't move forward with new tech if they do. 7 years is a VERY good life span for a GPU however like jesus christ it outlasted 3 generations and only now is it not able to run some games comfortably.
Cephei Nov 3, 2023 @ 7:04pm 
Originally posted by Tereru Ooensu:
Originally posted by Bozz Collin:
Guys, calm your asses down. This isn’t meant for shade throwing. I gave a general impression of performance on a 3080ti, with AND without RT features. Performance is rough either way, but it also looks fantastic either way. Does anybody read? I never complained about RT/PT being demanding. We all know that already.

This is simply a reference point for those who can’t use Frame Generation. We’re well aware that FG doubles your DLSS frame rate. FG is pretty much mandatory these days, if you want good frame rates. Nvidia has a way of rendering their previous series of cards obsolete since the 2000 series.

It’s probably best to hold out for the 5000 series if you don’t own a 4000 series yet.
I gotta 3080 and no matter how low I turned the settings, I still see stutters sadly, I'm pretty new to PC gaming so don't know if I'm doing something wrong but it's depressing

Got a 3080 too, sadly the card is aging rapidly, got to play at medium/high settings at 1440p and RT off.
I will skip the 40xx series and upgrade my pc with a 5080 card.
Last edited by Cephei; Nov 3, 2023 @ 7:05pm
Mingdug Nov 3, 2023 @ 8:17pm 
Originally posted by Lost Drunk:
Originally posted by patrick68794:
1080ti runs the game so poorly only because it doesn't support the mesh shaders used by the game. GPUs with equivalent compute that do have support run the game fine at 1080p with a couple of settings turned down.

Originally posted by Giant Feeble Lard Baby:

Mesh Shaders. Simple as that. Your 1080ti is going to struggle in games that start implementing it. 2 solutions to this. Either don't buy a game that has mesh shaders, or go out and purchase a card that has similar performance to the 6 year old card you have. A 4060 is less than $300 dollars US and performs roughly 10% better in most games.

Anyway have mod for make the game run on Vulkan API and give a good FPS boost even not supporting the mesh shaders. The developers just don't want to optimize the game when would.
Developers are very lazy now due so many technologies like RT and DLSS/FSR and just made products for run in specific conditions. (30 FPS on 2023 is amazing)
A minimum RTX 2060 for this game still hardly running on native 1080p/low and keep 30 FPS even having the mesh shaders support. (1080Ti is more strong than 2060 in several games...)
My GPU is an GTX 1070 and I not mention before because is not about try defending my setup.
I hate how people say "Just optimize it" ok HOW then? Upscaling technology as well as other tech to make ray-tracing more performant has lead developers to push graphical fidelity further because with upscaling they now have headroom to make even further improvements. If devs make a game thats extremely optimized like Metal Gear Solid 5 then they're berated for the lower graphical fidelity and are insulted for doing it just because of lower-end hardware, then we have something like Alan Wake 2 where they try to make it perform well enough so they can push the graphics to insane levels at times and then they're insulted for not making it performant on lower-end hardware.

It just goes to show you'll never appease everyone because majority of the gamers in echo chambers don't really know what they want.
Fahrenheit Nov 3, 2023 @ 8:30pm 
Originally posted by Ryrin:
There is a persistent microstutter that apparently almost no one else notices since I haven't seen anyone else mention it but it's there no matter how many settings I tweak. It makes even 90+ fps not feel smooth.
Yes me too. Render latency is really high and it's the only game doing this. Doesn't change no matter what settings I run
ar2games Nov 3, 2023 @ 11:59pm 
Originally posted by Mingdug:
Originally posted by Lost Drunk:



Anyway have mod for make the game run on Vulkan API and give a good FPS boost even not supporting the mesh shaders. The developers just don't want to optimize the game when would.
Developers are very lazy now due so many technologies like RT and DLSS/FSR and just made products for run in specific conditions. (30 FPS on 2023 is amazing)
A minimum RTX 2060 for this game still hardly running on native 1080p/low and keep 30 FPS even having the mesh shaders support. (1080Ti is more strong than 2060 in several games...)
My GPU is an GTX 1070 and I not mention before because is not about try defending my setup.
I hate how people say "Just optimize it" ok HOW then? Upscaling technology as well as other tech to make ray-tracing more performant has lead developers to push graphical fidelity further because with upscaling they now have headroom to make even further improvements. If devs make a game thats extremely optimized like Metal Gear Solid 5 then they're berated for the lower graphical fidelity and are insulted for doing it just because of lower-end hardware, then we have something like Alan Wake 2 where they try to make it perform well enough so they can push the graphics to insane levels at times and then they're insulted for not making it performant on lower-end hardware.

It just goes to show you'll never appease everyone because majority of the gamers in echo chambers don't really know what they want.

How? they could start with making their texture streaming systems better, provide the users with more quality options to ensure the supported hardware can run the game. Just looking at the file size you can tell they went overboard with the texture resolution. This was also a issue with Control so at this point, there's no excuse.

And like you just said, AW2 pushes the graphics to unnecessarily insane levels, no one is going to stop and look at every single scatered object in the floor, another area that needs optimization to avoid clutter.
BEEP! Nov 4, 2023 @ 12:21am 
Originally posted by Cephei:
Originally posted by Tereru Ooensu:
I gotta 3080 and no matter how low I turned the settings, I still see stutters sadly, I'm pretty new to PC gaming so don't know if I'm doing something wrong but it's depressing

Got a 3080 too, sadly the card is aging rapidly, got to play at medium/high settings at 1440p and RT off.
I will skip the 40xx series and upgrade my pc with a 5080 card.
I really think that limited Vram is holding the 3080 back a lot because I noticed I was hitting that Vram limited in game's at 1440P & then my card died in a storm & I replaced it with a RX6800 it was cheap $430 final price that's counting tax & warranty & even though my average is about 10% lower game's across the board feel so much smother I couldn't believe it then I realized in a lot of game's I was constantly going past
11-12GBVram I think cards under 12GBV are going to have a rude awakening very soon because almost every new big game these last few years 8-10GBV cards seem to have a lot of problems I honestly wouldn't recommend anyone buying a card under 12GBV for gaming past 1080p.
Last edited by BEEP!; Nov 4, 2023 @ 12:21am
SHREDDER Nov 4, 2023 @ 12:22am 
Originally posted by Lost Drunk:
Originally posted by SHREDDER:
But to run all new games maxed 1440p 60 fps and imean ALL of them you need RX 6700XT or better. gtx 1080 cant run most not all while mine rx 6700xt until now runs all games maxed 1440p 60 fps and with ray tracing.
I'm talking about 1080p... A GTX 1080Ti can't run an game even in 540p on very low today is a pure joke.
I don't care about your GPU or 1440p, this is about how the developers just don't want optimize your games.
OP just say that RTX 3080 already have poor results on Alan Wake 2 where was suppose to run on a flawless experience.
GTX 108Ti cant? One my friends got it in 2017 for 900 euros then in 2019 he got a 27'' 4k monitor that he has until today and he still runs all gamex at 4k. Not maxed 60 fps but he stilr uns 4k. I with my RX6 700XT RED DEVIL that ig ot in 2021 and DELLP2416D 24''2 560X1440 60 HZ IPS that ig ot in 2016 run the today games with 1000 times better graphics than him and most look like real!
BEEP! Nov 4, 2023 @ 12:23am 
Originally posted by Hello Stranger:
Originally posted by Ryrin:
There is a persistent microstutter that apparently almost no one else notices since I haven't seen anyone else mention it but it's there no matter how many settings I tweak. It makes even 90+ fps not feel smooth.
Yes me too. Render latency is really high and it's the only game doing this. Doesn't change no matter what settings I run
Yep I noticed it to even when I was 70+ it just never felt smooth like a constant micro stutter or like my Freesync wasn't working I ended up playing the game with a gamepad so I wouldn't notice it as much.
SHREDDER Nov 4, 2023 @ 12:25am 
Originally posted by Mingdug:
Originally posted by Lost Drunk:
GTX 1080 still can run mostly every game on ultra 1080p. Even an GTX 750Ti can run some modern games at 720p on low.
This is not an hardware issue.
Other people with 30-40 series Nvidia cards with performance issues is fair enough, 60-70 series for AMD is also fair enough thats fine criticism. GTX 1080 performance issues is expected, its a great card but the game recommends mesh shaders for a reason among a few other things. Developers shouldn't have to make sure modern games run on a 7 year old GPU however because then we can't move forward with new tech if they do. 7 years is a VERY good life span for a GPU however like jesus christ it outlasted 3 generations and only now is it not able to run some games comfortably.
7 years gpu was also how odl was my previous gpu when it died(november 2014-november 2021) which means these old gpus will die soon anyway because they are and then those will be forced to buy a new gpu much more powerful.
franzmauser Nov 4, 2023 @ 3:32am 
Originally posted by Iggy Wolf:
Originally posted by bRice153:
Fact is dlss quality at 4k looks better than native as proven by digital foundry, stop acting like it makes games look worse when it's already proven it enhances image quality. Now move on already.

"Better than native". Oh man, Nvidia sure knows how to pick their "marketing" terms. Funny. They didn't have that attitude when it came to previous versions of supersampling (known as SSAA). Maybe because it was understood that achieving a higher resolution by either bumping up a lower one to a higher one, or downscaling from a higher one to a lower one was the only way to RETAIN graphical fidelity without losing image quality and clarity. DLSS still continues to sometimes have the dreaded ghosting and blur that plagued TAA. Sorry, but at least native is ONE area in which it DOESN'T have that problem. And until DLSS can get rid of that, it will NOT be "better".
Don't argue with this ♥♥♥♥♥♥. His is completely braindead, he buyed this worthless nvidia crap, because he likes to be f..... in the a.. by the green team. After release of rtx 5000 his card will be obsolete because nvidia once again makes the same stunt.
Mingdug Nov 4, 2023 @ 3:50am 
Originally posted by ar2games:
Originally posted by Mingdug:
I hate how people say "Just optimize it" ok HOW then? Upscaling technology as well as other tech to make ray-tracing more performant has lead developers to push graphical fidelity further because with upscaling they now have headroom to make even further improvements. If devs make a game thats extremely optimized like Metal Gear Solid 5 then they're berated for the lower graphical fidelity and are insulted for doing it just because of lower-end hardware, then we have something like Alan Wake 2 where they try to make it perform well enough so they can push the graphics to insane levels at times and then they're insulted for not making it performant on lower-end hardware.

It just goes to show you'll never appease everyone because majority of the gamers in echo chambers don't really know what they want.

How? they could start with making their texture streaming systems better, provide the users with more quality options to ensure the supported hardware can run the game. Just looking at the file size you can tell they went overboard with the texture resolution. This was also a issue with Control so at this point, there's no excuse.

And like you just said, AW2 pushes the graphics to unnecessarily insane levels, no one is going to stop and look at every single scatered object in the floor, another area that needs optimization to avoid clutter.
I hate people who use terminology that exists but they don't understand it at all. Make their texture streaming systems better.... ok how do you suggest fixing it? how did you figure out texture streaming is the issue?
Lost Drunk Nov 4, 2023 @ 9:48am 
Originally posted by Mingdug:
Originally posted by Lost Drunk:
GTX 1080 still can run mostly every game on ultra 1080p. Even an GTX 750Ti can run some modern games at 720p on low.
This is not an hardware issue.
Other people with 30-40 series Nvidia cards with performance issues is fair enough, 60-70 series for AMD is also fair enough thats fine criticism. GTX 1080 performance issues is expected, its a great card but the game recommends mesh shaders for a reason among a few other things. Developers shouldn't have to make sure modern games run on a 7 year old GPU however because then we can't move forward with new tech if they do. 7 years is a VERY good life span for a GPU however like jesus christ it outlasted 3 generations and only now is it not able to run some games comfortably.
The issue is how developers start to done worst and worst works even using new hardware. Even in PS5/XBOX Series Alan Wake 2 have questionable performance, not mention that AW2 begin the development years ago so at least in Nvidia RTX 30 series should run great and without any issue.

People continue to see the problem in the wrong way and don't understand the full context. It is very easy to blame the hardware and not accept that software runs poorly on hardware beyond the initial production time.
One point to be clarified is that developers are in fact not to blame, but the companies themselves that force developers to rush projects and implement modern technologies just to sell the product in a flashy way and don't care about the real user experience. consumer. Anyone who defends a product like this is against the consumer himself.
< >
Showing 121-135 of 212 comments
Per page: 1530 50