Battlefield™ 2042

Battlefield™ 2042

View Stats:
Vlad Jun 6, 2023 @ 10:12am
Higher FPS - better gameplay
I know it sounds obvious. But it really made a difference for me in the responsiveness when playing. I have a 6700XT and 5600X, running on a 1440p display on the lowest settings I got around 150 FPS, maybe a bit more. My display is 165Hz.

I tried playing with 1080p resolution which rocketed the FPS past 200 and I could tell a clear difference in the gameplay. More kills because faster response from the controls. So now I play 1080p res on a 1440p screen, which doesn't look as good. Also it seems no matter how much I lower the resolution the game won't go past a little over 200 FPS.

If only there was a way to squeeze more FPS from the game while still playing in 1440p...
Last edited by Vlad; Jun 6, 2023 @ 10:15am
< >
Showing 1-13 of 13 comments
Jez Jun 6, 2023 @ 11:36am 
have you monitored usage, if your cpu is maxed at 200fps?
i run v-sync and GPU usage is 50-100%
Vlad Jun 6, 2023 @ 12:02pm 
Originally posted by Jez:
have you monitored usage, if your cpu is maxed at 200fps?
i run v-sync and GPU usage is 50-100%

Yeah, I don't use VSync because apparently it adds to input lag, and anyway my screen has Freesync Premium so within a range the Hz of my monitor syncs with my in-game FPS. So I don't need vsync for screen tearing.

Surprisingly this game doesn't run my GPU too hard, with the FPS uncapped it's like 60-70ºC GPU temp and around 90ºC hotspot. When a game uses my GPU really hard the junction temp can go +110ºC. If I see it going above 100ºC I start capping FPS just in case.

The CPU doesn't like this game though. I've seen it peak at 94ºC maximum which is really bad. At 95ºC is when the 5600X will throttle. And this is the only game ever that has run it this hard. I'm using the stock cooler. I'm not happy about it but at the same time the CPU is supposed to be smart and throttle down before any damage happens. I haven't noticed any lags or throttles though.

Only rarely sometimes when I spawn into a match for the first time the game goes like 2 FPS or the textures haven't loaded and you can't see your gun or your character, but that's not a CPU problem I think.
Jez Jun 6, 2023 @ 12:51pm 
Okay, meanwhile what I mean is if you are only getting 200ish fps max then maybe your cpu or gpu usage is at maximum.

i see you run AMD gpu, my first gpu was AMD, then I've had several nVIDIAs.

nVIDIA has NIS option (nVIDIA Image Scaling), so you can run lower resolutions and it upscales the rendered frames to your native monitor resolution, it looks really good but mean-while when lowering resolutions also lower the Texture Quality, lower Texture Filtering, and Clamp (0) the LOD Bias, and run every AntiAliasing option available, and if you run nVIDIA's NIS turn OFF sharpening.

I run 2K, my RTX3070Ti runs 75FPS V-Sync (50-100% usage) all Ultra RTAO On but with DLSS Quality. High Hz monitors are expensive to run properly for AAA games but it's good you have FreeSync. Try those settings I mentioned (lowering Texture Quality also improves input delay). I'm actually running minimal NIS atm, but with the added Antialiasing options running (5 in total) I still have to run DLSS Quality.

I love 2042, once nVIDIA makes a good enough GPU to run it with high FPS @ 4K Ultra I may upgrade, I was late to play BF1 but I could run it UWQHD or 4K with my GPU, it's tempting but nah :D
Originally posted by Vlad:
Only rarely sometimes when I spawn into a match for the first time the game goes like 2 FPS or the textures haven't loaded and you can't see your gun or your character, but that's not a CPU problem I think.
I had this when I ran 2042 on a HDD

edit: Actually when lowering resolution you can just turn OFF Anisotropic filtering, meanwhile you can increase Digital Vibrance & Contrast because you loose colour depth (i'm just going by nVIDIA options)
Last edited by Jez; Jun 6, 2023 @ 1:21pm
Die Zahl Jun 6, 2023 @ 1:10pm 
the human eye cant see more than 30fps. so you haven't a advantage if you go higher.
Jez Jun 6, 2023 @ 1:15pm 
Originally posted by Die Zahl:
the human eye cant see more than 30fps. so you haven't a advantage if you go higher.
30fps = speed of light
Master Paids' Lot Jun 13, 2023 @ 7:22pm 
Originally posted by Vlad:
Originally posted by Jez:
have you monitored usage, if your cpu is maxed at 200fps?
i run v-sync and GPU usage is 50-100%

Yeah, I don't use VSync because apparently it adds to input lag, and anyway my screen has Freesync Premium so within a range the Hz of my monitor syncs with my in-game FPS. So I don't need vsync for screen tearing.

Surprisingly this game doesn't run my GPU too hard, with the FPS uncapped it's like 60-70ºC GPU temp and around 90ºC hotspot. When a game uses my GPU really hard the junction temp can go +110ºC. If I see it going above 100ºC I start capping FPS just in case.

The CPU doesn't like this game though. I've seen it peak at 94ºC maximum which is really bad. At 95ºC is when the 5600X will throttle. And this is the only game ever that has run it this hard. I'm using the stock cooler. I'm not happy about it but at the same time the CPU is supposed to be smart and throttle down before any damage happens. I haven't noticed any lags or throttles though.

Only rarely sometimes when I spawn into a match for the first time the game goes like 2 FPS or the textures haven't loaded and you can't see your gun or your character, but that's not a CPU problem I think.

Dude you kidding right? Those temps are crazzzzyyy.
Crunchy[Daz] Jun 14, 2023 @ 1:12am 
Originally posted by Master Paids' Lot:
Originally posted by Vlad:

Yeah, I don't use VSync because apparently it adds to input lag, and anyway my screen has Freesync Premium so within a range the Hz of my monitor syncs with my in-game FPS. So I don't need vsync for screen tearing.

Surprisingly this game doesn't run my GPU too hard, with the FPS uncapped it's like 60-70ºC GPU temp and around 90ºC hotspot. When a game uses my GPU really hard the junction temp can go +110ºC. If I see it going above 100ºC I start capping FPS just in case.

The CPU doesn't like this game though. I've seen it peak at 94ºC maximum which is really bad. At 95ºC is when the 5600X will throttle. And this is the only game ever that has run it this hard. I'm using the stock cooler. I'm not happy about it but at the same time the CPU is supposed to be smart and throttle down before any damage happens. I haven't noticed any lags or throttles though.

Only rarely sometimes when I spawn into a match for the first time the game goes like 2 FPS or the textures haven't loaded and you can't see your gun or your character, but that's not a CPU problem I think.

Dude you kidding right? Those temps are crazzzzyyy.

Newer GPUs run hot. IIRC, the 6xxx series from AMD are rated to go up to 110-115 on the hotspot. But it probably shouldn't come close to that if you have decent airflow. Also kind of depends on where in the world you live.
Vlad Jun 14, 2023 @ 3:13am 
Originally posted by Master Paids' Lot:
Originally posted by Vlad:

Yeah, I don't use VSync because apparently it adds to input lag, and anyway my screen has Freesync Premium so within a range the Hz of my monitor syncs with my in-game FPS. So I don't need vsync for screen tearing.

Surprisingly this game doesn't run my GPU too hard, with the FPS uncapped it's like 60-70ºC GPU temp and around 90ºC hotspot. When a game uses my GPU really hard the junction temp can go +110ºC. If I see it going above 100ºC I start capping FPS just in case.

The CPU doesn't like this game though. I've seen it peak at 94ºC maximum which is really bad. At 95ºC is when the 5600X will throttle. And this is the only game ever that has run it this hard. I'm using the stock cooler. I'm not happy about it but at the same time the CPU is supposed to be smart and throttle down before any damage happens. I haven't noticed any lags or throttles though.

Only rarely sometimes when I spawn into a match for the first time the game goes like 2 FPS or the textures haven't loaded and you can't see your gun or your character, but that's not a CPU problem I think.

Dude you kidding right? Those temps are crazzzzyyy.

Yeah I wasn't too happy about it so the other day I bought & installed a new CPU cooler for 30€, now the CPU doesn't go past 80ºC.

The GPU is what it is. It's Red Devil 6700XT and the card temps and hotspot temps can get a lot of difference between them. So maybe it could use a re-paste but I don't want to void the warranty. Honestly it works fine as it is. If a game in particular makes the hotspot temps get crazy I just limit the FPS and that fixes the problem.

My airflow might not be too good. My case is a NOX Hummer TGX Rainbox, you can find images online. The case is freaking huge and has 3 intake fans and 1 outgoing fan, all 140mm. But the problem is that the 3 intake fans are millimiters away from the glass front panel, and so it takes in the air from a really small opening in the sides of the front panel. And then the room temperature itself. I live in Spain next to the sea so hot summer, lots of humidity, no AC in the room.. Also have non-grounded power outlets living in an old house.

But still both the CPU and the GPU are supposed to throttle if they get too hot to prevent damage. And so far in the 2 years I've had the computer even with the hot temps I've never noticed a lag, a throttle, nothing. Not even a blue screen. So far so good.
Waldherz Jun 14, 2023 @ 5:41am 
Originally posted by Jez:
nVIDIA has NIS option (nVIDIA Image Scaling), so you can run lower resolutions and it upscales the rendered frames to your native monitor resolution, it looks really good but mean-while when lowering resolutions also lower the Texture Quality, lower Texture Filtering, and Clamp (0) the LOD Bias, and run every AntiAliasing option available, and if you run nVIDIA's NIS turn OFF sharpening.

I run 2K

FYI: NIS is just the same thing as FSR. Why would you use NIS if you can use DLSS? NIS Is literally inferior and your worst option.

Also 2K is 1080p since the XK refers to the horizontal pixel count of a resolution. Look it up.
Jez Jun 14, 2023 @ 5:49am 
NIS is different it affects the whole image whereas FSR is distance/anisotropic biased
Originally posted by Waldherz彡:
Also 2K is 1080p since the XK refers to the horizontal pixel count of a resolution. Look it up.
well i've read 2K is twice HD (720 X2 = 1440) & 4K is 4x 720
Master Paids' Lot Jun 14, 2023 @ 3:10pm 
Originally posted by CrunchyDaz:
Originally posted by Master Paids' Lot:

Dude you kidding right? Those temps are crazzzzyyy.

Newer GPUs run hot. IIRC, the 6xxx series from AMD are rated to go up to 110-115 on the hotspot. But it probably shouldn't come close to that if you have decent airflow. Also kind of depends on where in the world you live.

Not this hot. Hottest my 4080 gets is around 65 degrees on Cyberpunk, 2k + path tracing where its utilized 99%, around 2900 mhz mem clock speed. Its sitting around 50-55 degrees on bf, 2k, low-med settings, utilized around 50%. So yea he has to watch that air flow, or clean/repaste the gpu.
Khamal Jun 14, 2023 @ 3:38pm 
Amd and nvidia newest cards have different heat junction limits
40”s are 92 I believe and 6 series are 115

Amd ones are supposed to run at 70 or more
Whereas 40 seems to run under 70

Even on max settings and 5120x1440 my 4070 stays 63-65
Last edited by Khamal; Jun 14, 2023 @ 3:41pm
Vlad Jun 15, 2023 @ 4:03am 
Originally posted by Master Paids' Lot:
Originally posted by CrunchyDaz:

Newer GPUs run hot. IIRC, the 6xxx series from AMD are rated to go up to 110-115 on the hotspot. But it probably shouldn't come close to that if you have decent airflow. Also kind of depends on where in the world you live.

Not this hot. Hottest my 4080 gets is around 65 degrees on Cyberpunk, 2k + path tracing where its utilized 99%, around 2900 mhz mem clock speed. Its sitting around 50-55 degrees on bf, 2k, low-med settings, utilized around 50%. So yea he has to watch that air flow, or clean/repaste the gpu.

Ehhh I don't really worry about it. What do you think will happen otherwise? On my previous computer I had a R9 380, and that card would sometimes reach 90ºC when playing some games. Not hotspot temperature because that card didn't have a hotspot sensor, 90ºC CARD temperature. I got that card in 2016. 7 years later it's still going strong in my other computer, no problems either.
< >
Showing 1-13 of 13 comments
Per page: 1530 50

Date Posted: Jun 6, 2023 @ 10:12am
Posts: 13