Remnant II

Remnant II

View Stats:
Emma_Z Mar 7, 2024 @ 8:19am
Very low GPU and GPU usage
I have a good pc and it runs very well any other game but with Remnant 2 i constantly see that my GPU and CPU usage dont go more than 50%, is mostly just 40% and when im not in the ward the game cant hit more than 60 fps and in areas like the new dlc it runs a 30 fps. No matter the settings, no matter even the map, the usage of hardware is exactly the same. I had this problem in launch and now returning for the DLC is still not fixed. Anyone knows how to contact the official support? im not the only one experiencing it
< >
Showing 1-15 of 24 comments
David Davidson Mar 7, 2024 @ 8:38am 
Seems Rem2 has some optimization issues, what are your system specs?
Do you have VSync on?

Tangent:
Though 60-30fps should be fine honestly, 30fps was the "aim for this, if you can get higher you're doing well" framerate for medium budget gaming PCs until about 2010/12. A movie runs at 24fps so if the framerate is consistent it shouldn't be a massive issue. If I could consistently place top 3 in battlefield 2 and COD4 at 30 and 25fps respectively it shouldn't be a massive issue in a moderately paced soulslike shooter.

Not saying that a higher framerate wouldn't be more desirable, though after around 25fps you make limited gains and running your GPU and CPU at 50%, while probably an optimization issue is better than maxing it out at 100% and giving you the same framerate, which is even worse optimization.
Last edited by David Davidson; Mar 7, 2024 @ 8:39am
Emma_Z Mar 7, 2024 @ 8:47am 
Originally posted by David Davidson:
Seems Rem2 has some optimization issues, what are your system specs?
Do you have VSync on?

Tangent:
Though 60-30fps should be fine honestly, 30fps was the "aim for this, if you can get higher you're doing well" framerate for medium budget gaming PCs until about 2010/12. A movie runs at 24fps so if the framerate is consistent it shouldn't be a massive issue. If I could consistently place top 3 in battlefield 2 and COD4 at 30 and 25fps respectively it shouldn't be a massive issue in a moderately paced soulslike shooter.

Not saying that a higher framerate wouldn't be more desirable, though after around 25fps you make limited gains and running your GPU and CPU at 50%, while probably an optimization issue is better than maxing it out at 100% and giving you the same framerate, which is even worse optimization.
6800 XT and an i5 12600K, no 30 fps or 60 fps is not acceptable, especially for a game like this, the optimization is ♥♥♥♥, is insane they didnt fix anything all this time.
Last edited by Emma_Z; Mar 7, 2024 @ 8:49am
antagonist Mar 7, 2024 @ 9:19am 
Low CPU usage seems rather normal in Remnant 2, while the GPU should be able to reach 90+% easily when you want to max every setting.
Do you perhaps have an alternative GPU BIOS active or some power-saving-options/FPS-limiter switched on in Adrenaline?
David Davidson Mar 7, 2024 @ 10:01am 
Originally posted by GORDO COMUNISTA CAZALIBERTARIOS:
Originally posted by David Davidson:
Seems Rem2 has some optimization issues, what are your system specs?
Do you have VSync on?

Tangent:
Though 60-30fps should be fine honestly, 30fps was the "aim for this, if you can get higher you're doing well" framerate for medium budget gaming PCs until about 2010/12. A movie runs at 24fps so if the framerate is consistent it shouldn't be a massive issue. If I could consistently place top 3 in battlefield 2 and COD4 at 30 and 25fps respectively it shouldn't be a massive issue in a moderately paced soulslike shooter.

Not saying that a higher framerate wouldn't be more desirable, though after around 25fps you make limited gains and running your GPU and CPU at 50%, while probably an optimization issue is better than maxing it out at 100% and giving you the same framerate, which is even worse optimization.
6800 XT and an i5 12600K, no 30 fps or 60 fps is not acceptable, especially for a game like this, the optimization is ♥♥♥♥, is insane they didnt fix anything all this time.

Do you have V-Sync turned on? V-Sync usually caps your framerate, which would explain things.

It's been out for less than a year and they can't test against every graphics card. Is the framerate actually stuttering, or are you just displeased that you're not getting the highest framerate possible?

Humans see at a rate of about 30-60Hz except under extremely specific, circumstances so unless you're experiencing frame drops below 30fps (really below 25fps given that many generations of console gamers played at 25fps in PAL regions and never noticed the games they were playing being choppy, right up to the Xbox360/PS3 era if they used a CRT TV), you shouldn't notice much of a difference between 30 and 60, or 60 and 120 and so on, or the difference will be so miniscule that it'll only impact gameplay in highly competitive first person shooters and even then, only by a few percent. 30-60 frames/s without stuttering sounds fine to me and would have been better than good when I was a teenager and is just fine for me now.


Originally posted by antagonist:
Low CPU usage seems rather normal in Remnant 2, while the GPU should be able to reach 90+% easily when you want to max every setting.
Do you perhaps have an alternative GPU BIOS active or some power-saving-options/FPS-limiter switched on in Adrenaline?
This is also a possibility, though I'd imagine the issue would be across the board rather
Fat Cat Mar 7, 2024 @ 10:34am 
Are your GPU drivers up to date? Do you have SAM enabled? What resolution are you playing at? Are you using upscaling? AFMF? Multiple monitors? Overclocking/undervolting? Much more information is needed before we can give any suggestions on how to improve your performance.
Emma_Z Mar 7, 2024 @ 2:28pm 
Originally posted by antagonist:
Low CPU usage seems rather normal in Remnant 2, while the GPU should be able to reach 90+% easily when you want to max every setting.
Do you perhaps have an alternative GPU BIOS active or some power-saving-options/FPS-limiter switched on in Adrenaline?
nope, in fact i overclock my gpu. This is literally the only game i had a problem in 2 years i have this build
Emma_Z Mar 7, 2024 @ 2:29pm 
Originally posted by David Davidson:
Originally posted by GORDO COMUNISTA CAZALIBERTARIOS:
6800 XT and an i5 12600K, no 30 fps or 60 fps is not acceptable, especially for a game like this, the optimization is ♥♥♥♥, is insane they didnt fix anything all this time.

Do you have V-Sync turned on? V-Sync usually caps your framerate, which would explain things.

It's been out for less than a year and they can't test against every graphics card. Is the framerate actually stuttering, or are you just displeased that you're not getting the highest framerate possible?

Humans see at a rate of about 30-60Hz except under extremely specific, circumstances so unless you're experiencing frame drops below 30fps (really below 25fps given that many generations of console gamers played at 25fps in PAL regions and never noticed the games they were playing being choppy, right up to the Xbox360/PS3 era if they used a CRT TV), you shouldn't notice much of a difference between 30 and 60, or 60 and 120 and so on, or the difference will be so miniscule that it'll only impact gameplay in highly competitive first person shooters and even then, only by a few percent. 30-60 frames/s without stuttering sounds fine to me and would have been better than good when I was a teenager and is just fine for me now.


Originally posted by antagonist:
Low CPU usage seems rather normal in Remnant 2, while the GPU should be able to reach 90+% easily when you want to max every setting.
Do you perhaps have an alternative GPU BIOS active or some power-saving-options/FPS-limiter switched on in Adrenaline?
This is also a possibility, though I'd imagine the issue would be across the board rather
no, you are literally insane if you cant see more than 30 fps, check your eyes, might have some brain damage too
Last edited by Emma_Z; Mar 7, 2024 @ 2:35pm
Emma_Z Mar 7, 2024 @ 2:31pm 
Originally posted by lumiereeclair:
Are your GPU drivers up to date? Do you have SAM enabled? What resolution are you playing at? Are you using upscaling? AFMF? Multiple monitors? Overclocking/undervolting? Much more information is needed before we can give any suggestions on how to improve your performance.
yes, to date. I do have SAM. I play at 1080p because of my monitor, 165 hz. Two monitors. I do overclock and undervolt my gpu, i have that profile for 2 years now, totally stable, via adrenaline, i dont use afterburner. This is literally the only game i had a problem before, this is an optimization problem, i only made this post so i could get a via to talk to official support or if there is any mod out there that can fix whats limiting the hardware, i feel like is an AI problem because it only happens badly outside the ward
Last edited by Emma_Z; Mar 7, 2024 @ 2:34pm
Fat Cat Mar 7, 2024 @ 2:49pm 
Originally posted by GORDO COMUNISTA CAZALIBERTARIOS:
Originally posted by lumiereeclair:
Are your GPU drivers up to date? Do you have SAM enabled? What resolution are you playing at? Are you using upscaling? AFMF? Multiple monitors? Overclocking/undervolting? Much more information is needed before we can give any suggestions on how to improve your performance.
yes, to date. I do have SAM. I play at 1080p because of my monitor, 165 hz. Two monitors. I do overclock and undervolt my gpu, i have that profile for 2 years now, totally stable, via adrenaline, i dont use afterburner. This is literally the only game i had a problem before, this is an optimization problem, i only made this post so i could get a via to talk to official support or if there is any mod out there that can fix whats limiting the hardware, i feel like is an AI problem because it only happens badly outside the ward

Which upscaler are you using and which quality setting, performance/balanced/quality? How much is the voltage offset for your GPU? What is the rest of your pc's specs?
David Davidson Mar 7, 2024 @ 7:22pm 
Originally posted by GORDO COMUNISTA CAZALIBERTARIOS:
Originally posted by David Davidson:

Do you have V-Sync turned on? V-Sync usually caps your framerate, which would explain things.

It's been out for less than a year and they can't test against every graphics card. Is the framerate actually stuttering, or are you just displeased that you're not getting the highest framerate possible?

Humans see at a rate of about 30-60Hz except under extremely specific, circumstances so unless you're experiencing frame drops below 30fps (really below 25fps given that many generations of console gamers played at 25fps in PAL regions and never noticed the games they were playing being choppy, right up to the Xbox360/PS3 era if they used a CRT TV), you shouldn't notice much of a difference between 30 and 60, or 60 and 120 and so on, or the difference will be so miniscule that it'll only impact gameplay in highly competitive first person shooters and even then, only by a few percent. 30-60 frames/s without stuttering sounds fine to me and would have been better than good when I was a teenager and is just fine for me now.



This is also a possibility, though I'd imagine the issue would be across the board rather
no, you are literally insane if you cant see more than 30 fps, check your eyes, might have some brain damage too
30-60Hz is the standard perception rate of the human eye outside of extremely niche (strobing lights for example) conditions. This is backed up by multiple double blind studies and scientific sources, the ones which state you can see >100Hz are usually websites trying to sell you high framerate computer monitors. I notice little (if any) difference between my two 144Hz monitors (one capped at 120) with Vsync/frame cap off and with Vsync/frame cap on. Additionally neither have friends who play using extremely expensive gaming PCs and a 60Hz TV, they see little, or no difference. GPU is a (laptop) 3070 though in benchmarks it'll usually pull >100FPS, which is unnecessary overheating imho.
Do movies (24 frames per second) look like slideshows to you? Do you somehow have superhuman sight?
If the game isn't stuttering, maybe you're having problems because you're keeping your eyes on the framerate in the top left or top right and not on what you're doing? Unless you are so superhuman you can count the number of frames per second in your head.


Originally posted by GORDO COMUNISTA CAZALIBERTARIOS:
yes, to date. I do have SAM. I play at 1080p because of my monitor, 165 hz. Two monitors. I do overclock and undervolt my gpu, i have that profile for 2 years now, totally stable, via adrenaline, i dont use afterburner. This is literally the only game i had a problem before, this is an optimization problem, i only made this post so i could get a via to talk to official support or if there is any mod out there that can fix whats limiting the hardware, i feel like is an AI problem because it only happens badly outside the ward

Wait, you overclock and undervolt? Never been much of an overclocker, though shouldn't you be overvolting your GPU if you want to get higher clock rates?


You created a steam thread to contact support. You can contact support via the link below, either "I need assistance for an issue within a game" or "I need technical assistance" be as thorough as possible. I'd recommend reverting your GPU to default before doing so, as they the developers probably don't have access to your PC config in it's entirety, but also definitely do not have access to that along with the soft/firmware modifications you've made through overclocking and under(over?)volting.

https://support.arcgames.com/hc/en-us/requests/new

Or you can send an email to:

info@gunfiregames.com

Both of these were easily found by google searching "Remnant 2 customer support"


Edit:
One final question:
How much RAM (not Vram, normal PC RAM) do you have? More RAM seems to affect framerates, especially in the DLC so if you're playing with 8GB of DDR4 or something that may be the issue, your GPU and CPU aren't running at full because the RAM is bottlenecking your system. These areas may take a lot of RAM to load and may not load completely into VRAM, so a small amount of normal RAM will mean that areas have to constantly be reloaded, capping framerates off at a certain amount and this has been the solution to problems for other people.
Last edited by David Davidson; Mar 7, 2024 @ 7:28pm
Fat Cat Mar 7, 2024 @ 7:43pm 
Originally posted by David Davidson:

Wait, you overclock and undervolt? Never been much of an overclocker, though shouldn't you be overvolting your GPU if you want to get higher clock rates?

Edit:
One final question:
How much RAM (not Vram, normal PC RAM) do you have? More RAM seems to affect framerates, especially in the DLC so if you're playing with 8GB of DDR4 or something that may be the issue, your GPU and CPU aren't running at full because the RAM is bottlenecking your system. These areas may take a lot of RAM to load and may not load completely into VRAM, so a small amount of normal RAM will mean that areas have to constantly be reloaded, capping framerates off at a certain amount and this has been the solution to problems for other people.

There's a few misconceptions here.

You most certainly can undervolt a GPU while also overclocking (increasing it's clock speed). Depending on the quality of the silicon, some GPU's can have a very aggressive voltage offset, but still perform just as well, or even better. Most people who do this are trying to lower overall power usage, heat generation, and/or making their GPU as efficient as possible.

With that said, the opposite can also be done, where you increase the voltage of a GPU, it's clock speed and power draw. By doing this you can get a relatively modest increase to performance, but also drastically increasing power usage/temps. In most cases, the small increase to performance is usually not worth it, but some folks want their high scores in 3dmark.

RAM doesn't work the way you described. System ram is typically used to store information in super quick storage, so the CPU can access said information quicker. How that relates to gaming, is a bit different. Game textures are loaded onto a GPU's VRAM. If your GPU has enough capacity to contain all the textures the game needs, then all is good. In the event, that the textures require more capacity, then your PC will load those textures into system memory. The problem with this is that system ram is vastly slower compared to VRAM. This usually leads to textures popping in and out in the game, "stutters" and a large gap between avg FPS and the 1% lows.

Now for the op, he is using a RX6800xt which has 16gb of VRAM. That is more than enough since they play at 1080p. Even without upscaling the game only uses about 8gb of VRAM on 1080p. So the issue with his performance most likely lies in somewhere else.
David Davidson Mar 7, 2024 @ 8:39pm 
Originally posted by lumiereeclair:
Originally posted by David Davidson:

Wait, you overclock and undervolt? Never been much of an overclocker, though shouldn't you be overvolting your GPU if you want to get higher clock rates?

Edit:
One final question:
How much RAM (not Vram, normal PC RAM) do you have? More RAM seems to affect framerates, especially in the DLC so if you're playing with 8GB of DDR4 or something that may be the issue, your GPU and CPU aren't running at full because the RAM is bottlenecking your system. These areas may take a lot of RAM to load and may not load completely into VRAM, so a small amount of normal RAM will mean that areas have to constantly be reloaded, capping framerates off at a certain amount and this has been the solution to problems for other people.

There's a few misconceptions here.

You most certainly can undervolt a GPU while also overclocking (increasing it's clock speed). Depending on the quality of the silicon, some GPU's can have a very aggressive voltage offset, but still perform just as well, or even better. Most people who do this are trying to lower overall power usage, heat generation, and/or making their GPU as efficient as possible.

With that said, the opposite can also be done, where you increase the voltage of a GPU, it's clock speed and power draw. By doing this you can get a relatively modest increase to performance, but also drastically increasing power usage/temps. In most cases, the small increase to performance is usually not worth it, but some folks want their high scores in 3dmark.

RAM doesn't work the way you described. System ram is typically used to store information in super quick storage, so the CPU can access said information quicker. How that relates to gaming, is a bit different. Game textures are loaded onto a GPU's VRAM. If your GPU has enough capacity to contain all the textures the game needs, then all is good. In the event, that the textures require more capacity, then your PC will load those textures into system memory. The problem with this is that system ram is vastly slower compared to VRAM. This usually leads to textures popping in and out in the game, "stutters" and a large gap between avg FPS and the 1% lows.

Now for the op, he is using a RX6800xt which has 16gb of VRAM. That is more than enough since they play at 1080p. Even without upscaling the game only uses about 8gb of VRAM on 1080p. So the issue with his performance most likely lies in somewhere else.

1st point:
Thanks, my overclocking experience is from a long time ago, before CPUs and GPUs had variable clock speeds, so you needed to overvolt to overclock as the older clocks required increased voltage to increase clock resonance, I never really got into overclocking either, so I'm definitely not experienced in modern overclocking. I appreciate you correcting me on this, as an e-eng I should know better, though I usually work (well, currently tinker more than anything) with 'fixed' clocks which drift up or down with voltage input.
I'd imagine a modern variable clock CPU adjusts clock voltage to adjust clock speed unless an entirely new clock system is used (which if it is, please link me something that I can learn from, I'd like that) though this may not be tied to CPU voltage as it used to be if the tech is different.

2nd point:
Regarding RAM, depending on how a game is optimized, certain things will be loaded into system RAM and other things into VRAM with system RAM being the middle ground for things going to VRAM depending on how well optimized a game is. If a game needs to compute a larger level, with potential spawns, their locations and their AI systems this may be put into system RAM as AI systems aren't (usually) processed by VRAM but by the CPU.
Also anecdotally people have said that simply upgrading their system RAM has increased framerates in the Awakened King DLC areas of Rem2:

Originally posted by issam244:
So I just had payday yesterday n went out to grab a second stick of RAM today... totally fixed my fps issues in the forlorn coast xD
Thread

In this case RAM was the bottleneck, or at least the problem, if a game is poorly optimized, bottlenecks are more apparent, so it could be that OP is using a very modern GPU, CPU and about 4-12GB of system RAM and assuming "GPU RAM will make up for my system RAM problems", maybe on top of other programs running another monitor, ones which aren't GPU intensive, like a web browser, which is almost necessary to play R2 without throwing hundreds of hours into solving somewhat obscure puzzles.

Though I do stand behind 30-60fps being a fine framerate unless you're doing esports, where 5% is a difference between a win and a loss.
WESTG4M3R4L1F3 Mar 8, 2024 @ 6:15am 
CPU AT 250% AND GPU AT 300%
Fat Cat Mar 8, 2024 @ 2:26pm 
Originally posted by David Davidson:

1st point:
Thanks, my overclocking experience is from a long time ago, before CPUs and GPUs had variable clock speeds, so you needed to overvolt to overclock as the older clocks required increased voltage to increase clock resonance, I never really got into overclocking either, so I'm definitely not experienced in modern overclocking. I appreciate you correcting me on this, as an e-eng I should know better, though I usually work (well, currently tinker more than anything) with 'fixed' clocks which drift up or down with voltage input.
I'd imagine a modern variable clock CPU adjusts clock voltage to adjust clock speed unless an entirely new clock system is used (which if it is, please link me something that I can learn from, I'd like that) though this may not be tied to CPU voltage as it used to be if the tech is different.

My experience with overclocking/undervolting is from personal experience doing it with my current GPU, RX6700xt and CPU Ryzen 5600X. Modern CPU's have a base clock speed and a variable boost clock speed. There is also an advertised max boost clock speed, but it's misleading, because that clock speed can only be reached if only a single core is being utilized. Many users get confused when they run an all core workload and their clock speeds is much lower than the advertised max boost clock speed. When all cores are under load, their core clocks won't boost as high, otherwise, it would generate wayyyy too much heat from the power consumption and not even a 420mm AIO liquid cooler will be able to dissipate the heat fast enough before the CPU thermal throttles.
Modern GPU's work the same way, with a base clock speed and a max boost clock speed. That will vary with the GPU load, and temps. The voltage does vary depending on the load, so that's how clock speeds get adjusted. It's also why the whole process of overclocking/undervolting can seem overwhelming to most users. Measuring the maximum voltage under a 100% load and then doing experiments in incremental steps is not for everyone.
Stiopper Mar 22, 2024 @ 8:12am 
I have the same problem on my old PC. GPU usage around 50% and low (35-50) FPS.

Gigabyte X99-UD4
Core i7-5820K
16gb RAM
RTX 2080ti
Stock firmware, latest drivers. No problems with other games (i.e. Cyberpunk 2077, Elden Ring) - 98% GPU usage, max possible framerate.


But in Remnant II:

On character select screen everything seems fine - FPS about 170, GPU usage 97%.

After I enter the game:

On 4k, high settings, DLSS balanced - FPS stays around 40, GPU usage around 50-60%.

On 4k, ultra settings, DLSS quality, detailed shadows - FPS stays the same, around 40, GPU usage raises up to 85%.

On 720p, potato settings, DLSS performance - FPS raises a bit, up to 50, GPU usage drops to 20%.

No matter what I do, FPS stays low. GPU usage varies depending on graphics settings, still never full. DLSS/FSR/Xess or none - almost no difference.

Unlimited framerate, vsync disabled. Also tried turning on and off all other unrelated options like NVidia latency, etc.

CPU usage stays around 50%. Can the CPU still be the bottleneck? By not having necessary instructions/features or whatever?

I can also try to switch GPUs between my old and my new PCs and check the results but I'm too lazy to do it atm.

P.S. Don't need a lecture on how human eye can't see more than 30 FPS. Just want a technical problem fixed. :)
Last edited by Stiopper; Mar 22, 2024 @ 8:31am
< >
Showing 1-15 of 24 comments
Per page: 1530 50

Date Posted: Mar 7, 2024 @ 8:19am
Posts: 24