RTX 3070 ti not at 100% utilization when playing games.
I recently just bought the EVGA XC3 Ultra Rtx 3070 ti and it honestly is working just fine for the most part. My games reach 60 fps at 4k, high settings (with the occasional dip to 50 or lower on some games). However I noticed that the card is sometimes at 50-80% utilization whenever I play certain games.
Halo Infinite uses about 80% on the main menu but in-game it uses about 60%.
Fallout 4 and Elder Scrolls: Skyrim use 50-60%.
Halo Reach on the MCC uses about 30-40%.
The only game that actually uses 90-100% of the card is Rise of the Tomb Raider, and even then on some areas it dips to 80%.
The card wasn't overheating or anything, it reached about 70°C on full load. I suspect it to be the power supply, because I have a Seasonic S12III-650, and the card's min. power supply recommendation is 750 watts.
I'm not sure if it's a problem because I swapped my previous gpu (GT 1030) with this one without properly deleting all of the leftover drivers, or it's the 3070 itself. I just want to know if this is normal or not.
Here are my system specs:
i7-10700f
OLOy DDR4 RAM 32GB (4x8GB) Warhawk
Asus Z490M-Plus
< >
Exibindo comentários 1630 de 30
In addition, while there are reasons you may want to disable v-sync (benchmarking, lower input latency/delay, or just increased smoothness perception), going above your frame rate won't get you any actual additional frames, and you may experience screen tearing with no form of sync enabled.
Supafly 12/fev./2022 às 4:40 
Escrito originalmente por Illusion of Progress:
In addition, while there are reasons you may want to disable v-sync (benchmarking, lower input latency/delay, or just increased smoothness perception), going above your frame rate won't get you any actual additional frames, and you may experience screen tearing with no form of sync enabled.
You'd get extra frames rendered just no extra's displayed. For competitive it can be good because the newer frame is more recent where as the previous frame was older and gets dropped
ZeekAncient 12/fev./2022 às 10:54 
Escrito originalmente por 🤪 Pocahawtness:
I don't know how it measures % but certainly much of the RTX series are dedicated to Ray Tracing which may mean that normal games will never reach 100%. That would explain why Tomb Raider gets closer - that does use Ray Tracing I believe.

Wow. This has literally nothing to do with GPU usage. RTX GPUs are capable of raytracing, just like AMD RX series, but you don't have to use ray tracing. And with ray tracing off, the GPUs operate the same way that GPUs have always operated. In fact, with ray tracing on even.

The reason he was seeing 100% GPU usage in Tomb Raider, even with Vsync on, is because it is a demanding game, especially at 4K. So getting 60 FPS in Tomb Raider, epecially at 4K, meant needing all that the GPU had. Hence 100% GPU usage. The other games were not as demanding. So getting and maintaining 60 FPS, didn't require 100% of the GPU. However, you remove Vsync, which was capping the games at 60, and viola! 100% GPU usage. The FPS will now go as high as they can go for that particular game.

Trust me, I don't care how demanding the game is. If you do not have a frame cap on, and are allowing a game to give you the max frames it can, you should be getting 100% GPU usage. Unless of course you run into to CPU limitations or bottlenecking. But at 4K, with a 10700F, not likely.
Última edição por ZeekAncient; 12/fev./2022 às 11:43
ZeekAncient 12/fev./2022 às 11:15 
Escrito originalmente por Supafly:
Escrito originalmente por Illusion of Progress:
In addition, while there are reasons you may want to disable v-sync (benchmarking, lower input latency/delay, or just increased smoothness perception), going above your frame rate won't get you any actual additional frames, and you may experience screen tearing with no form of sync enabled.
You'd get extra frames rendered just no extra's displayed. For competitive it can be good because the newer frame is more recent where as the previous frame was older and gets dropped

Exactly. While there is no point really to playing at 120FPS with a 60Hz monitor, you can and people do. You will still get 120FPS but you may experience screen tearing. However, that 120FPS will not be as smooth as if you were playing on a 120Hz monitor. Even if it says 120 FPS, the experience is still essentially a 60 FPS experience. But some people just can't stand Vsync, and need it off, even if they don't have Gsync or Freesync. Not me. On a 60Hz display, Vsync is coming on and I have a constant 60 FPS game experience. No more, no less. No tearing, and if the game is constant at 60 FPS, input lag and stutter can be minimized.

However, no matter what refresh rate you play at, you do not want to have Vsync, or any type of frame cap on, when running benchmarks. You want your GPU to be running at 100% so you see what it is capable of. So even if I may play with Vsync in game, if the game has a benchmark, I will turn Vsnyc off before running that benchmark. Because if the game is capable of more than your monitors refresh rate, alls the benchmark will tell you for average frame rate is your monitors refresh rate.

Also, adaptive sync technologies, like Gsync and Freesync, were created for high refresh rate monitors like 120Hz or above. The point obviously was to not have the stuttering or input lag you might get with vsync, but also to eliminate the tearing and make the experience more smooth when the FPS dropped below your monitors refresh rate. Because with Gsync for example, and a 144Hz monitor lets say, even if your GPU is not capable of rendering 144 FPS all the time in every game, dipping down to 100 FPS, 75 FPS, 60 FPS, or even 45 FPS, will be a much smoother experience than if you didn't have Gsync on. And not every GPU is capable of maintaining those high fps all the time, so if a lot people that want to play on a high refresh rate monitor, adaptive sync techs are a must and a god send.

I know if I am not playing on a Gsync, or Freesync, capable display then I need to have Vsync on. And I know that people, even on displays without adaptive sync, will play without Vsync, I just can't do it. One, I hate screen tearing, and two, on that type of display, no matter what the refresh rate is, I will cap the FPS at 60. It will keep it a nice constant smooth experience with no tearing and if your game is able to maintain 60 FPS all the time, stutter and input lag will be minimized even with Vsync. But nowadays, if your display doesn't have adaptive sync, like Gsync or Freesync, then it is most likely 60Hz, not more. Most displays, even TVs, that are 120Hz or more, have either Gsync and/or Freesync nowadays.
Última edição por ZeekAncient; 12/fev./2022 às 11:18
Escrito originalmente por Supafly:
Escrito originalmente por Illusion of Progress:
In addition, while there are reasons you may want to disable v-sync (benchmarking, lower input latency/delay, or just increased smoothness perception), going above your frame rate won't get you any actual additional frames, and you may experience screen tearing with no form of sync enabled.
You'd get extra frames rendered just no extra's displayed. For competitive it can be good because the newer frame is more recent where as the previous frame was older and gets dropped
Right, the frames are still processed by the GPU, but a display can only show up to its refresh rate.

OP seemingly thought the GPU wasn't "performing to capacity" and that v-sync was a "limiter" of sorts, so I wanted to explain more what toggling that actually did since OP didn't have a "problem" to begin with because looking at utilization is a wrong way to look at it since PCs are a mix of parts with the load on all of those parts varying and not being equal nor constant, yet some people have this idea that any given part should always be 100% when under some load or else something is wrong.

There are indeed reasons to disable syncing your frame rate, but it can also bring screen tearing as the one notable drawback so I wanted to clarify that to OP.
ZeekAncient 12/fev./2022 às 15:55 
Escrito originalmente por Illusion of Progress:
Right, the frames are still processed by the GPU, but a display can only show up to its refresh rate.

OP seemingly thought the GPU wasn't "performing to capacity" and that v-sync was a "limiter" of sorts, so I wanted to explain more what toggling that actually did since OP didn't have a "problem" to begin with because looking at utilization is a wrong way to look at it since PCs are a mix of parts with the load on all of those parts varying and not being equal nor constant, yet some people have this idea that any given part should always be 100% when under some load or else something is wrong.

There are indeed reasons to disable syncing your frame rate, but it can also bring screen tearing as the one notable drawback so I wanted to clarify that to OP.

You are right but wrong at the same time. Vsync IS a limiter. See if you have a 60Hz monitor and put Vsync on, it will cap the FPS at 60. So, if it is not a demanding game, you will not see 100% GPU usage. Take Vsync off, no cap, poof GPU usage goes up to 100%

I have been gaming a long time. I have had a lot of GPUs. Now you are right that usage is not a good way to judge all computer components, but it is a way to judge GPUs.

Every single GPU I have had. If there is no cap or Vsync on, I will get full, 100% GPU usage. I will get the most FPS I can for those games. Every single game.

Now, there are a few exceptions where sometimes GPU usage will dip below 100%. These are in CPU limited(CPU Bottlenecks) situations. But my systems are very well balanced and these are very rare. But like I said, it will go from like 99% to 93%, something like that. But that is still pretty much full GPU usage.

That is the issue the OP was having. He had Vsync on and the games were capped at 60FPS. So he wasn't getting full GPU usage, so he thought something was wrong. But nothing was wrong, those games just didnt need 100%(or 99%) GPU usage to maintain 60 FPS.

But like I said. I always monitor everything. GPU usage is one of the things I have on my LCD screen on one of my keyboards. When my FPS is not capped, GPU will be at 99% GPU usage all the time. I cap the FPS or put Vsync on, and it depends on how demanding the game will be to maintain that FPS.

Only situations where GPU usage will not be at 100% with no cap or Vsync on, is when CPU bottlenecks are involved. But for me that is very, very rare.

So if someone has a "real" CPU bottleneck, like they have an old Intel Core 2 Duo hooked up with a 3080, I don't care if you have an FPS cap on, Vsync on, No FPS cap, Vsync, off, whatever, you will be lucky to get 50% GPU usage out of that GPU. But you will see CPU usage at 100% all the time. Probably fry the darn thing.
Última edição por ZeekAncient; 12/fev./2022 às 18:59
hawkeye 12/fev./2022 às 18:34 
Max fps is determined by the cpu. The gpu can only equal that fps or reduce it. Settings like vsync can limit fps.

As modern cpus are usually relatively stronger than gpus, you probably have a limiter like vsync.

FO4 abd SSE have inbuilt vsync which is enabled if ipresentinterval = 1 in the ini files. Turn it off by setting it to 0 and cap the fps if necessary in the control panel for better performance (not higher fps). Uncapped, anything faster than 60 fps needs a mod to stop the game from misbehaving.
Última edição por hawkeye; 12/fev./2022 às 18:35
ZeekAncient 12/fev./2022 às 19:13 
You want to see some CPU Bottlenecks. Check out this video. On most of the games there is no issue. 99-100% GPU usage whether you are playing at 1080, 1440, or 4K. But on certain CPU intensive games like Mafia, Flight Simulator, GTA 5 to some extent, Far Cry 5, etc, even a 10700K bottlenecks the 3080 at 1080p. You are not seeing 99-100% GPU usage. Even at 1440p in some cases. But at 4K, 99-100% GPU usage across the board.

https://www.youtube.com/watch?v=yH2nX4giIc4&t=1039s
Última edição por ZeekAncient; 12/fev./2022 às 19:18
Pocahawtness 12/fev./2022 às 22:34 
Escrito originalmente por ZeekAncient:

Wow. This has literally nothing to do with GPU usage. RTX GPUs are capable of raytracing, just like AMD RX series, but you don't have to use ray tracing. And with ray tracing off, the GPUs operate the same way that GPUs have always operated. In fact, with ray tracing on even.

The reason he was seeing 100% GPU usage in Tomb Raider, even with Vsync on, is because it is a demanding game, especially at 4K. So getting 60 FPS in Tomb Raider, epecially at 4K, meant needing all that the GPU had. Hence 100% GPU usage. The other games were not as demanding. So getting and maintaining 60 FPS, didn't require 100% of the GPU. However, you remove Vsync, which was capping the games at 60, and viola! 100% GPU usage. The FPS will now go as high as they can go for that particular game.

Trust me, I don't care how demanding the game is. If you do not have a frame cap on, and are allowing a game to give you the max frames it can, you should be getting 100% GPU usage. Unless of course you run into to CPU limitations or bottlenecking. But at 4K, with a 10700F, not likely.


I have a 3070ti and I have a 4K 144Hz monitor, and I can tell you Ray Tracing most certainly does have a lot to do with GPU usage. I see exactly the same thing as OP does. The GPU just will go nowhere near 100% unless I play something like Quake RTX.
It's nothing to do with vsync either. Firstly never use vsync and secondly in graphics intensive games it won't even achieve the top refresh rate of the monitor, and yet still the CPU and GPU are way under 100%. Introduce RT though and that % starts to climb.
I am not saying I know all the answers, but the situation seems to be more complicated than it appears. So no, I don't trust you! Lol.
Última edição por Pocahawtness; 12/fev./2022 às 22:38
ZeekAncient 12/fev./2022 às 22:39 
Escrito originalmente por 🤪 Pocahawtness:
Escrito originalmente por ZeekAncient:

Wow. This has literally nothing to do with GPU usage. RTX GPUs are capable of raytracing, just like AMD RX series, but you don't have to use ray tracing. And with ray tracing off, the GPUs operate the same way that GPUs have always operated. In fact, with ray tracing on even.

The reason he was seeing 100% GPU usage in Tomb Raider, even with Vsync on, is because it is a demanding game, especially at 4K. So getting 60 FPS in Tomb Raider, epecially at 4K, meant needing all that the GPU had. Hence 100% GPU usage. The other games were not as demanding. So getting and maintaining 60 FPS, didn't require 100% of the GPU. However, you remove Vsync, which was capping the games at 60, and viola! 100% GPU usage. The FPS will now go as high as they can go for that particular game.

Trust me, I don't care how demanding the game is. If you do not have a frame cap on, and are allowing a game to give you the max frames it can, you should be getting 100% GPU usage. Unless of course you run into to CPU limitations or bottlenecking. But at 4K, with a 10700F, not likely.


I have a 3070ti and I have a 4K 144Hz monitor, and I can tell you Ray Tracing most certainly does have a lot to do with GPU usage. I see exactly the same thing as OP does. The GPU just will go nowhere near 100% unless I play something like Quake RTX.
It's nothing to do with vsync either. Firstly never use vsync and secondly in graphics intensive games it won't even achieve the top refresh rate of the monitor, and yet still the CPU and GPU are way under 100%. Introduce RT though and that % starts to climb.

I have a 3070 Ti also. What CPU are you using and what resolution are you playing at? Because I can guarantee, that if Vsync is off, or there is no cap, then usage will be at 100%. It has nothing to do with Ray tracing. If you are not getting 100% GPU usage, then you are running into CPU bottleneck.

Watch this video:

https://www.youtube.com/watch?v=yH2nX4giIc4&t=1039s

Look at the GPU usage at the different resolutions. At 1080, the 10700K bottlenecks the 3080 in some games.

So, no offense, whether you trust me or not, you are completely wrong.

I monitor usage every single second of my gameplay. Have been doing it for years. If I have Vsync off, then usage is at 100%. I use a 3070 Ti. If this is not the case, then you are at 1080 and have CPU bottlenecks.

What resolution do you play at and what CPU do you have?
Última edição por ZeekAncient; 12/fev./2022 às 22:43
Pocahawtness 12/fev./2022 às 22:41 
Escrito originalmente por ZeekAncient:
Escrito originalmente por 🤪 Pocahawtness:


I have a 3070ti and I have a 4K 144Hz monitor, and I can tell you Ray Tracing most certainly does have a lot to do with GPU usage. I see exactly the same thing as OP does. The GPU just will go nowhere near 100% unless I play something like Quake RTX.
It's nothing to do with vsync either. Firstly never use vsync and secondly in graphics intensive games it won't even achieve the top refresh rate of the monitor, and yet still the CPU and GPU are way under 100%. Introduce RT though and that % starts to climb.

I have a 3070 Ti also. What CPU are you using and what resolution are you playing at? Because I can guarantee, that if Vsync is off, or there is no cap, then usage will be at 100%. It has nothing to do with Ray tracing. If you are not getting 100% GPU usage, then you are running into CPU bottleneck.

3840x1600.

I don't use vsync.

I have a 12700K and it is definitely not bottleneking.
Última edição por Pocahawtness; 12/fev./2022 às 22:42
ZeekAncient 12/fev./2022 às 22:44 
Escrito originalmente por 🤪 Pocahawtness:
Escrito originalmente por ZeekAncient:

I have a 3070 Ti also. What CPU are you using and what resolution are you playing at? Because I can guarantee, that if Vsync is off, or there is no cap, then usage will be at 100%. It has nothing to do with Ray tracing. If you are not getting 100% GPU usage, then you are running into CPU bottleneck.

3840x1600.

I don't use vsync.

I have a 12700K and it is definitely not bottleneking.

Then something else is wrong. Because without Vsync or an FPS cap, you should be getting 100% GPU usage. Trust me. Watch the Video. Either you aren't being honest or something is wrong.

And the OP has already stated, once he turned Vsync off, he got 100% GPU usage. So you must have an FPS cap on or something.

Because, no I don't trust you! (That is what you told me, so, lol)

Also, you said you had a 4K monitor. 3840 x 1600 is not 4K.

Also, don't misunderstand what I say. Ray tracing is demanding and definitely contributes to GPU usage, but what I mean is, a game doesn't have to have ray tracing or ray tracing enabled to have 100% GPU usage. I have a lot of games, like Halo: Infinite, that don't use ray tracing, and when Vsync is off, GPU usage is at 100%. In fact, every single one of my games will shoot up to 100% GPU usage with Vsync off.

Go to Nvidia Control Panel, and make sure that your power management mode is set to Prefer Maximum Performance. That might be what is causing an issue right there.

And just to be clear, CPU usage shouldn't be at 100%, but GPU usage should be. But if CPU usage is getting close to 100% and you aren't getting close to 100% GPU usage, then you are running into a bottleneck.
Última edição por ZeekAncient; 13/fev./2022 às 0:43
hawkeye 12/fev./2022 às 23:38 
Escrito originalmente por 🤪 Pocahawtness:
Escrito originalmente por ZeekAncient:

I have a 3070 Ti also. What CPU are you using and what resolution are you playing at? Because I can guarantee, that if Vsync is off, or there is no cap, then usage will be at 100%. It has nothing to do with Ray tracing. If you are not getting 100% GPU usage, then you are running into CPU bottleneck.

3840x1600.

I don't use vsync.

I have a 12700K and it is definitely not bottleneking.

How do you check to see if it is bottlenecking?
Escrito originalmente por ZeekAncient:
You are right but wrong at the same time. Vsync IS a limiter. See if you have a 60Hz monitor and put Vsync on, it will cap the FPS at 60. So, if it is not a demanding game, you will not see 100% GPU usage. Take Vsync off, no cap, poof GPU usage goes up to 100%
I'm well aware of that.

When I said "limiter" I meant it in the way I was describing. I got the feeling there was the possibility OP seems to think v-sync was a "reduce GPU potential" toggle because they claimed it fixed the issue, when v-sync is its own setting with its own purpose.
ZeekAncient 13/fev./2022 às 0:21 
Escrito originalmente por Illusion of Progress:
Escrito originalmente por ZeekAncient:
You are right but wrong at the same time. Vsync IS a limiter. See if you have a 60Hz monitor and put Vsync on, it will cap the FPS at 60. So, if it is not a demanding game, you will not see 100% GPU usage. Take Vsync off, no cap, poof GPU usage goes up to 100%
I'm well aware of that.

When I said "limiter" I meant it in the way I was describing. I got the feeling there was the possibility OP seems to think v-sync was a "reduce GPU potential" toggle because they claimed it fixed the issue, when v-sync is its own setting with its own purpose.

Honeslty, I kind of figured that is what you were trying to say. I was a little confused by how you wrote it so I wasn't sure how to take it, so I just left what I wrote the way it was. Because at one point I had changed my post, and said my bad. But then I just left it the way it was, lol. It's all good.

But what he meant by turning vsync off fixing the issue, is that he thought something was wrong when he wasn't getting 100% GPU usage. But that is because Vsync was limiting his FPS to 60. Tomb Raider was the only game demanding enough to need 100% usage at 60FPS. When he turned Vsync off, it fixed his issue in that he was now getting full GPU usage.

But I think you know that, lol, again confused by how you wrote it.
Última edição por ZeekAncient; 13/fev./2022 às 0:25
< >
Exibindo comentários 1630 de 30
Por página: 1530 50

Publicado em: 11/fev./2022 às 7:55
Mensagens: 30