The Witcher 3: Wild Hunt

The Witcher 3: Wild Hunt

Ver estadísticas:
sbally30 22 DIC 2022 a las 10:46 a. m.
HOTFIX Pt-2 finally getting 99% GPU Usage.
When the NG game was released i had low GPU Usage with High temps at 82c, Hotfix 1 brought the temps down to 78c with 56-86% GPU Usage in DX12. Now with Hotfix-2 my GPU usage is 100% but my temps are at 68-72c. CPU Usage with new Hotfix is 2%. I dont know what to make of this? FPS between Hotfix-1/2 are exactly the same, Lower then before the Hotfix...
< >
Mostrando 31-45 de 60 comentarios
בתאניה 25 DIC 2022 a las 9:41 p. m. 
Publicado originalmente por wingback18:
Publicado originalmente por בתאניה:
The second hotfix broke RT with my AMD GPU, it went from this: https://www.youtube.com/watch?v=JIJbBDCwCZ4 to unplayable at the same settings.. now I have to say goodbye to RT, lol ..fun while it lasted.

7700X
32 Gigs DDR5 6600
6950XT Stock OCed with +20% ont he power slider

at what resolution are you playing ?
mine doesnt break 30 without FSR ?

3440x1440 ...but sadly, as I said, it's not letting me play at those frames anymore since the last update
Tech Enthusiast 25 DIC 2022 a las 9:56 p. m. 
Publicado originalmente por sbally30:
This is strange. I decided to test out RTX On & i'm getting max 80FPS 1440p with reflections off, 99% GPU Usage but with temps at 74c now. It does seem this new Hotfix has increased FPS with Raytracing-ON. With it off there is no difference. (haven't tested DX11)

To be expected, because the RT Features pushed peoples CPU into their limits.
Now with slightly reduced Calls (read, they dumped down the settings a little), CPUs no longer run into the bottleneck and GPUs can now be the bottleneck.

People seem to hate CPU Bottlenecks and love GPU bottlenecks... no idea why, but that is basically most of the crying on the forums. One of them WILL be the limiting factor, nothing anyone could do about that unless you have enough performance to run into... the engine bottleneck. ;-)
sbally30 26 DIC 2022 a las 2:06 p. m. 
Publicado originalmente por Tech Enthusiast:
Publicado originalmente por sbally30:
This is strange. I decided to test out RTX On & i'm getting max 80FPS 1440p with reflections off, 99% GPU Usage but with temps at 74c now. It does seem this new Hotfix has increased FPS with Raytracing-ON. With it off there is no difference. (haven't tested DX11)

To be expected, because the RT Features pushed peoples CPU into their limits.
Now with slightly reduced Calls (read, they dumped down the settings a little), CPUs no longer run into the bottleneck and GPUs can now be the bottleneck.

People seem to hate CPU Bottlenecks and love GPU bottlenecks... no idea why, but that is basically most of the crying on the forums. One of them WILL be the limiting factor, nothing anyone could do about that unless you have enough performance to run into... the engine bottleneck. ;-)

You don't get it? Game is poorly optimized. Instead of venting your issues on the community, why not let the developers know your frustration?
Game runs exactly the same as before just less FPS after hotfix, GPU Usage max 86% hovers in low 70% only in a strange field did GPU hit 100% at the same time CPU from before the hotfix to now has always used under 07% hovering at 02-03% most of the time.

The reason people hate CPU bottlenecks is only when the GPU is not being fully utilized correctly, Example a game using 100% CPU usage with only 50% GPU means there is something wrong esp if you can't hit a target FPS, lets hope you can now understand your pointless post. LOL
Tech Enthusiast 26 DIC 2022 a las 2:14 p. m. 
Publicado originalmente por sbally30:
snip
You sure, you are quoting the correct person?
There is no frustration on my part. Running the game maxed out without any issues.

But why is a 50% bottlenecked GPU worse than a 50% bottlenecked CPU?
Both need to do their work for a target Framerate. If the CPU hold the GPU back. You get less Frames. If the GPU hold the CPU back, you also get less frames,... you need both. *shrug*
Equation™ 26 DIC 2022 a las 2:30 p. m. 
Publicado originalmente por Tech Enthusiast:
Publicado originalmente por sbally30:
snip
You sure, you are quoting the correct person?
There is no frustration on my part. Running the game maxed out without any issues.

But why is a 50% bottlenecked GPU worse than a 50% bottlenecked CPU?
Both need to do their work for a target Framerate. If the CPU hold the GPU back. You get less Frames. If the GPU hold the CPU back, you also get less frames,... you need both. *shrug*

Huh...? but if the game is under optimised and somehow only using 50 - 60% of your GPU... how is it then the "GPU that needs to do their work" as you put it. The power of the GPU is there but the poorly coded optimisation has meant it isn't fully utilising the graphics card (or CPU)

For example, I run a 3080 and an i9 10900kf. Whilst playing this game, it is only using 60% of my GPU and a very small 2% of my CPU (I'll let it off with that as not many games need much usage of my CPU) but this in turn is giving me a much lower performance than I should be seeing.
You say you are running the game maxed out without any issues... So with that, you must be running it 4K at Ultra+RT, all ray-tracing settings maxed etc. You are getting a constant v synced 60fps, no stutters or crashes. What hardware are you running this on good Sir?
Tech Enthusiast 26 DIC 2022 a las 2:41 p. m. 
Publicado originalmente por Equation™:
Publicado originalmente por Tech Enthusiast:
You sure, you are quoting the correct person?
There is no frustration on my part. Running the game maxed out without any issues.

But why is a 50% bottlenecked GPU worse than a 50% bottlenecked CPU?
Both need to do their work for a target Framerate. If the CPU hold the GPU back. You get less Frames. If the GPU hold the CPU back, you also get less frames,... you need both. *shrug*

Huh...? but if the game is under optimised and somehow only using 50 - 60% of your GPU... how is it then the "GPU that needs to do their work" as you put it. The power of the GPU is there but the poorly coded optimisation has meant it isn't fully utilising the graphics card (or CPU)

For example, I run a 3080 and an i9 10900kf. Whilst playing this game, it is only using 60% of my GPU and a very small 2% of my CPU (I'll let it off with that as not many games need much usage of my CPU) but this in turn is giving me a much lower performance than I should be seeing.
You say you are running the game maxed out without any issues... So with that, you must be running it 4K at Ultra+RT, all ray-tracing settings maxed etc. You are getting a constant v synced 60fps, no stutters or crashes. What hardware are you running this on good Sir?

Your stated usage does not make sense, and I would not buy it without seeing a screenshot of it to be frank.
The game does not "use" your Hardware AT ALL. Code in gaming is NOT build around using your hardware. You can adjust some calls to be easier on hardware or not, but developers have zero impact on your GPU running at 60%.

You can only have these following situations:
* GPU and CPU are sub 100% (CPU being EACH CORE, not overall!) - only possible with a FPS cap that does not need all your hardware
* GPU is at 100%, CPU not: Only possibly if a game wants more work from your GPU than from your CPU and your CPU has to wait for your GPU to finish its work.
* CPU is at 100% (or at least one core), GPU not: Only possible if the game wants more work from your CPU than it can deliver and your GPU has to wait for the CPU to finish.
* Bonus Edgecase: The Game Engine is at ITS limit,... this can lead to sub 100% usage as well, if you put more hardware at work, then needed to reach that limit. You can do this with like 480p resolution usually. Or 720p even on medium grade GPUs / CPUs.

So if you only have a 2% CPU usage and 60% GPU usage, I am sure the developers would LOVE a screenshot and explanation, because that should not happen, unless you set like a 10fps limit or run a quantum PC from IBM.

As for my Specs: 5950X, UVed at 3400 base clocks and 4090 UVed to 0.85v@2400MHz. Both set to save energy / heat / noise, since the game does not need the full blast to reach 60fps.
sbally30 26 DIC 2022 a las 4:17 p. m. 
Publicado originalmente por Tech Enthusiast:
Publicado originalmente por sbally30:
snip
You sure, you are quoting the correct person?
There is no frustration on my part. Running the game maxed out without any issues.

But why is a 50% bottlenecked GPU worse than a 50% bottlenecked CPU?
Both need to do their work for a target Framerate. If the CPU hold the GPU back. You get less Frames. If the GPU hold the CPU back, you also get less frames,... you need both. *shrug*

GPUs process information faster than CPUs, a CPU core is a general purpose computational unit that can execute dozens of different tasks, while a GPU was designed to accelerate graphics rendering. When you factor in how many applications or background tasks a CPU is running while you game, why would you ever want your CPU usage to be more than the GPU. Once a CPU reaches 100% usage everything will slow down, It may even start to store system data on HDD slowing your PC down even further, but if your GPU reaches 100% your PC is still operational.
Consider the die size of a CPU vs a GPU? GPUs are designed to withstand heat better than CPUs, they are designed to be at 100% usage for days, while a CPU would most likely see degradation under that much heat/time, granted it will most likely throttle itself before that happens, but the CPU is still more susceptible to degradation than a GPU.
Última edición por sbally30; 26 DIC 2022 a las 6:13 p. m.
sbally30 26 DIC 2022 a las 5:40 p. m. 
Publicado originalmente por Tech Enthusiast:
Your stated usage does not make sense, and I would not buy it without seeing a screenshot of it to be frank.
The game does not "use" your Hardware AT ALL. Code in gaming is NOT build around using your hardware. You can adjust some calls to be easier on hardware or not, but developers have zero impact on your GPU running at 60%.

You can only have these following situations:
* GPU and CPU are sub 100% (CPU being EACH CORE, not overall!) - only possible with a FPS cap that does not need all your hardware
* GPU is at 100%, CPU not: Only possibly if a game wants more work from your GPU than from your CPU and your CPU has to wait for your GPU to finish its work.
* CPU is at 100% (or at least one core), GPU not: Only possible if the game wants more work from your CPU than it can deliver and your GPU has to wait for the CPU to finish.
* Bonus Edgecase: The Game Engine is at ITS limit,... this can lead to sub 100% usage as well, if you put more hardware at work, then needed to reach that limit. You can do this with like 480p resolution usually. Or 720p even on medium grade GPUs / CPUs.

So if you only have a 2% CPU usage and 60% GPU usage, I am sure the developers would LOVE a screenshot and explanation, because that should not happen, unless you set like a 10fps limit or run a quantum PC from IBM.

As for my Specs: 5950X, UVed at 3400 base clocks and 4090 UVed to 0.85v@2400MHz. Both set to save energy / heat / noise, since the game does not need the full blast to reach 60fps.

Witcher 3 NG is only using 2 cores? making all CPUs for over a decade with 6,12 etc,etc, useless, How do you expect the community to not blame the developers when CDPR choose to use ray tracing + DX12+DLSS+FSR in its remastered version of Witcher 3 yet didn't think to code the game to use more than 2 cores?, They know full well that the GPU would have to wait for the CPU causing everyone's GPU to not be used to its full potential, Basically Crippling the GPU in a waiting game with the CPU while its byproduct tanks your frame rate. 

This in turn helps promote Frame generation as the solution,giving consumers the choice of either buying a 40 series GPU for stable frame rate or live with the mediocre performance.
So yes I do blame the developers for promoting a stupidly priced 4090 GPU as the only way to play this game at higher FPS with RT-ON.

Cyberpunk made CDPR lose 75% of its shares, Look how many years was spent on developing this free update, imagine the costs. game developers/Nvidia/AMD work together to optimize games thats why you get an update for some of your games from nvidia/AMD on release date.
 
the simple facts are? after updating to latest driver No matter what resolution I use, be it 1080p,1440p,4K I can't get a stable 60FPS with ray tracing-ON It's always 55-57FPS Some areas I do get higher frame rate for a limited spell, but mostly it's ehh. Now was this deliberate! I think so, because it's a free update CDPR can get away with it & they are counting on that, Most likely Nvidia made them an offer & the rest is history. Hopefully in a few months time the game will get fixed. depending on how long the contract is.. LOL
Última edición por sbally30; 26 DIC 2022 a las 6:17 p. m.
Tech Enthusiast 26 DIC 2022 a las 6:46 p. m. 
Publicado originalmente por sbally30:
Witcher 3 NG is only using 2 cores? making all CPUs for over a decade with 6,12 etc,etc, useless, How do you expect the community to not blame the developers when CDPR choose to use ray tracing + DX12+DLSS+FSR in its remastered version of Witcher 3 yet didn't think to code the game to use more than 2 cores?,

Yet more false informations.
The game uses all cores. It just uses a few (more than 2) much more. Which is kind of a RT thing you can't just fix by waving a magic wand. Some stuff just runs in serial and nothing can be done about it. They can let everything else run on differend threads, but if that one thing that needs to run in serial is not done, the rest will have to wait.

But the key takeaway here:
Why do you keep making false accusations that can be proven wrong with just a 10 second google search and tens of thousands of screenshots, showing 6, 8, 12, 16 and even 32 core systems showing all cores used from 20-60%, with a few more nearing 100%.
I mean, if you want to mindlessly bash, at least put in the effort to make sure your "argument" is not invalidated within 10 seconds by everyone that can type in google.
Última edición por Tech Enthusiast; 26 DIC 2022 a las 7:06 p. m.
n050up4u 26 DIC 2022 a las 7:07 p. m. 
Publicado originalmente por Maverick81PL:
Publicado originalmente por ScienceDiscoverer:
Guys, just mod your GPU to use high end CPU cooler. You will never get temp more than 55 deg under 100% constant load.
modern GPU have good colling, my 2070S have max 75 temp on 100% load ;-)

Well.. modern nvidia.. I kept reading articles about radeon 7000 GPUs running consistently above 90c "by design" sad stuff..

For the other folks. with hairworks off (it looks messed up in certain lighting scenarios) I'm averaging 75% or more for GPU usage and running around in forests FPS are consistently above 90 with all the rays being traced.

GPU isnt going above 67 though. 4090 TUF OC GPU clock seems to stay locked at 2730.
playing on a ryzen 5900x and CPU temp isnt going above 40c
sbally30 26 DIC 2022 a las 9:13 p. m. 
Publicado originalmente por Tech Enthusiast:
Publicado originalmente por sbally30:
Witcher 3 NG is only using 2 cores? making all CPUs for over a decade with 6,12 etc,etc, useless, How do you expect the community to not blame the developers when CDPR choose to use ray tracing + DX12+DLSS+FSR in its remastered version of Witcher 3 yet didn't think to code the game to use more than 2 cores?,

Yet more false informations.
The game uses all cores. It just uses a few (more than 2) much more. Which is kind of a RT thing you can't just fix by waving a magic wand. Some stuff just runs in serial and nothing can be done about it. They can let everything else run on differend threads, but if that one thing that needs to run in serial is not done, the rest will have to wait.

But the key takeaway here:
Why do you keep making false accusations that can be proven wrong with just a 10 second google search and tens of thousands of screenshots, showing 6, 8, 12, 16 and even 32 core systems showing all cores used from 20-60%, with a few more nearing 100%.
I mean, if you want to mindlessly bash, at least put in the effort to make sure your "argument" is not invalidated within 10 seconds by everyone that can type in google.
The game is single core intensive, it uses other cores minimally, how is that false information, I can only speak on my own experience in which I'm getting 3% CPU usage in this supposed CPU intensive game on my 5800x CPU. even without RT-ON it's still 3%. Do you want me to send a pic to CDPR also? Apparently they love it! LOL
All your doing is nitpicking & you still don't understand,

Cyberpunk has native DX12 support, also does not RT enabled use all cores productively in Cyberpunk so the Red Engine can do DX12 natively, Guess no wand is needed just hard work which CDPR failed miserably at, they choose the lazy route, over 2 years for this update where DX12 was mashed over DX11.
When your GPU usage is below 90% it's because of a bottleneck from the CPU but at 70% GPU Usage. Yes, it's a sign that something is very wrong. The culprit is most likely due to inaccurate CPU core usage. In my case at 4K as well as so many others have reported on other forums CPU usage is very low gaming at 4K in this NG update.

FPS between 4K,1440p DLSS Ultra performance vs Ultra FPS remain the same meaning that the game isn't using the CPU correctly, which is the bottleneck.
GOOgLE THAT. LOL
Última edición por sbally30; 26 DIC 2022 a las 9:35 p. m.
Tech Enthusiast 27 DIC 2022 a las 2:20 a. m. 
Publicado originalmente por sbally30:
The game is single core intensive, it uses other cores minimally, how is that false information,

Single core intensive -> not false. But true for every RT game and won't change (before we get quantum computing in our home systems).
"other cores minimally" -> flat out lie. 20-60% is not "minimally", but pretty much as much as 9/10 games do at best, if at all.

W3 is doing an amazing MC job, compared to the average game out there. Very few games do better.

Could they do better: Probably.
But your initial statement was: "They only use two cores", which is flat out false. Then you walked it back "they only use more cores minimally", which is also false.

If you have like 2% usage, you really should send in proof. Since all the proof I have seen does not support your claims at all. No need to post it here. We can't do ♥♥♥♥ about it, if you are running in an edge case bug or whatever. Send it in to the developers and add all your specs and settings, so they can try and reproduce it.
Robbiee 27 DIC 2022 a las 2:36 a. m. 
Publicado originalmente por sbally30:
5800x has the worst temps even watercooled, I get 76c at PC startup down & stable 55c, but once i open google it shoots upto 80c then back down.
lol.... blame it on the cpu but it is by no means a user error. I got a 5800x and temps are not this bad. You should re sit the cpu make sure thermal paste is applied properly and make sure your cooler is a good one.
sbally30 27 DIC 2022 a las 3:24 a. m. 
Publicado originalmente por Robbiee:
Publicado originalmente por sbally30:
5800x has the worst temps even watercooled, I get 76c at PC startup down & stable 55c, but once i open google it shoots upto 80c then back down.
lol.... blame it on the cpu but it is by no means a user error. I got a 5800x and temps are not this bad. You should re sit the cpu make sure thermal paste is applied properly and make sure your cooler is a good one.
Blame what on the CPU? Also this is normal behavvior for 5800x CPUs.
sbally30 27 DIC 2022 a las 3:58 a. m. 
Publicado originalmente por Tech Enthusiast:
Publicado originalmente por sbally30:
The game is single core intensive, it uses other cores minimally, how is that false information,

Single core intensive -> not false. But true for every RT game and won't change (before we get quantum computing in our home systems).
"other cores minimally" -> flat out lie. 20-60% is not "minimally", but pretty much as much as 9/10 games do at best, if at all.

W3 is doing an amazing MC job, compared to the average game out there. Very few games do better.

Could they do better: Probably.
But your initial statement was: "They only use two cores", which is flat out false. Then you walked it back "they only use more cores minimally", which is also false.

If you have like 2% usage, you really should send in proof. Since all the proof I have seen does not support your claims at all. No need to post it here. We can't do ♥♥♥♥ about it, if you are running in an edge case bug or whatever. Send it in to the developers and add all your specs and settings, so they can try and reproduce it.

Witcher 3 is single thread intensive so where are you seeing 60% or even 100% CPU usage in witcher 3 NG at 4K? wouldn't mind seeing this data for myself.

In my case & many others on numerous forums we are seeing less then 10% CPU usage mostly hovering around 3-7% with RT off or ON. HWInfo confirmed the CPU only using 2 cores.
In case you missed the numerous comments on this thread as well as so many others, here's the link to others posts.


https://steamcommunity.com/app/292030/discussions/0/3731826212312414463/

https://www.overclock.net/threads/witcher-3-next-gen-update-cpu-utilization-low.1802758/

https://www.reddit.com/r/Witcher3/comments/zllc1l/the_witcher_3_next_gen_patch_running_on_a_ryzen_9/

my video at 4k using RTSS, I dont make sxxx up & i am not a liar...
https://www.youtube.com/watch?v=KutijilgxKI
Última edición por sbally30; 27 DIC 2022 a las 7:26 a. m.
< >
Mostrando 31-45 de 60 comentarios
Por página: 1530 50

Publicado el: 22 DIC 2022 a las 10:46 a. m.
Mensajes: 60