Instalar Steam
iniciar sesión
|
idioma
简体中文 (Chino simplificado)
繁體中文 (Chino tradicional)
日本語 (Japonés)
한국어 (Coreano)
ไทย (Tailandés)
български (Búlgaro)
Čeština (Checo)
Dansk (Danés)
Deutsch (Alemán)
English (Inglés)
Español - España
Ελληνικά (Griego)
Français (Francés)
Italiano
Bahasa Indonesia (indonesio)
Magyar (Húngaro)
Nederlands (Holandés)
Norsk (Noruego)
Polski (Polaco)
Português (Portugués de Portugal)
Português - Brasil (Portugués - Brasil)
Română (Rumano)
Русский (Ruso)
Suomi (Finés)
Svenska (Sueco)
Türkçe (Turco)
Tiếng Việt (Vietnamita)
Українська (Ucraniano)
Informar de un error de traducción
3440x1440 ...but sadly, as I said, it's not letting me play at those frames anymore since the last update
To be expected, because the RT Features pushed peoples CPU into their limits.
Now with slightly reduced Calls (read, they dumped down the settings a little), CPUs no longer run into the bottleneck and GPUs can now be the bottleneck.
People seem to hate CPU Bottlenecks and love GPU bottlenecks... no idea why, but that is basically most of the crying on the forums. One of them WILL be the limiting factor, nothing anyone could do about that unless you have enough performance to run into... the engine bottleneck. ;-)
You don't get it? Game is poorly optimized. Instead of venting your issues on the community, why not let the developers know your frustration?
Game runs exactly the same as before just less FPS after hotfix, GPU Usage max 86% hovers in low 70% only in a strange field did GPU hit 100% at the same time CPU from before the hotfix to now has always used under 07% hovering at 02-03% most of the time.
The reason people hate CPU bottlenecks is only when the GPU is not being fully utilized correctly, Example a game using 100% CPU usage with only 50% GPU means there is something wrong esp if you can't hit a target FPS, lets hope you can now understand your pointless post. LOL
There is no frustration on my part. Running the game maxed out without any issues.
But why is a 50% bottlenecked GPU worse than a 50% bottlenecked CPU?
Both need to do their work for a target Framerate. If the CPU hold the GPU back. You get less Frames. If the GPU hold the CPU back, you also get less frames,... you need both. *shrug*
Huh...? but if the game is under optimised and somehow only using 50 - 60% of your GPU... how is it then the "GPU that needs to do their work" as you put it. The power of the GPU is there but the poorly coded optimisation has meant it isn't fully utilising the graphics card (or CPU)
For example, I run a 3080 and an i9 10900kf. Whilst playing this game, it is only using 60% of my GPU and a very small 2% of my CPU (I'll let it off with that as not many games need much usage of my CPU) but this in turn is giving me a much lower performance than I should be seeing.
You say you are running the game maxed out without any issues... So with that, you must be running it 4K at Ultra+RT, all ray-tracing settings maxed etc. You are getting a constant v synced 60fps, no stutters or crashes. What hardware are you running this on good Sir?
Your stated usage does not make sense, and I would not buy it without seeing a screenshot of it to be frank.
The game does not "use" your Hardware AT ALL. Code in gaming is NOT build around using your hardware. You can adjust some calls to be easier on hardware or not, but developers have zero impact on your GPU running at 60%.
You can only have these following situations:
* GPU and CPU are sub 100% (CPU being EACH CORE, not overall!) - only possible with a FPS cap that does not need all your hardware
* GPU is at 100%, CPU not: Only possibly if a game wants more work from your GPU than from your CPU and your CPU has to wait for your GPU to finish its work.
* CPU is at 100% (or at least one core), GPU not: Only possible if the game wants more work from your CPU than it can deliver and your GPU has to wait for the CPU to finish.
* Bonus Edgecase: The Game Engine is at ITS limit,... this can lead to sub 100% usage as well, if you put more hardware at work, then needed to reach that limit. You can do this with like 480p resolution usually. Or 720p even on medium grade GPUs / CPUs.
So if you only have a 2% CPU usage and 60% GPU usage, I am sure the developers would LOVE a screenshot and explanation, because that should not happen, unless you set like a 10fps limit or run a quantum PC from IBM.
As for my Specs: 5950X, UVed at 3400 base clocks and 4090 UVed to 0.85v@2400MHz. Both set to save energy / heat / noise, since the game does not need the full blast to reach 60fps.
GPUs process information faster than CPUs, a CPU core is a general purpose computational unit that can execute dozens of different tasks, while a GPU was designed to accelerate graphics rendering. When you factor in how many applications or background tasks a CPU is running while you game, why would you ever want your CPU usage to be more than the GPU. Once a CPU reaches 100% usage everything will slow down, It may even start to store system data on HDD slowing your PC down even further, but if your GPU reaches 100% your PC is still operational.
Consider the die size of a CPU vs a GPU? GPUs are designed to withstand heat better than CPUs, they are designed to be at 100% usage for days, while a CPU would most likely see degradation under that much heat/time, granted it will most likely throttle itself before that happens, but the CPU is still more susceptible to degradation than a GPU.
Witcher 3 NG is only using 2 cores? making all CPUs for over a decade with 6,12 etc,etc, useless, How do you expect the community to not blame the developers when CDPR choose to use ray tracing + DX12+DLSS+FSR in its remastered version of Witcher 3 yet didn't think to code the game to use more than 2 cores?, They know full well that the GPU would have to wait for the CPU causing everyone's GPU to not be used to its full potential, Basically Crippling the GPU in a waiting game with the CPU while its byproduct tanks your frame rate.
This in turn helps promote Frame generation as the solution,giving consumers the choice of either buying a 40 series GPU for stable frame rate or live with the mediocre performance.
So yes I do blame the developers for promoting a stupidly priced 4090 GPU as the only way to play this game at higher FPS with RT-ON.
Cyberpunk made CDPR lose 75% of its shares, Look how many years was spent on developing this free update, imagine the costs. game developers/Nvidia/AMD work together to optimize games thats why you get an update for some of your games from nvidia/AMD on release date.
the simple facts are? after updating to latest driver No matter what resolution I use, be it 1080p,1440p,4K I can't get a stable 60FPS with ray tracing-ON It's always 55-57FPS Some areas I do get higher frame rate for a limited spell, but mostly it's ehh. Now was this deliberate! I think so, because it's a free update CDPR can get away with it & they are counting on that, Most likely Nvidia made them an offer & the rest is history. Hopefully in a few months time the game will get fixed. depending on how long the contract is.. LOL
Yet more false informations.
The game uses all cores. It just uses a few (more than 2) much more. Which is kind of a RT thing you can't just fix by waving a magic wand. Some stuff just runs in serial and nothing can be done about it. They can let everything else run on differend threads, but if that one thing that needs to run in serial is not done, the rest will have to wait.
But the key takeaway here:
Why do you keep making false accusations that can be proven wrong with just a 10 second google search and tens of thousands of screenshots, showing 6, 8, 12, 16 and even 32 core systems showing all cores used from 20-60%, with a few more nearing 100%.
I mean, if you want to mindlessly bash, at least put in the effort to make sure your "argument" is not invalidated within 10 seconds by everyone that can type in google.
Well.. modern nvidia.. I kept reading articles about radeon 7000 GPUs running consistently above 90c "by design" sad stuff..
For the other folks. with hairworks off (it looks messed up in certain lighting scenarios) I'm averaging 75% or more for GPU usage and running around in forests FPS are consistently above 90 with all the rays being traced.
GPU isnt going above 67 though. 4090 TUF OC GPU clock seems to stay locked at 2730.
playing on a ryzen 5900x and CPU temp isnt going above 40c
All your doing is nitpicking & you still don't understand,
Cyberpunk has native DX12 support, also does not RT enabled use all cores productively in Cyberpunk so the Red Engine can do DX12 natively, Guess no wand is needed just hard work which CDPR failed miserably at, they choose the lazy route, over 2 years for this update where DX12 was mashed over DX11.
When your GPU usage is below 90% it's because of a bottleneck from the CPU but at 70% GPU Usage. Yes, it's a sign that something is very wrong. The culprit is most likely due to inaccurate CPU core usage. In my case at 4K as well as so many others have reported on other forums CPU usage is very low gaming at 4K in this NG update.
FPS between 4K,1440p DLSS Ultra performance vs Ultra FPS remain the same meaning that the game isn't using the CPU correctly, which is the bottleneck.
GOOgLE THAT. LOL
Single core intensive -> not false. But true for every RT game and won't change (before we get quantum computing in our home systems).
"other cores minimally" -> flat out lie. 20-60% is not "minimally", but pretty much as much as 9/10 games do at best, if at all.
W3 is doing an amazing MC job, compared to the average game out there. Very few games do better.
Could they do better: Probably.
But your initial statement was: "They only use two cores", which is flat out false. Then you walked it back "they only use more cores minimally", which is also false.
If you have like 2% usage, you really should send in proof. Since all the proof I have seen does not support your claims at all. No need to post it here. We can't do ♥♥♥♥ about it, if you are running in an edge case bug or whatever. Send it in to the developers and add all your specs and settings, so they can try and reproduce it.
Witcher 3 is single thread intensive so where are you seeing 60% or even 100% CPU usage in witcher 3 NG at 4K? wouldn't mind seeing this data for myself.
In my case & many others on numerous forums we are seeing less then 10% CPU usage mostly hovering around 3-7% with RT off or ON. HWInfo confirmed the CPU only using 2 cores.
In case you missed the numerous comments on this thread as well as so many others, here's the link to others posts.
https://steamcommunity.com/app/292030/discussions/0/3731826212312414463/
https://www.overclock.net/threads/witcher-3-next-gen-update-cpu-utilization-low.1802758/
https://www.reddit.com/r/Witcher3/comments/zllc1l/the_witcher_3_next_gen_patch_running_on_a_ryzen_9/
my video at 4k using RTSS, I dont make sxxx up & i am not a liar...
https://www.youtube.com/watch?v=KutijilgxKI