Black Myth: Wukong

Black Myth: Wukong

View Stats:
Getsu Fuma Aug 17, 2024 @ 5:23am
Well that benchmark made me realize one thing
And that is the coil whine on my video card is annoying as hell. Makes me glad I use headphones so I don't have to hear how awful that noise is.

But what I have to ask is why does the benchmark not say VRAM used? For me it says 0 VRAM used and I know that is impossible.
< >
Showing 1-15 of 18 comments
Naamtar Aug 17, 2024 @ 5:25am 
The benchmark isn't going to be accurate to the actual game, anyway. It's a slowed down pan over terrain. There's nothing busy going on in any of it.

I saw one reviewer say he got 70 in the benchmark but averaged 40 in the game.

Probably doesn't show VRAM as an oversight when it was coded.
prusswan Aug 17, 2024 @ 5:35am 
it's just a gauge to help sales (hopefully), also there have been cases of benchmark tools frying the gpu (noise is not a good sign)

If you really want to, there are tools like nvidia-smi to check the vram usage
Getsu Fuma Aug 17, 2024 @ 5:54am 
Originally posted by Naamtar:
The benchmark isn't going to be accurate to the actual game, anyway. It's a slowed down pan over terrain. There's nothing busy going on in any of it.

I saw one reviewer say he got 70 in the benchmark but averaged 40 in the game.

Probably doesn't show VRAM as an oversight when it was coded.
Yeah I already know benchmarks are usually not accurate so no telling how well the game will run even if the benchmark claims you will be fine. But I just wanted to see if the benchmark for this game would even start sense most benchmarks don't even work for my computer for some reason.

But for me the VRAM thing does matter because that lets me know what I can run in the background while playing a game.
Hogwarts Legacy Aug 17, 2024 @ 5:59am 
Originally posted by Getsu Fuma:
I have to ask is why does the benchmark not say VRAM used? For me it says 0 VRAM used and I know that is impossible.
On the base of test Video from PC Games Hardware in 4K resolution the game used only 9.5 GB of video memory - all this causes questions about its optimisation

https://steamcommunity.com/app/2358720/discussions/0/4432191123103071440/?ctp=2#c6149188575104839238

In my tests game ate about 6 GB of 16 GB VidepMemory
https://steamcommunity.com/app/2358720/discussions/0/4432191123103071440/?ctp=2#c6149188575101635000

This is HIGH + TSR only NO FG and 100% resolution
https://www.imagebam.com/view/MEV9GZU
https://www.imagebam.com/view/MEV9H95

This is ALL HIGH settings + TSR only + FG and 100% resolution
https://www.imagebam.com/view/MEV9GZV
https://www.imagebam.com/view/MEV9H6F

7-8 GB Video with full retracing on ULTRA
https://www.imagebam.com/view/MEVC2D6
Last edited by Hogwarts Legacy; Aug 17, 2024 @ 6:15am
Getsu Fuma Aug 17, 2024 @ 6:15am 
Originally posted by Hogwarts Legacy:
Originally posted by Getsu Fuma:
I have to ask is why does the benchmark not say VRAM used? For me it says 0 VRAM used and I know that is impossible.
On the base of test Video from PC Games Hardware in 4K resolution the game used only 9.5 GB of video memory - all this causes questions about its optimisation

https://steamcommunity.com/app/2358720/discussions/0/4432191123103071440/?ctp=2#c6149188575104839238

In my tests game ate about 6 GB of 16 GB VidepMemory
https://steamcommunity.com/app/2358720/discussions/0/4432191123103071440/?ctp=2#c6149188575101635000

This is HIGH + TSR only NO FG and 100% resolution
https://www.imagebam.com/view/MEV9GZU
https://www.imagebam.com/view/MEV9H95

This is ALL HIGH settings + TSR only + FG and 100% resolution
https://www.imagebam.com/view/MEV9GZV
https://www.imagebam.com/view/MEV9H6F
Thank you for this.
Last edited by Getsu Fuma; Aug 17, 2024 @ 6:15am
Hogwarts Legacy Aug 17, 2024 @ 6:17am 
Originally posted by Getsu Fuma:
Thank you for this.

7-8 GB Video memory with FULL retracing on ULTRA
https://www.imagebam.com/view/MEVC2D6

But FPS = LOL! 🤗🤣😁👍

By the way I had temperature oh HOT Point on my Video card are about ~ 92 degrees with FULL retracing

I've never seen such a high temperature in games because my video card has 5 big copper tubes for cooling and three big coolers
ヽ༼ ಠ益ಠ ༽ノ
-----------------------------------------------
check this out!!!

This is game AFOP from UBISOFT with FULL retracing^

In Avatar my videocard has ~95-99% loading but 60C degrees and 73C HOT Point only 🤡

https://www.imagebam.com/view/MEVC2GS
https://www.imagebam.com/view/MEVC2GU
https://www.imagebam.com/view/MEREDBL
https://www.imagebam.com/view/METXG8Z
https://www.imagebam.com/view/MEVC2QC

Last edited by Hogwarts Legacy; Aug 17, 2024 @ 6:33am
Getsu Fuma Aug 17, 2024 @ 6:29am 
Originally posted by Hogwarts Legacy:
Originally posted by Getsu Fuma:
Thank you for this.

7-8 GB Video memory with FULL retracing on ULTRA
https://www.imagebam.com/view/MEVC2D6

But FPS = LOL! 🤗🤣😁👍

By the way I had temperature oh HOT Point on my Video card are about ~ 92 degrees with with FULL retracing

I've never seen such a high temperature in games because my video card has 5 big copper tubes for cooling and three big coolers
ヽ༼ ಠ益ಠ ༽ノ
Yeah from the benchmark I did notice that it does heat up my video card more so than any other game or software I use. And honestly to me Ray Tracing is never worth it because it always makes games run worse to the point they might be unplayable. And honestly I kind of expect UE games to run poorly now so I am always hesitant on buying them.
chrcoluk Aug 17, 2024 @ 6:34am 
Game and bench to sell GPUs.

My 4080 super can do 1440p dlss 75% resolution, with RT low/medium and sustain 60fps in the benchmark. Boost to RT high and it drops to mid to high 50s. If I turn off Rt and then try native 1440p, it is low to mid 50s. Modern game engine optimisation seems so bad.

I didnt test frame generation as HAGS is currently off and cant be bothered to reboot at the moment.

CPU usage was also not low, so people with pre X3D ryzen and zen 3 or older would struggle likewise with Intel maybe pre alder lake.
Hogwarts Legacy Aug 17, 2024 @ 6:38am 
Originally posted by Getsu Fuma:
And honestly I kind of expect UE games to run poorly now so I am always hesitant on buying them.
Yep! But if devs are OK and have straight hands, the games with UNREAL V very beautiful

1-st my picture I do not remember name of recent new game on the base of UNREAL 5.4....,
But picture 2 and 3 ROBOCOP with Unreal 5.2 🙌 🙌
https://www.imagebam.com/view/MEVC31F
Last edited by Hogwarts Legacy; Aug 17, 2024 @ 6:38am
Snake Aug 17, 2024 @ 7:03am 
Well, I run an AM4 rig for 1080p.
Having 5800x3d and 4070ti OC on SSD gave me 30 fps on cinematic + rtx with no dlss in benchmark. I think that in some tougher battle with a lox of effects this might drop to like 20 easily...

I kinda hate frame generation stuff, but I dunno what kind of gpu will run this at native 4k on cinematic with higher avg fps.

And it said perfectly well that it used 7.4 gigs
Last edited by Snake; Aug 17, 2024 @ 7:04am
Hogwarts Legacy Aug 17, 2024 @ 7:14am 
Originally posted by Snake:
I dunno what kind of gpu will run this at native 4k on cinematic with higher avg fps.
Yep!
Did these tests yesterday^

i7 14700K for 500 USD + RX 7900ХТХ 24 GB for 1300USD^ +32GB RAM +SSD

and 100% NO FG
https://ibb.co/X36g0Nd

and 100% NO FG
https://ibb.co/H7WCJYr

this is EXTREME settings and 100% resolution 😹😭

Pay your attention: NO FG!!!!! ( ͡ʘ ͜ʖ ͡ʘ) ▄︻̷̿┻̿═━一

I hate FAKE Generation (ง ͡• ͜ʖ ͡•)ง
Snake Aug 17, 2024 @ 9:38am 
Originally posted by Hogwarts Legacy:
Originally posted by Snake:
I dunno what kind of gpu will run this at native 4k on cinematic with higher avg fps.
Yep!
Did these tests yesterday^

i7 14700K for 500 USD + RX 7900ХТХ 24 GB for 1300USD^ +32GB RAM +SSD

and 100% NO FG
https://ibb.co/X36g0Nd

and 100% NO FG
https://ibb.co/H7WCJYr

this is EXTREME settings and 100% resolution 😹😭

Pay your attention: NO FG!!!!! ( ͡ʘ ͜ʖ ͡ʘ) ▄︻̷̿┻̿═━一

I hate FAKE Generation (ง ͡• ͜ʖ ͡•)ง

lmao man

maybe some 5090super would get us to like 45 fps on 4k ultra with no dlss in the future
(and 60fps for 2k)

for me it's not even about fake generation - i just see how weird those DLSS frames look and because of cp2077 I feel traumatized about using that thing again even with "super-resolution".
Dev Aug 17, 2024 @ 9:50am 
Originally posted by Getsu Fuma:
And that is the coil whine on my video card is annoying as hell. Makes me glad I use headphones so I don't have to hear how awful that noise is.

But what I have to ask is why does the benchmark not say VRAM used? For me it says 0 VRAM used and I know that is impossible.

The ‘Coil whine’ you hear at start is because your GPU @ benchmark tool launch will auto rev to the highest configured boost core clock speed resulting in high current/peak draw and triggers what’s likely the MLCC capacitor or inductor whine)…

The VRAM used = 0 error is not some coding error but it could be some error combo between your drivers, conflicting monitoring calls made by different monitoring applications, misconfigured GPU UV/OC settings conflicts etc.

My point here is every result I’ve see across huge variety of GPUs & in-benchmark settings shows VRAM used just fine. If your benchmark is showing VRAM used = 0, then it’s a user configuration or mix of software/firmware issues on your end.

Your best bet is to run another game, have something like MSI AB/RTSS running, check VRAM usage under load. If MSI AB reports VRAM usage then try again with benchmark tool. Also try file verification and if need be redownload it.
Getsu Fuma Aug 17, 2024 @ 10:38am 
Originally posted by Dev:
Originally posted by Getsu Fuma:
And that is the coil whine on my video card is annoying as hell. Makes me glad I use headphones so I don't have to hear how awful that noise is.

But what I have to ask is why does the benchmark not say VRAM used? For me it says 0 VRAM used and I know that is impossible.

The ‘Coil whine’ you hear at start is because your GPU @ benchmark tool launch will auto rev to the highest configured boost core clock speed resulting in high current/peak draw and triggers what’s likely the MLCC capacitor or inductor whine)…

The VRAM used = 0 error is not some coding error but it could be some error combo between your drivers, conflicting monitoring calls made by different monitoring applications, misconfigured GPU UV/OC settings conflicts etc.

My point here is every result I’ve see across huge variety of GPUs & in-benchmark settings shows VRAM used just fine. If your benchmark is showing VRAM used = 0, then it’s a user configuration or mix of software/firmware issues on your end.

Your best bet is to run another game, have something like MSI AB/RTSS running, check VRAM usage under load. If MSI AB reports VRAM usage then try again with benchmark tool. Also try file verification and if need be redownload it.
The coil whine thing is good to know and I will look into it.

As for the VRAM being 0 thing it might be because I am on Linux and the benchmark might not work right on it. Like other games and software will pick up my VRAM fine but the only thing that won't is this benchmark.
Hogwarts Legacy Aug 17, 2024 @ 3:23pm 
Originally posted by Getsu Fuma:
As for the VRAM being 0 thing it might be because I am on Linux
Y)ep/// LinuX is a reason for U 😄

Alco.....It's could be a driver error.
I remember a game that showed with one version of drivers - it consumed 8 gigabytes of v.memory, when I installed other drivers, MSI RTSS showed me that used video memory is 10 gigabytes (°ロ°)☝

In 4K resolution the game used 9.5 GB of video memory
https://www.youtube.com/watch?v=LpwsqWGINQI

NOT too much ✍(◔◡◔)

Watch this^
AFOP/Avatar Pandora/ Eating 12-14 video and 14-24 GB RAM (°ロ°)☝
https://www.imagebam.com/view/MERLBD6
https://www.imagebam.com/view/MERLBD7

Cyberpunk 2077 + 4K Ultra Textures MOD = ~15GB Video memory
https://www.imagebam.com/view/MEV2LNA
https://www.imagebam.com/view/MEV2LLT
https://www.imagebam.com/view/MEVC9S8
https://www.imagebam.com/view/MEVC9S7
https://www.imagebam.com/view/MEVC9SC
Last edited by Hogwarts Legacy; Aug 17, 2024 @ 3:40pm
< >
Showing 1-15 of 18 comments
Per page: 1530 50

Date Posted: Aug 17, 2024 @ 5:23am
Posts: 18