Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
I saw one reviewer say he got 70 in the benchmark but averaged 40 in the game.
Probably doesn't show VRAM as an oversight when it was coded.
If you really want to, there are tools like nvidia-smi to check the vram usage
But for me the VRAM thing does matter because that lets me know what I can run in the background while playing a game.
https://steamcommunity.com/app/2358720/discussions/0/4432191123103071440/?ctp=2#c6149188575104839238
In my tests game ate about 6 GB of 16 GB VidepMemory
https://steamcommunity.com/app/2358720/discussions/0/4432191123103071440/?ctp=2#c6149188575101635000
This is HIGH + TSR only NO FG and 100% resolution
https://www.imagebam.com/view/MEV9GZU
https://www.imagebam.com/view/MEV9H95
This is ALL HIGH settings + TSR only + FG and 100% resolution
https://www.imagebam.com/view/MEV9GZV
https://www.imagebam.com/view/MEV9H6F
7-8 GB Video with full retracing on ULTRA
https://www.imagebam.com/view/MEVC2D6
7-8 GB Video memory with FULL retracing on ULTRA
https://www.imagebam.com/view/MEVC2D6
But FPS = LOL! 🤗🤣😁👍
By the way I had temperature oh HOT Point on my Video card are about ~ 92 degrees with FULL retracing
I've never seen such a high temperature in games because my video card has 5 big copper tubes for cooling and three big coolers
ヽ༼ ಠ益ಠ ༽ノ
-----------------------------------------------
check this out!!!
This is game AFOP from UBISOFT with FULL retracing^
In Avatar my videocard has ~95-99% loading but 60C degrees and 73C HOT Point only 🤡
https://www.imagebam.com/view/MEVC2GS
https://www.imagebam.com/view/MEVC2GU
https://www.imagebam.com/view/MEREDBL
https://www.imagebam.com/view/METXG8Z
https://www.imagebam.com/view/MEVC2QC
My 4080 super can do 1440p dlss 75% resolution, with RT low/medium and sustain 60fps in the benchmark. Boost to RT high and it drops to mid to high 50s. If I turn off Rt and then try native 1440p, it is low to mid 50s. Modern game engine optimisation seems so bad.
I didnt test frame generation as HAGS is currently off and cant be bothered to reboot at the moment.
CPU usage was also not low, so people with pre X3D ryzen and zen 3 or older would struggle likewise with Intel maybe pre alder lake.
1-st my picture I do not remember name of recent new game on the base of UNREAL 5.4....,
But picture 2 and 3 ROBOCOP with Unreal 5.2 🙌 🙌
https://www.imagebam.com/view/MEVC31F
Having 5800x3d and 4070ti OC on SSD gave me 30 fps on cinematic + rtx with no dlss in benchmark. I think that in some tougher battle with a lox of effects this might drop to like 20 easily...
I kinda hate frame generation stuff, but I dunno what kind of gpu will run this at native 4k on cinematic with higher avg fps.
And it said perfectly well that it used 7.4 gigs
Did these tests yesterday^
i7 14700K for 500 USD + RX 7900ХТХ 24 GB for 1300USD^ +32GB RAM +SSD
4К and 100% NO FG
https://ibb.co/X36g0Nd
2К and 100% NO FG
https://ibb.co/H7WCJYr
this is EXTREME settings and 100% resolution 😹😭
Pay your attention: NO FG!!!!! ( ͡ʘ ͜ʖ ͡ʘ) ▄︻̷̿┻̿═━一
I hate FAKE Generation (ง ͡• ͜ʖ ͡•)ง
lmao man
maybe some 5090super would get us to like 45 fps on 4k ultra with no dlss in the future
(and 60fps for 2k)
for me it's not even about fake generation - i just see how weird those DLSS frames look and because of cp2077 I feel traumatized about using that thing again even with "super-resolution".
The ‘Coil whine’ you hear at start is because your GPU @ benchmark tool launch will auto rev to the highest configured boost core clock speed resulting in high current/peak draw and triggers what’s likely the MLCC capacitor or inductor whine)…
The VRAM used = 0 error is not some coding error but it could be some error combo between your drivers, conflicting monitoring calls made by different monitoring applications, misconfigured GPU UV/OC settings conflicts etc.
My point here is every result I’ve see across huge variety of GPUs & in-benchmark settings shows VRAM used just fine. If your benchmark is showing VRAM used = 0, then it’s a user configuration or mix of software/firmware issues on your end.
Your best bet is to run another game, have something like MSI AB/RTSS running, check VRAM usage under load. If MSI AB reports VRAM usage then try again with benchmark tool. Also try file verification and if need be redownload it.
As for the VRAM being 0 thing it might be because I am on Linux and the benchmark might not work right on it. Like other games and software will pick up my VRAM fine but the only thing that won't is this benchmark.
Alco.....It's could be a driver error.
I remember a game that showed with one version of drivers - it consumed 8 gigabytes of v.memory, when I installed other drivers, MSI RTSS showed me that used video memory is 10 gigabytes (°ロ°)☝
In 4K resolution the game used 9.5 GB of video memory
https://www.youtube.com/watch?v=LpwsqWGINQI
NOT too much ✍(◔◡◔)
Watch this^
AFOP/Avatar Pandora/ Eating 12-14 video and 14-24 GB RAM (°ロ°)☝
https://www.imagebam.com/view/MERLBD6
https://www.imagebam.com/view/MERLBD7
Cyberpunk 2077 + 4K Ultra Textures MOD = ~15GB Video memory
https://www.imagebam.com/view/MEV2LNA
https://www.imagebam.com/view/MEV2LLT
https://www.imagebam.com/view/MEVC9S8
https://www.imagebam.com/view/MEVC9S7
https://www.imagebam.com/view/MEVC9SC