Steam 설치
로그인
|
언어
简体中文(중국어 간체)
繁體中文(중국어 번체)
日本語(일본어)
ไทย(태국어)
Български(불가리아어)
Čeština(체코어)
Dansk(덴마크어)
Deutsch(독일어)
English(영어)
Español - España(스페인어 - 스페인)
Español - Latinoamérica(스페인어 - 중남미)
Ελληνικά(그리스어)
Français(프랑스어)
Italiano(이탈리아어)
Bahasa Indonesia(인도네시아어)
Magyar(헝가리어)
Nederlands(네덜란드어)
Norsk(노르웨이어)
Polski(폴란드어)
Português(포르투갈어 - 포르투갈)
Português - Brasil(포르투갈어 - 브라질)
Română(루마니아어)
Русский(러시아어)
Suomi(핀란드어)
Svenska(스웨덴어)
Türkçe(튀르키예어)
Tiếng Việt(베트남어)
Українська(우크라이나어)
번역 관련 문제 보고
I'll keep my eyes open for any additional issues, but if my GPU is dying, there's not much I can do but throw the laptop in the trash when that happens.
Serves me right for even buying a gaming laptop 🤣
Some things to consider via NV Control Panel but if you're not sure, I would leave it.
https://www.digitaltrends.com/computing/best-nvdia-control-panel-settings/
In regards to why it has only been noticed in some UE games that very well could be due to whatever settings you have configured for those games. You may have some graphics settings that make more use of VRAM than you are seeing in other games.
You could try to run some other graphically intense GPU benchmarks that will stress the GPU and VRAM and see if you have any crashes and/or artifacting.
Furmark [geeks3d.com] will usually fully load the GPU and VRAM so that will likely show the issue.
https://store.steampowered.com/app/223850/3DMark/
3DMark using Timespy Extreme should also fairly heavily load the GPU and VRAM in certain segments of the benchmark so that should also show the issue.
What brand & model is the laptop?
None of them showed any artifacts, even after prolonged tests. My temps don't go above 70°C in any of them, but I've already ruled out overheating.
It's an Acer Predator Helios 300, with a GTX 1060 6GB, i7-8750 and 16GB RAM, also 2 SSD-s.
It is around 3 years old, but I'm still hoping that the GPU isn't quite dead yet.
I'm not really suggesting it's an overheating issue. Did you happen to install and run GPU Shark alongside Furmark? What settings in Furmark did you use?
GPU Shark will show you GPU memory use and GPU memory controller use. So you can watch that while Furmark is running to make sure you've run it with settings that are fully/near fully using your VRAM and VRAM controller. You could also use GPU-Z to monitor/log while running Furmark and then look at the Memory Used data point during the burn-in and see if you're using most of your VRAM or not.
Same suggestions with Unigen. Valley isn't really going to be taxing on VRAM for a 1060 6GB card. Superposition could be depending on the options you've configured. Using its default settings for it isn't going to use a substantial amount of VRAM. IIRC its default settings is going to only use about 2GB - 3GB of VRAM.
Also, it could be memory clock stability. You can use GPU-Z to see what the memory frequency, and voltage, is hitting when running a game where you're seeing the artifacting. Then use MSI Afterburner to slightly under clock the memory and see if it is still having issues.
Your manufacturer will know how to access those features if they are available on your PC.
But really, it could still be anything. Lots of buggy interactions exist, esp. when more than one program uses the GPU and during hardware acceleration. Lots of programmers also use shortcuts that get patched out of existence at some point because they aren't intended.