Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
Read the readmes and do what it says.
If you type dxdiag into the start menu can you see the gpu?
It shows the chip type to be the GTX 1070 in the display tab of dxdiag.
Not only were my games not detecting my graphics card but I was also experiencing loss of monitor screen (crashing to black, basically). Turns out it was a cable between the GPU and the power supply that was bottlenecking. Once replaced, that was resolved. But why the games not reading the GPU?
Short answer: Old games, new graphics card. I just replaced my old GPU 2 months ago (in mid 2022 for posterity). I found that some of my games were listed in the Nvidia control panel, and many weren't. Skyrim for example. Fallout 4, for another. Neither would detect my graphics card. Well, as it turns out, the simple truth is that these are older games, and the GPU is new. My newer games like The Outer Worlds for example, ARE listed in the Nvidia control panel and the DO detect the card.
If this is the case for anyone else then the simplest fix is, when you load up any older games for the first time and it says "card not detected", just go into the game's settings and manually set the graphics to whichever mode you think your rig can handle. In my case I go with "ultra high."
So for me, mystery solved. Older games are, as it happens, sometimes unable to detect new hardware. Who knew?