Installer Steam
Logg inn
|
språk
简体中文 (forenklet kinesisk)
繁體中文 (tradisjonell kinesisk)
日本語 (japansk)
한국어 (koreansk)
ไทย (thai)
Български (bulgarsk)
Čeština (tsjekkisk)
Dansk (dansk)
Deutsch (tysk)
English (engelsk)
Español – España (spansk – Spania)
Español – Latinoamérica (spansk – Latin-Amerika)
Ελληνικά (gresk)
Français (fransk)
Italiano (italiensk)
Bahasa Indonesia (indonesisk)
Magyar (ungarsk)
Nederlands (nederlandsk)
Polski (polsk)
Português (portugisisk – Portugal)
Português – Brasil (portugisisk – Brasil)
Română (rumensk)
Русский (russisk)
Suomi (finsk)
Svenska (svensk)
Türkçe (tyrkisk)
Tiếng Việt (vietnamesisk)
Українська (ukrainsk)
Rapporter et problem med oversettelse
Regardless the hardware auto detect is probably a bit more low tech than you might imagine.
Basically, you need supported hardware for games to more accurately auto-detect settings. For example, I found a post of a guy with a 7900 GTX(?) card from ~2006, playing a game from ~2004(Dawn of War 1). The game lists a GeForce 3 as the recommended NVIDIA card, while he uses a GeForce 7(xxx) card, and his settings were not properly auto-detected. So, you'll want to make sure you use hardware included in the minimum or recommended system reqs and anything in between(?). Any older or newer hardware likely won't be detected properly and the game will either completely fail to detect or use a much simpler method, like roughly basing settings on VRAM or something(is my guess). Updates to a game post-release may introduce support for newer hardware but I can't say that with certainty - and could well differ from game to game.
I noticed that recommended hardware is usually a few years older than a game's release date. Why? I would assume because development(and testing) for many games started years prior to release.
This auto-detection issue is actually quite disturbing because it essentially means everyone playing at least slightly older games than their hardware, could well default to the lowest settings, resulting in a subpar experience. Many people still don't or don't know how to tweak their settings.
I'm not sure it was necessary to dig up a 4 year old thread for this, but at least you are contributing to the conversation.
I think this is the crux of your concern. It's not the game's fault when an end-user doesn't possess basic knowledge of PC game settings. The solution I recommend for these people is to learn the basics. What is resolution? What is anti-aliasing? What is anisotropic filtering? How does raytracing effect things? What's VRAM? Etc.
https://www.youtube.com/watch?v=tqDa5bp6X88
If someone is unwilling or unable to learn these fundamentals, they can either remain content with their sub-par experience, or they can purchase a console where they can play games without having to think about settings at all, save for maybe an option between "High Performance Mode" and "High Resolution Mode", depending on the game.
PC gaming does, always has, and always will require just a little bit more knowledge and know-how to get the most out of it.