iniciar sessão
|
idioma
Български (Búlgaro)
čeština (Tcheco)
Dansk (Dinamarquês)
Nederlands (Holandês)
English (Inglês)
Suomi (Finlandês)
Français (Francês)
Deutsch (Alemão)
Ελληνικά (Grego)
Magyar (Húngaro)
Italiano (Italiano)
日本語 (Japonês)
한국어 (Coreano)
Norsk (Norueguês)
Polski (Polonês)
Português (Portugal)
Română (Romeno)
Русский (Russo)
简体中文 (Chinês Simplificado)
Español (Espanhol)
Svenska (Sueco)
繁體中文 (Chinês Tradicional)
ไทย (Tailandês)
Türkçe (Turco)
Українська (Ucraniano)
Ajude-nos a traduzir o Steam
If I were you i'd make sure I have the latest drivers, and google around for known problems. Having this problem would really annoy me, because it would not make sense, I have been able to play every, and I do mean EVERY, game I've ever thrown at this computer, the most I've ever had to do is force it to use the chipset with some older games.
One tip, if your computer is like my build, you could always try going HDMI to your TV, that's how my dell gaming laptop handles dual monitor, HDMI out automatically uses the Nvidia chipset, doesn't matter if it's desktop, Word, 3D game, or what- whatever is displayed on the HDMI-out montiro is displayed via the gaming chipset. If nothing else doing this may increase your framerate/performance well enough that you can enjoy the game you paid for. Let me know if that helps.