Instal Steam
login
|
bahasa
简体中文 (Tionghoa Sederhana)
繁體中文 (Tionghoa Tradisional)
日本語 (Bahasa Jepang)
한국어 (Bahasa Korea)
ไทย (Bahasa Thai)
Български (Bahasa Bulgaria)
Čeština (Bahasa Ceko)
Dansk (Bahasa Denmark)
Deutsch (Bahasa Jerman)
English (Bahasa Inggris)
Español - España (Bahasa Spanyol - Spanyol)
Español - Latinoamérica (Bahasa Spanyol - Amerika Latin)
Ελληνικά (Bahasa Yunani)
Français (Bahasa Prancis)
Italiano (Bahasa Italia)
Magyar (Bahasa Hungaria)
Nederlands (Bahasa Belanda)
Norsk (Bahasa Norwegia)
Polski (Bahasa Polandia)
Português (Portugis - Portugal)
Português-Brasil (Bahasa Portugis-Brasil)
Română (Bahasa Rumania)
Русский (Bahasa Rusia)
Suomi (Bahasa Finlandia)
Svenska (Bahasa Swedia)
Türkçe (Bahasa Turki)
Tiếng Việt (Bahasa Vietnam)
Українська (Bahasa Ukraina)
Laporkan kesalahan penerjemahan
Just get an NVIDIA card.
Thats not the case anymore. You're still stuck in 2013 / old gen era. The next gen consoles are running on the x86 architecture and this is a huge win for PC gamers.
Games will never run better on next gen consoles over PC. A console will always have to sacrifice graphics and lower fps in order to run stable, hence why GTA 5 and Witcher 3 is 30 fps and medium graphics, while PCs are 60 fps / ultra graphics.
So games like Watchdogs, Lords of the Fallen, The Evil Within, and many others play better on PC aye. Nope, they play better on the consoles due to these games being poorly optimized for PC.
It is when on console you can play 30 FPS without stutter whereas on PC you may be getting a higher FPS but with it comes random FPS drops due to crappy coding that entails the game stuttering on you. Not to mention the fact that these new gen games built with new gen consoles in mind hog the VRAM like no tomorrow which means these games more often then not hit the VRAM ceiling of your GPU and thus more stuttering. Then you have the console specifically designed UI that they keep for the PC version instead of designing a UI with 'each' platform in mind....
You're not the universal arbiter of what constitutes gaming.
Are you oblivious to the thing called a graphics option menu on PC?
Next gen consoles are running Watch dogs with 30 fps and lower graphics, while on PC you'll get 60-70 fps with all the graphics maxed out with a gtx 970.
Tell me, how are they running better on the consoles? Is 30 fps better than 60 fps? Is lower graphics better than ultra graphics? You dont make sense.
Might take a look at that.
Read my post above Sentient_Toaster.
Also I am far from oblivious but can admit that some games actually can play better on consoles due to the fact that it's easier to optimize a game better for a specific hardware instead of a myriad of hardware set-ups which is PC.
I'm using a Asus ROG Swift PG278Q 27inch for my monitor of which is G-Sync and I always make sure to disable V-Sync so it doesn't hinder the G-Sync. Impressed with it so far :)
CPUs these days are relatively further ahead than GPUs, consequently the best way to keep your PC up to date is to change the graphics card about 2-3 times as often as the cpu.
OFC this depends on how good your cpu is to start with, but as a general rule its ok.