Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
Me personally i enjoy the older PC games (Giants Citizen Kabuto, Sacrifice, Total Annhilation). There have maybe been 1 or 2 games i can actually say i really enjoyed in the last decade for the PC. DA: Origins, Elder Scrolls: Morrowwind, Deus Ex
Also watchdogs is more than them just lowering the graphics, they blatantly lied to us.
Because you've openly said you're primarily a console player, there is no point in me even wasting my time with you but I will anyway.
If my PC can run the best games out there then I would be very disappointed if the actual games were not up to standard. Did I spend £1200+ on my PC so it can perform like a peasant box? No.
I fully expect 60 FPS, FOV slider, amazing graphics etc. If the game doesn't have those then what was the point of me investing in a PC?
And you can't defend Watchdogs in any way whatsoever. Ubisoft should be sued for it. False advertising is a criminal offence.
That's pretty much the reason. Unlike a lot of PC gamers - who are very much used to 60 FPS if not more - I don't actually mind all that much if my FPS drops to the area of 30 frames. This is because back then I was (primarily) a console gamer.
As you get more used to PC gaming and as you continuously upgrade your PC like I've been doing, what was once acceptable back then becomes inexcusable.
As for the rest, we PC gamers hate locks at 30 FPS because many of us (for a fairly large amount of games, myself included) have rigs that can provide 60 FPS constantly. Because you're mainly a console gamer (you stated so yourself), your eyes may not be adjusted to the massive increase in smoothness from 30 FPS to 60 FPS.
If anyone ever tells you "oh the eye can only see 24 fps so fps doesn't matter hurr durr derp derp durr" don't listen. The eye doesn't see in frames, it sees in motion. Higher framerate = smoother motion. 24 fps is just the minimum that your eyes will detect as motion rather than still images.
When we say we want our CPUs and GPUs to be used 100% while we play our games, that doesn't always mean that it has to be highest graphics. We just want the game to be optimized enough to take full advantage of the rigs we spend upwards of $600 no matter what visual settings we choose to use so that we get the best performance possible. It sucks when a game is only using 30% of your GPU because you know that if it was using 100% you'd have a significantly higher framerate.
About Watch_Dogs, we aren't mad at the graphics themselves (indeed, it looks quite good). We're mad (though for me it's more slightly irritated) that Ubisoft said that PC on max settings would look just like the E3 2012 demo and it didn't. We're also angry about the optimization of the game, and how poorly it runs even on dual SLI GTX Titans (watch TotalBiscuit's "WTF Is... Watch Dogs?" for more info).
Also confused at how the E3 2012 demo graphics settings are IN THE GAME'S CODE and can be re-enabled by messing with config files and work just fine and, for some people, increases framerate. If the game crashed more often and had issues with those graphics settings on, I could understand why they removed it. But it doesn't. They work fine.
In some cases, games do look worse probably because of the consoles. The developers spend a large amount of time making a game and getting it to work on consoles, then they don't have enough time and money to make a proper PC version with better visuals and actual optimization. Hence why I say make the PC version first, then put it on low or medium graphics settings and put that on consoles.
Overall, I'd say we PC gamers aren't on a "high horse" necessarily (though there are some of us who have too much of a superiority complex), rather, we spend a lot of time and money on our video games and we want the most out of them. For many of us, we play at a smooth 60 FPS (and for some of us who have a lot of money, 120 FPS) for a majority of the time, and we expect EVERY game to run at that. We want games to look amazing because we know our rigs can handle it.
We want the best of the best, not because we're nitpicky and fussy, but because we know it's possible.
Edit: FPS may have been lower than 10