Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
So is the whole thing about bigger and better numbers a conjob? Well, in many cases yes. But like my example above, there are always going to be some fringe exceptions to the rule.
It should also be noted that this is not something exclusive to the computer industry. you see a lot of this in other industries, like the auto-industry. The big thing with trucks for example is often to talk about how much power they have and how much they can haul. Many of them do this just for marketing, when the reality is that all that power is WAY excessive for most cases.
This is how a company upsells you. they present something which is better on paper and depend on people not asking the question of how much is actually necessary.
The most likely thing you'll notice it the game acting a bit more responsive, but that also depends highly on the kind of games you play. Fast paced action titles obviously profit more from it than turn based strategy.
For resolution it depends entirely on screen size and view distance. If you do a little google, you can find quite a lot of graphs when a higher resolution monitor is advantageous. But even then, perceived visual quality may differ from person to person and also by game. A game on low grpahics settings might still look like ♥♥♥♥ in 4k. ;)
(If you have an outlet store nearby, see if they have a 240 Hz monitor. There is one by Ezio for example. Ask for a demonstration on it. If you still can't see a difference to 40 Hz, there might be something seriously wrong with your eyes. :P While it still runs on 120 Hz, it will insert a black screen every other picture and it's amazing how smooth it looks. Our brain kinda only works on processing differences/movement and this black screen really gets to it.)
We have a saying here: if you don't know what you're talking about, just shut up. It's by far not the first post of yours I've seen which either contains patently false information or completely misses the point.
Basically you create a series of textured triangles which make up the objects. GPUs can process X number of triangles per second. This number is restricted by:
- Memory Bandwidth from the CPU to the GPU (usage of this bandwidth is reduced by loading the textures into the GPU memory in advance)
- Processing of the lighting and shadows in the scene
- Post-Processing steps that are enabled (bump mapping, anti-aliasing, etc)
There is usually an important number that GPU manufacturers tout, particularly when it's higher than a competitors card.
Network architecture of a game has no impact on the GPU side of the equation (it can still do the same number of triangles regardless), there are other limitations to framerate you actually get however (which come down to when the game decides to tell the GPU that it has all the data necessary to render the current scene). A poorly designed game could tie that to the netcode (so it tries to update on every packet from the server) but that would be silly.
Any effectively programmed game does an estimate of what has probably happened, and renders that then if it gets corrected from the netcode adjusts the rendered scene. For small corrections interpolating the difference to make it smooth, for large differences (when you have a lot of lag for example) just causing people to jump to their "correct" location.
Id like to point out that bump mapping is not a post process effect(But i suppose you could use some kind of deferred shading technique for it and it would be kind of post processed), also AA is not always post processed. The simplest kind of AA is just rendering to a bigger render target then the output window, when the render target is presented on swap to the output window it is automatically downscaled/upscaled to fit that windows resolution, its called MSAA. But fxaa and many other aliasing techniques are post processed. Also the triangles themselves usually do not have the actual texture but usually have UV coords, position, normal,texture channel and maybe something else like color or if you want to run well on AMD hardware you have the model data as a constant array and then each vertex has nothing but an index value.
It is technically possible(but impractical) to make your entire game run completly on the GPU and the CPU only sends data to it(Data that could be obtained from a network). By completly on the GPU I mean not only the drawing but the actual game processing and content is running fully on the GPU. All of your oher points where well made.
However, at the same time, a health young standard human eye can perceive and detect drops below 48 FPS and even noticable changes even up to 120 FPS. Why?
Persistence of vision is the phenomenon of the eye by which an afterimage is thought to persist for approximately one twenty-fifth of a second on the retina.
It's the flickering effect which annoys the human eye, as the frame flips to the next. Mostly it's ignored by the human brain, cats and dogs for example would notice it more. Depending on how smooth the edges of the animation is, the human brain will still register the previous few frames with the one it sees, calculating differences and ignoring slight variations. This is why monitors now all come with backlights, it greatly reduces this flickering effect.
You'll find that movies and console games can run lower 24FPS and get away without being noticed, because of the distance and edge blur. However, a PC has much higher quality and is closer range, therefore the brain can pick out the edge change a lot more. It entirely depends on what animation you are viewing and what device your viewing it on. For a standard PC, it's ideal to keep it at least above 48FPS at all times, for younger eyes not to be so distracted by the changes.
The human mind can actually detect and correct to a certain degree. However, higher the resolution and sharper the image edges, the more frames are required to make it appear as a smooth movement - without the brain stressing to fill missing gaps.
FPS changes and varies, so 30 FPS won't be continuous (rather it's a rise and lower (for example: 24 to 48 FPS). It's thoses changes which are even more distracting at lower FPS levels. When getting up to 120FPS+, it becomes much less noticed.
You eye also adjusts and learns to accept what it sees. If you need glasses, but don't wear them for years, the eye will consider what it sees as normal... until you see better with glasses, then when you remove the glasses vision suddenly appears a lot more blurry. The same factor applies to monitors. People running at 60Hz, will be happy, till they see a 120Hz/144Hz monitor to compare it against. The brain will then register the 60Hz as lower quality, than what it first determined it to be at.
my old, cheap 1600x900 led monitor has 60hz. got no problem when having between 50-60 fps plating left 4 dead.
dropping to below 50 = headache for me.
I can.
but it also depends on how close I am to the screen in question on FPS. On resolution beyond 2K. I dont know but I ABSOLUTLY can tell the difference between 1080p and 2k
You can easily tell the difference and when playing you can feel the difference. I don't buy the "I can't see a difference" part.
Just add more footballs to see the massive difference. If you have a higher refresh rate monitor you can see the 90 and 120 versions.
No you can't. There is no way you can tell the difference between, say, 80 and 90 fps. Not gonna happen.