Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
2. Higher refresh rate capable displays are more expensive to manufacture than a simple 60Hz panel.
They cost a lot more because the components are a lot more expensive.
^These.
Also I recommend waiting on this 4K with HDR for monitors, as they won't drop in price for awhile, until compaines finds away to make good components they needed cheaper, which won't be for awhile.
The response time of a screen is irrelevant to price, the response time of the typical monitor is usualy lower than the typical response time of a TV, that is true in the averages and extremes but not always. An extremely fast TV can beat an average monitor in that regard. There is also 2 things to consider on the topic of responsiveness the time for a pixel to change colour, and the time for screen to tell the pixel to change colour after it receives the signal. The first is response time and the second is input lag.
Big Bricced was kind enough to correct Cathulhu on the response time monitors actually being lower (lower is better) than TV's. But then Big Bricced went on to the next false information: TV's refresh rate skyrocket on TV's. Actually only the commercial ♥♥♥♥♥♥♥♥ lying fake "2400Hz" refresh rate on TV's explode. Their real refresh rate is always at a maximum of 50Hz in the states and 60Hz elsewhere.
But none of that actually matters, none of that helps to answer your question as response time, input lag and refresh rate have none to close to none effect on the price. Especialy as your question doesn't even bring up higher refresh rate pannels. And no monitors do not have "much more expensive components" Actually they have smaller components that are cheaper to manufacture as parts and a lot cheaper to ship around and store.
The actual biggest reason that monitors are more expensive than TV's is economy of scale: TV's are mass produced by hundreds of millions, and because there is so much volume to sell they don't care so much about making a lot of profit on each one they just want to sell as many as possible for a small profit which in turn increases competitivity and endlessly draws prices down. Monitors however may sell in tens of millions of units, but they will mostly be office monitors where 1080p or 1440p is most commun. 4K monitors are RARE, the number of people buying them is negligeable, monitor manufacturer stick their nose up and say nahh i can't be bothered to spend money making that... oh well if you pay a 2000% mark up then maybe i'l just sell a few for the giggles... There is also the fetching the elites or milking the early adopters where the price will be put very high when a new technology is first released just to get the maximum amount of money from the few who are ready to pay a lot more before cutting the price in half or more 6 months or a year later. That milking period has likely been extended for high refresh rate and 4k monitors because the demand is higher than they expected driving their stocks down and the prices stay high.
If you want to middle finger the monitor manufacture's ridiculus prices you can just get a TV in stead. Just be careful about getting a TV with low input lag and response time, not too big and without a strobbing backlight. Also better plug that TV into an ethernet or the android in it will fry your brains out endlessly trying to connect to wifi.
disabels all picture enhancer and get therefor an input lag low as any good monitor. Most of them in eurpoe are true 60Hz by nature and you can OC a screen. Could easily switch mine to 75Hz.
240Hz TV have a refresh rate of 240Hz truely but they cant take a singal at 240Hz. First you would need Display Port as HDMI only can go 120Hz. TV's dont have it. Also they only take singals up to 60Hz + OC most TV's actually are capable of 75Hz by default.
How doe they display then 240Hz? The same way you see a Movie on 50Hz or 60Hz tho the movie itself is capture at 23 FPS. The single frames recieved by the TV will be calculated against each other and an average pixels calculated and rendered in between.
To explain it simple:
Imagine you a Pixel having in Frame 1 the color "00" and in Frame 2 the Color "03".
Then the TV would calculate that Frame 1 remains Frame 1 and Frame 2 going to be Frame 4.
So you miss now Frame 2 and 3 which going to be calulated by the TV. The TV will now calculate for Frame 2 the Color "01" and for Frame 3 the color "02".
So you'll have a feelign of a more soft changing color and see actually more frames tho thsoe frames are just an average calculation out of the original frames. This process cost a lot of time and raises the input lag greatly. Doesnt matter for a TV as you might need a few seconds longer to load the movie but as the movie are all already know data that does not change. YOu'll proberly dont care if a movie starts 2-3 seconds later. But you would care if a game where you see a result of your input would only be displayed 2-3 seconds later.
And, honestly, youll need to sell a kidney to even run at 4k at 100fps