Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
if a monitor has the gsync module, its about 10-15 more watts than a similar non gsync display
Okay, tantrum out of the way, as others have said, there's more to it than screen real estate. On average, yes, I would expect a higher resolution does draw more, but typically you're not changing JUST the resolution, but the size, and a whole slew of other things.
It was like when LCDs came about, they claimed MASSIVE power savings compared to CRT. It was one of those things that WAS technically true... but also not. My ancient TV lists something like 155W, and my prior CRT TV probably drew less. The reason is, my prior CRT TV was smaller.
If I change from my current PC LCD to a newer one, I imagine I'd save wattage even if I went bigger, as I'm using CCFL right now (LED backlit LCDs typically draw far less).
So you'll have to simply look up the specifications of the display, but that will likely only give you a singular average. If you're truly want to know, get something like a kill a watt meter and measure your own use with both displays.
1080p is 2k, 1440p is 2.5k and 2160p is 4k
4K typically refers to a resolution that is around 4,000 pixels horizontally (the "K" means "kilo" or "thousand"). 4K on TVs and monitors is often 3840 x 2160 (the "real" 4K is actually something else entirely but we won't even go into that) so 4K itself is actually a bit short of that, but it's an approximation.
So 1080p is 1920 x 1080.
1440p is 2560 x 1440.
Between 1920 and 2560, which is closer to "approximately 2000 and a bit short of it"? So if "2K" was to be either of those, it'd be 1080p.
Marketing has just slapped 2K on 1440p lazily because it somewhat recently became a mainstream rather than high end thing so now more people are being confronted with it, and 1080p is already known as 1080p, whereas 4K and such (5K, 8K, etc.) is a new way to term it, so they needed a way to retroactively label it between the two and 2K was what they went with for... whatever reason, but it's just wrong. It'd be like 2.6K if anything, but that doesn't roll off the tongue as well I guess. And, even though it'd be closer to 3K than 2K, that's still slightly off and probably oversells it a bit much (rather undersell it and push 4K).
even 'hd'
technically hd is 720p, and fhd is 1080p
just say the vertical+p if its in 16:9, or actual res if its not
Although I would say they're good terms, you just have to understand what they mean. And therein lies the problem, you have to be weird about resolution trivia.
And I sort of saw 1080p as the transitory one. Back when LCDs were newer in the late 2000s and perhaps a bit in the early 2010s, you had to spend up a bit to get 1080p, and it wasn't really justified at the time for most people. Broadcast wasn't commonly 1080p at the time (much of it was still 4:3), streaming was still kicking off, the consoles of the time didn't really justify it, DVD was still the norm and didn't justify 720 was as it was, let alone higher. So it just wasn't justified, and I imagine most people who did get it, did so merely by virtue of buying a set large enough that 720 just wasn't used at that size.
720p (or technically 1366x768) was a lot of people's first LCD TV, and unless you routinely changed your TV ever 3 to 5 years, chances are many people stuck with it long enough to just skip over 1080p, because 4K quickly started being pushed so hard that it existed even at the low end.
Higher refresh rate will draw more power but resolution is minimal difference if any.
its the size of the display (quantity of backlight leds)