Steam telepítése
belépés
|
nyelv
简体中文 (egyszerűsített kínai)
繁體中文 (hagyományos kínai)
日本語 (japán)
한국어 (koreai)
ไทย (thai)
Български (bolgár)
Čeština (cseh)
Dansk (dán)
Deutsch (német)
English (angol)
Español - España (spanyolországi spanyol)
Español - Latinoamérica (latin-amerikai spanyol)
Ελληνικά (görög)
Français (francia)
Italiano (olasz)
Bahasa Indonesia (indonéz)
Nederlands (holland)
Norsk (norvég)
Polski (lengyel)
Português (portugáliai portugál)
Português - Brasil (brazíliai portugál)
Română (román)
Русский (orosz)
Suomi (finn)
Svenska (svéd)
Türkçe (török)
Tiếng Việt (vietnámi)
Українська (ukrán)
Fordítási probléma jelentése
2160p is double the pixel count of 1080p so if that's 4k then 1080p is 2k
It fits into 4k 4 times because it's a display resolution, it's 2 times the actual resolution not 4 times
1920x1080=2,073,600 pixel
3840x2160=8,294,400pixels
That's why many people believed that 4K is from 4 times more pixels and 2K is from 2 times the pixel count.
4K was not supposed to exist but was made for marketing as well because UHD was unappealing
logically speaking , closest one to 2K today will be 1920x1080 & 1920x1200
but because this would be a poor marketing move the marketing crew ended up agreeing on a <2K for these
and named 2560x1440 as 2K even though it's way above the 2K and would fit 2.6K more
naming 1920x1080 & 1920x1200 as "2K" would be a poor marketing move because the "4K" aka UHD aka 3840x2160 is 4 times better than 1920x1080
so people would look at 2K(1920x1080) and at 4K(3840x2160) , would see a massive price difference , and will think (oh damn, this 4K is so much more expensive while being just TWO times better! i'm buying 2K.) , they basically would misunderstand everything , and thus 2560x1440 was named as 2K by the marketing
if you want more info then this dude on reddit (who's also apparently works/worked at Linus Tech Tips, still being a mod at LTT forum at the very least) explained it in depth -
start quote:
Long ago, the digital cinema industry used "#K" as a shorthand for horizontal resolution. Usually rounded to the nearest .5, so you would have (for example):
2K for ≈2000 pixel wide formats, like 2048×1080 or 1920×1080
2.5K for ≈2500 pixel wide formats, like 2560×1440 or 2400×1350
4K for ≈4000 pixel wide formats, like 4096×2160 or 3840×2160 etc.
It was just a casual shorthand, with no specific rules. See here: https://cdn.static-bl.com/images/manual/WEAPON-SCARLET-W-Operation-Guide.pdf#page=109
Then TV makers decided to start switching over to that convention when they transitioned to 3840×2160 televisions. This is a 4K format, by the well-established cinema shorthand, so the used the term 4K. This makes sense.
Unfortuantely, they also made a big deal of the fact that 3840×2160 had 4 times as many pixels as 1080p.
As a result, when people would ask "what's this new '4K' thing", people would inevitably answer "it's 4 times 1080p". Which, while true, is basically just a coincidence, and has nothing to do with why it's called "4K". But people were understandably misled by this.
Then some people decided they wanted to start being cool, and said "Aha, well if 4K is 4 times 1080p, and 2560×1440 is around twice as many pixels as 1080p, I'm gonna start calling it 2K! That will make me seem really cool!". And lots of people would ask them "what's 2K" (and we are talking about the type of people that really like it when people ask them what things mean, because it makes them feel like knowledgeable experts) and they would be more than happy to teach every person who asked how the "#K" notation system supposedly worked ("it means how many times 1080p it is!") and so the idea spread.
It's also worth noting this is around the time when phones started to have 2560×1440 screens for the first time, and everyone was talking about it, so a number of tech news outlets began using the term "2K screen" in the title instead of the conventional terms 1440p or QHD (or simply 2560×1440), likely to generate more clicks ("ooh, what does '2K' mean? I've never heard of it! Is it related to 4K? I'd better read to find out!". And this helped cement the idea of the "#K = how many times 1080p" convention.
After tech news articles started using it, Newegg updated their website to label all 2560×1440 monitors and 3440×1440 monitors as "2K", even though 3440×1440 doesn't even meet the "how many times 1080p" convention, it makes no sense at all; they are literally just using "2K" as a drop in replacement for 1440p with no understanding of what it is supposed to mean. But this also helped cement the "2560×1440 is 2K" idea.
Examples:
Newegg just lists resolutions with no names (on the left sidebar, expand the "Recommended Resolution" option)
Newegg starts incorrectly labeling things "2K" (on the left sidebar, in the "resolution" section, expand the "2K" option)
A few months later, ASUS updated their website to start referring to their new 2560×1440 models as 2K, and so did some other manufacturers. Viewsonic even started putting "2K" embedded in the model number of some of their displays.
Examples:
Compare ASUS doesn't have 2K with ASUS changes product pages to have "2K"
ViewSonic puts "2K" in model number
Of course, things reversed a little bit when the first 7680×4320 screens (and related things like linus's 8K gaming video) started coming out with lots of people asking "shouldn't it be 16K, since it's 16 times 1080p?" and people were able to correct them.
Demonstration: some search results for "16K" in the comments of this video: https://i.imgur.com/hCMQzvv.png (Depressing GIF version)
So now people mostly use the correct convention for resolutions higher than 4K.
However, even though people now use the correct terms for >4K resolutions, 2K referring to 2560×1440 instead of ≈1920×1080 is still left over as a vestige of all the confusion, because people are used to it. No matter how many times you try to correct it, people will always just go with whatever convention other people use, because we don't want to go against the flow, or seem pedantic or weird.
For that reason I don't think people will switch from calling it 2K to calling it 2.5K, and that is quite understandable. What does annoy me though are those people on the internet who have that instinctual urge to avoid admitting they were wrong at all costs, and start making backpedally excuses about how they were following the convention the whole time "no, I was just rounding down to the nearest whole number" even though the original convention is clearly "2.5K". But oh well.
------------------------------------------------
By the way, on the topic of the whole "2K" thing, another thing you'll see a lot is people who say "2K refers to a specific DCI standard with a resolution of 2048×1080." That is incorrect. It is a generic term for resolutions approximately 2000 pixels in width. Yes, DCI uses the term "2K" in their standard as a descriptive term (because it is a 2K resolution), but that's not like an "official name" for the format or anything.
Example: here's a discussion about 2K and 4K resolutions from 2004 (note, that is before the DCI Specification even existed): https://web.archive.org/web/20090216161631/http://cinematography.net:80/edited-pages/Defining_2K_and_4K.htm
So no, DCI did not invent the term "2K" as a name for their standard, these terms are casual shorthand descriptions that have been in use for years before the DCI standard existed. DCI is just using it as a descriptive term. When they establish definitions for "2K" and "4K", they're basically just doing the same thing as every other long document does (such as legal documents), where they make a long description of something, give it a short name, and use that short name to refer to that long description to save space and typing.
So yes, "2K" means 2048×1080 within the scope of the document, just like legal documents when they say "this is a legal contract between The Company with a Creative Name, LLC (Creative) and the person who is operating the account that is posting messages on their forum (User)". It's well understood that this is not claiming to define the word "user", but just means "this long description will henceforth be known by this one word for the rest of this document for convenience". That is what the DCI Specification means by the definitions of a 2K and 4K distribution in section 4.3 and elsewhere. Outside the specification, it is just a generic term, as demonstrated by the archived discussion linked above (or refer to the Red camera manual linked way near the top).
It's basically like if I were to come up with a new standard that said "This standard establishes two standardized formats, a 16:9 format and a 21:9 format. The 16:9 format shall have a resolution of 1600×900, and the 21:9 format shall have a resolution of 2100×900, and they shall have 8 bpc color depth, 60 Hz refresh rate, [...]" And then, now that I've defined them, just refer to them as "the 16:9 format" and "the 21:9 format" throughout the rest of the standard, and it gets misinterpreted as an official name for the format or something, when in reality the "16:9" is meant to describe the format, not name it. (and then 10 years later a bunch of internet people start saying "no, 1920×1080 isn't true 16:9, true 16:9 is an official standard, see the definition here? It says "the 16:9 format is defined as [...]" right here!".
So "2K" is a generic term. If you want to refer specifically to the particular 2K format described by the DCI standard, the generally accepted name is simply "DCI 2K". The whole "2048×1080 is the one true definition of 2K" thing is not correct, and the same goes for its counterpart, the whole "4096×2160 is the one true 4K format" idea that you've probably heard before too. There is no such thing as "the one true 4K format", it's just a generic descriptive term, like 16:9.
end quote
No one quoting technical specs is wrong. However, the world is chock full of examples where colloquialisms and terms that filter down to the consumer level, etc get muddy.
When someone asks a questions clearly referring to a commonly used colloquialism and the response is arguing the widely used layman's terms are wrong in favor of technical terms/definitions laymen don't use, is a classic blunder.
If we move this same argument towards something we take for granted, like say "kilobytes". How many bytes are in a kilobyte anyway? The common understanding, the almost universal understanding is 1024. But it's wrong. Kilo is a decimal prefix, it literally means 1,000. But everyone has been misusing it for so long that sometimes people will argue that the correct use is dishonest, see HDD/SSD manufacturers who use the decimal value to advertise their disk sizes versus the binary value. It was less of a problem when you were a few bytes off, or a few KB or a few MB. But the larger you go the wider the differences gets. 1TB vs 930GB, 70GB difference is a lot.
And we have binary prefixes in the form of kibi, mebi, gibi, tebi-bytes, and KiB, MiB, GiB, TiB. But no one us going to use them, no one wants to. Using the same terms for decimal and binary measurements is locked in.
So at what point exactly does an "incorrect" but universally used term become defacto correct?
So yeah some people may not like 2K as a term because it's a bit messy. No one really uses 1K, and 4K is locked in. The battle is lost, we're not erasing 15-20 years of colloquialisms. Ignoring that is at least as foolish as adoption of sloppy terms. But welcome to Earth, where the rules are made up and the points don't matter.
4320p (8k): 7680x4320
2160p (4K): 3840x2160
1440p (2k): 2560x1440
1080p (HD): 1920x1080
720p (HD): 1280x720
480p (SD): 854x480
360p (SD): 640x360
240p (SD): 426x240
1080p is to 2160p as 720p is to 1440p, so logically 1080p is the real 2K if 2160p is definitively 4K.