Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
It won't have any borders if you set nvidia control panel gpu settings to stretch
However it's probably best to allow the TV to stretch the 3840 res instead. This way you can be more sure that game performance isn't impacted
While 1080p is 2K more/less, 1440p is 1.78X above 1080p, so 2.5K isn't correct for 1440p as its more like 2.8K
This whole 2K, 2.5K, 4K BS is just the industry trying to round off numbers to try and simplify it all for clueless consuners.
Just say what you mean by calling out each screen res and forget this whole 2K, 4K non-sense
u have a very good point but i dont think its the big companies are doing this.
its just like u said people showing off there pc i also noticed this from time to time
And I never heard of 2.5k, but if people are applying the logic that the horizontal resolution determines the amount, then actually, 1440p (2560 horizontal resolution) would be proper to call 2.5k (closer to 2.6k if you want to round to the nearest but still).
Although, yeah, I agree with you on there being numerous overlapping terms and it getting annoying. But for making average Joe consumer understand, it sort of works. They know there's this new thing called 4k, and they know of 1080p (or full HD). Now there's this increasingly common 1440p (for PCs at least). 2k and 2.5k work as decent enough lazy latch-on terms for the average consumer to understand the differences and how they all line up compared to the new 4k naming scheme. I guess 2160p and later 4320p (if this is what 8k is vertically?) weren't as favored. Add in 5k and such and I sort of get it.
yea people like me
I guess he meant QHD over FHD, so the total pixel count, not just height.
Neither 2K, nor 4K were meant for consumers, it's cinema standards. CEA allows to put 4K next to UHD, but saying just 2K or 4K talking consumer electronics simply isn't right.
Right, initially. But as with many things in life, what it was meant as originally, and what it is later, aren't always the same. Whether it's 100% accurate or proper is neither here nor there as it's apparently what they've decided to go with for it. I'm not saying I personally wholly support it, but I get why it became a thing, and it sort of works for that purpose regardless.
Right, initially. But as with many things in life, what it was meant as originally, and what it is later, aren't always the same. Whether it's 100% accurate or proper is neither here nor there as it's apparently what they've decided to go with for it. I'm not saying I personally wholly support it, but I get why it became a thing, and it sort of works for that purpose regardless. [/quote]
It still is what it originally was. DCI didn't redefine 2K and 4K, CEA didn't redefine FHD, QHD or UHD. It sure is common to confuse cinema resolutions with consumer resolutions, but it's up to us if we tell people how it all actually works or go in line with all that. I personally don't mind explaining resolutions to those confused by marketing and people online as long as they listen.
"IDK why anyone uses 2K or 2.5K to refer to 1080p and 1440p anyways. It's simply not correct. But yea if anything 2K is more accurate for 1080p, due to the 1920 being close enough to 2K, just like 3840 is close enough to 4K
While 1080p is 2K more/less, 1440p is 1.78X above 1080p, so 2.5K isn't correct for 1440p as its more like 2.8K"
Bold is my emphasis. Do I really have to point out the obvious inconsistency there? You can't say 1920 is 2k and 3840 is 4k and then turn around and say it doesn't match up to call 2560 2.5k. That's all I'm saying. I'm not making an argument here to officially call 2560 x 1440 as 2.5k.
Sure, and you do you I'm don't mind. Again, you are confusing me for making an argument for something I'm not; I'm simply saying I understand WHY it happens. In short, marketing.
Sony's HVEC & H.2.65 is what we are used to using all these Years for 4K.
But I noticed the problem is also in Movies & TV, too.
Microsoft switched out the HEVC & H.265 for a less costly format called CV-1,
which is not as great, nor as nice looking as the HEVC & H.265 by Sony.
HEVC & H.265 = 4096 Res Native, Smooth, & Sharp (Costly)
MPEG2 & CV-1 = 3086 Res Stretched Slightly, Blurry, & Laggy (Cheaper)
Microsoft doesn't wanna pay Sony the costs anymore, so they are changing this format to
a Cheaper & more affordable cost now.
This might be why your getting your Resolution issues, but it also might not.
------------------------------------------------
I'm aware i'm getting this info from a Movie format, but after looking it up myself,
I just think this is what might be what is causing the issues.
Movies in even 1080P Hi-Def look horrible now, mostly when paused.
They don't even look Hi-Def anymore.
I can only imagine what Games would look like now. Not very pleasing to View while
playing your Games.
it looks like i am in the right place for this question
please look at image witch one should i choose to play skyrim on ?
https://www.dropbox.com/s/7lv8dk2hlvjd1z5/Screenshot%202020-08-22%2008.53.05.png?dl=0