XxzodazZ Aug 18, 2020 @ 9:57pm
3840 x 2160 pixels or 4096 x 2160
I want to set some games resolution to 4096 x 2160 (my true native res) But to eliminate black vertical borders on several different games I just use a program to override and make 4096 into 3840. However I then can't select 4096 and have to use only 3840. Sounds like just leaving it alone would work fine,but many games don't scale to 4096 and have borders. To complicate things some games think your monitor has only a few lo-res options in borderless,or window without fullscreen etc. This is annoying. I am very used to Nvidia contol panel and have tried all scall to full screen-aspect ratio etc no avail. Any help would be much appreciated
Last edited by XxzodazZ; Aug 19, 2020 @ 8:09am
< >
Showing 16-30 of 36 comments
XxzodazZ Aug 19, 2020 @ 7:33pm 
Originally posted by Bad 💀 Motha:
Originally posted by SkunkJ:
What the kicker is,out of box windows showed 3840x2160 as my native res with 4096x2160 above it. My tv is the latter resolution and games Gta 5 borders Final fantasy borders horizon zero dawn borders all these running 4096,messed'em up. So I had to look forever to find program to make 4096 invisible to my video card. Now I can't say any major problem,but @3840 it is being stretched ever so slightly. Oh well......

Because TVS generally are not programed in the way monitors are and aren't meant to display certain res. Your TV might do both 4K res just fine, however it will simply be stretching the picture if you use 3840 on it.
Yep. I think 3840 looks fair. My Gpu is stretching it. I just wish the games would all run in 4096 without the borders on the side. If I set desktop res to 3840 the games behave
XxzodazZ Aug 19, 2020 @ 7:35pm 
What also bothers me,guess I should have done more homework. These new 4k tv's are rated by gamers on lag input,and game rating. Kind of like it is meant to do both Pc and other media
Bad 💀 Motha Aug 19, 2020 @ 7:45pm 
Originally posted by SkunkJ:
Originally posted by Bad 💀 Motha:

Because TVS generally are not programed in the way monitors are and aren't meant to display certain res. Your TV might do both 4K res just fine, however it will simply be stretching the picture if you use 3840 on it.
Yep. I think 3840 looks fair. My Gpu is stretching it. I just wish the games would all run in 4096 without the borders on the side. If I set desktop res to 3840 the games behave

It won't have any borders if you set nvidia control panel gpu settings to stretch
However it's probably best to allow the TV to stretch the 3840 res instead. This way you can be more sure that game performance isn't impacted
Snow Aug 19, 2020 @ 9:47pm 
Originally posted by Illusion of Progress:
Today I learned 4k isn't 3840 x 2160 but is slightly wider.
Here's another cool info for you - 2K isn't 1440p, it's 1080p. Most people still can't get this right.
Last edited by Snow; Aug 19, 2020 @ 9:47pm
Bad 💀 Motha Aug 19, 2020 @ 9:55pm 
IDK why anyone uses 2K or 2.5K to refer to 1080p and 1440p anyways. It's simply not correct. But yea if anything 2K is more accurate for 1080p, due to the 1920 being close enough to 2K, just like 3840 is close enough to 4K

While 1080p is 2K more/less, 1440p is 1.78X above 1080p, so 2.5K isn't correct for 1440p as its more like 2.8K

This whole 2K, 2.5K, 4K BS is just the industry trying to round off numbers to try and simplify it all for clueless consuners.

Just say what you mean by calling out each screen res and forget this whole 2K, 4K non-sense
Last edited by Bad 💀 Motha; Aug 19, 2020 @ 9:55pm
Tristan Aug 19, 2020 @ 10:51pm 
Originally posted by Bad 💀 Motha:
IDK why anyone uses 2K or 2.5K to refer to 1080p and 1440p anyways. It's simply not correct. But yea if anything 2K is more accurate for 1080p, due to the 1920 being close enough to 2K, just like 3840 is close enough to 4K

While 1080p is 2K more/less, 1440p is 1.78X above 1080p, so 2.5K isn't correct for 1440p as its more like 2.8K

This whole 2K, 2.5K, 4K BS is just the industry trying to round off numbers to try and simplify it all for clueless consuners.

Just say what you mean by calling out each screen res and forget this whole 2K, 4K non-sense


u have a very good point but i dont think its the big companies are doing this.

its just like u said people showing off there pc i also noticed this from time to time
Originally posted by Snow:
Originally posted by Illusion of Progress:
Today I learned 4k isn't 3840 x 2160 but is slightly wider.
Here's another cool info for you - 2K isn't 1440p, it's 1080p. Most people still can't get this right.
Well I know that one. I just didn't know there was a prior cinema inspired 4k and that the PC used a different 4k resolution. I thought the entire point of 16:9 coming to PC was to unify them (well, production costs too).
Originally posted by Bad 💀 Motha:
I
While 1080p is 2K more/less, 1440p is 1.78X above 1080p, so 2.5K isn't correct for 1440p as its more like 2.8K
Eh? 1440 is 133% of 1080, no? How are you getting almost two times larger? 1.78x larger would mean the horizontal resolution would be well North of 3,000 pixels (around 3500-ish plus or minus).

And I never heard of 2.5k, but if people are applying the logic that the horizontal resolution determines the amount, then actually, 1440p (2560 horizontal resolution) would be proper to call 2.5k (closer to 2.6k if you want to round to the nearest but still).

Although, yeah, I agree with you on there being numerous overlapping terms and it getting annoying. But for making average Joe consumer understand, it sort of works. They know there's this new thing called 4k, and they know of 1080p (or full HD). Now there's this increasingly common 1440p (for PCs at least). 2k and 2.5k work as decent enough lazy latch-on terms for the average consumer to understand the differences and how they all line up compared to the new 4k naming scheme. I guess 2160p and later 4320p (if this is what 8k is vertically?) weren't as favored. Add in 5k and such and I sort of get it.
Last edited by Illusion of Progress; Aug 20, 2020 @ 1:18am
Tristan Aug 20, 2020 @ 3:58am 
Originally posted by Illusion of Progress:
Originally posted by Bad 💀 Motha:
I
While 1080p is 2K more/less, 1440p is 1.78X above 1080p, so 2.5K isn't correct for 1440p as its more like 2.8K
But for making average Joe consumer understand, it sort of works. They know there's this new thing called 4k, and they know of 1080p (or full HD). Now there's this increasingly common 1440p (for PCs at least). 2k and 2.5k work as decent enough lazy latch-on terms for the average consumer to understand the differences and how they all line up compared to the new 4k naming scheme. I guess 2160p and later 4320p (if this is what 8k is vertically?) weren't as favored. Add in 5k and such and I sort of get it.


yea people like me :steamhappy:
Snow Aug 20, 2020 @ 5:22pm 
Originally posted by Bad 💀 Motha:
While 1080p is 2K more/less
Some 1080p resolutions are 2K, and some aren't. Depends on the width.
Originally posted by Illusion of Progress:
Eh? 1440 is 133% of 1080, no?
I guess he meant QHD over FHD, so the total pixel count, not just height.
Originally posted by Illusion of Progress:
But for making average Joe consumer understand, it sort of works.
Neither 2K, nor 4K were meant for consumers, it's cinema standards. CEA allows to put 4K next to UHD, but saying just 2K or 4K talking consumer electronics simply isn't right.
Originally posted by Snow:
Originally posted by Illusion of Progress:
Eh? 1440 is 133% of 1080, no?
I guess he meant QHD over FHD, so the total pixel count, not just height.
Oh, that would explain where the number comes from, but makes zero sense to compare it that way. That's literally saying "1080p is 2k because its horizontal resolution is close to 2,000 pixels, but we can't call 1440p 2.5k even though its horizontal resolution is close to 2,500 pixels because... of this completely separate criteria in which they differ by a different amount". Again, that makes zero sense. Of course if you change the measuring criteria halfway through, the results won't line up.
Originally posted by Snow:
Originally posted by Illusion of Progress:
But for making average Joe consumer understand, it sort of works.
Neither 2K, nor 4K were meant for consumers, it's cinema standards. CEA allows to put 4K next to UHD, but saying just 2K or 4K talking consumer electronics simply isn't right.
Right, initially. But as with many things in life, what it was meant as originally, and what it is later, aren't always the same. Whether it's 100% accurate or proper is neither here nor there as it's apparently what they've decided to go with for it. I'm not saying I personally wholly support it, but I get why it became a thing, and it sort of works for that purpose regardless.
Last edited by Illusion of Progress; Aug 20, 2020 @ 8:33pm
Snow Aug 20, 2020 @ 8:49pm 
Originally posted by Illusion of Progress:
Oh, that would explain where the number comes from, but makes zero sense to compare it that way. That's literally saying "1080p is 2k because its horizontal resolution is close to 2,000 pixels, but we can't call 1440p 2.5k even though its horizontal resolution is close to 2,500 pixels because... of this completely separate criteria in which they differ by a different amount". Again, that makes zero sense. Of course if you change the measuring criteria halfway through, the results won't line up.
There's no "measuring criteria" of any sort, only standards. 2K is 2048x1080p, which makes it a 1080p resolution. There's no such thing as 2.5K, no one established such standard to my knowledge.
Originally posted by Illusion of Progress:
Neither 2K, nor 4K were meant for consumers, it's cinema standards. CEA allows to put 4K next to UHD, but saying just 2K or 4K talking consumer electronics simply isn't right.
Right, initially. But as with many things in life, what it was meant as originally, and what it is later, aren't always the same. Whether it's 100% accurate or proper is neither here nor there as it's apparently what they've decided to go with for it. I'm not saying I personally wholly support it, but I get why it became a thing, and it sort of works for that purpose regardless. [/quote]
It still is what it originally was. DCI didn't redefine 2K and 4K, CEA didn't redefine FHD, QHD or UHD. It sure is common to confuse cinema resolutions with consumer resolutions, but it's up to us if we tell people how it all actually works or go in line with all that. I personally don't mind explaining resolutions to those confused by marketing and people online as long as they listen.
Originally posted by Snow:
There's no "measuring criteria" of any sort, only standards. 2K is 2048x1080p, which makes it a 1080p resolution. There's no such thing as 2.5K, no one established such standard to my knowledge.
You're missing the point. Whether it exists or not as an official thing is an entirely separate point that was neither here nor there. Okay, 2k is 2048 x 1080 and not 1920 x 1080. Okay, 4k is also 4096 x 2160 and not 3840 x 2160. Nice, I get it. But that's neither here nor there. The other person was saying this...

"IDK why anyone uses 2K or 2.5K to refer to 1080p and 1440p anyways. It's simply not correct. But yea if anything 2K is more accurate for 1080p, due to the 1920 being close enough to 2K, just like 3840 is close enough to 4K

While 1080p is 2K more/less, 1440p is 1.78X above 1080p, so 2.5K isn't correct for 1440p as its more like 2.8K"

Bold is my emphasis. Do I really have to point out the obvious inconsistency there? You can't say 1920 is 2k and 3840 is 4k and then turn around and say it doesn't match up to call 2560 2.5k. That's all I'm saying. I'm not making an argument here to officially call 2560 x 1440 as 2.5k.
Originally posted by Snow:
It still is what it originally was. DCI didn't redefine 2K and 4K, CEA didn't redefine FHD, QHD or UHD. It sure is common to confuse cinema resolutions with consumer resolutions, but it's up to us if we tell people how it all actually works or go in line with all that. I personally don't mind explaining resolutions to those confused by marketing and people online as long as they listen.
Sure, and you do you I'm don't mind. Again, you are confusing me for making an argument for something I'm not; I'm simply saying I understand WHY it happens. In short, marketing.
Last edited by Illusion of Progress; Aug 21, 2020 @ 6:29pm
76561198264707224 Aug 21, 2020 @ 11:34pm 
This might be what is causing all of this to take effect on the Market.

Sony's HVEC & H.2.65 is what we are used to using all these Years for 4K.
But I noticed the problem is also in Movies & TV, too.

Microsoft switched out the HEVC & H.265 for a less costly format called CV-1,
which is not as great, nor as nice looking as the HEVC & H.265 by Sony.

HEVC & H.265 = 4096 Res Native, Smooth, & Sharp (Costly)
MPEG2 & CV-1 = 3086 Res Stretched Slightly, Blurry, & Laggy (Cheaper)

Microsoft doesn't wanna pay Sony the costs anymore, so they are changing this format to
a Cheaper & more affordable cost now.

This might be why your getting your Resolution issues, but it also might not.

------------------------------------------------

I'm aware i'm getting this info from a Movie format, but after looking it up myself,
I just think this is what might be what is causing the issues.

Movies in even 1080P Hi-Def look horrible now, mostly when paused.
They don't even look Hi-Def anymore.

I can only imagine what Games would look like now. Not very pleasing to View while
playing your Games.
Tristan Aug 21, 2020 @ 11:58pm 
so can any one of u guys help me out ?

it looks like i am in the right place for this question

please look at image witch one should i choose to play skyrim on ?


https://www.dropbox.com/s/7lv8dk2hlvjd1z5/Screenshot%202020-08-22%2008.53.05.png?dl=0
< >
Showing 16-30 of 36 comments
Per page: 1530 50

Date Posted: Aug 18, 2020 @ 9:57pm
Posts: 36