安装 Steam
登录
|
语言
繁體中文(繁体中文)
日本語(日语)
한국어(韩语)
ไทย(泰语)
български(保加利亚语)
Čeština(捷克语)
Dansk(丹麦语)
Deutsch(德语)
English(英语)
Español-España(西班牙语 - 西班牙)
Español - Latinoamérica(西班牙语 - 拉丁美洲)
Ελληνικά(希腊语)
Français(法语)
Italiano(意大利语)
Bahasa Indonesia(印度尼西亚语)
Magyar(匈牙利语)
Nederlands(荷兰语)
Norsk(挪威语)
Polski(波兰语)
Português(葡萄牙语 - 葡萄牙)
Português-Brasil(葡萄牙语 - 巴西)
Română(罗马尼亚语)
Русский(俄语)
Suomi(芬兰语)
Svenska(瑞典语)
Türkçe(土耳其语)
Tiếng Việt(越南语)
Українська(乌克兰语)
报告翻译问题
The correct term for this technology is Dithering. Dithering is a low cost transparency effect without any real transparency (meaning adding multiple colors on top of each other).
The solution for MGS5 was simply to up render detail in the graphics options. It is post processing that activates true transparency.
https://steamcommunity.com/sharedfiles/filedetails/?id=2015516381
Oh noes my graphics card is malfunctioning
https://steamcommunity.com/sharedfiles/filedetails/?id=2015516420
Post Processing "Extra High"
2080 Ti fails to achieve 144fps at 1080p in The Witcher 3.
https://www.youtube.com/watch?v=pDlfWPXpsmI
In The Division 2.
https://www.youtube.com/watch?v=Om5YCsGuEEs
In Forza Horizon 4.
https://www.youtube.com/watch?v=hTiEKEvYI0A
In SotTR.
https://www.youtube.com/watch?v=3f0bPvbUQ_U
Tell me about "not breaking a sweat". Or were you referring to CS:GO?
Upping your resolution, or at least the resolution of the transparency does make the screendor effect on said transparent objects less visible.
From my personal experience there is a couple of things affecting how subtle the effect is. Internal render resolution and the render resolution of the effect itself being the main ones.
Maybe certain drivers on certain graphics cards mess with the effect rendering but I wasn't able to pinpoint that yet.
Which is why I'm always interested in which resolution people play in so that I can test said games out myself.
Personally my worst offender was way back when I played Fable 3 on an ATI Radeon HD2400. Had to run that game at lowest settings @480p to even break 30fps though.
Of course the 2080Ti doesn't achieve 144fps in Witcher 3, they got Hairworks enabled lol. It is an outlier. That's similar as to me saying "GPU X can't achieve 60 FPS in game Y so it's not a 4k capable card" while running said game with 8xMSAA or something ridiculous like that (Tip: Ultra settings pretty much always destroy FPS without adding much noticeable detail).
There will always be games that perform horribly with certain graphics settings. Heck I can't even consistently reach 60fps in VRCHAT with a 1080.
All I'm saying is that, generally speaking, the 2080Ti absolutely destroys games at 1080p and generally runs into a CPU bottleneck before anything else at that resolution.
I'm playing at 3440x1440p with a GTX 1080 and I get 60-90 fps in pretty much any game I play. If I were to drop down to 1080p I can assure you I'd easily get 100+ fps in every game (except for the obvious exceptions).
But that wasn't the point of my comment, I was just curious as to why they chose that particular configuration because most of the time I'd recommend people to go with at least a 1440p display with a setup like that because they got the horsepower to easily do so.
Most people don't aim for 144+ fps. The majority of people is absolutely happy playing at 60-70 fps.
Anyway, we're going offtopic here.
Since MSAA is rendering parts of the image in higher resolution, it would've indeed be unfair to use it in benchmarking. And yeah, ultra vs high often have little to no visual difference, unless it's HBAO+ vs SSAO - then the difference is huge and totally worth it.
That things runs Unity. The only smooth experience I had with Unity was Ori and the Blind Forest, and even then I had to add something like -DX9 to shortcut to make it stop stuttering. Screw Unity.
And what I'm saying is they could've bought 2080 Ti for high refresh rate gaming, which can be just as taxing as high resolution gaming. Generally 1080p 144fps takes around the same or even more GPU power as 2160p 60fps, unless it's something really old.
I'm actually quite happy playing 30-60 fps, but I'd always drop a bit of settings to get higher performance. Depending on the game, it can just as well be resolution, because if the game has tons of nice effects and smart AA - I'd rather play at lower resolution than lose all the shadows/reflections/whatever. Control is one of those games I've sacrificed resolution instead of effects to get everything smooth.
But, like, why not. I quite enjoyed our conversation here, and there is actually no topic anymore - after all it's 2015's thread, and all the right answers have been given already. Might as well just talk stuff here.