安装 Steam
登录
|
语言
繁體中文(繁体中文)
日本語(日语)
한국어(韩语)
ไทย(泰语)
български(保加利亚语)
Čeština(捷克语)
Dansk(丹麦语)
Deutsch(德语)
English(英语)
Español-España(西班牙语 - 西班牙)
Español - Latinoamérica(西班牙语 - 拉丁美洲)
Ελληνικά(希腊语)
Français(法语)
Italiano(意大利语)
Bahasa Indonesia(印度尼西亚语)
Magyar(匈牙利语)
Nederlands(荷兰语)
Norsk(挪威语)
Polski(波兰语)
Português(葡萄牙语 - 葡萄牙)
Português-Brasil(葡萄牙语 - 巴西)
Română(罗马尼亚语)
Русский(俄语)
Suomi(芬兰语)
Svenska(瑞典语)
Türkçe(土耳其语)
Tiếng Việt(越南语)
Українська(乌克兰语)
报告翻译问题
I mean... modern Far Crys, Crysis's, Metro's, Battlefields, Bathesda titles, Call of Duty's, All popular F2P, E-Sports and BR games, modern CIV and Anno tittles, Borderlands's, Outerworlds, Monster Hunter World, GTA, Killing Floor 2, The Forrest, R6S...
I cant think of a game I have to drop any lower than mostly high with a couple select mediums, and most I get even better than that.
Serisouly, what title do you play that you have had this issue?...
I mean, I get what you are saying to an extent, but to have anything in the 1080/Vega64/2070/5700xt or greater range and struggle to run 4K at reasonable settings seems odd to me...
Even in such a case, just pull the old NV trick and use some form of low load AA combined with downscaling al-la fake DLSS. In side by side testing with still image capture using something like traditional 2x AA with ~75% rez, or something like TAA with a decent 70-80% rez scale will produce similar quality and similar FPS to first generation DLSS...
And before you ask about my computer, it's a Ryzen 5 2600, overclocked @ 4.2 Ghz all-core with 16GB DDR4 @ 3533 Mhz @ 16-16-16. And windows 10 64-bit. I might have a better time of it if I had a 9900K or a Ryzen 3900X system. That may be a factor into it but I don't know if it would make that big of a difference. 4K stuff is usually graphics card limiting, not so much system limiting.
We have the same cpu :D
Fair enough, out of those the only one I have is Borderlands 3 which I havent played much of (had my borderlands fill in 1/2 and PS). That said I will give it to you that BL3 seems to eat pixels from the testing I ran tonight. That said you should def be able to pull reasonable 4K full native, or 4K scaled settings.
It will take me a bit to render them all down from 100Mb/s raw and then upload them cuz they are at 4K but I ran some testing for you and have 6 bench runs.
The primary three (the ones I would actually use) are a settings balanced to 4K native, a settings balanced to 4K output with 75% rez, one is maxed "badaxx" at 4K output with 50% rez. Of those maxed @ 50% is the worst looking IMHO. These are all shoting for around 60FPS, and on my GPU are a bit under that, but on a 5700XT/2070s/1080ti or better should be more or less a solid 60.
Along with those I have 3 additionals, one is 4K full native @ "High" preset that the game defaulted to for my system, one is 4K full native @ "Ultra" preset, and the last for lolz is a "True Max" with 4K on Badaxx settings with 200% rez scaling (8K rendering). That last one is a slide show, and its worth noting that it would require a GPU with 16GB VRAM as it was sonsistently at like 13-14GB VRAM usage (had to use AMD's memory segment feature).
But if you have a 5700XT or anything within 10%-ish in performance there are a few options to get 4K at good frames and good quality.
as stated about in my initial post, i do NOT play modern trash AAA games. i play well built indie games or games that a multiple years old and have actually fixed to a working state. i have zero respect for modern game developers, as they are lazy bums who throw out broken products after broken products and than insult US for giving their games low scores....yes this is a run on sentence but i do not care :D
I was pulling mixtures of Medium and High (with *no* lows) on full native 4K in Borderlands 3 tonight with 52fps averages and have the vids to upload for proof, and thats on a slower Vega 64, meaning a 10-20% faster 5700XT would pull a pretty solid 60. And thats one of the games you gave. I also dont have to run any lows in Battlefields, or most any other modern title, period.
On the whole, with anything late 2019 through 2020 I generally run a mixture of medium and high with games either tending towards more mediums with some highs or towards more highs with some mediums.
For anything early 2019 or older its 4K full native with either high, near max, or max settings. Mediums are not generally used at all.
Again, this is with a Vega 64 which is 10-20% slower than a 5700XT...
Now, as to your question as to why anyone would willingly spend the price (~$400 USD) for decent quality 4K... Well, I can hands down tell you my 4K quality is easilly 2-3x that of the "4K" any console can put out. I easily call that worth the 400 bucks I paid...
Now as I will leave it to the community to decide tomorrow night when I upload the vids of BL3, the bigger question if why anyone would want to jump up to the next step or higher (for 4K, not high refresh). There is little difference between setings optimised 4K and true 4K in real world view but there is in FPS performance... If I can pull high quality good looking 4K at 400 bucks, and if I can pull 100% maxed 4K @ 30fps and see there is little difference visually, why should I shell out the extra $800 USD to upgrade for that maxed settings 4K that is little different in looks?... That is what would seem sad to me, paying +$800 for placebo quality increase...
In that case man either card will run wonders for you and be overkill for 4K in most of those types of titles ;) You will be happy either way!
But unless you have a real need for Ray Tracing on those titles you might opt for a 5700xt to save the cash over the RTX lineup for the same general rendering performance ;)
You can add Deus Ex Mankind Divided and Assassin's Creed Origins. I've tried both of them in 4k on my 5700XT. Mankind Divided ran at a mix of Low & Medium, while AC Origins needed the low preset (w/ maxed textures). (I was aiming for 60fps in both games.)
Yea, i changed out the 4K monitor for a 144hz 1440p curved (27 inches) still got the 5700xt, should last a while. Rather have a smooth as butter framerate > Resolution tbh.
Not sure about Deus Ex but AC Origins is not a very optimized game on PC (or so i hear)
I'm still at 1080p with my 5700XT. Still don't care to move up resolution. (24in 240Hz and 27in 144Hz (yuck))
And yes, AC:O is a horrid hog on resources. And no, it is def one of the *few* titles that is not playable at decent settings full native 4K. But then again, that holds true for just about any GPU with that game. And its not even a title that has super forward thinking things that are just too much right now. No, it is just hog. Reminds me of GTA-IV or Saints Row 2 in that regard :/
Basically the Arkham Knight of its generation.
i mean my Ryzen 2600 is probably not ideal for 144hz but i really do not play anything demanding. as long as i can hold above 60 fps in most games, it will still be smooth. Most graphically intensive game i'll play this year is probably Cyberpunk 2077 and by that point, i'll have gotten a 3700x (unless the new ryzen chips are close)
i am fine lowering settings to keep frames up, Usually as long as textures, af and draw distance are max, the other stuff can be shut off for all i care lol.