Steam telepítése
belépés
|
nyelv
简体中文 (egyszerűsített kínai)
繁體中文 (hagyományos kínai)
日本語 (japán)
한국어 (koreai)
ไทย (thai)
Български (bolgár)
Čeština (cseh)
Dansk (dán)
Deutsch (német)
English (angol)
Español - España (spanyolországi spanyol)
Español - Latinoamérica (latin-amerikai spanyol)
Ελληνικά (görög)
Français (francia)
Italiano (olasz)
Bahasa Indonesia (indonéz)
Nederlands (holland)
Norsk (norvég)
Polski (lengyel)
Português (portugáliai portugál)
Português - Brasil (brazíliai portugál)
Română (román)
Русский (orosz)
Suomi (finn)
Svenska (svéd)
Türkçe (török)
Tiếng Việt (vietnámi)
Українська (ukrán)
Fordítási probléma jelentése
So, yeah, a 2080ti can handle just about everything at 4k, however I still prefer higher fps as I find it makes more of an impact when I'm sitting 8-10ft away.
See… Hers the thing though.. For me, AA is the first thing to come back thanks to the Nvidia effect…
Whats that I hear you wondering…
Well, awhile back NV pushed a fancy new tech, one *many * people use, especially at 4K or when using Ray Tracing…
DLSS…
Why does this matter… Because anyone using DLSS is no longer running at anywhere even * near* full 4K quality. Not. Even. Close.
In side by side quality testing DLSS is *worse* than 78% rez…
So basically, anyone running a good quality AA with around 70-80% rez is now getting the same or better quality as DLSS. Except this works with almost any new title where DLSS doesn't…
Think I am full of it?…
https://www.techspot.com/article/1794-nvidia-rtx-dlss-battlefield/
And this comes into play in titles where perhaps you * have* to have that extra bit of performance.
Lets look at Far Cry 5.
For me, the in game bench is generally a bit lower than my real world performance in game, by an average of 5-10fps higher in game than in bench. Due to this, I personally ran the game at 4K full native ultra, and got a mostly average 55-60 with some dips into the 50’s and occasionally into the 40s (but that was rare).
This is reflected in the following benchmark, at all ultra, full native 4K, with no AA present. I score an average of 48 in the bench, which translated to about the stated 55ish average in game (great for me for such a title).
https://steamcommunity.com/sharedfiles/filedetails/?id=2095716613
https://youtu.be/yxmxiEeZUvw
But now… Lets apply that NV effect… Remember, 78% rez was already better than 4k with DLSS… So, lets do bit better more and run 80% cuz 10% increments and all, and lets toss on some TAA, and leave all other settings at ultra with 4K output…
https://steamcommunity.com/sharedfiles/filedetails/?id=2095716631
https://youtu.be/J341rrIqBE8
Now we have a true solid 60FPS+ gaming experience, again, real world will be a bit better (at least it was for me in this title)…
For a final look I put the two together into a side by side. At least to my eyes, and to those of my girlfriend who didn't know what she was being shown (was just asked which looked better) we both think the TAA@80% looks better than the native sans TAA…
80% is on left, Full 4k on right, boath sides were from the raw 100mbps recording.
https://youtu.be/fwLGkUgOV-A
Before the RTX line PC Gamers wouldn't deal with rez scaling, now days NV has literally made it a default feature and a crutch that they force the use of allot of times with RTX enabled.
If the PCMR community is fine with DLSS, then there is no reasonable argument against the proven superior method of 80% rez + AA. Period.
Anyone who wants to argue that is not reasonable 4K, has to also be willing to accept that any use of DLSS causes even worse reductions in graphic fidelity. Though I will give NV credit that DLSS 2.0 is a bit better.
But as to 4K on a 5700XT… Well, in my view the Vega can push high settings 4K. Maybe not max, but def *far* from minimums as people seem to think. Being that the 5700XT is 10-20% faster, and that I am able to pull over 60FPS at 80%, then the 5700XT, at full native 4K ultra should be right about at 60FPS (at least for FC5 specifically).
Point is that anything 1080/2070/Vega64/5700 or higher is more than capable of reasonable high settings 4K in most games. Not maxed. And I never claimed such a thing as max, but high settings, yes!
As for why people get focused on "max / ultra," I think it's because people who spend big $$$ on a gpu expect to be able to turn the settings up.
^^Exactly this^^
Like I said to the GF when I was putting toether the post. To get 4K you have to compromise, doesnt matter if you have a 2080ti or anything less. Bigger question then becomes would you rather compromise down to 95% on a 1200$ card or down to 85% on a 400$ card...
I think that the 400-500$ cards are more than reasonable of high quality 4K.
Im already at max or near max in most games I play.
I already get over 60FPS in most titles I play on a 60hz screen.
And 1440p is a lower rez, even than 70 or 80% scaled AA filtered output at 4K.
The above FC5 example is a perfect one. Why would I ever opt for the higher FPS 1440p option when I play on a 4K/60 screen and 2160p@80% Ultra is still far better than 1440p ultra?
I have yet to see a game that has required me to opt all the way down to 1440p to get a solid 60 without hitting settings too low. And I have no interest in gaming beyond 60FPS, as nothing I have goes beyond 60 stock (aside from an old CRT) and though I could OC my 1440p samsung desktop panel to about 70hz I dont see the point for 10 extra frames. To my 4k/60 screen is my screen of choice :)
Bue you should be getting at least +10% over me at any setting/rez combo...
https://gpu.userbenchmark.com/Compare/AMD-RX-Vega-64-vs-AMD-RX-5700-XT/3933vs4045
Though, it would seem there are at least a few exceptions where veaa can still take the lead. The two cards are not exactly eqaul, the XT is 3D heavy while Vega is compute.
Either way, even if all you did was match, that is still plenty enough for what I consider to be reasonable 4K for said game. You can pick from full native ultra with 45-55fps (which I think is fine for a PvE Co-Op or SP experaince) or if you have to have that 60 opt for TAA+80% rez for nearly the same visuals but plus 15-20 fps...
Still though, I would look into ur 5700xt a bit more, see if it needs to have its power levels tweaked, or perhaps a better fan curve. I dont run my vega stock (no one who owns one should) . A good undervolt and custom fan curve can litterally boost my FPS by 15% or more...
At all stock my core under heavy 3D load runs about 1450-1500 (mainly due to voltage and heat). Once undervolted and tuned, with a good fan profile, I will consistently hold boost speed or better, about 1600-1650mhz. Also run my HBM up slightly form stock, again, a normal thing for any vega owner.
Though, honestly I would be doing the similar types of tweaking to get the last bit of juice our of an NV card on a 4K load too, so its nothing special, just Vegas need it more than others to get their true potential. Perhaps the XT is similar?
What it feels like to me is that the 5700XT is bandwidth-limited, even though technically the Vega 64's bandwidth advantage is not TOO large (448 GB/s vs 484 GB/s). And FC5 also uses this thing called Rapid Packed Math to improve performance on Radeon. It's an FP16 thing iirc, and the Vega 64, being more compute-focused, has about +25% more FP16 performance than the RX 5700XT.
But, my vega is a blower style stock sapphire, and in jan 2021 will be at end of its 2 year warranty, at which point I plan to open it up, put better TIM on it, and put either a Kracken w/AiO or a Morpheus II if I can still get one. Havent decided between the two.
my 2080ti runs 95% of my games 4k@60fps...4k at reasonable settings does not far exceed consoles...the Xbox1X runs native 4k with "reasonable" settings and will give u 30-60 fps depending on what is goin on, on screen. Dont buy into the PCMR crap because its bull
Ive taken my rig over to my dads place to shown him what real 4K on PC looks like compared to 4K on consoles. He uses a PS4 Pro for 4K HDR. I use my vega for the same.
His ps4 pro for example, has to run outerworlds at what looks to be comparable to 50% rez scale at all lows on PC, with obvious frame dips. My vega can pull either a mix of mediums and highs at full 4K with a solid 50-60fps, or mostly highs with a couple ultras at 80% with the same 50-60. Either one looks *miles* ahead of the 4K console.
Yes, the Xbone will look marginally better than the PS4, but not enough to bridge the gap to PC grade 4K at any level.
But curent gen just isnt up to real 4K.