is the rx 5700 xt good for 4k gaming?
i'm currently building a pc for my cousin and he needs to be set for 1440p gaming for the next 5 years until a gpu upgrade is necessary again. now we won't really be getting into 4k gaming but just out of curiosity, how well should it perform against titles with 4k ultra? now obviously i could look at a yt video and get a general answer but opinions on here would be cool too.
< >
1630/44 megjegyzés mutatása
I play at 4k max (minus AA as it isn't needed or noticeable at 4k) about half the time on my couch rig with a 9900k and a 2080ti, basically, if it's slower paced I'll go 4k, faster paced where I want more than 60fps, I'll drop to 1080p (TV only does 120Hz at 1080p not 1440p sadly) and as I'm so far back, it's not THAT noticeable while playing even at 65 inch, where as 1080p on my 27 inch 1440p rog swift is very noticeable.

So, yeah, a 2080ti can handle just about everything at 4k, however I still prefer higher fps as I find it makes more of an impact when I'm sitting 8-10ft away.
Spec_Ops_Ape eredeti hozzászólása:
xSOSxHawkens eredeti hozzászólása:
Im noticing a trend here.

Anyone that *has* to have all maxed out or nothing doesnt like 4K.

Those that *do* like 4K tend to compromise on settings.

Fun-Fact: Its always been that way, and probably will be for some time yet. Even *if* the next gen cards can push current titles at actual maxed settings 4/60, it will be less than a year before new games make them unable to do so anymore...

And by the standards amy of you seem to apply, at that point its no longer a 4K cappable card...

Heres a pro-tip:

I dont know a *single* person who actually uses 4K for their daily gaming, refardless of setup, that runs the games 100% maxed. Its both un-needed, and a major waste of performance at such a rez in many games. Its *always* been this way on 4K, even when the R9-290X was the 4K king in 2013. Same rules apply today, even for a 2080ti. You compromise.
Gamed at 4k for 2 years and I loved it.
Anti-aliasing was always the first to go at 4K, then shadows. When I started having to drop textures from high to medium was when I finally decided to move to 1440p. Modern titles like Far Cry 5 / New Dawn, The Division 2, Shadow of the Tomb Raider I had to drop to 1440p for the better gaming experience.

See… Hers the thing though.. For me, AA is the first thing to come back thanks to the Nvidia effect…

Whats that I hear you wondering…

Well, awhile back NV pushed a fancy new tech, one *many * people use, especially at 4K or when using Ray Tracing…

DLSS…

Why does this matter… Because anyone using DLSS is no longer running at anywhere even * near* full 4K quality. Not. Even. Close.

In side by side quality testing DLSS is *worse* than 78% rez…

So basically, anyone running a good quality AA with around 70-80% rez is now getting the same or better quality as DLSS. Except this works with almost any new title where DLSS doesn't…

Think I am full of it?…

TechSpot eredeti hozzászólása:
It gets worse though, and we’ll switch to a different scene for this one. Here is a comparison between DLSS and our 78 percent resolution scale, roughly 1685p, which we found to perform exactly the same as DLSS. It’s a complete non-contest. The 1685p image destroys the DLSS image in terms of sharpness, texture detail, clarity, basically everything.

https://www.techspot.com/article/1794-nvidia-rtx-dlss-battlefield/

And this comes into play in titles where perhaps you * have* to have that extra bit of performance.

Lets look at Far Cry 5.

For me, the in game bench is generally a bit lower than my real world performance in game, by an average of 5-10fps higher in game than in bench. Due to this, I personally ran the game at 4K full native ultra, and got a mostly average 55-60 with some dips into the 50’s and occasionally into the 40s (but that was rare).

This is reflected in the following benchmark, at all ultra, full native 4K, with no AA present. I score an average of 48 in the bench, which translated to about the stated 55ish average in game (great for me for such a title).

https://steamcommunity.com/sharedfiles/filedetails/?id=2095716613

https://youtu.be/yxmxiEeZUvw

But now… Lets apply that NV effect… Remember, 78% rez was already better than 4k with DLSS… So, lets do bit better more and run 80% cuz 10% increments and all, and lets toss on some TAA, and leave all other settings at ultra with 4K output…

https://steamcommunity.com/sharedfiles/filedetails/?id=2095716631

https://youtu.be/J341rrIqBE8

Now we have a true solid 60FPS+ gaming experience, again, real world will be a bit better (at least it was for me in this title)…

For a final look I put the two together into a side by side. At least to my eyes, and to those of my girlfriend who didn't know what she was being shown (was just asked which looked better) we both think the TAA@80% looks better than the native sans TAA…

80% is on left, Full 4k on right, boath sides were from the raw 100mbps recording.

https://youtu.be/fwLGkUgOV-A


Before the RTX line PC Gamers wouldn't deal with rez scaling, now days NV has literally made it a default feature and a crutch that they force the use of allot of times with RTX enabled.

If the PCMR community is fine with DLSS, then there is no reasonable argument against the proven superior method of 80% rez + AA. Period.

Anyone who wants to argue that is not reasonable 4K, has to also be willing to accept that any use of DLSS causes even worse reductions in graphic fidelity. Though I will give NV credit that DLSS 2.0 is a bit better.

But as to 4K on a 5700XT… Well, in my view the Vega can push high settings 4K. Maybe not max, but def *far* from minimums as people seem to think. Being that the 5700XT is 10-20% faster, and that I am able to pull over 60FPS at 80%, then the 5700XT, at full native 4K ultra should be right about at 60FPS (at least for FC5 specifically).

Point is that anything 1080/2070/Vega64/5700 or higher is more than capable of reasonable high settings 4K in most games. Not maxed. And I never claimed such a thing as max, but high settings, yes!
Legutóbb szerkesztette: xSOSxHawkens; 2020. máj. 13., 22:12
xSOSxHawkens eredeti hozzászólása:
But as to 4K on a 5700XT… Well, in my view the Vega can push high settings 4K. Maybe not max, but def *far* from minimums as people seem to think. Being that the 5700XT is 10-20% faster, and that I am able to pull over 60FPS at 80%, then the 5700XT, at full native 4K ultra should be right about at 60FPS (at least for FC5 specifically).

Point is that anything 1080/2070/Vega64/5700 or higher is more than capable of reasonable high settings 4K in most games. Not maxed. And I never claimed such a thing as max, but high settings, yes!
I'll re-download Far Cry 5 and we'll see how it runs at 4k. Check back in a day or two for that (I have ♥♥♥♥♥♥ internet, but been thinking of replaying it anyway before Cyberpunk 2077 hits.)

As for why people get focused on "max / ultra," I think it's because people who spend big $$$ on a gpu expect to be able to turn the settings up.
AbedsBrother eredeti hozzászólása:

As for why people get focused on "max / ultra," I think it's because people who spend big $$$ on a gpu expect to be able to turn the settings up.

^^Exactly this^^

Like I said to the GF when I was putting toether the post. To get 4K you have to compromise, doesnt matter if you have a 2080ti or anything less. Bigger question then becomes would you rather compromise down to 95% on a 1200$ card or down to 85% on a 400$ card...

I think that the 400-500$ cards are more than reasonable of high quality 4K.
Legutóbb szerkesztette: xSOSxHawkens; 2020. máj. 13., 22:39
or drop to 1440p for higher settings or higher fps (if the cpu can do it)
_I_ eredeti hozzászólása:
or drop to 1440p for higher settings or higher fps (if the cpu can do it)
Why would I do that when:

Im already at max or near max in most games I play.
I already get over 60FPS in most titles I play on a 60hz screen.
And 1440p is a lower rez, even than 70 or 80% scaled AA filtered output at 4K.


The above FC5 example is a perfect one. Why would I ever opt for the higher FPS 1440p option when I play on a 4K/60 screen and 2160p@80% Ultra is still far better than 1440p ultra?

I have yet to see a game that has required me to opt all the way down to 1440p to get a solid 60 without hitting settings too low. And I have no interest in gaming beyond 60FPS, as nothing I have goes beyond 60 stock (aside from an old CRT) and though I could OC my 1440p samsung desktop panel to about 70hz I dont see the point for 10 extra frames. To my 4k/60 screen is my screen of choice :)
xSOSxHawkens eredeti hozzászólása:

But as to 4K on a 5700XT… Well, in my view the Vega can push high settings 4K. Maybe not max, but def *far* from minimums as people seem to think. Being that the 5700XT is 10-20% faster, and that I am able to pull over 60FPS at 80%, then the 5700XT, at full native 4K ultra should be right about at 60FPS (at least for FC5 specifically).

Point is that anything 1080/2070/Vega64/5700 or higher is more than capable of reasonable high settings 4K in most games. Not maxed. And I never claimed such a thing as max, but high settings, yes!
Just ran the tests, Hawkens (videos to follow). I tested Far Cry 5's benchmark in 4k Low, 4k normal, 4k Ultra, and 4k Ultra @80% resolution scale. My 5700XT performs almost identically to your Vega 64 in Far Cry 5 - it's about 2fps faster on average. I might get a bit more performance if I overclock, or had a faster cpu (though FC5 is decently multi-threaded), but not 10-20% more.
AbedsBrother eredeti hozzászólása:
xSOSxHawkens eredeti hozzászólása:

But as to 4K on a 5700XT… Well, in my view the Vega can push high settings 4K. Maybe not max, but def *far* from minimums as people seem to think. Being that the 5700XT is 10-20% faster, and that I am able to pull over 60FPS at 80%, then the 5700XT, at full native 4K ultra should be right about at 60FPS (at least for FC5 specifically).

Point is that anything 1080/2070/Vega64/5700 or higher is more than capable of reasonable high settings 4K in most games. Not maxed. And I never claimed such a thing as max, but high settings, yes!
Just ran the tests, Hawkens (videos to follow). I tested Far Cry 5's benchmark in 4k Low, 4k normal, 4k Ultra, and 4k Ultra @80% resolution scale. My 5700XT performs almost identically to your Vega 64 in Far Cry 5 - it's about 2fps faster on average. I might get a bit more performance if I overclock, or had a faster cpu (though FC5 is decently multi-threaded), but not 10-20% more.
Meh, it might be a combination of me having a good vega (consistently score in the 99th percentile for ait cooled vega 64's) and you having a card running bare bones stock?...

Bue you should be getting at least +10% over me at any setting/rez combo...

https://gpu.userbenchmark.com/Compare/AMD-RX-Vega-64-vs-AMD-RX-5700-XT/3933vs4045

Though, it would seem there are at least a few exceptions where veaa can still take the lead. The two cards are not exactly eqaul, the XT is 3D heavy while Vega is compute.

Either way, even if all you did was match, that is still plenty enough for what I consider to be reasonable 4K for said game. You can pick from full native ultra with 45-55fps (which I think is fine for a PvE Co-Op or SP experaince) or if you have to have that 60 opt for TAA+80% rez for nearly the same visuals but plus 15-20 fps...

Still though, I would look into ur 5700xt a bit more, see if it needs to have its power levels tweaked, or perhaps a better fan curve. I dont run my vega stock (no one who owns one should) . A good undervolt and custom fan curve can litterally boost my FPS by 15% or more...

At all stock my core under heavy 3D load runs about 1450-1500 (mainly due to voltage and heat). Once undervolted and tuned, with a good fan profile, I will consistently hold boost speed or better, about 1600-1650mhz. Also run my HBM up slightly form stock, again, a normal thing for any vega owner.

Though, honestly I would be doing the similar types of tweaking to get the last bit of juice our of an NV card on a 4K load too, so its nothing special, just Vegas need it more than others to get their true potential. Perhaps the XT is similar?
xSOSxHawkens eredeti hozzászólása:
Still though, I would look into ur 5700xt a bit more, see if it needs to have its power levels tweaked, or perhaps a better fan curve. I dont run my vega stock (no one who owns one should) . A good undervolt and custom fan curve can litterally boost my FPS by 15% or more...

At all stock my core under heavy 3D load runs about 1450-1500 (mainly due to voltage and heat). Once undervolted and tuned, with a good fan profile, I will consistently hold boost speed or better, about 1600-1650mhz. Also run my HBM up slightly form stock, again, a normal thing for any vega owner.

Though, honestly I would be doing the similar types of tweaking to get the last bit of juice our of an NV card on a 4K load too, so its nothing special, just Vegas need it more than others to get their true potential. Perhaps the XT is similar?
Oh, it's running tweaked & under-volted. In-game it's running at ~1835Mhz (boost frequency is limited to 1905, which is the stock rating from Powercolor). I have a detailed overlay that will show temps, frequencies & voltages. It's not over-heating. I even did some custom tweaks that have helped with other Ubisoft games - clearing the Standby list, disabling Freesync and limiting tessellation to 16x (though FC5 isn't big on tessellation per se). This smoothed out the benchmark a bit, but didn't really boost the frame-rate.

What it feels like to me is that the 5700XT is bandwidth-limited, even though technically the Vega 64's bandwidth advantage is not TOO large (448 GB/s vs 484 GB/s). And FC5 also uses this thing called Rapid Packed Math to improve performance on Radeon. It's an FP16 thing iirc, and the Vega 64, being more compute-focused, has about +25% more FP16 performance than the RX 5700XT.
AbedsBrother eredeti hozzászólása:
xSOSxHawkens eredeti hozzászólása:
Still though, I would look into ur 5700xt a bit more, see if it needs to have its power levels tweaked, or perhaps a better fan curve. I dont run my vega stock (no one who owns one should) . A good undervolt and custom fan curve can litterally boost my FPS by 15% or more...

At all stock my core under heavy 3D load runs about 1450-1500 (mainly due to voltage and heat). Once undervolted and tuned, with a good fan profile, I will consistently hold boost speed or better, about 1600-1650mhz. Also run my HBM up slightly form stock, again, a normal thing for any vega owner.

Though, honestly I would be doing the similar types of tweaking to get the last bit of juice our of an NV card on a 4K load too, so its nothing special, just Vegas need it more than others to get their true potential. Perhaps the XT is similar?
Oh, it's running tweaked & under-volted. In-game it's running at ~1835Mhz (boost frequency is limited to 1905, which is the stock rating from Powercolor). I have a detailed overlay that will show temps, frequencies & voltages. It's not over-heating. I even did some custom tweaks that have helped with other Ubisoft games - clearing the Standby list, disabling Freesync and limiting tessellation to 16x (though FC5 isn't big on tessellation per se). This smoothed out the benchmark a bit, but didn't really boost the frame-rate.

What it feels like to me is that the 5700XT is bandwidth-limited, even though technically the Vega 64's bandwidth advantage is not TOO large (448 GB/s vs 484 GB/s). And FC5 also uses this thing called Rapid Packed Math to improve performance on Radeon. It's an FP16 thing iirc, and the Vega 64, being more compute-focused, has about +25% more FP16 performance than the RX 5700XT.
Ahh, yeh, if FC5 uses compute effectivly and needs bandwidth I can see the difference being not as large as would otherwise be expected. I do beat the xt in both those, specialy with the HBM overclock, I'm actually sitting at 499.2GB/s and can push it over 500 by upping the OC from 975mhz to 1015, but I see little benefit in most games, and the HBM is the highest heat producer, so keeping it cooler give the core noticeably more headroom. I have found 975 to 985 to be where I start to see deminishing returns on memory boosting and start to lose in core speed due to thermals.

But, my vega is a blower style stock sapphire, and in jan 2021 will be at end of its 2 year warranty, at which point I plan to open it up, put better TIM on it, and put either a Kracken w/AiO or a Morpheus II if I can still get one. Havent decided between the two.
xSOSxHawkens eredeti hozzászólása:
I guess I am the odd one out here...

Yes, 5700XT is fine for 4K gaming. Not 4K Ultra, no GPU is, not even the 2080ti. But for 4K at reasonable settings that far exceed consoles *and* are true native 4K yes, its a fine card and will work great. Figure mostly high settings, except on the newest games in which case plane to run a mix of mediums and highs.

Speaking from experiance as someone that runs AMD 4K daily. I use the Vega 64, which is roughly comparable, but slightly behind, the 5700xt.

I play 4K-60 in GTAv, MCC (reach, CE, 2), Borderlands, BF-v, BF-1, Apex, The Forrest, Killing Floor 2, Far Cry 5, Monster Hunter World, Outerworlds, Alien Isolation and a few others. All within the past 2 months. Have been using it for 4K daily since Jan 2019.

I would say that any card comparable, which the 5700xt is, should offer comparable performance.

Again though, at 4K, be reasonable with settings, look into what *needs* to be maxed to look good, and what can run a touch or two lower to gain back the FPS. At that rez you might find not every single thing needs to be 100% maxed out. 80% maxed settings with 4K rez is often far better than 100% ultra at 1080.

Most argue that 1440p ultra is best compromise, but having both 4K and 1440P screens, I can easily see the difference between the two now that I am used to the detail 4K gives. If you play games where seeing at a distance is a good thing, you will love 4K.

my 2080ti runs 95% of my games 4k@60fps...4k at reasonable settings does not far exceed consoles...the Xbox1X runs native 4k with "reasonable" settings and will give u 30-60 fps depending on what is goin on, on screen. Dont buy into the PCMR crap because its bull
I ran 4k from the day i got my PC 3 years ago and I had a gtx 1080SC that ran 4k 30-60fps depending on the game xbox windows games I could run 4k@60 with max settings games like Tomb Raider, Ghost Recon, Hitman and other such titles i had to lower setting a bit and it would get atleast 30fps usually between 30's to low 40's...to me 4k looks insanely better than 1080p or 1440p I cant even play my Ps4 games cause 1080 looked like crap. I had to buy a Ps4 Pro whcih runs 4k but checkerboard 4k so its not true native 4k like the XB1X. but next gen consoles will run native 4k@60fps...For the price of my new gpu I could literally buy both the PS5 and the next xbox when they release and would have money left over. console gaming is a much better option for gamers on a budget that want the most out of their gaming system
Legutóbb szerkesztette: ThatzYoAzz; 2020. máj. 14., 21:32
ThatzYoAzz eredeti hozzászólása:
xSOSxHawkens eredeti hozzászólása:
I guess I am the odd one out here...

Yes, 5700XT is fine for 4K gaming. Not 4K Ultra, no GPU is, not even the 2080ti. But for 4K at reasonable settings that far exceed consoles *and* are true native 4K yes, its a fine card and will work great. Figure mostly high settings, except on the newest games in which case plane to run a mix of mediums and highs.

Speaking from experiance as someone that runs AMD 4K daily. I use the Vega 64, which is roughly comparable, but slightly behind, the 5700xt.

I play 4K-60 in GTAv, MCC (reach, CE, 2), Borderlands, BF-v, BF-1, Apex, The Forrest, Killing Floor 2, Far Cry 5, Monster Hunter World, Outerworlds, Alien Isolation and a few others. All within the past 2 months. Have been using it for 4K daily since Jan 2019.

I would say that any card comparable, which the 5700xt is, should offer comparable performance.

Again though, at 4K, be reasonable with settings, look into what *needs* to be maxed to look good, and what can run a touch or two lower to gain back the FPS. At that rez you might find not every single thing needs to be 100% maxed out. 80% maxed settings with 4K rez is often far better than 100% ultra at 1080.

Most argue that 1440p ultra is best compromise, but having both 4K and 1440P screens, I can easily see the difference between the two now that I am used to the detail 4K gives. If you play games where seeing at a distance is a good thing, you will love 4K.

my 2080ti runs 95% of my games 4k@60fps...4k at reasonable settings does not far exceed consoles...the Xbox1X runs native 4k with "reasonable" settings and will give u 30-60 fps depending on what is goin on, on screen. Dont buy into the PCMR crap because its bull
Consoles dont even come close man. Not even to compromised 4K settings on PC. Also, if you use DLSS on that 2080ti then any game you are using it on is not running 4K either, its running comparable to 4K @ 80% with TAA as shown in side by side comparison testing of image quality from still.

Ive taken my rig over to my dads place to shown him what real 4K on PC looks like compared to 4K on consoles. He uses a PS4 Pro for 4K HDR. I use my vega for the same.

His ps4 pro for example, has to run outerworlds at what looks to be comparable to 50% rez scale at all lows on PC, with obvious frame dips. My vega can pull either a mix of mediums and highs at full 4K with a solid 50-60fps, or mostly highs with a couple ultras at 80% with the same 50-60. Either one looks *miles* ahead of the 4K console.

Yes, the Xbone will look marginally better than the PS4, but not enough to bridge the gap to PC grade 4K at any level.
ThatzYoAzz eredeti hozzászólása:
I ran 4k from the day i got my PC 3 years ago and I had a gtx 1080SC that ran 4k 30-60fps dpending on the game...to me 4k looks insanely better than 1080p or 1440p I cant even play my Ps4 games cause 1080 looked like crap. I had to buy a Ps4 Pro whcih runs 4k but checkerboard 4k so its not true native 4k like the XB1X. but next gen consoles will run native 4k@60fps...For the price of my new gpu I could literally buy both the PS5 and the next xbox when they release and would have money left over. console gaming is a much better option for gamers onm a budget that want the most out of their gaming system
Yeh, next gen looks like it will be a big of a game changer. The UE5 demo on PS5 was amazing.

But curent gen just isnt up to real 4K.
Legutóbb szerkesztette: xSOSxHawkens; 2020. máj. 14., 21:33
the ps4 PRo basicallylooks like 1440p not native 4k...but the xbox1X is native 4k and they dont have the lowest settings either...they use med to high settings...consoles more than come close despite what you may think. The price you pay for a high end PC to achieve marginally better graphics with slightly better settings isnt even a comparison consoles win by a long shot...not to mention console games look better and better over the life of their console generation...not so much for those PC's you put so much stock in.
Legutóbb szerkesztette: ThatzYoAzz; 2020. máj. 14., 21:36
< >
1630/44 megjegyzés mutatása
Laponként: 1530 50

Közzétéve: 2020. máj. 12., 20:39
Hozzászólások: 44