Este tema ha sido cerrado
Skkooomer Lord 4 JUL 2024 a las 1:46 a. m.
Why we can't trust Nvidia users anymore. DLSS.
"4070S is a 4K card"


Lists all games using DLSS..

So none of them are 4K.

In the comments, "no but I consider it 4K because of final image."


Cognitive dissonance and software reliance for performance, not GPU horsepower.

The new delulu.

https://www.reddit.com/r/nvidia/comments/1duz20n/blown_away_by_how_capable_the_4070s_is_even_at_4k/
< >
Mostrando 16-30 de 107 comentarios
Skkooomer Lord 4 JUL 2024 a las 6:48 a. m. 
Publicado originalmente por C1REX:
Publicado originalmente por CRBLunaFly:
There is?

Most people don't even need DLSS,
What?
I wish my 7900xtx had DLSS.
Then why did you buy one?

You can grab a RTX 4060 and DLSS performance yourself to half the performance of an XTX with fake resolution.


Also....

https://forums.overclockers.co.uk/threads/what-do-gamers-actually-think-about-ray-tracing.18984266/
C1REX 4 JUL 2024 a las 6:56 a. m. 
Publicado originalmente por CRBLunaFly:
Publicado originalmente por C1REX:
What?
I wish my 7900xtx had DLSS.
Then why did you buy one?

You can grab a RTX 4060 and DLSS performance yourself to half the performance of an XTX with fake resolution.


Also....

https://forums.overclockers.co.uk/threads/what-do-gamers-actually-think-about-ray-tracing.18984266/

I don’t like RT but I’m a big fan of playing on a big 4K screen.

Even 4090 benefits from DLSS while 7900xtx loses here big time.

What are you going to do with AMD gpu when it can’t handle native 4K (or any other resolutions)?
Skkooomer Lord 4 JUL 2024 a las 6:58 a. m. 
Publicado originalmente por C1REX:
Publicado originalmente por CRBLunaFly:
Then why did you buy one?

You can grab a RTX 4060 and DLSS performance yourself to half the performance of an XTX with fake resolution.


Also....

https://forums.overclockers.co.uk/threads/what-do-gamers-actually-think-about-ray-tracing.18984266/

I don’t like RT but I’m a big fan of playing on a big 4K screen.

Even 4090 benefits from DLSS while 7900xtx loses here big time.

What are you going to do with AMD gpu when it can’t handle native 4K (or any other resolutions)?
Not play on 4K because I don't play on 4K.

1440P sweet smooth runnings and longevity.
C1REX 4 JUL 2024 a las 7:02 a. m. 
Publicado originalmente por CRBLunaFly:
Publicado originalmente por C1REX:

I don’t like RT but I’m a big fan of playing on a big 4K screen.

Even 4090 benefits from DLSS while 7900xtx loses here big time.

What are you going to do with AMD gpu when it can’t handle native 4K (or any other resolutions)?
Not play on 4K because I don't play on 4K.

1440P sweet smooth runnings and longevity.
With DLSS I could play at 4k with smooth 1440p performance.

After trying 4K there is no way I will go back to 1440p for single player AAA games.
Skkooomer Lord 4 JUL 2024 a las 7:08 a. m. 
Publicado originalmente por C1REX:
Publicado originalmente por CRBLunaFly:
Not play on 4K because I don't play on 4K.

1440P sweet smooth runnings and longevity.
With DLSS I could play at 4k with smooth 1440p performance.

After trying 4K there is no way I will go back to 1440p for single player AAA games.
Well you ruined it for yourself.

FOMO.

I do own own a 4K monitor, it's never used.

It was the cheap Samsung TN model, it's 27" the same as my two 1440P high refresh displays, it's sharper but b a miniscule amount.



I can't see a 7900 XT which can rasterize nearly as fast as the RTX 4080 / XTX as being really pushed any time soon at 1440P with native visuals.


Hogwarts Legacy for example in a CPU bound scenario like Hogsmeade will easily run 105-135 FPS around that town whichis with a 5800X3D at 1440P, elsewhere it ends up being GPU limited.

HWL is considered demanding not for having the best visuals, it's not a well optimized game, but it's doable...


The fact a 7900 XT can compete with an RTX 4090 in that title due to the fact gamers have no control over what devs do with a game shows how easily performance can be taken away by even the best of the GPU's on the market sadly.

If you enable RT, you NEED DLSS and FG to gain frames on a 4000 series card, the CPU hit is ridiculous.

Consumers are pawns that assume their interests are at the forefront...
Última edición por Skkooomer Lord; 4 JUL 2024 a las 7:10 a. m.
A&A 4 JUL 2024 a las 7:26 a. m. 
So much hate for what? That companies use marketing that only aims to make the end user buy the given product?

You gave an example with Nvidia. You haven't found the hot water and it is not just Nvidia.
󠀡󠀡 4 JUL 2024 a las 7:32 a. m. 
its a choice at the end of the day
C1REX 4 JUL 2024 a las 7:33 a. m. 
Publicado originalmente por CRBLunaFly:
Publicado originalmente por C1REX:
With DLSS I could play at 4k with smooth 1440p performance.

After trying 4K there is no way I will go back to 1440p for single player AAA games.
Well you ruined it for yourself.

FOMO.
I didn’t switch to PC gaming to play Elden Ring at a lower resolution than on consoles.
Luckily my PC can handle 4K60 no problem and I can add frame gen from Lossless Scaling app for buttery smooth 4K120.

One good argument for Radeon is lower CPU overhead. That can help in CPU limited scenarios what includes CPU related shader compilation and traversal stutters.

However, If AMD won’t get their poo together and fix their embarrassing FSR then I may switch back to NVidia.

The sentence from your first post is a bit of a stretch but thanks to DLSS the 4070s can potentially give better 4K experience than the 7900XT.
Última edición por C1REX; 4 JUL 2024 a las 7:55 a. m.
Skkooomer Lord 4 JUL 2024 a las 7:36 a. m. 
Publicado originalmente por C1REX:
Publicado originalmente por CRBLunaFly:
Well you ruined it for yourself.

FOMO.
I didn’t switch to PC gaming to play Elden Ring at a lower resolution than on consoles.
Luckily my PC can handle 4K60 no problem and I can add frame gen from Lossless Scaling app for buttery smooth 4K120.

One good argument for Radeon is lower CPU overhead. That can help with CPU limited scenarios what includes CPU related shader compilation and traversal stutters.

However, If AMD won’t get their poo together and fix their embarrassing FSR then I may switch back to NVidia.

The sentence from your first post is a bit of a stretch but thanks to DLSS the 4070s can potentially give better 4K experience than the 7900XT.
No but you wlll beg for tech that consoles used since like the XBOX 360 / PS3 era, upscaling and pretend resolutions.
skOsH♥ 4 JUL 2024 a las 7:41 a. m. 
Publicado originalmente por C1REX:
Publicado originalmente por CRBLunaFly:
That's why we don't use it...

Also why nobdoy uses RT, same thing.
Because AMD has the worst one on the market.

Intel has a better one.
NVidia has a better one.
UE5 has a better one.
Sony had an amazing checkerboard reconstruction one for ps4pro and is planning to make something new for ps5.
Even lossless scalling app often offers a better results than the fsr3 when the app doesn’t even have game integration.

FSR is so bad that in games like Forspoken where you have an option for native FSR AA it literally makes the image worse. FSR AA makes the image much worse, with flickering at a native resolution. Not even upscaling. Just making the image trash.

It’s insane how bad it is.

Game companies need to make their games optimized. Consoles have far inferior components.

The 7900XTX has a super high rasterization score, meaning it can generate frames faster. If it's using UE5, it doesn't need to do ray tracing....but it can.

I don't need to use any FSR. I run native 2k. It's glorious. Doing the math, I would still be able to run RDR2 at over 100fps if I ran it in 4k, which wouldn't make sense due to my 2k native screen. RDR2 is a highly optimized game and it looks phenomenal. Oh, and I have RT enabled in that game.

People just assume that AMD cards suck, for no reason. I doubt you use an AMD card. UE5 looks amazing and runs more efficiently than anything before it because it renders large amounts of polygons without actually getting bogged down by the "number of polygons". You can just render the shape to have the contours of those polygons without rendering multiple objects.
󠀡󠀡󠀡󠀡⁧⁧Kei 4 JUL 2024 a las 10:00 p. m. 
cold take but the 4090 is a 1440p card for high fps at most
Tonepoet 4 JUL 2024 a las 11:29 p. m. 
Just because some nvidia users aren't getting good performance out of Hogwarts Legacy doesn't mean they all aren't, and even if they all weren't, that doesn't necessarily mean the fault lies with Nvidia. It could be a fault with a bug in that particular game. We'd want at least a 3 game sample size to make generalized assumptions about a card's performance (preferably more) to mitigate against such things.

Also, even with the fault lied with Nvidia, it could also be fixable through a driver update or firmware rather than a fault with the hardware. How long have these problems been going on anyway?

Anyway, Techspot did an extensive suite of tests comparing G.P.U. performance in that particular game too.[www.techspot.com]

In 4K res. Ultra preset with T.T.A. High the 4080 falls behind the 7900 xtx, as the third best card to use with the game (with the 4090 taking top spot), perf is 72 F.P.S., with 59 F.P.S. in the 1% lows.

When you add Ultra Ray Tracing on top of that, the 7900xtx sinks down to being the fifth best card to use with the game, after the 3090. 4080 is the second best card in these conditions, and you get 45 F.P.S. with 36 in the 1% lows. 7900 XTX ets 32 F.P.S. with 23 in the 1% lows. Only the 4090 gets close to 60 F.P.S. (59 to be exact).

In any case, it looks like Nvidia's going to let up the gas on their V.R.A.M. stinginess going forward. The 5090 is rumored to have 32 gigs of V.R.A.M., so we might hope for 4 to 8 gig increases at least across the high and mid-range. Maybe even a 2-4 gig bump in the lower ranges if we're lucky.

Though really, it's no use for me to think about it so much. In my price bracket, I'm probably going to buy an Intel Arc Battlemage card as soon as those are released...
Última edición por Tonepoet; 4 JUL 2024 a las 11:32 p. m.
Skkooomer Lord 5 JUL 2024 a las 12:49 a. m. 
The game don't even work on Nvidia for most users right now lol.

Got nothing to do with perf.
Tonepoet 5 JUL 2024 a las 1:33 a. m. 
Well, okay, yeah, looks like Nvidia users do have problems with that particular game. The first was with a driver update issue, which seems to have been fixed since, so that's a non-issue. The other issues started around the update in June. Seems like everybody reporting issues is reporting with Nvidia hardware, and given the hardware survey overwhelmingly favors Nvidia G.P.Us., it's likely fair to assume the others might be having such issues too.

I'm not really sure if Nvidia should be blamed for game devs. breaking their game and not being bothered enough by it to roll back the update to an earlier working version within the month or at least offer beta downloads of the earlier versions if they somehow can't completely pull the update though, particularly since it was working well enough beforehand.
Última edición por Tonepoet; 5 JUL 2024 a las 1:41 a. m.
Dutchgamer1982 5 JUL 2024 a las 8:04 a. m. 
2
to properly run something you need to be able to max out settings and still get 100+ fps.
NO upscaling crap.

currently only 1 card is capable of that and is a true 4k card, the 4090

all other cards on the market are 1440p cards. or perhaps 1440p widescreen if your into that.

if you are talking "playable but not advisable"
than anything 60fps is "playbale"
and the list of 4k "capable" cards is extended

RX7900XTX = 95 fps
4080Super = 90 fps
4080 = 88 fps
RX7900XTX = 82 fps
4070tiSuper = 78 fps
3090ti = 75 fps
4070ti = 72 fps
3080ti = 70 fps
RX6950XT = 69fps
3080 12GB = 68 fps
4070Super = 66 fps
RX7800XTR = 63 fps
3080 = 62 fps
RX6900XT = 61 fps

so yeah if we look that way.... nvidea is correct a 4070 super.. CAN run 4k with ultra settings without upscaling at 66fps.. so BARELY playable...

a non super 4070 is below that 60fps mark so is not playable at 4k.
Última edición por Dutchgamer1982; 5 JUL 2024 a las 8:15 a. m.
< >
Mostrando 16-30 de 107 comentarios
Por página: 1530 50

Publicado el: 4 JUL 2024 a las 1:46 a. m.
Mensajes: 107