RTX 40 vs 50 series
So on traditional frame render there is no big difference between 40 and 50 series but on frame generation its a big difference and since the 40 series will get DLSS 4 with improved frame generation (Multiframe generation is only for 50 series) this means i will get more frames when using DLSS and its kinda useless to upgrade from 40 series to 50 series since on traditional frame generation is no big difference? Tell me your opinion guys
最后由 Helicopter200 编辑于; 1 月 10 日 上午 5:30
< >
正在显示第 16 - 30 条,共 79 条留言
D. Flame 1 月 10 日 下午 12:52 
引用自 Monk
I see you are simply too poor to ever actually tried it so won't be the target audience anyway.

Beyond that, that is a VERY extreme example that no other card would even get half of those frames, native.

The complexity and engineering really is lost on people who say 'fake frames'.
I upgraded from a 30 series to a 40 series.

The 50 series is still fake frames.
Monk 1 月 10 日 下午 12:55 
引用自 D. Flame
引用自 Monk
I see you are simply too poor to ever actually tried it so won't be the target audience anyway.

Beyond that, that is a VERY extreme example that no other card would even get half of those frames, native.

The complexity and engineering really is lost on people who say 'fake frames'.
I upgraded from a 30 series to a 40 series.

The 50 series is still fake frames.

No more fake than anything else, its simply a new way of doing things.

But hey, good luck with your slide shows or simply not using half the new tech developed, I'll be enjoying fluid gameplay with all the eye candy turned up.
Heretic 1 月 10 日 下午 1:03 
引用自 D. Flame
Nvidia is just a scam now. Instead of increasing real performance, they are just adding new chips to lock down their snake oil to the new hardware.

Then they are putting out hardware that runs at 27 FPS and fakes 3 frames from every 1 real frame.
The input lag must be terrible.
D. Flame 1 月 10 日 下午 1:04 
引用自 Monk
引用自 D. Flame
I upgraded from a 30 series to a 40 series.

The 50 series is still fake frames.

No more fake than anything else, its simply a new way of doing things.

But hey, good luck with your slide shows or simply not using half the new tech developed, I'll be enjoying fluid gameplay with all the eye candy turned up.
It's literally fake. For every 1 frame rendered, it fakes 3 additional frames.
Monk 1 月 10 日 下午 1:08 
引用自 Heretic
引用自 D. Flame
Nvidia is just a scam now. Instead of increasing real performance, they are just adding new chips to lock down their snake oil to the new hardware.

Then they are putting out hardware that runs at 27 FPS and fakes 3 frames from every 1 real frame.
The input lag must be terrible.

It's not though, this isn't lossless scaling's rubbish program but very expensive dedicated hardware, it really isn't noticeable unless you are a very high end competitive player, but, then, any competitive fame doesn't need it....
r.linder 1 月 10 日 下午 1:22 
The 5090's 27 frames is at 4K with settings absolutely maxed in Cyberpunk 2077 without DLSS, which doesn't seem like much, but the difference over the 4090 was 7 frames (20 FPS) which actually makes the 5090 35% faster than the 4090 at the same settings in the same game.

There is still a typical generational increase, it's just that most of the gains they're advertising is through multi-frame generation which is currently only plausible on 50 series due to architectural improvements.

Frames obtained through DLSS and frame generation aren't fake just because it involves AI, it's AI that's integrated into the GPU. It's still the GPU. They're improving more on the AI front but that's because it's newer, and it's capable of more.
最后由 r.linder 编辑于; 1 月 10 日 下午 1:55
引用自 D. Flame
It's literally fake. For every 1 frame rendered, it fakes 3 additional frames.

Can you describe a real frame to me so I know the difference between a frame generated on a GPU vs a frame generated on a GPU?

引用自 Monk
I see you are simply too poor to ever actually tried it so won't be the target audience anyway.

Beyond that, that is a VERY extreme example that no other card would even get half of those frames, native.

The complexity and engineering really is lost on people who say 'fake frames'.

Well Nvidia has only invented every change in the way graphics are rendered and are the company that has single handed brought every new leap in tech in the gaming industry.
I think these guys sat at home in their bedrooms know best personally and if you give them a chance they will lay down a way better 10 year plan than Nvidia can come up with, it's why these guys earn the big bucks.
Helicopter200 1 月 10 日 下午 2:20 
引用自 Monk
I see you are simply too poor to ever actually tried it so won't be the target audience anyway.

Beyond that, that is a VERY extreme example that no other card would even get half of those frames, native.

The complexity and engineering really is lost on people who say 'fake frames'.
Poor? Dude i have a 4080 with an i7 13700k and im poor? :lunar2019crylaughingpig:
D. Flame 1 月 10 日 下午 2:30 
引用自 D. Flame
It's literally fake. For every 1 frame rendered, it fakes 3 additional frames.

Can you describe a real frame to me so I know the difference between a frame generated on a GPU vs a frame generated on a GPU?

引用自 Monk
I see you are simply too poor to ever actually tried it so won't be the target audience anyway.

Beyond that, that is a VERY extreme example that no other card would even get half of those frames, native.

The complexity and engineering really is lost on people who say 'fake frames'.

Well Nvidia has only invented every change in the way graphics are rendered and are the company that has single handed brought every new leap in tech in the gaming industry.
I think these guys sat at home in their bedrooms know best personally and if you give them a chance they will lay down a way better 10 year plan than Nvidia can come up with, it's why these guys earn the big bucks.
Sure, if you game is actually rendering at 20 FPS, you will have the latency of 20 FPS while playing. Then it will generate 3 additional fake frames, so your game will LOOK MUCH LIKE it has 80 FPS, but your latency will still be that of a 20 FPS game. Meanwhile a game rendering at 60FPS without fake frames will actually have better latency than your Fake 80.
最后由 D. Flame 编辑于; 1 月 10 日 下午 2:31
r.linder 1 月 10 日 下午 2:41 
引用自 D. Flame

Can you describe a real frame to me so I know the difference between a frame generated on a GPU vs a frame generated on a GPU?



Well Nvidia has only invented every change in the way graphics are rendered and are the company that has single handed brought every new leap in tech in the gaming industry.
I think these guys sat at home in their bedrooms know best personally and if you give them a chance they will lay down a way better 10 year plan than Nvidia can come up with, it's why these guys earn the big bucks.
Sure, if you game is actually rendering at 20 FPS, you will have the latency of 20 FPS while playing. Then it will generate 3 additional fake frames, so your game will LOOK MUCH LIKE it has 80 FPS, but your latency will still be that of a 20 FPS game. Meanwhile a game rendering at 60FPS without fake frames will actually have better latency than your Fake 80.
Just wait until they keep improving DLSS to the point where it doesn't add any latency. Frame generation was new last generation, it's going to take multiple revisions to iron out all of the problems.

This is true for pretty much any hardware change, AMD had latency issues to resolve after Zen2 split the die with their I/O die having more latency and their Ryzen 9 SKUs having latency between chiplets, Intel is going through the same thing with Arrow Lake's I/O die and with P-cores and E-cores. Eventually the technology is improved to the point that the latency isn't noticeable.
最后由 r.linder 编辑于; 1 月 10 日 下午 2:43
引用自 D. Flame
Sure, if you game is actually rendering at 20 FPS, you will have the latency of 20 FPS while playing. Then it will generate 3 additional fake frames, so your game will LOOK MUCH LIKE it has 80 FPS, but your latency will still be that of a 20 FPS game. Meanwhile a game rendering at 60FPS without fake frames will actually have better latency than your Fake 80.

That's not describing a real frame, you're on about FPS and the tie to control input and the linear process of that which controls latency, now if only Nvidia were creating a method to help with this change...
最后由 ¤☣wing☢zeяo☣¤™ 编辑于; 1 月 10 日 下午 2:43
D. Flame 1 月 10 日 下午 2:48 
引用自 D. Flame
Sure, if you game is actually rendering at 20 FPS, you will have the latency of 20 FPS while playing. Then it will generate 3 additional fake frames, so your game will LOOK MUCH LIKE it has 80 FPS, but your latency will still be that of a 20 FPS game. Meanwhile a game rendering at 60FPS without fake frames will actually have better latency than your Fake 80.

That's not describing a real frame, you're on about FPS and the tie to control input and the linear process of that which controls latency, now if only Nvidia were creating a method to help with this change...
A real frame is the frame being rendered by the GPU from the game, and that is what is the latency will be based on. The Fake frames are the 3 following frames that are just "made up" and pasted between that real frame and the next real frame.
Monk 1 月 10 日 下午 2:53 
引用自 D. Flame

Can you describe a real frame to me so I know the difference between a frame generated on a GPU vs a frame generated on a GPU?



Well Nvidia has only invented every change in the way graphics are rendered and are the company that has single handed brought every new leap in tech in the gaming industry.
I think these guys sat at home in their bedrooms know best personally and if you give them a chance they will lay down a way better 10 year plan than Nvidia can come up with, it's why these guys earn the big bucks.
Sure, if you game is actually rendering at 20 FPS, you will have the latency of 20 FPS while playing. Then it will generate 3 additional fake frames, so your game will LOOK MUCH LIKE it has 80 FPS, but your latency will still be that of a 20 FPS game. Meanwhile a game rendering at 60FPS without fake frames will actually have better latency than your Fake 80.

And how do you think you'll be getting those 80 fps exactly? I mean nothing exists that can get Cp2077 truly maxed out 4k native anywhere near that, so your argument is mute.

Edit.

Also, any added latency is very small and it does not play like it's at 30fps, there is a VERY distinct difference in how 30 or 60 fps feels vs say dlss/ frame gen at 120+

Maybe try the tech out?
最后由 Monk 编辑于; 1 月 10 日 下午 2:55
引用自 D. Flame

That's not describing a real frame, you're on about FPS and the tie to control input and the linear process of that which controls latency, now if only Nvidia were creating a method to help with this change...
A real frame is the frame being rendered by the GPU from the game, and that is what is the latency will be based on. The Fake frames are the 3 following frames that are just "made up" and pasted between that real frame and the next real frame.

And that's why they are revamping input which will only get better, now I'll give you for now on twitch shooters still rock native but for story games that push FX you wont tell.

Also they have to move as they are phasing out rasterization as it's pretty much a dead end now, 2 things will be phased out in gaming over a long term 2nd is x86, both are very messy and unnecessary bulky now with legacy crap.

Consumers cry now, now, now while innovators see a future, maybe we should have stuck to 2D sprites and call it a day.
D. Flame 1 月 10 日 下午 3:15 
引用自 D. Flame
A real frame is the frame being rendered by the GPU from the game, and that is what is the latency will be based on. The Fake frames are the 3 following frames that are just "made up" and pasted between that real frame and the next real frame.

And that's why they are revamping input which will only get better,
Yes, I am aware that they are trying to make a band-aid to counteract the issues cause by their fake frames, but that doesn't change anything.
< >
正在显示第 16 - 30 条,共 79 条留言
每页显示数: 1530 50

发帖日期: 1 月 10 日 上午 5:29
回复数: 79