Steam'i Yükleyin
giriş
|
dil
简体中文 (Basitleştirilmiş Çince)
繁體中文 (Geleneksel Çince)
日本語 (Japonca)
한국어 (Korece)
ไทย (Tayca)
Български (Bulgarca)
Čeština (Çekçe)
Dansk (Danca)
Deutsch (Almanca)
English (İngilizce)
Español - España (İspanyolca - İspanya)
Español - Latinoamérica (İspanyolca - Latin Amerika)
Ελληνικά (Yunanca)
Français (Fransızca)
Italiano (İtalyanca)
Bahasa Indonesia (Endonezce)
Magyar (Macarca)
Nederlands (Hollandaca)
Norsk (Norveççe)
Polski (Lehçe)
Português (Portekizce - Portekiz)
Português - Brasil (Portekizce - Brezilya)
Română (Rumence)
Русский (Rusça)
Suomi (Fince)
Svenska (İsveççe)
Tiếng Việt (Vietnamca)
Українська (Ukraynaca)
Bir çeviri sorunu bildirin
I would imagine future titles will have true native support for FSR4 which seems very convincing from the footage I have seen.
But if this driver override works well enough then it's not a big deal.
Steamdb was updated not too long ago. Branches is now under Depots. Everything is still there just on the left sidebar. Its all still working like it did. Just that they moved some things around. There is only one branch as they removed the older outdated ones.
If its still not working for you, you may have to clear your cache/cookies for the site?
So far, DLSS 4 driver overrides are working quite well it seems. As per users reporting it in that other thread. So what AMD is doing now. Its looking really good for them. Lots of progress made in a short amount of time.
Hey, good stuff! You know, I was thinking about it and I see the RX 9070 XT is just as power hungry as my current RTX 3080. I really wanted a GPU that was more power efficient.
So when in stock, I think I'm just going to get an RTX 5080. But still on the fence about it. But that will be included in a new build all together.
Just curious, what is your CPU? Let us know how it works out for you.
https://www.youtube.com/watch?v=nzomNQaPFSk
https://www.youtube.com/watch?v=H38a0vjQbJg
The RX 9070XT is meant to get the last out of the chip. It is way past the optimum efficiency point. This is why it is as powerhungry as a RX 7900 XT, but somewhat more powerful.
The RX 9070 is very efficient. Essentially it nullifies the Nvidia efficiency advantage at the same GPU power tier. It sits probably spot on in the "sweet spot" power/efficiency for the chip.
You might want to see this review :).
https://www.youtube.com/watch?v=LhsvrhedA9E
Only about 10 % less powerful than the RX 9070XT, but 80-100W less powerdraw :). That also tells me already that you will be able to undervolt or frequency limit the RX 9070 XT in a way that costs only 2-5 % (if even that) performance with massivevly better efficiency. This must be so, since RX 9070XT and RX 9070 are the very same chip :).
You need to tinker with such chips a little. Actually both GPUs would be a nice enough upgrade from a RX 3080, but in your place I would buy the 9070 XT and tune it to my liking.
Overall it seems the RX 9000 series is a very large leap forward for AMD Radeon, both on the architecture and the software side (upscaling). It is very nearly parity with Nvidia now (with RTX 5070 TI), at a much better price. The only thing that is really left for Nvidia is heavy Ray Tracing titles as an advantage. While the 9000 series is much better than RX 7000 there it is clearly not equal. But these games you can count with one hand and also any GPU below a RTX 4090 struggles in such titles (Black Myth Wukong/Cyberpunk) with all the RT bells and whistles.
So, really a nice step forward. The next GPU generation should be really interesting.
Hey, thanks for sharing. Yea, I saw those videos as well, but again. Its not so much about upscaling side of things for me. Its about the power draw. Which the 50 series is very efficient. With better performance at lower wattage.
We'll see how NVIDIA responds next Gen thats for sure. As the upgrade from last gen is really not there, and so far the reviews arent that good. Especially with the manufacturing defects. But I'd be going from a 3080 as you know. So the upgrade for me would make sense.
Im sure NVIDIA will have the kinks solved by than and by the time I upgrade there hopefully wont be any issues. I will think about it some more. Im in no rush.
the upscaling looks good on generic content. i reckon it should work well.
mosdef out of context assuming the included framegen below.
(neat to have more interpolated frames, but it comes with a compute cost. andy... how's the input latency? the experience...
you may wanna play parts of it like a third person shooter and frame delay is not really bueno for that. frame delay hurts cinematic performance too if you target the wrong frame rate. 30 to 24 conversion. ig got it down to a solid tho. it looks good on 48. i know i may have to drop the target fps for part 2. lots of scenery. midrange gamer not giving in to the AI hype. :)
thanks for the points, animal. not worth a lot, but i see it. :)
And when people want to play at 4K but only have a RTC 4070 TI Super or equivalent upscaling sure helps from time to time to get more than 60-70 FPS. Quality mode DLSS4 or FSR4 upscaling to 4K looks definitely better than native 1440P so... . I also do not use upscaling often except for UE5 but in some scenarios it is a very good experience and can enable "sort of" high resolution 4k gaming at a low price tag that is close to the native experience.
In this upscaling is completely different than frame generation, upscaling truly enhances performance, while frame gen does not.
is frame gen included in this? i wrote my comment under the assumption it is. and yes. frame gen doesn't help at all. unless you already have a playable framerate and wanna push you fps for some gamer reason.
FSR4 therefore is a genuine step forwrad for AMD Radeon since it gives you image quality on par with DLSS 3 or even slightly superior. Upscalers will have less image quality than native, but can have better AA (eliminate TAA blur) and as I said upscaled to 4K looks better than 1440P native with FSR4 or DLSS 3/4. So if I need to use such tech to get high FPS (more than 60 FPS in shooters for example) I would definitely use it. Framegen..., no way.