Steam telepítése
belépés
|
nyelv
简体中文 (egyszerűsített kínai)
繁體中文 (hagyományos kínai)
日本語 (japán)
한국어 (koreai)
ไทย (thai)
Български (bolgár)
Čeština (cseh)
Dansk (dán)
Deutsch (német)
English (angol)
Español - España (spanyolországi spanyol)
Español - Latinoamérica (latin-amerikai spanyol)
Ελληνικά (görög)
Français (francia)
Italiano (olasz)
Bahasa Indonesia (indonéz)
Nederlands (holland)
Norsk (norvég)
Polski (lengyel)
Português (portugáliai portugál)
Português - Brasil (brazíliai portugál)
Română (román)
Русский (orosz)
Suomi (finn)
Svenska (svéd)
Türkçe (török)
Tiếng Việt (vietnámi)
Українська (ukrán)
Fordítási probléma jelentése
So it's ~2.4mil.-> 1440p vs. ~2.2 mil.-> 4k. Maybe I shouldn't have said "exactly" but whatever.
Regarding the extra elements to render I'm not sure what you are up to about this as it totaly depends on the implementation of ultrawide and in general the performance impact is almost linear to the pixel count. I think I've seen only one or two outliers when I researched this topic before buying my monitor but nothing noteworthy and till now never came across any meaningfull source pointing out the massiv performance increase due to a wider field to draw.
Besides this you absolutely missed the point of my post.
I think I understood your point; from what I gathered, your tried a product, found out it worked better than you thought in many regards, but ultimately still decided to go with something more?
I was mostly replying to say that it won't always be this way.
Though, more often than not, it probably will, yes. Especially when factoring that most people will have more modest GPUs which are more likely to be pixel fill rate limited. I was probably thinking more broadly (the "extra demand" in ultra wide cases also comes in the form of more CPU needed, whereas you weren't talking about that and were just talking about the GPU side of things).
Since I'm not afraid to admit when I'm "wrong", I'll show you the example I was going to use just to show the "extra rendering" a wider aspect ratio will do. The White lines represent the edge of a 2560 x 1600 image on top of the same scene in 3840 x 1600.
https://hosting.photobucket.com/images/t464/Garnet_Alexandria/0.jpg
But I also have performance numbers from both resolutions (along with 4K) that I got for curiosity sake, which in this case actually support what you say and not what I say, and I'm not afraid to admit that (sample size of one, though). Here they are.
2560 x 1600
https://hosting.photobucket.com/images/t464/Garnet_Alexandria/2_u9LqPXi2JQdxvgZQp58CKM.jpg
3840 x 1600
https://hosting.photobucket.com/images/t464/Garnet_Alexandria/4_ijV9JWNkqm4rNLhXvfHZLH.jpg
3840 x 2400
https://hosting.photobucket.com/images/t464/Garnet_Alexandria/6_bgW78KcWqaVRH1GBR6TN7F.jpg
Just as you say, pretty linear, so probably largely fill rate limited here. So in my one example I was wrong and I admit that. I just wanted to say that SOMETIMES there will be more to it than just the raw pixel count, which is still true.
In either case, I'm glad you found something you were happy with. I'm looking at upgrading from my aged GTX 1060 very soon too, but an RTX 4070 Ti is just a bit too expensive for me, and 12 GB VRAM would be less that I would be comfortable with long term. If the RTX 4070 had 16 GB I'd probably look at it myself. And there's rumors of higher VRAM refreshes coming later but I'm not sure I want to keep waiting that long.
Out of curiosity, was the RTX 4080 an easy choice? I know an RTX 4090 is quite a bit more expensive than the RTX 4070 Ti you were originally looking at, but I still think the RTX 4090 would have made the RTX 4080 a tough choice to make to go with.
I went from a 970 to a 2070 super and it was heaven. So you're in for a treat whatever card it will be.
When I saw I needed about 25% more performance to play Cyberpunk at ~7k it was decided. First time I saw 7k in Cyberpunk with my 2080 ti I knew this is what I want. So yeah... was a rather easy choice to go for the 4080 :D
7k Cyberpunk was the first time in a game I thought this is almost real life.
Also right now is a rather unique situation in which the 4080 performance increase is higher than the price increase.
According to Techpowerup and Userbenchmark probably about 50-60% Ras performance increase.
For Raytracing and DLSS I would guess closer to 100%+ performance increase.
That's actually kind of why I was asking. The performance increase of the RTX 4090 over the RTX 4080 seems to be higher than the price increase as well.
I "sort of" see the point to the RTX 4070 Ti even if its value is bad in my eyes, due to the gap in price, but the RTX 4080 just seems too "close" to the RTX 4090. So I was just curious if you had given it a look at all, or if you were happy enough with the RTX 4080 (sounds like the RTX 4070 Ti was mostly enough for you anyway so I get it it from a "why spend even more than I have to" perspective).
DLSS Ultra Performance = 1/3 scale internal rendering.
4k = 3840 x 2160
8k/7k = 7680 x4320
3840/3 = 1280
7680 / 3 = 2560...
As in 1280 x 720 and 2560 x 1440p.
So what you are actually saying in that long OP is that the 4070ti is a top end 720-1080p card, and a decent but not great one at 1440p ( too much data though and brought the performance down), and that if one is willing to accept quality (but fake) resolution and frames, along with compressed textures, the card can push 4k+ outputs, but not native internal render.
Sadly this is not how DLSS works performance and vram wise.
This would make things sooo much easier :D
The price/performance increase from 4080 to 4090 is still linear what is actually insane.
But I realy don't want to keep on moving goalposts. You can always go for more and find reasons why it makes sense. "price/performance" is still just one of several metrics.
Also the 4070 ti still works pretty good with my system. The 4080 will too but it'll be borderline.
The 4090 would definetly need a new system.
If I kept the 4070 ti I'd probably still be happy until whatever model of the 50series catches my interest. With the 4080 it's almost certain though, that I will skip 50 series.