Steam 설치
로그인
|
언어
简体中文(중국어 간체)
繁體中文(중국어 번체)
日本語(일본어)
ไทย(태국어)
Български(불가리아어)
Čeština(체코어)
Dansk(덴마크어)
Deutsch(독일어)
English(영어)
Español - España(스페인어 - 스페인)
Español - Latinoamérica(스페인어 - 중남미)
Ελληνικά(그리스어)
Français(프랑스어)
Italiano(이탈리아어)
Bahasa Indonesia(인도네시아어)
Magyar(헝가리어)
Nederlands(네덜란드어)
Norsk(노르웨이어)
Polski(폴란드어)
Português(포르투갈어 - 포르투갈)
Português - Brasil(포르투갈어 - 브라질)
Română(루마니아어)
Русский(러시아어)
Suomi(핀란드어)
Svenska(스웨덴어)
Türkçe(튀르키예어)
Tiếng Việt(베트남어)
Українська(우크라이나어)
번역 관련 문제 보고
General rule of thumb is to drop 10 points for each GPU generation, and as generations are annual releases, the card will be wiped out of the "current gen" performance metrics after about 9 years, another 5 years after that and it typically reaches integrated graphics territory.
RTX 4090 (2023) -> 5080 (2024) -> 6070 (2025) -> 7050 (2026) -> 8040 (2027) -> RTX 9030 (2028) -> RTX 10th Gen x20 class (2029) -> RTX 11x10 class (2030)
By the end of the decade we will probably need a new ranking system for NVIDIA GPUs, I doubt RTX 10020, and RTX 11010 would be marketable. Maybe go Intel-ish with n3-10020, and n3-11010?
Similarly, the 1080 ages:
1080ti (2020) -> 2080 (2021) -> 3070 (2022)-> 4060 (2023)-> 5050 (2024)-> 6040 (2025).
Which somewhat answers the OP's question with the 1080ti entering 40 class performance at around 2025, which would be about 4 on a scale of 1-10, dropping it below the "midrange". Which is about on par with the expected end of the PlayStation 5 lifecycle.
0 is Intel integrated graphics, which is up to three console generations behind, which top end integrated cards being a "1" and only two console generations behind.
1-3 is Desktop Class business use, typically two console generations behind.
4-6 is mid-range gaming, with anything less than a 5 being scoffed at. Price to performance is generally most cost effective. Expected to be competitive with current gen consoles.
7-9 is high end enthusiast class, expected to exceed the abilities of current gen consoles.
---
What is most interesting is the discussions about Moore's Law not holding up, and compute power plateauing, along with growing popularity of Mini PC designs and ARM architecture.
The 1080ti is fully expected to be rendered obsolete by progress, but will that hold true for all subsequent cards? Suddenly there is doubt as to whether an RTX 4090 will ever be rendered entirely "obsolete", at least on the x86 platform.
An RTX card shouldn't lose performance running RTX! How stupid is that?!! Yet people buy them.
I'm not surprised though. Nvidia marketing is impressive. The marketing team deserves more praise than the engineering.
That's the kind of answer I've been looking for, and you laid it all out nicely.
I thought my 3060 Ti was the equivalent of a 2080 Super... does that fit and jibe with your analysis?
Aside from the fact that FSR looks like sh!t
Granted I've only ever seen them on RDR2, No Man's Sky, and Death Stranding, all at 1080p.
But I've tried the other methods and they all look muddy or glitchy.
I used DSR back when I had a 1660 Ti, and it seemed to sharpen up the graphics a lot, but I don't need to with a 3060 Ti. I thought it looked better, but who knows.
If you need FPS help at 1080p, you need better PC hardware, hands-down.
I don't need more than 60 frames anyway.
Can you help me understand how it is night and day going from the 1080ti to the 4080 at 1080p? I have been on the border of upgrading my 1080ti but I play at 1440p and run new games very smooth. I am trying to figure out if I am missing out.
That said, if you haven't naturally arrived at the conclusion you need to upgrade, then what you have now probably isn't in need of replacing regardless of how much faster newer stuff is. It sounds like you're about to reach that point with the phrase "I have been on the border of upgrading" though.
If you're not in a rush to upgrade, the upcoming SUPER launches might be worth waiting on if you're considering the RTX 4080, because the RTX 4070 Ti and RTX 4080 are pretty mediocre values (though with the RTX 4090 going to the moon, the RTX 4080 looks less bad) and nVidia might slightly improve on that with the SUPER models at those spots in the lineup. Hard to say though since this is modern nVidia we're talking about.
is off the charts as well.those who say different either dont have it or have to low
a card to use it.i dropped my 1080ti for a 3080 and it was mind blowing.while waiting
on my 3090.gave the 3090 to the wife and jumped on a 4090 the 3000 and 4000
series cards are leaps and bounds above the 10 series cards to think any different
your just lying to youself.doesnt mean theres any thing wrong with the 10 series cards
but realistically they are now a low tier card.
its the rtx cards that have shortcuts to do it faster
fire up the quake2 rtx demo on intel hd if you doubt me, it just wont run at a playable fps
You can crank up CP 2077 to max even with RT on (and I don't mean standard RT mode but the experimental, much more advanced one) and you'll get around 60 FPS without DLSS with a 4080 on 1080p which I couldn't even dream about when I had my 1080ti, just to give you an example on how powerful it is. Also what do you expect me to say except that I'm served for the next few years with this card. But in your case I'd wait a bit because 50XX will probably be launching next year and prices for this generation may drop.