ติดตั้ง Steam
เข้าสู่ระบบ
|
ภาษา
简体中文 (จีนตัวย่อ)
繁體中文 (จีนตัวเต็ม)
日本語 (ญี่ปุ่น)
한국어 (เกาหลี)
български (บัลแกเรีย)
Čeština (เช็ก)
Dansk (เดนมาร์ก)
Deutsch (เยอรมัน)
English (อังกฤษ)
Español - España (สเปน)
Español - Latinoamérica (สเปน - ลาตินอเมริกา)
Ελληνικά (กรีก)
Français (ฝรั่งเศส)
Italiano (อิตาลี)
Bahasa Indonesia (อินโดนีเซีย)
Magyar (ฮังการี)
Nederlands (ดัตช์)
Norsk (นอร์เวย์)
Polski (โปแลนด์)
Português (โปรตุเกส - โปรตุเกส)
Português - Brasil (โปรตุเกส - บราซิล)
Română (โรมาเนีย)
Русский (รัสเซีย)
Suomi (ฟินแลนด์)
Svenska (สวีเดน)
Türkçe (ตุรกี)
Tiếng Việt (เวียดนาม)
Українська (ยูเครน)
รายงานปัญหาเกี่ยวกับการแปลภาษา
No.
As long as you don't count Ray Tracing as "Max Settings".
The popular games in the pool for "benchmarking" still include titles that the 3060 Ti can run at Ultra.
We are just now getting into the Post-8GB era where the 3060 is starting to be reduced to High or Medium.
Also, 1080p is no longer considered a high resolution. Textures are VRAM limited, Resolution is GPU processing limited. The 3060 is generally quite capable of 1080p. 1440p/Low and 80 FPS seems reasonable for the 3060, especially if DLSS is involved.
Onboard has gotten better. This is why the x40 and below either disappeared, or get refreshed very rarely.
If you're buying a basic display adapter for an older PC, even the lower end ones available now are probably good enough. If not, newer onboard is better.
So for dedicated cards, yes, the x50 is basically entry level now. And honestly, even the x60 (starting with the RTX 30 series) feels more upper entry level as opposed to mid range for me. The RTX 2060 to RTX 3060 uplift was relatively small (RTX 2060 was above GTX 1070 and RTX 3060 was around equal with GTX 1080 so a small uplift, but barely there). The RTX 3060 Ti was the "real" RTX 3060 but then it bought the price up to $400 and lacked VRAM too. The upcoming RTX 4060 Ti is also rumored to have 8 GB VRAM and potentially cost $450. I'd say that makes it dead on arrival (honestly even if it's $400), yet the RTX 3060/3070 sold in high numbers despite being pretty flawed spots in their lineup, so the x60 might sell well simply because it's nVidia and has the lowest price. Then again, none of the rest of the 40 series besides the RTX 4090 is selling well, so here's to hoping consumers wise up and don't eat the RTX 4060 up just because it has nVidia and x60 in the name.
And the RTX 4050 is rumored to have 6 GB on a 96-bit bus. Just rumors for now, granted. To be honest I'm surprised they're still bothering with the x50. As I said, the x60 feels entry level now. In CUDA core count (but also performance), the x60 used to be around half of the top SKU (which used to be the x80), and the x50 used to be around half of the x60, or a quarter of the top. How about now? The rumored RTX 4060 Ti is going to be around quarter of the RTX 4090. So I'll let you determine what that means the RTX 4060 Ti REALLY is, despite what it's going to be called (hint, it's an RTX 4050 being over-named to justify higher pricing).
There's a reason the RTX 4070 (non-Ti) feels "barely mid range". It's what the RTX 4060 (not Ti) should be. It's the old "entry level to mid range" basically, and yet it costs $600. No wonder it's not selling.
In other words, the lineups have been changing in recent years. The old times where the x40 and below existed, and the x60 was actually firmly middle, are long, long, LONG gone.
Substantially. The RTX 3080 Ti, 3090, and 3090 Ti aren't all that far apart. The x90 number used to be for dual GPU graphics cards, but was reintroduced here as a new SKU. it wasn't really a gaming GPU (even though gamers also ate it up), so it wasn't much more powerful because the RTX 3080 launched first as the flagship, and nVidia only had so much more they could do with it.
The RTX 4090, on the other hand, launched first as the flagship, and was a massive uplift, but the RTX 4080 was severely cut down from it. As far as I'm concerned, everything in the RTX 40 series lineup below the RTX 4090 itself is overnamed a whole SKU, in order to help ease the justification for them being overpriced. That is, the RTX 4080 is an RTX 4070 or Ti, and the real RTX 4080 (which will no doubt come as the RTX 4080 Ti) is still missing. And it's like this because the new generation had to supplement the old one and could NOT afford to replace it. Why? Too much stock from the crypocurrency times.
The reason nVidia had to make a new generation was because they were contractually obligated to do so before the demand fell.
AMD is dealing with it a different way. They simply AREN'T introducing a new generation yet (which is why it's only the 7900 series so far). I guess they didn't have the same problem nVidia did of having as much supply, because nVidia represents WAY more market share and thus sells way more volume.
In other words, this is thus far a "lost generation".
But yes, the RTX 4090 is a massive performance uplift over the RTX 3090 Ti. Even the RTX 4080 is. You can easily see this with any reviews.
"Feels".
Not making the x40 and below does not make the the x50 "Entry Level".
It merely means that Onboard has improved to the point where Onboard graphics is the entry level. Just having a computer is entry level.
Just above onboard is the Steam Deck.
Or is the Steam Deck not considered a "Gaming" device?
x50 is entry level, might play newer demanding games at lowest settings
x60 is mainstream
x70 is good gaming level
x80 is enthusiast
x90 is dev or rendering work
steam deck, meh, it can play games on its own at its res 1200x800, its not even full hd, around 'hd' (1280x720p)
for a handheld its pretty strong, but not compared to a desktop with dedicated gpu
imho, best use for steam deck is to play on the go, while streaming from a much more capable desktop with fast internet connection
These are all subjective, arbitrary terms. Where a card stands changes from person to person. Namely, and like my very first post in this thread states, it depends on if you're using the definition of looking at a single lineup, or at the wider gaming market. What is entry level, mid range, or high end will differ DRASTICALLY between which of the two sides you lean towards.
You're not doing it any differently by using the number alone, either. That is a more flawed method as far as I'm concerned because it ignores how the chips TRULY stack up in an era where they are being shifted relative to one another. You can not just look at the number anymore. The x60 is, objectively, what the x50 used to be. There exist a gap between the x70 and x80 larger than there has ever been. Everything is cut down. Ergo, what used to be mid range may now be slipping towards entry level (x60 in this example).
This might sound absurd if you look at the pricing of these cards, and that's because it is! nVidia is taking entry level stuff, giving it mid range naming, and then charging mid-high end pricing. I've been saying this for a long time now. Everything (referring to the RTX 40 series, namely further down the stack from the top) is over-named by up to one tier or so to justify it being overpriced by up to two tiers or so. It sounds absurd because it is! But objectively, this is how the chips compare. Don't tell me it's just "feelings" when nVidia is the one that renamed them. The names are arbitrary. The tech stats and performance is objective.
So you missed my main point. The former low end has been evaporating, and the remaining lineup has been being cut down at the same tier. What this means is that the low end STILL EXISTS, just with a higher number than it used to have. So, yes, the x50 and as far as I'm concerned the x60 are the entry level now.
This isn't to say they are poor performance. Obviously even an RTX 3060 is a well performing card for most things. Most of the market, myself included right now, has lower (sometimes much lower) performance.
So it's "HD" and then some slightly, due to being a bit taller vertically. This is because it is 16:10 as opposed to 16:9.
No love for the Steam Deck?
Streaming sucks. I tried it with the Steam Link, and I tried it with the Steam Deck. I have always gotten better quality running natively on the device.
streaming to a friends house 40min drive away <20ms ping, and <40ms display lag, every game is completely playable
and in house, its <20ms just over 1 frame @60hz behind the host, you cant even tell its not local
steam deck is fine for what it is, but will it play new games by itself in 5+ years, not even close
and upgrading it is difficult
My understanding of "Entry Level" is that it is synonymous with "Minimum Requirements". Once a device no longer meets the minimum requirements, it is no longer entry level.
3060ti came out in December 2020. I bought mine in Oct 2021 so no idea where you came up with Feb 2022? Maybe check your facts before calling others out lol
https://www.techpowerup.com/gpu-specs/geforce-rtx-3060-ti-gddr6x.c3935
https://wccftech.com/nvidia-geforce-rtx-3060-8-gb-rtx-3060-ti-gddr6x-graphics-cards-october-launch-rumor/
I think the RTX 3060 Ti 8GB GDDR6 was launch in Dec 2020, while the RTX 3060 Ti 8GB GDDR6X was launched last year, that's why there's a divergence in release dates. I suppose when we talk about the RTX 3060 Ti now, we're referring to the later model with GDDR6X VRAM buffer.
I think the Steam deck has an AMD graphics solution, so I'm not sure where it falls, but it is its own device so no I wasn't really even thinking of it.
I'd love to get one as a complimentary device for playing lower demanding games when I want to be away from my desk, but my main gaming device needs upgrades more right now.