Asenna Steam
kirjaudu sisään
|
kieli
简体中文 (yksinkertaistettu kiina)
繁體中文 (perinteinen kiina)
日本語 (japani)
한국어 (korea)
ไทย (thai)
български (bulgaria)
Čeština (tšekki)
Dansk (tanska)
Deutsch (saksa)
English (englanti)
Español – España (espanja – Espanja)
Español – Latinoamérica (espanja – Lat. Am.)
Ελληνικά (kreikka)
Français (ranska)
Italiano (italia)
Bahasa Indonesia (indonesia)
Magyar (unkari)
Nederlands (hollanti)
Norsk (norja)
Polski (puola)
Português (portugali – Portugali)
Português – Brasil (portugali – Brasilia)
Română (romania)
Русский (venäjä)
Svenska (ruotsi)
Türkçe (turkki)
Tiếng Việt (vietnam)
Українська (ukraina)
Ilmoita käännösongelmasta
While I'm super impressed with performance, I think I'm just as impressed with thermals and noise levels. My (EVGA) GTX 1060 ran quiet, but warm. Safe, but warm. The default fan curve/BIOS was tame as it prioritized noise levels, which I prefer. I also run my case fans at lower voltage/speed for the same reason, so part of me was figuring a graphics card with well over twice the power consumption was going to result in some combination of similar temperatures or higher noise. Yet it runs just as quiet (maybe quieter?) and much cooler at the same time.
Like you, I don't tend to upgrade often, since I don't chase performance for the sake of it, and I share in what you said in a later post and prefer to get closer to triple digit percentage uplifts in performance when I upgrade, as less than that doesn't register as much for me (for reference, my past upgrades of 8800 GT to GTX 560 Ti, and then that to GTX 1060 didn't feel too substantial for me at the times).
The GTX 1060 lasted me just shy of seven years though so I can't say anything bad about it. I might have upgraded sooner, but every generation after Pascal had some reason for people to wait it out. The RTX 20 series was only one generation later, and saw price hikes. The RTX 30 series was the earliest most would consider upgrading, but cryptocurrency mining set pricing to the moon, and by time that passed, I decided to hold out for the upcoming generation on the horizon. I'm glad I did, much as I may have wanted to upgrade sooner. The RTX 30 series was still a poor value after price drops, and had serious VRAM deficiencies. Unfortunately, price/performance with this generation was a low uplift too, but it had gotten to the point the overall uplift was well warranted.
With you having a GTX 1080 you might be able to sit longer. I think that's the one graphics chip in the Pascal generation still hanging on. If I had one, I might have been waiting still and hoping next generation is a decent enough uplift (I'm slightly skeptical though, outside nVidia finally offering decent levels of VRAM now that pricing is on the floor for it). You'll know when the time is right. Someone else said it above and I absolutely agree with it, and that is trust your own feeling on this. Buy when you find a worthwhile uplift at a price you agree with.
Current pricing in GPU market is not great... I'm happy with the card, but not with the price.
All things considered, given the position of the card, inflation, etc and a MSRP $269 USD. You add in whatever aftermarket tweaks, cooling, and other jazz (a "free" AAA game). $330 doesn't seem like a terrible price. Arguably for the price of Starfield being $70 USD you came out ten dollars ahead if you had found a $269 version with no frills and had to buy Starfield separate.
I dunno in some cases I think people get so used to grousing about GPU prices every deal seems terrible. In 2016 a GeForce 1060 6GB MSRP'd for $249 and there were probably $300 versions easy. And I kinda feel like the RX7600 is in an equivalent spot in the lineup. So again the price doesn't seem unreasonable to me.
I guess what do you think the price ought to be? And what makes you think that?
MSRP btw somewhat nonexistent outside the US.
Regardless the RX 7600 is competitive with RTX 4060. And I don't think you can point at the product line to argue price, when ultimately performance is usually a better indicator. Competitive products will often have competitive prices. That's not a new thing.
I mean the RTX 4060 is the lowest desktop sku too at the moment. There's no desktop 4050 or 4030 either. I just think entry level cards aren't being produced like they used to be. If there is going to be a 4050 and an equivalent RX 7500 that would also kinda shoot holes in your arguments. The entry level hardware tended to be released last/later anyway.
Speaking of performance - it's increase from 6000 series is pretty small to justify 300+ price tag.
The entry level hardware tended to be the last to be released anyway - last, but not least. It's the most massive segment of the market in terms of quantities sold.
I think it's a mix of gains being harder to get, at least economically, but I also think nVidia and AMD are giving the least possible amount of uplift they thing the market will tolerate because the previous generation coincided with a demand surge (both because of lockdowns plus cryptocurrency mining) and it left them in the tough spot of needing to bring out a new generation without cannibalizing themselves in the process. So I guess I'm hoping this generation being a smaller uplift is partly because of that.
To a point, the x60s/x600 tiers from nVidia/AMD respectively, at least the non Ti and XT versions of them, are feeling more like entry level gaming chips rather than "the" midrange spot they used to be. This is both because there's more tiers atop them (introduction of x90/x900 tiers), and the production of tiers below them are falling out from under them (improving integrated graphics and the cost of products below the x60 not being as worthwhile to make anymore).
Just speaking as someone who used to buy particularly at the x60 tier for nVidia and found they always hit the mark, but the RTX 3060/4060 were just not appealing for some reason or another. The former gives the VRAM you want, but less of the price/performance, and the latter is the inverse. Of course, maybe Pascal (what I had before) being slightly generous with VRAM is skewing my perception, but it does seem nVidia goes through this cycle of being stingy on VRAM and the last generation or two seems to be that time again (Kepler was the time before it).
This doesn't means no new products will ever come out; just that if they do, they would be based on existing chips rather than new ones.
Yea like how the jump from 8nm ampere to 5nm adalovelace reduce the power consumption 🙄
Did the 3090 Ti run at 3000mhz like a 4090 can? No. So you're comparing apples and oranges.
It can be the most efficient manufacturing process in the universe and you can still make a card draw 1000 watts if you increase the clocks / voltage high enough.
As Karumati said, it's not relevant. What matters is the performance you get for that same power draw.
A better comparison would be a tdp of two undervolted gpus but whatd i knw abt steams logic