Instale o Steam
iniciar sessão
|
idioma
简体中文 (Chinês simplificado)
繁體中文 (Chinês tradicional)
日本語 (Japonês)
한국어 (Coreano)
ไทย (Tailandês)
Български (Búlgaro)
Čeština (Tcheco)
Dansk (Dinamarquês)
Deutsch (Alemão)
English (Inglês)
Español-España (Espanhol — Espanha)
Español-Latinoamérica (Espanhol — América Latina)
Ελληνικά (Grego)
Français (Francês)
Italiano (Italiano)
Bahasa Indonesia (Indonésio)
Magyar (Húngaro)
Nederlands (Holandês)
Norsk (Norueguês)
Polski (Polonês)
Português (Portugal)
Română (Romeno)
Русский (Russo)
Suomi (Finlandês)
Svenska (Sueco)
Türkçe (Turco)
Tiếng Việt (Vietnamita)
Українська (Ucraniano)
Relatar um problema com a tradução
Pretty much every GPU flagship has been a significant amount ahead of the rest of the GPUs in the generation, so when come next generation, it holds up better than the rest of the generation.
Turing, however, was a marginal improvement at best over Pascal, people were PISSED about that, so how did you miss that?
The 2070 SUPER is just behind the 1080 Ti in many titles, and the 2080 is around equal. Compare that to the 980 Ti, which was rivaled by the GTX 1070 Non-Ti.
And the 1080 Ti is starting to show its age a bit. If you can't see that, you're either in denial or blind.
So, you are agreeing that GTX 1080 Ti is indeed Future-proof (because it's flagship).
That's exactly what i have said (exactly) in comment # 14. Quote-
''If you buy any top-end or even sami-top hardware at any given year, it will last comfortably for next at least 5+ years''.
So, i see no problem with anything. The Future-proofing DOES exist.
Eventually--maybe reluctantly--it gets old. I accept that; but my strategy has been to buy 2nd best or the best (it depends on the value to me at the time or what money I have during that upgrade cycle) and then hold onto that card (or cards) for 3, 5, or even 7 years. And even then when the day comes to replace it/them -- use the old flagship in another system if its possible, because wind can still fill its sails.
It's just that onet won't be able to sail as fast or as nimbly as one could on a new flagship.
My 1080tis can play Borderlands 3 at 85fps after significant tweaking with the nvidia profile inspector program. If I did not do that, the game ran at 25fps at best at 4k with all the ultra turned on. That's horrid, unplayable for a 3d shooter. But I had to manually go futz with it and ended up with a really good result. The thing is.. the day are over when that sort of performance was to be expected without having to try.
To tie this into the original post of future proof motherboards, there's a future you can proof, but it won't last forever. A flagship motherboard will outlast a flagship GPU by 2x to 3x years--and I can't imagine that being true for a typical consumer board with a person wanting to play AAA title games through the lifespan of that motherboard. Web mail yes. Total War II: Warhammer II at 4K resolution at ulltra while streaming, no.
the 1080ti by itself was the card that was just sooo close to being able to do 4K well or some other resolution at 144hz well--on modern titles-and do both consistently well enough to overlook the problems. There are now cards that actually get there. That doesn't diminish the 1080tis -- but there's no denying that there is better hardware out now.
5 years.
That's 2, maybe 3 NVIDIA generations. The 2080 Ti will be low end by the end of that stretch. With Radeon pushing back hard and making progress, NVIDIA has to push back to try and stop Radeon from stealing their thunder. The result: COMPETITION.
What does competition do? Increases the rate of which hardware technology improves over time, because competitors are all fighting to make the product that everyone wants to buy. It was the lack of competition and NVIDIA's goal of shoving RTX down our throats that brought us the rather disappointing RTX 20 series. Had Radeon been a strong competitor back then, that wouldn't have happened and the 30 series would've been even faster than it is right now. That changed with the 30 series in this reality because NVIDIA knew that Radeon was getting back on the right track with RDNA, so they decided to try and get one step ahead of Radeon and try to crush them before they could get somewhere.
It didn't really work, because AMD's offerings are strikingly close with the only major downside being the fact that their drivers are still garbage. If they manage to fix that issue and pull ahead enough, that's more than enough of a threat to NVIDIA to push GPU technology to the extreme because NVIDIA definitely has the resources and wealth to go beyond what we've seen, and AMD is only going to reciprocate at this point.
Futureproof means there is no possible way that there is not going to be improvements over it.
Building to last means building a computer that is (likely) going to last you a reasonably long time.
Example, if I buy a watch that's waterproof, should that only last 3-6 years, before it magically becomes not waterproof?
What about if I buy bulletproof armour, should that only last 3-6 years before it does nothing against bullets?
The whole statement of futureproof is just dumb, it doesn't have a place in the computer hardware world, since you cannot make a futureproof piece of hardware (CPU, GPU, RAM, Motherboard, Storage, etc.)
There will always be something better, and then something better than that, so building something and calling it futureproof makes no sense.
The 1080Ti is a great card, great for it's time, decent (midrange performance) now.
Let me ask a counter question; if it's futureproof, is it still going to be able to play the latest games at 1440p 144fps/hz in 3+ years?
We may max out on the performance side of things in the future. Next stop is cutting down on the physical footprint.
I accept that my machine is already obsolete. It's only 5 five months old. It can and will take a gpu upgrade when it's available. It was never built to be "future-proof." How?--when new and better stuff is coming out all the time? You can drive yourself nuts that way.
The 3070 slightly beats the 2080 Ti, which beats the 1080 Ti by easily 30%. The only way you're keeping up at 1440p with the new cards is with SLI, assuming you're playing games that support it well.
Wanting to call something higher end doesn't change what it actually is. My 3900X and 2080 Ti is statistically mid-tier at this point in time because Ryzen 5000 is 20% faster and the 2080 Ti is bested by the 3080 and 3090 by a fair deal. (Unless you power mod the 2080 Ti, where it could potentially catch up to the 3080, but it uses over 380W and requires a custom loop or LN2 to actually remain within safe temperature)
Whether the GPU is "futureproof" or needs to be upgraded doesn't really matter to me because GPUs are relatively easy to upgrade compared to a motherboard/CPU.
I will edit my original post to reflect this.
You may or may not be able to run future games on current hardware. Unpredictable.
I don't know why this is such a difficult concept for some people. There's not much point in spending more for "futureproofing" when it's not going to last much longer unless you're content with consistently having to lower your graphics settings, though if your CPU just can't handle a game, you're stuck with that until you upgrade it, but if it's already the best you can get on that socket or chipset... good luck.
Pre-Zen3 Ryzen is difficult to justify especially because the core performance is behind enough that there's extremely little performance benefits from a 3080 or 3090 above the 3070. There is a heavy bottleneck above the 3070 for those processors, so if you want a better GPU than a 3070, well, you're going to have to upgrade to Ryzen 5000 or Intel to not get bottlenecked to hell. It's severe enough that you literally may as well not upgrade your GPU because performance will not reflect the money spent at all.
Nobody can ever seem to talk about it because you always have the same two sides: the people with the rationality to realise how stupid the idea is, and the people that just want to justify their purchases instead of accepting the harsh truth.
And other people are telling me there is such a thing. I respect your point of view, and I have read through these comments. And might read through them again. But I'm also going to consider the point of view of people who say that you can buy parts strategically to make a system last you longer, most particularly when it comes to motherboards.
I'm simply saying there still seems to be debate over this and I'm going to consider every point of view.
Thank you for your input.