Instale o Steam
iniciar sessão
|
idioma
简体中文 (Chinês simplificado)
繁體中文 (Chinês tradicional)
日本語 (Japonês)
한국어 (Coreano)
ไทย (Tailandês)
Български (Búlgaro)
Čeština (Tcheco)
Dansk (Dinamarquês)
Deutsch (Alemão)
English (Inglês)
Español-España (Espanhol — Espanha)
Español-Latinoamérica (Espanhol — América Latina)
Ελληνικά (Grego)
Français (Francês)
Italiano (Italiano)
Bahasa Indonesia (Indonésio)
Magyar (Húngaro)
Nederlands (Holandês)
Norsk (Norueguês)
Polski (Polonês)
Português (Portugal)
Română (Romeno)
Русский (Russo)
Suomi (Finlandês)
Svenska (Sueco)
Türkçe (Turco)
Tiếng Việt (Vietnamita)
Українська (Ucraniano)
Relatar um problema com a tradução
You agreeing does not mean you go around threads, shooting down other people's opinions and answers, calling info provided fake or talk as if you are the god of tech knowledge that only you can provide an answer and no one else can.
Also, Game Developer's system requirements tell only one side of the story. When they specify "recommended specifications", it can also mean the game running on that spec, without anything else in the background. I can give a prime example, Call of Duty WarZone, despite having a recommended CPU of an i5-2500K, my workplace's i5-7400 (which is about 5% better than the 2500K but with DDR4 ram), still struggled to maintain a smooth experience and that's just with the game running only with little to no background programs. I run Discord or open even chrome, it starts stuttering all of a sudden, even in Valorant too.
Or better still, despite me having a very high end PC (9900K, 32GB ram, RTX 2080 Ti), pinning a game like Fortnite to the maximum possible FPS with DLSS caused the game to stutter, although my CPU and GPU load weren't even at 100%. It goes to show Epic Games did not optimize the game well to run at that specs and FPS.
The devil is in the details, just like how ram usage, despite WarZone recommending 16GB, actually is 24GB in committed memory with 10GB of it being standby or virtual memory on the pagefile.
If you really think about it, the term is actually meaningless. It also depends on how you define this term yourself.
This is what I hate about marketing: it can brainwash you into believing your hardware isn't "good" enough anymore after months or one year. But for that year, wow, your stuff sure was "future-proof."
But again, it's not going to last forever, so it can't be futureproof.
I don't know why you think it does.
Indeed he does.
I did not ignored his answer.
Comment #105
And it appears that you again applied your ''dictionary word meaning'' tactics, to write something against your opponents. Exactly as i have mentioned in Comment #68. (exactly).
Things were way different than now. I absolutely can't see the future so I won't try and predict 8 cores will be needed in X years or that new consoles will make them mandatory, as it's not my intent (nor is it necessarily what I think either), but only to say that the "consoles didn't do it then so it's proof it won't happen now" isn't necessarily applicable either. That's overlooking some key differences I mentioned in my prior post.
By the way, those FX CPUs weren't even 8 fully fledged core CPUs anyway. What is it with FX and bringing bad luck a lot? Remember the GeForce FX (5) series?
I had a FX5600, great GPU for the time.
''to design software, a computer, etc. so that it can still be used in the future, even when technology changes:''.
https://dictionary.cambridge.org/dictionary/english/future-proof
https://linustechtips.com/topic/345641-future-proofing-doesnt-exist/
https://www.weforum.org/agenda/2019/01/how-to-plan-for-technology-future-koenig-elsayed-ali/
https://www.extremetech.com/gaming/275890-why-you-cant-future-proof-your-gaming-pc
I have been here for quite a while as I can remember. Most members claimed a few older GPU is a future-proof which I do not believe it is.
Also, Cambridge misinterpreted.
It is definition for British word, but I will post it anyway.
(Of a product) unlikely to become obsolete - not true for PC, my GTX 680 have been obsoleted, so is GTX 10XX.
Ampere brought big gains to RT performance, and AMD's own RT with RDNA2 has upped the popularity of RT. It's inevitable that RT will replace rasterisation completely at some point, and when that happens, ALL video cards that can't keep up will be rendered obsolete for modern gaming.
So you cant have a piece of hardware that will last you until hardware stops progressing (unlikey to ever happen.)
I don't get what's so hard to understand about this, lmao.
Many, myself included, have older hardware that runs just fine, thanks. I definitely got my money's worth out of my almost 5 five year old 1080 FE and feel no shame about running it until I can replace it at my convenience.
How many people felt shame and and envy and discarded perfectly good hardware because of this "future-proof" jazz? I think the best is yet to come game-wise as ray-tracing is still more of a concept and less of a reality right now.
Intel and its "future-proof" motherboards every two years, don't get me started...
Edit to add: OP--temps, did you ever think your post would ignite such a lively debate and conversation? Pretty amazing--I gleaned some things from the posts here, for sure.
A strong CPU will support any future GPU without any bottleneck allowing the owner to just put money on a singular component down the road if newest games become more demanding: the GPU.
If you save on CPU you will regret, the CPU upgrade will involve also MB and probably new generation RAM.
I myself would always go for a high end i7 processor since it will allow me to replace only the GPU in 3/4 years, the system will last at least 8 years at high levels.