Steam 설치
로그인
|
언어
简体中文(중국어 간체)
繁體中文(중국어 번체)
日本語(일본어)
ไทย(태국어)
Български(불가리아어)
Čeština(체코어)
Dansk(덴마크어)
Deutsch(독일어)
English(영어)
Español - España(스페인어 - 스페인)
Español - Latinoamérica(스페인어 - 중남미)
Ελληνικά(그리스어)
Français(프랑스어)
Italiano(이탈리아어)
Bahasa Indonesia(인도네시아어)
Magyar(헝가리어)
Nederlands(네덜란드어)
Norsk(노르웨이어)
Polski(폴란드어)
Português(포르투갈어 - 포르투갈)
Português - Brasil(포르투갈어 - 브라질)
Română(루마니아어)
Русский(러시아어)
Suomi(핀란드어)
Svenska(스웨덴어)
Türkçe(튀르키예어)
Tiếng Việt(베트남어)
Українська(우크라이나어)
번역 관련 문제 보고
It's kind of amazing how old hardware is still capable to maintain 30-60 FPS on new AAA games.
The 1080 is still a beast but I dunno about the 4790 though.
Sounds like a bottleneck.
Might not be able to crank up every visual and still have very smooth fps but yea you tweak the games to smooth out your fps averages more to your liking.
Now if the game is highly cpu intensive then it might not be so good. Like the Total War: Warhammer games or Mount & Blade 2 for example. But for all the Tomb Raider games, GTAV, RDR2, Kingdom Come, Metro games... haven't had any problems really with bad performance
I think between Sandy Bridge and the Intel 10th generation I think it was, CPUs didn't even really get twice as fast in IPS (instructions per second, not clock, meaning even WITH the newer CPUs being clocked faster, they weren't quite firmly over twice as fast, which also meany those older ones overclocked lessened the gap a little bit). Now, twice as fast is not nothing, and in some things it might have actually been (much) more than twice, or even (much) less. that's also obviously ignoring core count, but core count isn't a "default" increase for CPUs like IPC or clock speed is... and to "only" get that much faster over almost a decade is rather slow compared to before those times. But this might be the new normal given we're approaching limits of things.
But ultimately, if today's CPUs are good for high refresh rate gaming, then of course something at least half as fast will have a good chance of being good for 60FPS (let alone 30FPS+) a good amount of the time.
GPUs have been advancing faster but now it seems it may be becoming their turn. Generations used to be a year apart, a while back it changed to two like CPUs, and the rate they are getting faster is also slowing as they have to rely on simply raising TDP to keep the pace going (like CPUs have had to at times). Worse, the rate they are getting faster at the same price point is advancing even slower. The $259 GTX 1060 STILL doesn't have something twice as fast at its same price point 6 years later. I know inflation has been a thing but even even considering that, it's not there yet. The $329 RTX 3060 is closest, and it's not twice as fast, and last I looked, it's going for closer to RTX 3060 Ti pricing anyway (which in unfortunate as the non-Ti was already the worse value).
So people still using aged hardware, and it being capable (especially when you consider the majority doesn't play at higher resolutions, and sometimes not high refresh, greatly lessening the barrier of entry so to speak), shouldn't be much of a surprise.
That, plus even now not that many games can make a good use of 8+ threads, making all those 20- and 24-threaded CPUs overkill for pure gaming.
2500K was a good leap over past stuff. Unless maybe you were able to afford a nice LGA-1366 config.
Then after that, not a big jump until around the 4790K.
Then not a big jump until the 8700K or perhaps 9700K
Then a bigger jump with 12th Gen i7/i9 stuff.
But there are claims that the 13th Gen i5 is even enough to put every previous Intel and AMD consumer (non-workstation) CPUs to shame. So we'll see.
You CPU is not the one that will make a very big difference in game....Your GPU is the one that make the biggest difference in game....so you better to have a smaller cpu but put your cash on the GPU.
I already bought the CPU for $50. To be honest it seems fine for 1080p 60fps paired with a GTX 1080.