Steam 설치
로그인
|
언어
简体中文(중국어 간체)
繁體中文(중국어 번체)
日本語(일본어)
ไทย(태국어)
Български(불가리아어)
Čeština(체코어)
Dansk(덴마크어)
Deutsch(독일어)
English(영어)
Español - España(스페인어 - 스페인)
Español - Latinoamérica(스페인어 - 중남미)
Ελληνικά(그리스어)
Français(프랑스어)
Italiano(이탈리아어)
Bahasa Indonesia(인도네시아어)
Magyar(헝가리어)
Nederlands(네덜란드어)
Norsk(노르웨이어)
Polski(폴란드어)
Português(포르투갈어 - 포르투갈)
Português - Brasil(포르투갈어 - 브라질)
Română(루마니아어)
Русский(러시아어)
Suomi(핀란드어)
Svenska(스웨덴어)
Türkçe(튀르키예어)
Tiếng Việt(베트남어)
Українська(우크라이나어)
번역 관련 문제 보고
For now I dont game much anymore...and mainly on a 1080P monitor, which I can perhaps upsample to 1440 perhaps, got links to those numbers?
( https://gpu.userbenchmark.com/Compare/AMD-RX-Vega-56-vs-AMD-RX-5700-XT/3938vs4045 , that's vs. the 5700 XT, but the Aniverary is just a better binned 5700XT iirc.)
POUusnavy says some strange things sometimes, but sometimes he's helpful, I don't think he means bad either.
My point, he's not the most reliable or accurate for information.
Still, I wouldn't just throw his opinion out, just check around or ask for proof (like you did.)
Anyway, back to topic.
Power draw/effecient is kinda not an issue, you're talking like 10-15 dollars a year difference. You'd make more money back changing your lightbulbs to energy efficent ones.
Temps with the stock cooler are garbage, we're talking 'spaceheater' levels, but most aftermarket coolers will do fine for it, so long as its not a blower, or super skinny heatsink.
Honestly, the Vega56/1070 is still fine for 1080p60fps max/high settings, so it's not a problem if you do decide to keep the V56.
You've gotta decide if 100 dollars is worth a ~30% gain in performance.
Could you link you source(s)?
I'm curious to know where you/they got ~50% from.
The only time it would 'matter' is if there was a CPU bottleneck, but the GPU still wouldn't care, because the CPU 'draws' frames, and the GPU 'colours them in', the GPU wouldn't care about the CPU bottleneck, because it just means less instructions are sent to the GPU. And it only sees that as 'oh well, I got less sent today' not 'whew, he's got loads of work to do today.'
You could run a 2080ti with an i3-9100F, it would just depend on what games you play.
If they're singe-quad threaded games, then you're not going to experience a CPU bottleneck.
So games like CS:GO, older CoD games, other various E-Sports titles, and funny enough, the latest Far Cry game (New Dawn), it only uses 4 threads, so you could use a (Quadcore) i3 for it just fine.
And '6 core' isn't a good statement to make, because of older i7's (There were 6c12t versions, back in the first gen of i7's), and FX CPUs, which are '6 core' or '8 core' but realistically, they're like 3 and 4 core CPUs with hyperthreading, not to mention their IPC and clockspeeds being low. (And, drawing stupid amounts of power, lmao), or even older Xeons, which aren't that great today either.
So, refer to which '6 core' you mean when you say it otherwise someone could just search '6 core processor' in google, and get an FX pop up, and they'll be like ''Oh it's 6 core, so it has to work, RIGHT?''
But, you still didn't explain how or where you got ~50% from.
It will not cause stutter. You might just lose you a few FPS because more time is spend sending data and less computing.
If you didn't have enough RAM, then it would, because you'd be constantly hitting pagefile.
RAM speed will not 'make or break' a PC, it's not gonna make a low end PC equal to a high end one, and it won't make a low end one equal to a high end one.
It's going to have a slight difference in framerate, that's it.
Please, tell me, how is slow RAM going to make my ''GPU stutter''?