Инсталирайте Steam
вход
|
език
Опростен китайски (简体中文)
Традиционен китайски (繁體中文)
Японски (日本語)
Корейски (한국어)
Тайландски (ไทย)
Чешки (Čeština)
Датски (Dansk)
Немски (Deutsch)
Английски (English)
Испански — Испания (Español — España)
Испански — Латинска Америка (Español — Latinoamérica)
Гръцки (Ελληνικά)
Френски (Français)
Италиански (Italiano)
Индонезийски (Bahasa Indonesia)
Унгарски (Magyar)
Холандски (Nederlands)
Норвежки (Norsk)
Полски (Polski)
Португалски (Português)
Бразилски португалски (Português — Brasil)
Румънски (Română)
Руски (Русский)
Финландски (Suomi)
Шведски (Svenska)
Турски (Türkçe)
Виетнамски (Tiếng Việt)
Украински (Українська)
Докладване на проблем с превода
AMD Vega 64 Crossfire has been beaten 1080 Ti SLI.
4K Ultra 80fps.
Why should he stay from AMD?
I don't look at these stupid videos from idiots who don't have a clue, I do it myself, why I need some paid junkie on YT to tell me.
Gonna get 1080 Ti SLI for my 1080p 60hz and keep it for 10 years for high demanding. never go under 30 fps.
Now you really are just making crap up.
Sure OC'ed CPU can help, but not in everything. Many games will run about the same regardless of what the CPU is OC'ed to. Now obviously if the CPU was lacking that could of course hurt the game performance or benchmarks. But an OC doesn't just magically give Vega better performance to then beat a 1080 Ti; so you can just stop right there. Cause it would increase performance all-around, not help one and not the other.
7700K isn't even really good enough for running 2x Vega64 or 2x GTX 1080 Ti anyways.
Everything I stated can be fact checked. No misinformation here, except for your comments.
With 100hz and 100fps, you will constantly be fluctuating between sync and no sync, or be forced to use traditional vsync when it tips over 100. You do understand that FreeSync and Gsync does NOT limit frames? Having a higher max refresh rate allows for these technologies to run consistently, rather than switch off or to another more traditional vsync.
Not wasteful, you just don't understand the technology.
Absolutely agree.
So what exactly did I miss? According to you there's not a good 60hz 1080p display. You're own words right there. Are you even reading what you're typing? You're not even making logical sense in your posts here.
That I did not know and that's quite terrible from both a power and heat standpoint. No gpu should ever be running above any monitor's maximum refresh rate for any reason. Just wasteful on power and heat for nothing. Glad I learned.. I'll avoid all adaptive sync monitors forever now since I'm trying my best to keep all power in the house down as low as we can.
You didn't know, because you do not understand the technology, just like I said. Why are you trying to give advice on something you don't know?
You still fail to see the point. If you have 90-100 fps, like your previous example, a 144hz FreeSync or Gsync monitor will allow that frame rate to be consistently synced. Latency so low that it is negligible. That is the point.
And if you have a situation where the frames go over the refresh rate of the monitor, then you can use another form of vsync or a frame limiter. Gsync can be used in conjunction with any other vsync. Once max refresh rate is reached, the other form of vsync that you have set will engage. It is disengaged while Gsync is in operation.
BTW, your statement about frame rate should never go over refresh rate is simply subjective. There are plenty of advantages with frame rate going over refresh rate. Latency being a huge one, both in frame time and input latency.
No wasteful power or heat, you're using your GPU to it's fullest potential. Lowering GPU usage doesn't really help with power bills. Especially considering your examples. 100 fps at 100hz will have the same power usages as 100 fps with 144hz adaptive sync.
My advice, learn this stuff before trying to give advice on it.
And I used to play on a R9 290X for a number of years, which sucked down roughly 450-475 watts when gaming, was always at 100% utilization struggling to handle most games @ 1080p/75 @ max settings and switching to the 1080 Ti and our monthly power bill dropped by roughly -$15/mo just from that. It definitely does effect things.
You left out a key part of the statement.
Same performance, same relative power usage.
And we are not talking about a 1080 Ti and power consumption, we're talking about the Vega 64 and relative performance base in conjunction with FreeSync. If you wanted to argue power usage, then why not go the more logical path and debate Vega 64 vs GTX 1080?
Now that you cherry picked my previous comment, care to respond to everything else I typed?
And now you are back to arguing about sustaining 144 again...
And that's what we're discussing here. Y'all want to run 1080p or 1440p @ 144hz screens with freesync on Vega64. That's what the topic's about so far. Or at least it's gone to there.
But you don't have to sustain 144 with FreeSync... we've been through this already.
It isn't nearly that bad in normal use and if a few hundred watts tank your power bill that much that you cannot afford it, perhaps high end gaming shouldn't be a priority?
Don't know how high your bills are, but here in the UK even with our horrendous price hikes over the last few years my bills haven't gone up much outside of price hikes despite having wayyy more power demanding kit, I really don't notice that much of a difference, really, if 5-15 bucks a month is enough to worry you,an $5-800 GPU shouldn't even be looked at lol.