Instale o Steam
iniciar sessão
|
idioma
简体中文 (Chinês simplificado)
繁體中文 (Chinês tradicional)
日本語 (Japonês)
한국어 (Coreano)
ไทย (Tailandês)
Български (Búlgaro)
Čeština (Tcheco)
Dansk (Dinamarquês)
Deutsch (Alemão)
English (Inglês)
Español-España (Espanhol — Espanha)
Español-Latinoamérica (Espanhol — América Latina)
Ελληνικά (Grego)
Français (Francês)
Italiano (Italiano)
Bahasa Indonesia (Indonésio)
Magyar (Húngaro)
Nederlands (Holandês)
Norsk (Norueguês)
Polski (Polonês)
Português (Portugal)
Română (Romeno)
Русский (Russo)
Suomi (Finlandês)
Svenska (Sueco)
Türkçe (Turco)
Tiếng Việt (Vietnamita)
Українська (Ucraniano)
Relatar um problema com a tradução
Anyone recommending either chip for gaming, is nothi g but a fan boy and wasting people's money with bad advice on this forum.
Edit.
Also, does anyone who can afford these chips really give a damn about the extra power draw and how minor it will be on their electricity bill?
Hell, I just ran folding at home for a month, so 24/7 use that at peak loads pulled 900 Watts from the wall, it only increased my electricity bill by £15-ish, the extra power draw on a 10900k over an amd chip is going to be negligible on your bill, so if you can afford the cpu, the power isn't a big deal.
Of course I do... I also take pride in my low energy house.
Yeah, bs, but congrats on buying efficient stuff I guess for your home (doesn't everyone when they buy new stuff these days, it's hard to even find inefficient white goods).
My point is people throw around the power draw of various components like it has a big impact on anything, it doesn't, hence I used my folding experiment as an example, that system isn't used much these days, has a 7900X and sli 1080ti's and running it 24/7 at near full load for a month (when it usually gets less than 20 hours use a month recently) only increased the bill by £15, at 900watts! So a cpu/gpu using an extra 50-100w during use is negligible in reality.
I do have a rack running my EdgeMAX ER8-Pro and my EdgeSwitch Max and the AP's but they are all low power. UBNT gear uses MIPS. Which by design is low power.
Fact is what goes in must come out, and in the case of computers that generally means electric in heat out...
For some reason, this exact fact that people threw at AMD durring FX era, and that NV throws at AMD now, is seemingly forgotten now days when looking at AMD vs Intel...
Take my Vega 64. I love it, but I will openly admit that this GPU has hand down won the award for most power hungry modern GPU. Under load it pulls 330w... Its then predictable that it has to *vent* that 330w some where, either into the case or out the back on blowers like mine.
Likewise for the CPU...
As it now stands, an Intel i9 has to dump ~200w under stock "boost" configs, and as much as 250-300w+ under heavy boost or overclock. That *has* to be cooled somehow.
Intel has done a great job sticking a bandaid on 14nm by lowering core height and increasing IHS thickness and mass, and that has translated into lower core temps due to beter thermal transfer from the core to the cooler... But that doesnt make the heat dissapear, it just gets it off the chip and into your room faster...
If we take this chip, and combine it with my Vega you have a machine that could potentially draw 300w+ to both GPU and CPU under load alone...
Compare that to an R9/GTX-1080 build that would offer the same realtive FPS but better production at just ~340w for the CPU+GPU...
Both will give you the same performance, but one will heat your room noticably quicker and hotter...
My vega was not a power/performance pick it was a price/performance pick at 400 bucks.
Likewise AMD's FX8/9 chips were not power/performance picks, but they often were price/performance picks...
If Intel is going to push heat houses into peoples houses the least they could do is price themselves to match. Instead they are inteltionally pricing them selves up in am attempt to push a brand image they frankly dont deserve (right now) and are currently going contrary to in actions.
I would whole heartedly be suggesting the Intel lineup all acvross the board *if* each part undercut AMD's price at each specific teir by 50 USD... That would make Intel massively competitive... But being 30-50$ more than AMD... That makes them a joke...
Also, remember FX wasn't cheaper at launch, it cost the same or more at the time depending on model.
Again, my point was, that power / heat isn't a big issue for the average user and none of the top tier chips from either side are needed or make much sense.
Same ruling goes for my systems, I didn't need as much power as they have and could definetly of cut back on the cpu's with very little impact on real world performance, which is why I rarely suggest the top tier kit to people, as it doesn't make sense for most, halo products very rarely do justify their existence beyond just to sut at the top.