Всички дискусии > Steam форум > Hardware and Operating Systems > Подробности за темата
More performance + less power consumption: when do you upgrade? Downgrade!
This should be an easy CPU-upgrade math equation: "more for less"?



When do YOU guys decide to make the 'switch'?



Is it the always tempting "Oooh! Look! This shiny new thing ..." temptation, you cannot resist? (Money is not the issue)

Is it the more economic: "I finally HAVE to upgrade" last straw, after years of waiting, that tips you over?

Or, is it the "I can have more performance"? (not that you really need it)



I ask, because, I just recently - without any pressure to do it = more out of 'boredom' - bought a mid-spec "Intel Core i3 10100" Comet Lake (4 core 3.6GHz, + HT + 4.4GHz Turbo-Boost) to replace my still well working 1st Zen-Gen AMD Ryzen 7 2700X (8/16x4.3GHz).

The ONE reason was "30-40% less power consumption" while same (or even BETTER single-core) performance.

That was it.

Now, the same PC + Monitor runs at average 60 Watts instead 100 Watts.

I could have done that 2 years ago, but it never occurred to me.

That triggered my question: what 'triggers' YOU to upgrade? Or, in my case, 'downgrade/upgrade'.

:TerribleFace:
Последно редактиран от Adam Beckett; 18 окт. 2021 в 11:48
< >
Показване на 1-15 от 40 коментара
when my device starts going sluggish OR if I'm about three generations behind. Whichever comes first.
I tend to base my builds around my GPU. If I upgrade my graphics card, and it's bottlenecked by my CPU, then it's time to upgrade my CPU (And motherboard, and RAM, and might as well get a new case while I'm at it, and ooooHHH is that RGB?)
I rarely upgrade, because I am fairly able to change parts on my 6 year old motherboard, it looks like a "Frankenstein" scene when you look at it, but it works.
I had to upgrade my GPU last year for VR and my monitors setup.
Overall, I upgrade when games and programs start to be more demanding, usually every 4 years or so.

:saint:
So you spent several hundred pounds / dollars / euros etc to save about 2 bucks a month on your electric bill and have far worse performance the second something wants more than 4 (entry level not mid) cores... Are you sure you want to be publically admitting to this?

Anyway I upgrade when there is a tangible benefit to do so and I've got the funds to do so, the cost of power to run my pc's under normal use has never once been a factor as it is so minor even when my systems eat 700-1000w all in.
Първоначално публикувано от Monk:

Well, it's just 100 USD for the CPU.

And yes, I do think 40 Watts make a difference. It's 150 USD less per year, for me. So it paid itself off after 9 months already.
Първоначално публикувано от Adam Beckett:
When do YOU guys decide to make the 'switch'?

Is it the always tempting "Oooh! Look! This shiny new thing ..." temptation, you cannot resist? (Money is not the issue)

Is it the more economic: "I finally HAVE to upgrade" last straw, after years of waiting, that tips you over?

Or, is it the "I can have more performance"? (not that you really need it)
Typically, somewhere between the second and third, but more so the second these days (especially with the GPU market).

I used a Core i5 2500K platform for nearly a decade. I didn't change the motherboard, CPU, RAM, or CPU cooling at all in the time. I did make other changes (case, PSU, GPU, and storage) in that time, but for over nine and almost ten years, I got by on that. What necessitated change was...

1. 16 GB RAM wasn't enough anymore.

2. Windows 7 wasn't supported anymore.

3. It's becoming more and more time for a faster CPU.

It wasn't until the first became true that I upgraded, but it was the combination of them all at once that made it necessary. If I wanted to put something newer and faster than the GTX 1060 I had in there later, the CPU would have been a concern too, so a somewhat 4th reason was for that (though given the GPU market I don't even know when I'll do that, maybe 2023 because right now my GPU situation is at your third one but not yet the second but I think I'll more run into that one starting the next year or two).

I'd say the best way to come out ahead is to get something that is as close as possible to not being overkill but still being enough (this is probably near impossible to predict though), and being able to use it as long as possible.
Първоначално публикувано от Adam Beckett:
This should be an easy CPU-upgrade math equation: "more for less"?

I ask, because, I just recently - without any pressure to do it = more out of 'boredom' - bought a mid-spec "Intel Core i3 10100" Comet Lake (4 core 3.6GHz, + HT + 4.4GHz Turbo-Boost) to replace my still well working 1st Zen-Gen AMD Ryzen 7 2700X (8/16x4.3GHz).

The ONE reason was "30-40% less power consumption" while same (or even BETTER single-core) performance.

That was it.

Now, the same PC + Monitor runs at average 60 Watts instead 100 Watts.

I could have done that 2 years ago, but it never occurred to me.
I think you're actually skipping out on a lot of math, or overlooking some things.

What was the cost of the stuff you just got two years ago? You're presuming it'd have been the same then as it is today. Namely, the 10th generation received some big price cuts (although I'm not sure how the affected the Core i3 10100).

What was the cost of the new stuff? Yes, it's newer, faster, and uses less power which sounds like an automatic win (this is more common than you think, by the way). That doesn't always make it "worth it" to replace something though. That merely makes it better than it side by side in a vacuum, but costs exist.

What is the difference in your power bill? It's easy to say "the CPU or system now uses 30% less power in this one time" but did you actually check how much each costs to run over time, and how long it will take for that difference to make up for the cost it took to get it to begin with? It's usually a rather long time. You could sort of consider the money you spent on the new stuff is "credit" you took out, and until the power savings catch up, you haven't repaid it and thus broke even. And even then, for that to apply, you'd have to consider BOTH of the following will be true; that you would have kept the old stuff until the point it takes to break even, and that you will use the new stuff longer than than. If either of those aren't true, it throws the entire "I made it cheaper but better" comparison nearly impossible to prove.

From memory, Zen+ is like Skylake (ish) performance, so yeah the newer one is faster per core, although you were also running a CPU with twice the cores and threads. You need to factor that in. Is there a situation where your workflow would be done sooner with the higher core count chip? People fixate on "average" or "idle" power draw and falsely extrapolate that the same difference will apply throughout, which isn't true to begin with, but namely this is forgetting to factor in one other important thing, and that is time. What about tasks where the faster CPU completing sooner means it isn't using above idle power for as long as a slower one? Though this less applies to something like games, it's often overlooked. It might still use more overall but what I'm saying is you can't look at one situation and hyper fixate on one percent number like it's the constant, overall difference. it could be skewed either way.

If you're happy with it, that's ultimately enough to be good enough, so I'm not trying to sell you against the change you made. I'm just saying that you can't just plug in "it's faster but also uses xx% less power" and presume you're instantly ahead as a result. But if you're happy with it, that's justification enough.
Последно редактиран от Illusion of Progress; 18 окт. 2021 в 12:33
You had just to say to Windows to use half of the cores ! It would have been cheaper
Последно редактиран от smallcat; 18 окт. 2021 в 12:38
I usually do it when I'm bored. I'm already sick of AMD and I'm waiting on my Intel platform. I ordered two CPU:
11400f and i3 10320. Currently deciding which CPU I want to keep. My motherboard is here (ASRock b560 pro4). I'll try out both of them and see.

I'm so sick and tired of AMD relations Microsoft. Bye bye AMD. Never again sorry.
Последно редактиран от GOD RAYS ON ULTRA™; 18 окт. 2021 в 19:30
i was goning to buy AMD , but the shop didnt have the exact one and i ended up with an Intel
Последно редактиран от smallcat; 18 окт. 2021 в 12:52
Първоначално публикувано от littlecat20160:
You had just to say to Windows to use half of the cores ! It would have been cheaper

I did. It did. I had the Ryzen running on 4-cores (via UEFI/BIOS) and even throttled it down to 3GHz (not many games I play really need more). :TerribleFace:

Първоначално публикувано от Illusion of Progress:

Good points. To answer your last question: yes. I have a power meter, showing me the total power consumption of my PC + Monitor (exclusively).

It measures the actual total power wattage.

My math is based on the 'same' PC now running at total 60 Watt (CPU+MB+RAM+GPU) while before - with the Ryzen 7 - it ran at 80-100 Watt ... on idle: using it the same way, with the same Operating System energy saving modes, with the same apps running.

Nothing says that this Ryzen is always using MORE power than the Intel CPU when running in certain energy (saving) modes or any given app or game. But, in my case, it does.

There's also always the ability to throttle the CPU and/or reduce the number of cores you allow it to use.

In a non-gaming environment, when also not using CPU-heavy tasks (like video encoding, etc), more than 4 cores are overkill? It makes sense to park those or switch them off ... and modern CPUs - as you well know - do that even by themselves.

When it DOES come to CPU-heavy applications (certain games, software) the 4-core Intel CPU has a TDP of 65 Watts while the 8-cre Ryzen 7 hits 105 Watts. In GAMES, the Ryzen does not perform better (with a handful of exceptions). Now, these are max values of course. But, they matter - especially when CPU-intensive games are running.


You had an old Ryzen , The current generation is good .

Did you measure the consumption with 4 cores ? Also some motherboards draw more power .
Последно редактиран от smallcat; 18 окт. 2021 в 13:03
Първоначално публикувано от Adam Beckett:
Първоначално публикувано от Monk:

Well, it's just 100 USD for the CPU.

And yes, I do think 40 Watts make a difference. It's 150 USD less per year, for me. So it paid itself off after 9 months already.

Maybe if its under full load 24/7 elsewise, it's not, plus the mobo, so at best you are saving $12.50 a month, an ammount that can more easily be saved by making your own coffee or using tap water once a week or less and you end up with a less usable system.

Fair enough, it's your money.

But your electric must be crazy if it was previously around 250 to 300 a year JUST for the pc, I mean mining 24/7 on a 3090 pulling around 300w only costs about 40 bucks a month, but, your normal use pc using 1/3rd the power of 1 gpu cost 3/4 as much? Yikes!
200W vs 500W really is not going to matter at the end of the month or even annually when you look at your power bills. Now sure maybe if you had it running 8-12 hours a day gaming, live streaming, video editing... sure. Or if you have a house full of people using gaming PCs. When you're just surfing internet and watching videos/movies; even an i9 + RTX 3090 based PC is only going to draw a small amount from the wall when not doing anything stressful. Even if you force a 24/7 OC on it.

You really want to help conserve power and save on energy bills; look at the rest of your Home; Windows, Doors, Frame Seals, A/C and Heating efficiency, de-cluttering your Home and also keeping it clean of dust/hair/pet-dander... Also limit how much you use a Microwave. Some 3-5 person family can easily spend over 1 hour a day using a Microwave, and that is at High/Full power at 1000-1500 watts
Последно редактиран от Bad 💀 Motha; 18 окт. 2021 в 18:55
Първоначално публикувано от littlecat20160:
i was goning to buy AMD , but the shop didnt have the exact one and i ended up with an Intel

How times have changed.
Първоначално публикувано от Adam Beckett:

Well, it's just 100 USD for the CPU.

And yes, I do think 40 Watts make a difference. It's 150 USD less per year, for me. So it paid itself off after 9 months already.
I want to see your math for this. I bet you took 100w 24/7 which is clearly not true.



Първоначално публикувано от Adam Beckett:
When do YOU guys decide to make the 'switch'?

Most likely never. My pc draws 350w idle and 850w gaming from wall but it doesnt matter much for a few hours i use it.
Последно редактиран от dOBER; 18 окт. 2021 в 21:58
< >
Показване на 1-15 от 40 коментара
На страница: 1530 50

Всички дискусии > Steam форум > Hardware and Operating Systems > Подробности за темата
Дата на публикуване: 18 окт. 2021 в 11:20
Публикации: 40