Инсталирайте Steam
вход
|
език
Опростен китайски (简体中文)
Традиционен китайски (繁體中文)
Японски (日本語)
Корейски (한국어)
Тайландски (ไทย)
Чешки (Čeština)
Датски (Dansk)
Немски (Deutsch)
Английски (English)
Испански — Испания (Español — España)
Испански — Латинска Америка (Español — Latinoamérica)
Гръцки (Ελληνικά)
Френски (Français)
Италиански (Italiano)
Индонезийски (Bahasa Indonesia)
Унгарски (Magyar)
Холандски (Nederlands)
Норвежки (Norsk)
Полски (Polski)
Португалски (Português)
Бразилски португалски (Português — Brasil)
Румънски (Română)
Руски (Русский)
Финландски (Suomi)
Шведски (Svenska)
Турски (Türkçe)
Виетнамски (Tiếng Việt)
Украински (Українська)
Докладване на проблем с превода
I had to upgrade my GPU last year for VR and my monitors setup.
Overall, I upgrade when games and programs start to be more demanding, usually every 4 years or so.
Anyway I upgrade when there is a tangible benefit to do so and I've got the funds to do so, the cost of power to run my pc's under normal use has never once been a factor as it is so minor even when my systems eat 700-1000w all in.
Well, it's just 100 USD for the CPU.
And yes, I do think 40 Watts make a difference. It's 150 USD less per year, for me. So it paid itself off after 9 months already.
I used a Core i5 2500K platform for nearly a decade. I didn't change the motherboard, CPU, RAM, or CPU cooling at all in the time. I did make other changes (case, PSU, GPU, and storage) in that time, but for over nine and almost ten years, I got by on that. What necessitated change was...
1. 16 GB RAM wasn't enough anymore.
2. Windows 7 wasn't supported anymore.
3. It's becoming more and more time for a faster CPU.
It wasn't until the first became true that I upgraded, but it was the combination of them all at once that made it necessary. If I wanted to put something newer and faster than the GTX 1060 I had in there later, the CPU would have been a concern too, so a somewhat 4th reason was for that (though given the GPU market I don't even know when I'll do that, maybe 2023 because right now my GPU situation is at your third one but not yet the second but I think I'll more run into that one starting the next year or two).
I'd say the best way to come out ahead is to get something that is as close as possible to not being overkill but still being enough (this is probably near impossible to predict though), and being able to use it as long as possible.
I think you're actually skipping out on a lot of math, or overlooking some things.
What was the cost of the stuff you just got two years ago? You're presuming it'd have been the same then as it is today. Namely, the 10th generation received some big price cuts (although I'm not sure how the affected the Core i3 10100).
What was the cost of the new stuff? Yes, it's newer, faster, and uses less power which sounds like an automatic win (this is more common than you think, by the way). That doesn't always make it "worth it" to replace something though. That merely makes it better than it side by side in a vacuum, but costs exist.
What is the difference in your power bill? It's easy to say "the CPU or system now uses 30% less power in this one time" but did you actually check how much each costs to run over time, and how long it will take for that difference to make up for the cost it took to get it to begin with? It's usually a rather long time. You could sort of consider the money you spent on the new stuff is "credit" you took out, and until the power savings catch up, you haven't repaid it and thus broke even. And even then, for that to apply, you'd have to consider BOTH of the following will be true; that you would have kept the old stuff until the point it takes to break even, and that you will use the new stuff longer than than. If either of those aren't true, it throws the entire "I made it cheaper but better" comparison nearly impossible to prove.
From memory, Zen+ is like Skylake (ish) performance, so yeah the newer one is faster per core, although you were also running a CPU with twice the cores and threads. You need to factor that in. Is there a situation where your workflow would be done sooner with the higher core count chip? People fixate on "average" or "idle" power draw and falsely extrapolate that the same difference will apply throughout, which isn't true to begin with, but namely this is forgetting to factor in one other important thing, and that is time. What about tasks where the faster CPU completing sooner means it isn't using above idle power for as long as a slower one? Though this less applies to something like games, it's often overlooked. It might still use more overall but what I'm saying is you can't look at one situation and hyper fixate on one percent number like it's the constant, overall difference. it could be skewed either way.
If you're happy with it, that's ultimately enough to be good enough, so I'm not trying to sell you against the change you made. I'm just saying that you can't just plug in "it's faster but also uses xx% less power" and presume you're instantly ahead as a result. But if you're happy with it, that's justification enough.
11400f and i3 10320. Currently deciding which CPU I want to keep. My motherboard is here (ASRock b560 pro4). I'll try out both of them and see.
I'm so sick and tired of AMD relations Microsoft. Bye bye AMD. Never again sorry.
I did. It did. I had the Ryzen running on 4-cores (via UEFI/BIOS) and even throttled it down to 3GHz (not many games I play really need more).
Good points. To answer your last question: yes. I have a power meter, showing me the total power consumption of my PC + Monitor (exclusively).
It measures the actual total power wattage.
My math is based on the 'same' PC now running at total 60 Watt (CPU+MB+RAM+GPU) while before - with the Ryzen 7 - it ran at 80-100 Watt ... on idle: using it the same way, with the same Operating System energy saving modes, with the same apps running.
Nothing says that this Ryzen is always using MORE power than the Intel CPU when running in certain energy (saving) modes or any given app or game. But, in my case, it does.
There's also always the ability to throttle the CPU and/or reduce the number of cores you allow it to use.
In a non-gaming environment, when also not using CPU-heavy tasks (like video encoding, etc), more than 4 cores are overkill? It makes sense to park those or switch them off ... and modern CPUs - as you well know - do that even by themselves.
When it DOES come to CPU-heavy applications (certain games, software) the 4-core Intel CPU has a TDP of 65 Watts while the 8-cre Ryzen 7 hits 105 Watts. In GAMES, the Ryzen does not perform better (with a handful of exceptions). Now, these are max values of course. But, they matter - especially when CPU-intensive games are running.
Did you measure the consumption with 4 cores ? Also some motherboards draw more power .
Maybe if its under full load 24/7 elsewise, it's not, plus the mobo, so at best you are saving $12.50 a month, an ammount that can more easily be saved by making your own coffee or using tap water once a week or less and you end up with a less usable system.
Fair enough, it's your money.
But your electric must be crazy if it was previously around 250 to 300 a year JUST for the pc, I mean mining 24/7 on a 3090 pulling around 300w only costs about 40 bucks a month, but, your normal use pc using 1/3rd the power of 1 gpu cost 3/4 as much? Yikes!
You really want to help conserve power and save on energy bills; look at the rest of your Home; Windows, Doors, Frame Seals, A/C and Heating efficiency, de-cluttering your Home and also keeping it clean of dust/hair/pet-dander... Also limit how much you use a Microwave. Some 3-5 person family can easily spend over 1 hour a day using a Microwave, and that is at High/Full power at 1000-1500 watts
How times have changed.
Most likely never. My pc draws 350w idle and 850w gaming from wall but it doesnt matter much for a few hours i use it.