Steamをインストール
ログイン
|
言語
简体中文(簡体字中国語)
繁體中文(繁体字中国語)
한국어 (韓国語)
ไทย (タイ語)
български (ブルガリア語)
Čeština(チェコ語)
Dansk (デンマーク語)
Deutsch (ドイツ語)
English (英語)
Español - España (スペイン語 - スペイン)
Español - Latinoamérica (スペイン語 - ラテンアメリカ)
Ελληνικά (ギリシャ語)
Français (フランス語)
Italiano (イタリア語)
Bahasa Indonesia(インドネシア語)
Magyar(ハンガリー語)
Nederlands (オランダ語)
Norsk (ノルウェー語)
Polski (ポーランド語)
Português(ポルトガル語-ポルトガル)
Português - Brasil (ポルトガル語 - ブラジル)
Română(ルーマニア語)
Русский (ロシア語)
Suomi (フィンランド語)
Svenska (スウェーデン語)
Türkçe (トルコ語)
Tiếng Việt (ベトナム語)
Українська (ウクライナ語)
翻訳の問題を報告
Of course at the time people would have preached to wait for the 30 series. And typically I would have, or maybe with a 2080 super wait for the 40 series. I guess I still have that option.
I think the lesson is buy the hardware you need when you need it. Waiting for the next big thing can be fraught with peril either with supply issues or performance/feature disappointments. And there's always going to be newer hardware eventually that will make the thing you're waiting for obsolete anyway and the current available hardware will last nearly as long as that.
There's plenty of games out there to play still that run on older hardware, have people really played them all and the newest upcoming thing is all that's left?.
I also don't get the hate towards cryptominers. Gamers themselves attempt to run hardware to it's fullest for the FPS. For the gaming experience. That's arguably less worthwhile than producing imaginary money. It's probably equally worthless when you really think about it. Sure gamers get "joy" out of gaming. And the cryptominers I suppose get "joy" when they find another coin I guess. I do understand cryptominers are doing it on a much larger scale than gamers on an individual basis.
I can't believe gamers are ethically superior to cryptominers either when it comes to power and resource usage. Upgrading cards every couple years when they still aren't anywhere near broke produces a lot of waste and buying CPU's or GPU's that run faster and hotter than other just for some more FPS isn't really helping the planet. And if SLI really did manage to produce groundbreaking resutls, pretty sure everybody would have two to three gpu's per pc. It wasn't anyone's ethics that told them not to go the SLI route. It was personal "is my money spent here worth it" attitude. If it was worth it the decision would have been made.
I'll start with this OP. I'm trying not to sound like a jerk but I get how someone would take my post as being smug at the situation. I'm not or at least I don't think I am. I felt like you did want something to get you thinking. I do however get triggered at the overall attitude by some people of a series of GPU's not being available is one of the most terrible things to ever happen. I don't think it's that serious.
My strategy for the future is simple: use the 5700XT until it dies. Which hopefully won't be for years (barring unforeseen circumstances). I know I could get a good price for it on Ebay right now - I still have my old RX 470, so I wouldn't be without a gpu - but I like my 5700XT. Not planning on selling it.
I doubt dedicated gpus are going to return to the old prices. Those days are gone. Some tech people are saying the shortage will last until ~2023. DDR5 should be out by then, with (hopefully) significant advances in APU's / iGPU's. Add-in gpus are going to remain an "elite" option; the lower-cost options are going to get better than they are now, but the market's segmentation between add-in and integrated is going to get MUCH larger than it is now.
That's what I anticipate, anyway. That's why I'm hanging on to the 5700XT; even with advances in memory & caching, APU's will never equal its performance in the foreseeable future - though RX 470-level performance in an APU might be possible within 5 years. IF that happens, expect add-in gpu markets to shrink even further.
I'm sure the skeptics in the forum will say this will will never happen. Maybe it won't. I also expect Microsoft to abandon traditional Windows and turn it into a paid-for, linux-based desktop environment (which would include closed-source dlls needed to natively run Windows programs). So who knows.
5 months.
Since a few months ago I started mining and now just suggest if you can afford the markup (annoyingly it's far worse on the low end than High end) just buy from a scalper if you don't get lucky after a few days of proper hunting, then if you can't beat em, join em, start mining on down time and you'll earn back the extra cost in a couple of months.
I know it's not popular, but, if you can do it, it just makes sense right now.
https://www.pcgamer.com/samsung-ddr5-pcie-5-cxl-memory-expander/
just playing the waiting game
That is, ddr5 may release at 5200MHz but will have worse timings than 4400-5000MHz ddr4 for no real gain, but you will pay a premium.
Luckily, none of the chips due with the first ddr5 release look like they are worth upgrading to if your on 9th gen or zen2 (3000) or newer.
its best to buy a cpu/mobo/ram combo that work good together
GPU mining is becoming less profitable, which I hope changes the situation. It has also become political so even more unpredictable.
The CPU shortages seem to be improving.
It is a situation you cannot control.
My only strategy is to to have the cash available to buy what I want when it goes down to the price I want.
What do you do where you'd benefit from it, or require it?
I'd thought of upgrading my CPU to to a 5900X/5950X but looking at things now, after paying what is now a reasonable price of about 1350USD for my Nitro+ 6900 XT very early in January, I'm happy to just keep my systems as they are now.
I'd thought of selling my Vega64 Red Devil for some nice cash, and use my spare HD7970 as its replacement, but decided not to as I love my Vega64 RD too much. I have three systems, each with reasonably powerful cards for what I need 'em to do, so I'll wait out this mess we're in and see what the next one-two years hold.
I have a GTX 1060 6 GB that I got in... 2016 (?) but feels like only three years ago or so, and it's still sort of fine, but I've been more and more approaching that "I would like to upgrade" point lately, but it's not yet "I need to" (counterpoint, there's a FEW games I MAY have gotten if I had a better GPU so you might argue it's sort of there, but I have plenty of unplayed games so they'd possibly just become the same). But, not happening in this market. I'll use it as long as it works, and if it stops, I'm probably done PC gaming beyond what a spare GT 1030 can do, at least for the short foreseeable future, until I get desperate and save the money to spend more than I did for this five years ago for less performance than it offers (the fact that this is possible means GPUs have made a net decrease in price to performance). Ideally, I'd have something like a 1440p/120 Hz+ panel, in which case an upgrade would be more necessitated, so having "only" 1200p/60 Hz helps here.
People who use GPUs as hobbyist can't compete as a whole with those buying these up at many times MSRP to generate money on (whether crypto currency or content creators, unless also doing the same) without just sucking up the higher initial cost, but not everyone can (or wants to) pay the higher barrier of entry and then do that anyway.
My purchase strategy for the last 1-2 decades was "buy the most common (quality) hardware" of the previous year.
The idea was to buy what is popular, since THAT hardware would have the most sophisticated and the most optimized drivers and any (Windows or PC game) 'problem' would be patched or fixed immediately.
I also stopped with overclocking and chasing the most frames per second. While it was interesting for a while, the MORE interesting path was to seek "how low can I go?" = undervolting.
Basically, it meant optimizing voltage and clock speeds to run as cool and low as possible while reaching the monitor frequency and size (1920x1080@60Hz = 60FPS locked - or now 2560x1440@144Hz = 144FPS locked).
I learned to hate to waste power: Why run a game @ 350 Watts GPU when the same game runs and produces the same amount of frames when running a 100 Watts GPU?
The graphic card marketing hype train of "the new card runs 4x faster ..." becomes silly when you ask yourself "how fast does it REALLY NEED to run?".
Am I enjoying <put in game name of your choice> MORE when it runs @ 300 FPS instead 288 FPS?
"Performance per Watt" is a term that stuck with me for the last year or two.
This is why I was never interested in NVIDIA's clever "Real time ray-tracing" marketing push to sell the same 1080s from before, for a much higher price by adding "Tensor cores" which have nothing to do - unless you watch Minecraft RT all day long and calling it RTX 2080.
But, funny enough, the latest GPU/CPU hardware is not only faster (and CAN eat more power), but also has the smallest dies (temp) and most sophisticated power management options. It achieves the same (or better) performance, while still being a non-compromising gaming PC, which can switch on/off power when needed.
By the end of last year, wanting a RTX 3080 or AMD equivalent had more to do with being able to run the new card at the absolute minimum for daily consumption (which games do REALLY need the full power of these cards today? 3-4 titles? How much time on a gaming PC is spend watching YouTube/Twitch or browsing the web?) but having the necessary power to play a demanding game - like Microsoft Flight Simulator - in the next 3-5 years.
Speaking of Microsoft Flight Simulator ...
... THAT is the ONE game, which was the focus of my GPU hardware update wishlist since last year. I wanted to buy a new 30xx GTX plus MS Flight Sim.
I did not buy neither.
My current 'solution' looks like, I am going to buy a gaming console, for the first time, to get me through this PC hardware drought and that should last me 2-3 years.
Luckily, these gaming consoles are new and all new (PC/cross-platform) titles will have to be able to run on them. Almost all new AAA titles will be available on them (with only a handful notable exceptions, like Total War Warhammer III, etc).
Meanwhile, I guess, many of us have to pray that their current PC hardware will last? At least, CPU, MB and RAM prices are not sky-rocketing the same way.
But the "will the GPU market ever return to 'normal'?" question is still of great concern to me. NVIDIA's switch to up-selling "PC gamers" features nobody was asking for or really needed, so they can up the price structures (and AMD following them, sheepishly, to keep up = "We also have ray-tracing!") does not bode well.
If your sweet-spot for the last 10+ years was a mid-range gaming GPU for <300USD, this market might be gone forever?
Same here. ASUS ROG STRIX GTX 1060 6GB OC is my 'most recent' graphic card (having a few others, even older).
I feel like 'stuck' now, since my update strategy (buy the new RTX 3080) for last year was crushed by scalpers and crypto-miners.
Boy, I wish I had bought a 1080 Ti back when. And I should not have given away my AMD 580 to a friend for free (He sold it on eBay, instead using it).
Thankfully, the 1060 is fine enough for most games at 1080p and even good enough for 1440p with many titles. More demanding Action-Adventure and/or RTS PC titles, I play with capped 30FPS with super smooth 33.3ms frametime and below 80F/35Celsius.
Good that the 1060 is STILL the most common graphic card, looking at the Steam Hardware Survey!