Diablo® IV

Diablo® IV

View Stats:
KrzysiuM Oct 5, 2023 @ 7:32am
3080 ti bricking?
I've heard that 3080 ti graphics died during the gaming in beta. Is it still case here after the beta?
< >
Showing 16-21 of 21 comments
ChickenBalls Oct 6, 2023 @ 10:41am 
just buy a 4090
problem solved
MonkehMaster Oct 6, 2023 @ 10:52am 
Originally posted by ChickenBalls:
just buy a 4090
problem solved

no thanks, todd howard.
xnamxoutlawx Oct 8, 2023 @ 8:42pm 
Originally posted by Zukabazuka:
Originally posted by xnamxoutlawx:
Lol game's don't brink any gpu, if your gpu gets "bricked" by a game the gpu is faulty.
Buddy if it was hardware it would break on any game, but due to it only happening on a single game points towards it being a software problem. After all you can make software do a lot of crazy stuff.
And once again fools will be fools, have you actually even read up on the actual issue that caused the so called bricking? Yeah read up then come back and try to say the same thing LOL, again I never said software couldn't brick a gpu I said a game could never brick your gpu, can a faulty bios from your gpu manufacturer cause software to brick your gpu? Yes, can a video game by itself without any other factors cause your gpu to be bricked? No, do some research and edify yourself please gentlemen.
Last edited by xnamxoutlawx; Oct 10, 2023 @ 5:01pm
Kahara_Bokuden Oct 21, 2023 @ 8:50am 
Back to the question at hand, is there any information on whether or not it is safe to play Diablo 4 on a 3080/3080 Ti gigabyte graphics card? I haven't really found any useful information about this on the web. Thank you.
zipper76 Oct 21, 2023 @ 9:04am 
Originally posted by Kahara_Bokuden:
Back to the question at hand, is there any information on whether or not it is safe to play Diablo 4 on a 3080/3080 Ti gigabyte graphics card? I haven't really found any useful information about this on the web. Thank you.

I've been playing on my EVGA 3080 ti since forever now. The only issue I've experienced was during launch era. I would get hitches after playing for awhile, unless I lowered my Texture Quality. This was a known issue that had something to do with VRAM or something like that. Either way it's been fixed and right now I play just fine.

I do cap my framerate at 90, though it's not out of necessity. My eyes just don't seem to notice any difference beyond that so I keep all my games capped at that. Less stress on components never hurts.
Mister Magician Oct 21, 2023 @ 9:30am 
Originally posted by White Spirit:
Originally posted by KrzysiuM:


Well, if you have high refresh rate monitor it's not so stupid to have lots of fps

No, it is stupid. I am talking some games jacking up FPS to thousands, which in some cases damaged particular cards. It also can increase card's power draw for zero reason.

Everyone should limit FPS to something reasonable, typically several FPS below screen refresh rate, some discussion on it here:
https://www.reddit.com/r/buildapc/comments/qj0jpz/to_cap_or_not_to_cap_fps/
The worst are unity games like "pathfinder wrath of the righteous" because unity has no way to cap framerate natively. That game was overheating my laptop and crashing my desktop just on character creation, but for some reason it worked perfectly on steam deck; all the malfunctioning stopped once I capped framerate, which steam deck does automatically.

Originally posted by SLSAMG ✪:
This thread in a nutshell over the last few replies: copious amounts of overly emotional fools coming out of the woodwork and getting butt-hurt over the fact that their GPU's could be a ticking time bomb due to a very low probability of a design flaw... proceeds to blame gaming developers, rather than engineers, as a coping mechanism.

New World was killing GPU's because frames exceeded 3 digits, putting a load well beyond what the cards were capable of handling. It was an outlier. D4's problem is related to certain GPU's using lower quality memory modules.
blizzard is known throughout the industry to be bad at coding, and their code being inefficient ended up "murphy's law"ing some cut rate graphics cards that the manufacturer thought they could get away with releasing. In other words, the cards SHOULD and WOULD work on anything recent because they were "just good enough," but blizzard's code actually is THAT bad that it can find any hardware flaw instantly just on accident.

Another example would be baldur's gate 3 (shock of all shocks, a unity game) bricking people's entire pcs and frying graphics cards because larian cannot code in directx or vulkan; going so far as to tell the user to "restart the client and switch between the two when you run into a bug on the other one."

Just recently a major game that was running like complete trash, so much so that amd released an entire patch just to target anti-lag frame generation for that game. The company then detected amd's frame generation as an "exploit" and started permanently banning anyone with an amd card. Sounds like something blizzard would do, so was it d4?
Last edited by Mister Magician; Oct 21, 2023 @ 9:45am
< >
Showing 16-21 of 21 comments
Per page: 1530 50

Date Posted: Oct 5, 2023 @ 7:32am
Posts: 21