Installera Steam
logga in
|
språk
简体中文 (förenklad kinesiska)
繁體中文 (traditionell kinesiska)
日本語 (japanska)
한국어 (koreanska)
ไทย (thailändska)
Български (bulgariska)
Čeština (tjeckiska)
Dansk (danska)
Deutsch (tyska)
English (engelska)
Español - España (Spanska - Spanien)
Español - Latinoamérica (Spanska - Latinamerika)
Ελληνικά (grekiska)
Français (franska)
Italiano (italienska)
Bahasa Indonesia (indonesiska)
Magyar (ungerska)
Nederlands (nederländska)
Norsk (norska)
Polski (polska)
Português (Portugisiska – Portugal)
Português - Brasil (Portugisiska - Brasilien)
Română (rumänska)
Русский (ryska)
Suomi (finska)
Türkçe (turkiska)
Tiếng Việt (vietnamesiska)
Українська (Ukrainska)
Rapportera problem med översättningen
If you can't wait that long.
https://www.youtube.com/watch?v=wvO05ktxF5o
There's your wake up call.
Now that I'm home, let's have a better wake up call.
D3 Menu
https://i.imgur.com/qyfWfPJ.png
D4 Menu
https://i.imgur.com/8zIlWYs.png
D3 In Game
https://i.imgur.com/H2zmlt1.png
D4 In Game
https://i.imgur.com/pVqXRBr.png
If you were honest, you'd have written it vastly different in comparing them. Since D4 would be 10/10 in comparison to D3 and D2. But, let's go further, shall we?
Diablo 2. 3/10.
Diablo 3. 5/10.
Diabl2RF. 7/10.
Diablo 4. 9/10.
Diablo 2 reforged, looks better than D3 in many regards.
However, you didn't bring images from DII LoD which would have demolished your point within seconds.
What you brought up only emphasizes how right i got it regarding DII/DIII/D4.
That's WHY you couldn't dare to bring any videos from Diablo II LoD here.
You can't evade how POOR DII LoD is with respect to DIII RoS regarding graphics quality.
What you did above (by just bringing DIII and D4 with nothing from DII LoD) is a laughable perception management.
Everyone knows D4 looks better than DIII as i wrote above. The real problem APPEARS when you compare DIII to DII LoD.
See? I am aware of your strategy and OP is dead right in his/her take.
Lighting is easy to render for an avarage GPU but the Texture and the Models requires higher VRAM which only new GPU's have.
It would be really awsome if the new GPU's actually offer something better other than 5 or 6% more performance.
3090Ti is still somehow better than 4080 and that explains a lot.
It's not only the game companies but people who makes GPU is also started to get greedy and incompetent.
D3, is literally in comparison to D4, as D2 is to D3. The tech advances are that vast. You're confusing art style with actual technology, way to prove my point how ignorant you are on that front.
That, and you can clearly tell the higher poly and more detailed models and textures.
To me, the UI I see in your screenshots is at least a hundred times more important than "poly count" or graphic of any kind. And when I look at the UI of D4 it both puzzle me and want me to throw up. At least D3 menu UI look decent and to the point.
It was showing, that the environments, and models, are vastly superior to the previous generation in textures, technology and mesh.
And I fail to see how the UI is confusing, since they are near the same layout. I refuse to believe you're that blind. I even have the same keybinds.
It. Don't. Matter.
And if you can't see the missing functionabilities between your screenshots from D3 to D4 then you're the one who's blind. Adding two or three extra clicks/steps to absolutely everything in "modern" UI is a blight and a cancer in recent years.
The era of game developers; software developers as well as content developers, that actually know what they're doing at an engine level; and are well-versed in the tricks and sleight of hand of managing to present an environment rendered at a believable fidelity but with affordable performance trade-off, has come to a close.
Everything is stock general-purpose non-specialized middleware flavored engines these days. Your Unreals; your Unities; your Godots; etc.
Modern game software developers are to a very large degree just assemblers of said middleware; are operating on a minimal knowledge base; and are -- compared to the budgets flowing to the content creation department and especially the marketing department -- operating on what can relatively speaking, only be accurately described as a shoestring budget.
If an indie game has a health 40:60 to 50:50 split between actual development work and polishing; and game design and content development, then a triple-A title has ... ey, probably around a 20:80 split. Most of that 80% isn't sunk into good optimized asset development either. It's burned on having to model things to unrealistic high fidelity, which takes time in and off itself; and burned on ... just the insane amount of scope of visceral required content going into a triple-A game nowadays. (It is legit, simply massive.)
Gets even worse and more lopsided if you're working with script-able game engines (which all the major ones are) that allow your content developers to write actual high-level code for the 'business logic' of the game.
E.g. Bethesda is infamous for having its content developers write the script code for a lot of things in the Elder Scrolls and Fallout games. Some of the Papyrus scripting written for Skyrim is legit so bad it would make a first year community-college level CS student blush in embarrassment. Judicious use of global variables and goto statements are just the tip of the iceberg there.
And keep in mind: that's still only about 40% of the total budget you often see cited for a title.
Because the other 60% is going to marketing and logistics. Eh-- mostly marketing nowadays.
That's actually not a joke; nor an exaggeration. Look up figures people plotted for expenditures over the years for e.g. EA. Basically the bulk of their budget that used to go to physical logistics got whole-sale relocated to marketing with the advent of the digital distribution age. The development and research budgets haven't moved up. Actually; they were on a slightly downward curve in EA's case.
So what you get is a bunch of woefully underpaid people straight out of college that don't know what the hell they're getting into; getting shoved into a boiler room for a year or two and put under enormous pressure to deliver.
Guess what? It gives you exactly the level of quality you'd rationally expect.
Which the industry is happy with.
Because consumers will lap it up no matter what. Persons are smart; people are dumb. Given enough marketing is tossed at it, it will sell - regardless. There's always a sucker to be found, as they say.
And a bit of well-spent budget on marketing ... is thus sadly worth more, i.e. at the bottom line is cheaper, than good software design.
This isn't new in any regard either.
It has been happening and has been ongoing and progressively getting worse, for at least the last decade. You just haven't been able to really notice it before when gaming on the PC platform, because PCs used to massively; massively out-power consoles. Like to a disgusting degree.
The last generation with the PS4 and X360 represented a catch up closing the gap somewhat, especially thanks to the consoles being able to use unified RAM and thus having a comparatively disgustingly huge pool of VRAM to store textures in compared to the average PC graphics card at the time. The actual hardware itself was still fairly weak and bottle-necking the whole thing; making it such that the PC platform still had quite some slack left to it, albeit less comfortably than the generations before.
But with the current gen; the PS5 and XBox Series X, the gap has come close enough to being closed that the shortcomings in optimization for the PC platform are beginning to show. The vaunted > 60 FPS rates have already been sacrificed and we're at the point where PCs are going to be using the console 'golden standard' of 30 FPS. Including all the console band-aids such as 'dynamic resolution'; 'checkerboard rendering'; or 'screen space interpolation' (aka what we call DLSS).
This kind of erosion isn't limited to just graphics fidelity and performance either.
Have you ever stopped to consider that anti-cheat is also pretty much a band-aid solution to a problem of bad architecture that didn't used to exist?
In older games, servers were in charge of gate-keeping and maintaining an authoritative game-state. Local clients could extrapolate and interpolate to smooth things out over janky connections; but ultimately, what the server said goes - or ehm.. 'went.'
Which means the amount of actual cheating you could do was very limited. Limited basically to above-normal observation of data the server sent you. You could mess with local state; but it wouldn't get you much. Wouldn't even get you 'wall hacks' - i.e. seeing other players through walls - in many engines, because the server would simply refuse to send updated enemy player coordinates to you if it believed you shouldn't be able to have line-of-sight to them.
When competitive games became predominantly the terrain of walled-garden; closed-space consoles there no longer was a reason to 'not trust the client' - so all the complexity of making a game design; network communication model; and using an engine that was centered on distrusting the local client, was tossed out. Because it's just cheaper that way.
And then anti-cheat was heralded as the solution for the not-walled-off PC platform.
As mostly an afterthought.
Money > time. Throw more money at a problem and it will solve itself. Just ask Valve.