Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
Devs don't agree with you:
Unreal Engine 5, even with ray-tracing, shadows and everything maxed out still don't require avx2.
Same with Marvel's Spider-Man Remastered btw (it's the only ps port with ray-tracing options currently as far as I remeber).
Nothing requires AVX2, it is a CPU instruction set that improves floating point performance, if you remove it, performance gets worse.
Why should the people who spend money keeping their PCs updated be held back by people on decade old systems?
That's how you kill an industry.
From the point of view of an ordinary user, this is work with video, photos, sound, as well as with software that use voice, face, gesture recognition algorithms (which you can see from the list of software where avx and avx2 are used).
Only Horizon Zero Dawn and Death Stranding use avx (not avx2) and are pretty fine with it.
Nice job just quoting a small fragment of my text, also terrible reply, it does not confirm or deny the avx removal.
I hope they don't remove it, so you can stay mad for not playing this game, and I also hope more games future upcoming games use AVX 2, it was about fking time that some dev finally started to used this crap after 9 years since its release of nobody using it.
Can you say this in the thread i created to support AVX2? Poetry my friend.
A lot of software actually uses avx2 (optionally or not), it's just not games:
- Blender
- Bloombase
- Botan
- x264, x265 and VTM video encoders
- Various CPU-based cryptocurrency miners
- libsodium
- glibc
- Linux kernel
- EmEditor
- Microsoft Teams
- Pale Moon
- Tesseract OCR
to name a few.
I know that, I said it in my first reply to this thread, hell even some skyrim mods use it, like fast hdt-smp, some emulators like yuzu loves avx2 too (the ps2 emulator for some reason), and a rgb keyboard software called signal rgb uses it and don't forget Maya integrated remesher.
I even have 700+ hours on blender,
The thing is this tech has not really seen use in gaming, I don't even care about this game, it only took my attention because I saw this is probably the first pc(non emulated) game using avx2 for the first time in the 9 years of life that thing.
Maybe because it's not really suitable for gaming applications actually?
ROOOOFFFFLLLLL You just reinforced the point of """BAD PORT""" regardless of who's fault it was it was a bad port! o lordy
BTW if Iron Galaxy was so hot they would have known that they couldn't properly build the port without the afor mentioned technical support . . . .
Also, supporting multiple configurations is a hassle. It's not free. Every extra config means extra testing load, which in practice can easily mean "just test less per config." Depending on the workflow, it might also increase build times, which is also a real productivity killer.
There's an AVX2-specific wrinkle here: early AVX2 CPUs had some power consumption related trickiness running AVX2 which lead to the situation that using it sparingly could perform worse than not using it at all; which perhaps explains why it wasn't a very attractive tech to start sprinkling in, and why it took so long to gain traction. It's also the kind of thing people didn't always test when overclocking, which could lead to stability issues if your program happens to be one of the first to use it; and that's a support hassle.
From my perspective: I build things to be used not just now, but for at least a few years into the future. It's pretty normal to me to make fairly stringent feature requirement cutoffs when those have any significant technical benefit, because by the time the software is widely used, those will bite slightly less, and over the lifespan of the software it's a shame to make everyone suffer worse performance just to support a tiny fraction of hardware. But the exact cutoff naturally depends on all kinds of things, including engineering resources, how useful the new shiny tech is in this use-case, how widely used the older unsupported platforms are, whether the older platforms are likely to be upgraded, if there's a workaround, etc etc etc.
In general: if you want quality, fast software: be careful demanding support for niche hardware configs that require these kinds of build system changes. Sometimes support for old stuff is easy: fine! Sometimes, it's not; and as an outsider (even if you've got ample technical background as I do) - it's generally hard or impossible to judge stuff like this; it tends to depend on real nuts-and-bolts details of the exact system you're building.
As a dev, I'm actually more excited about AVX512 than I ever was about AVX2 - that may sound like just yet another wider SIMD extension, but it's actually a lot more flexible than AVX2, and that means it's more broadly applicable too, and can perform a lot better in somewhat more complex scenarios. Now if only that were actually broadly supported by CPUs...
(Sorry, couldn't resist a Billy Madison quote)
On a serious note though AVX2 is not obscure, nor has it been replaced by AVX-512. The latter has specific use cases, mostly in HPC, but no game dev is going to compile their project for AVX-512 given that 99.9999% of retail SKUs don't support it.
In almost all cases, and again certainly for a game dev, the instruction sets used are also not decided implicitly. It is done when the project is compiled and the optimiser decides what low-level instructions to utilise for specific problems.
As an example in UE4 it would simply be a case of setting the below appropriately prior to compiling[docs.unrealengine.com]:
I have a 10+ year old CPU and i can play most modern AAA games just fine after spending a bit of time tweaking the settings. Even games that originally required AVX2 and were later updated to not require it usually run just fine.
$599 is a non-trivial amount of money for most people. Do you really think this hobby should be limited solely to the middle class and up?