Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
Why do people whine about this, it's not even that big of a deal, im on a 7900XTX and get well over 100FPS(up to 130ish) maxed out, while on Linux which usually struggles even more with RT. So a 2080 run it without an issue. Also All AMD GPU's next year will get improved RT, same with Nvidia, and Intels recent cards have improved RT, get with the program if your card can't handle it, RT seems like it's gonna be built into a lot of future games.
https://www.youtube.com/watch?v=UEJU5gMRBz8
It's only going to be the future if devs want to sell to a niche, wealthy fanbase and ignore the majority of buyers. I'm pretty sure the 1060 is still the most popular graphics hardware on Steam. It's what I have, and for literally everything else it is still good enough to run things at highest or near-highest settings and still get solid 60 FPS at 1080p. Newer graphics cards might be better for 1440 or 4K (no idea if anything runs well on ultra at 4K tbh) but I am not going to bother upgrading a new $1000 gaming laptop just for one game being stupid in thinking that ray tracing is the future.
Even in games that make it optional, it was never a good idea. It looks uglier than normal graphics do, and at the cost of severe performance hits. Even my brothers who have ray tracing capable cards rarely use it because of the performance drop. It's not worth it.
^^^ This. Exactly.
Most people who actively buy new games actually have hardware capable of running this game.
The hardware in question is 6 years old, the average upgrade timeframe is 4 years. If you don’t want to play recently released aaa games, thats fine, if you do, you should stick to the 4 year upgrade timeframe
4 years? LOL you think it's reasonable to be forced to upgrade every 4 years? You sound like one of those Apple devotees who throws out their perfectly good last year's phone just because a slightly shinier one came out.
The whole point of buying an expensive rig is that...get this...you don't have to upgrade it. It should last a long time, and normally, it does. If MachineGames didn't arbitrarily gate-keep, I am sure my hardware would run it fine because it runs everything else just fine.
Why should the passage of time matter, anyhow? This isn't like the 90s or early 00s where graphics and processors made huge leaps in fidelity and power (respectively) every couple of years. Graphics have not really improved very much in the last 8-10 years because they already looked fantastic then so there really isn't much room left to go. Nor have recent processors really given huge performance gains either.
The extra power from recent GPUs over a 10-series is nominal outside of the RTX gimmick which most people never cared for anyway. Maybe they're good if you have high refresh rate monitors or UHD resolutions, but as my monitor is Full HD and 60 Hz anyway it makes no difference to me.
HAHHAHA, nope, thats never how this has worked
nope, its not arbitrary at all. and sure, you can have your pc last longer, 6 years instead of 4. but you can't go much higher than that if you want to keep playing new releases
yeah no... this is a very uninformed take...
HAHAHAHAHAH, yeah no.... I used to have a 1080ti, even that gpu was showing its age 3 years ago
Because it was not just the fidelity that changed around the turn of millennium, it was the hardware and software APIs that also changed a lot. DirectX 8 and 9 broke many things, but also enabled better graphics thanks to new features.
Reason why past 10 years have looked quite good was because there was huge upgrade in graphics hardware/software 10 years ago, but if you bought your GPU back then, maybe you didn't notice.
Another huge upgrade was in 2018-2019, but it was not mandatory, it's starting to become mandatory now, 6 years later. That's because new games that started developing 4 years ago, have been developed on new hardware.
That's not what it needs raytracing for. What you are describing are the optional extras to have path tracing in this game. Raytracing can be as little as basic bounce lighting for global illumination, this game doesn't require you to do full raytracing.
Only common use engines like Unreal Engine have a checkbox for hardware raytracing, every other engine, the developer can decide where the raytracing will be used.
Also, people have been hating on Unreal Engine look, but they actually didn't even have hardware raytracing until this month. Until now, everybody got software raytracing instead because people have been refusing to upgrade to new hardware. But even they now see, that hardware raytracing has enough adoption.
This game is made with id Tech engine, which only got raytracing for reflections recently (Doom Eternal patch), so MachineGames probably had full control what they raytrace and what they don't.
4090 is 3 times faster than the 1080Ti
Wouldn't that mean, that the 1080Ti could be even more performant at 1080p than the 4090 at 4K - given that 4K is 4x1080p?
Recent hardware needs to support much higher resolutions and much higher framerates - that's usually where the extra power goes to - at least that's my assumption, given that my dusty 1080 can play pretty much everything recent at 1080p still (and the SteamDeck can run most recent games at 720/800p).
That said - if they decided to save development costs because they didn't want to spend the extra effort of supporting baked lights, it's their right to do so. They'll certainly lose out on quite a few sales, but they probably decided that that loss in sales is negligible compared to the dev costs of implementing fallbacks. Maybe someone will make a mod somewhere down the line that removes this requirement - if not, well, there's other games to play as well. Making an Indiana Jones game 1st-person was already a weird choice; not supporting older hardware is just another one. They seem to know they have a large enough customer base to be able to afford these shenanigans. Good for them.
i am pised off this with this nonsense requirement. the grahics are nto that impressive compared to other gamnes like assasins creed valhalla either. it is all noinsense hype
All in all about 60% of Steam users can play this game. And that number is only going to rise.
Less than 17% are still using GTX cards...
I'm using "FullRT" path tracing and to me Indy looks much better...