Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
series S can run the game but the 1660 ti isn´t?... good lord the consoles hold back devs inoperancy in optimize anything
Actually, GTX 1660 does support raytracing, but only in DirectX 12. Vulkan is missing raytracing API for GTX 1660.
Even if there WAS a Vulcan API for this, the cards would still be WAY too slow to run this game.
The Xbox Series S on the other hand uses AMD's RDNA 2 architecture and supports hardware accelerated ray-tracing...
Doubt it uses shader cores (is that even a thing anymore?), more likely CUDA cores (compute cores), which Nvidia used for raytracing with OptiX already since 2009.
The amount of raytracing Indiana Jones uses for global illumination is so low that GTX 1660 could probably do it, if the engine would use DirectX or if the Vulkan would support VK_KHR_ray_query for that GPU.
GTX 16 series doesn't do software emulation, it does API translation, just like Proton translates DirectX to Vulkan. Or, how shader cores nowadays aren't emulated, they are translated to be done by CUDA cores. Yes, CUDA cores aren't as performant for raytracing as RT cores, but it works if the amount of work is low.
Steam Deck has only 8 RT cores, Xbox Series S around the same (is as powerful as GTX 1660 Ti or RTX 2060 Mobile). Both can run Indiana Jones. GTX 1660 Super and Steam Deck can also run Quake 2 RTX at low resolution and 30fps. Quake 2 RTX is not just doing global illumination, it does path tracing.
Yes, the tech is old now (8 years), so slowly we're starting to see the first games depend upon it instead of just having it optional.
I mean consider that the STEAM DECK has hardware-support for it!
https://www.nvidia.com/en-us/geforce/news/gfecnt/geforce-gtx-dxr-ray-tracing-available-now/
In that article from nVidia you'll also find lots of performance numbers. The frame example from Metro Exodus is quite interesting. That game also uses ray-traced global illumination. These rather low end GPUs would have to render the entire game, which is SIGNIFICANTLY more complex than Quake 2 RTX, while "wasting" a large part of their limited power on emulating ray tracing.
It might be possible to run Indiana Jones that way, if there was support for this in Vulkan, but I doubt it would be much fun...
GTX cards might not be dead yet, but they're dying fast... If you want to play this game, don't wait or hope that someone wastes time, energy and money on making it run on old hardware. I think you can play it right away with GeForce Now. Or you can wait for your next GPU upgrade. Whichever current or new card you'll buy then will support hardware ray-tracing and can run the game. And you'll probably get the game cheaper then, as well. :)
Again, it's not emulated, it's just translated to different hardware.
"raytraced global illumination" doesn't mean "have to render the entire", raytraced global illumination is just basic diffuse lighting, rest of the render is still rasterization. Path tracing in Quake 2 is way more complex than some global illumination because path tracing involves multiple light bounces and color bleed.
GTX 16 series is not dead, it uses the same architecture as RTX 20 series, means it has the same API features, it can even do AI upscaling without RT cores in Geforce NOW. GTX 1080 can do too, rest of the GTX 10 series can't because they have disabled it for performance reasons.
Yeah, it's the castrated low end version of the RTX 20 Series without RT cores, without Tensor cores, without DLSS and only 6GB VRAM. As I said, it might not be entirely dead yet, but it's dying fast.
I had an overclocked RTX 2070, which is at least 50% faster than any GTX 16, much more so with ray tracing, and even that is pretty much dead if you want the latest games with high settings AND high FPS.
I currently have an overclocked RTX 4070 Ti Super which will last for a bit. I guess my next card will be an RTX 50 "Super"/Refresh or an RTX 60. Depends on future games and possibly on my next Monitor. ;)
xx60 and xx070 variants are totally fine, these are entry-level GPUs, meant for lower target resolutions. If you find them slow, your graphics settings are probably too high. GTX 10 series might be dead because it uses old architecture that is missing some important features, but GTX 1080 is so over-powered that it's exception in that series.
Unreal Engine 5 has "software" global illumination and even some smaller indie games like Tiny Glade have it. In reality, they just rely on compute cores, which is still hardware, just not dedicated for RT. But GTX 16 series doesn't have to do any tricks for raytracing, it has DirectX raytracing API support.
The GTX 1080 Ti was a really good card in it's time. It's often almost twice as fast as the GTX 1660 Ti when it comes to DirectX ray-tracing (DXR) but it's STILL slower than an RTX 2060.
No GTX card supports DirectX Ultra. That's because none of these cards have ANY hardware ray tracing capabilities. On the GTX 10 and 16 series, It's - according to nVidia - all just done with shaders and thus very basic and slow, compared to the RT cores on RTX 20 series cards and upward models.
Metro Exodus also uses ray-traced global illumination (RTGI), somewhat similar to Indiana Jones. According to the nVidia article* that describes ray-tracing support on GTX 10 and GTX 16 cards, a GTX 1660 Ti gets less than 20 fps at 1080p. The RTX 2060 more than 40 fps, almost 50 with DLSS. The GTX 10 and GTX 16 cards don't support DLSS. The difference grows with higher resolutions and better DXR quality.
The only GTX card where Metro is playable with RTGI is the GTX 1080 Ti which reaches around 35 fps at 1080p. Still not much fun though.
I think Indiana Jones would probably run equally bad or worse on GTX cards if this shader-based ray-tracing was available with the Vulkan API. If it was, the 1080 Ti would probably be the only GTX card that could run it at low but barley playable frame rates. Not really worth the effort for ONE outdated former high-end GPU...
I also think the only reason for the GTX 16 series' existence was to use Turing chips with defective RT or Tensor cores for entry level cards instead of throwing them away...
* Here's the link again, in case you didn't read the nVidia article:
https://www.nvidia.com/en-us/geforce/news/gfecnt/geforce-gtx-dxr-ray-tracing-available-now/
It also says this:
Indiana Jones runs 60fps on RTX GPUs, 30fps on Steam Deck, GTX 1080 and GTX 1660 would not have problem running it 30fps. The game is basically walking simulator with some low-framerate punching animations, it doesn't need to be played at 60fps.
Yeah, Unreal Engine does all that and more with "software". So, no point to compare everything with Metro Exodus. Also, there is reason why there are Low and Medium settings and lower resolutions, entry-level GPUs are meant to run 1440p or 4K on High and Ultra presets.
FP32 Pascal shader cores are CUDA cores (compute cores). They article is written for noobs, for who it doesn't matter how they "emulate" it.
This isn't true. Yes there is a lot of walking (but it's definitely not a walking simulator, lol) and brawling, but the brawling tends to be against multiple enemies and you need to move around a lot, and also there are several fast paced action set pieces throughout the game.