Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
It's time to upgrade from your 8 year old card if you want to play the full breadth of games. More and more games will be RT only from now on to save on dev time and costs.
The new Intel Arc cards seem to be a good, affordable entry level option.
also, what is elitist about requiring aq SIX YEAR OLD GPU?
Because it's unnecessary for literally every other game out there, so why would I upgrade? Only people who do are enthusiasts who just have to have 4K or 1440p or high refresh rates.
You say "six years" as if the passage of time should make any difference. The 20-series has basically no real graphical advantage over the 10-series except that it could in theory run ray-tracing (albeit so poorly and with such high performance costs, and usually thrown in games as an afterthought anyway) that it was entirely pointless.
I usually ignore "min requirements" because even when they say a CPU or GPU above my specs, my machine usually runs just fine anyway. They claim "minimum" but it's really more of a recommendation 99% of the time. Nobody else has ever been so audacious as to actually refuse to start at all unless you have a gimmicky RTX card.
In my defense, they also didn't even announce it would be RTX-only apparently until 2 days before release. Seems like a bit of an oversight.
and yes, the passage of time DOES make a difference, especially when it comes to software and hardware. don't be ridiculous.
you are part of the reason why games are being held back because "its not necessary" like dude, you are in an ever evolving hobby, a hobby that is quite luxurious and requires keeping up with the times.
RTGI is visible by default
Even if I had to run on medium-low @ 30 FPS I really wouldn't mind, It's not a shooter, right? I only care about a good frame rate in FPS games. In the adventure genre 30 is good enough.
it's a I would say 90% adventure game in the spirit of old Lucas Arts with some forced action trough story events or combat by players choice. there is pretty much always a stealth/disguise option, however I liked the action in this one.
played in 1440p ultra settings with 60fps cap just fine.
30fps might look bad because of the first person view. it doesn't matter much for 3rd person games with a proper motion blur implementation.
Indiana Jones is also running on idtech7 engine. they keep doing it.
As I've said many times before, it's really not comparable. Graphics and computer processing power improved very quickly in the 90s and 00s. Just 3-5 years makes a massive difference. Today is different. There is literally no difference in graphics between today and 5 years ago. There is a nominal difference between today and 10 years ago. Graphics already peaked a long time ago and it's increasingly diminishing returns.
I don't even think RTX is an upgrade. In every scenario I've seen it applied, it literally looks worse. Compare even Indy RTX vs PTX, and the PTX has worse shadows and lighting and fewer reflections. It looks goofy how shadows jump around constantly with any slight movement. Ray-tracing is not designed to look good in games. It's designed to make developers' jobs lazier by having one global lighting instead of actually designing lighting and shadows for the map (baked in). It's the same lazy attitude of "who cares about optimizing file size or load times because now hard drives are less expensive?"
If you want to talk about ironic, Quake 3 was famous for Carmack discovering revolutionary new ways of optimizing the game to run faster.
Misinformation.
I do find it odd how (and why) people manage to write whole paragraphs of just wrong stuff (and I mean all of it). Whatever. Go, get some education and/or experience in the industry.
What ways are those? And why it's ironic?
If you cant afford to upgrade you GPU more frequently than every half a decade, perhaps PC gaming isn't for you and it is time to invest in a console instead?