Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
John Carmack had Doom Eternal running on a 4 year old 1050 GTX in 1080p/60.
Kojima had Death Stranding running on a 5 year old PS4 (750 GTX equivalent) in 1080p/60.
STALKER 2 needs a 1 year old 4070 to hit 1080p/60 and the 3+ year old GPUs need fake frames and fake resolutions to come close to that same 1080p/60 mark.
Whatever you put up with, you end up with.
Death Stranding - empty
so don't compare
They obviously want to push their newest lines not show how good their old stuff is so you go buy that instead.
Everything? "Your 3 year old card is nearing the end of its life cycle, please purchase a new one." We as consumers miss out on higher quality products and you are defending that?
1080p/60 on 1050 GTX in DE is only possible on lowest settings with DRS (which frequently drops resolution in half in heavy scenes), meaning that it is actually 540p/60 most of the time. And DS is 1080p/30 on PS4, not 60.
Meanwhile STALKER 2 benchmark stated to be MAX settings, which could easily mean that those same numbers (1080p/30) are achievable on old hardware on lowest presets.
id Tech 6 and 7 reuses code written by Carmack.
Decima was improved upon by Kojima productions led by Kojima whether he himself wrote the code or ordered his team to. E
The point is this - all those games were applauded for their graphics while being able to run on a toaster. Can you say the same thing for STALKER 2?
Here's DE on a 1050 running in 1080p/high/60fps, no dynamic res:
https://www.youtube.com/watch?v=3yDK4675yIw
Oh excuse me for DS, 30 instead of 60 on a 6 year old GPU while being one of the most visually impressive games for its time is a feat in itself. Today a 1 year old GPU can barely run the game at 1080/60. That's sad.
There's little difference between low and max settings in modern games anymore unlike 10-20 years ago when devs made different textures for each of the settings. There's no more MSAA settings either, it's all a blurry mess with TAA because we traded a real anti aliasing solution for slightly better lighting so we are stuck with the motion clarity of 720p in a lot of games.
Not only this is a 1050 Ti (which is like 25% faster than base 1050), it is clearly a dynamic res which you can see from performance metrics on screen being almost permalocked at 50% resolution during combat with FPS often dropping below 60 and even to high thirties. Meaning that it is not even a stable 540p/60 on a faster GPU, let alone 1080p/60 on 1050.
This is as much of a consequence of having very plain and simplistic environments as an actual technical feat.
It is not even out yet, so what if we actually can once we see benchmarks of lower presets?
1050 or 1050Ti they are both budget cards with a difference of maybe $40. Stop apologizing for lazy developers and poor code.
My guy it doesn't concern you there's barely any gameplay vids or news a week from release? And devs changed minimum requirements a month from release? That doesn't scream stability issues?