Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
The nearest comparable card is a 3080, there's essentially no difference between my 6900 and a 3080. The biggest difference is the 3080 use's 3rd gen RT and mine use's 2nd gen RT.
Despite this I am unable to run RT at a satisfactory FPS, and that is intolerable, in case ya wondering the RTX stands for (ray tracing extreme) and if a card specificity designed for RT cant run RT, then what do you do?
Other games run AOK with RT enabled, indeed one of the games Metro:Enhanced requires more hardware then Witcher 3, even after the update.
I run Metro:Enhanced at higher settings then I run Witcher 3 with Rt enabled AND satisfactory FPS... To quote He-man " I HAVE THE POWER"..
BTW I consider anything below 120 FPS to be pretty slow, except some games I.E Fallout games, dont really have a choice, but for that sortta game 60 FPS is plenty
Even without RT enabled the DX12 version drops to ridiculously low FPS on even high end machines, it's not just in peoples heads.
Why is that so hard to understand? Maby if we both say it, it'll sink in. Except i'm getting 140 FPS in the country and low 90's in towns on DX 12.
What I believe he's saying is that when someone in the Steam forum complains about DX12 ultra+ performance AND is also flexing about their rig... it's much more statistically probable that they are rocking an i5 or i7 2nd gen and a 1650...not the latest gen i7 and RTX 3xxx/4xxx series like they may claim or insinuate.
I'm rocking a i9 ACTULLY. So now wot?
So, CDProjekt lied.
People have a right to be upset.
Oh absolutely, they are right too. My point is that it doesn't really matter how powerful your machine is lol.
My machine is mid/high with a 3800X, 3080 12GB and I'll see dips into the 20's in Novigrad without RT enabled on DX12 (4K). With the DX11 current version I'll get a solid 60+ FPS with everything maxed/Ultra+ (4K).
People with 4080, 4090's etc still struggle to maintain a solid FPS. I'm not particularly bothered by it though, it still looks great in DX11 mode so I'll still finish this play through.
Those are some expensive frames. I miss Crysis.
From a thread before 'patch' release... Do they have a right to be "upset" or are they just hardware ignorant?
I certainly feel lied to. I have an I7 10700K and EVGA FTW3 RTX 3080. If I use raytracing, the game is simply unplayable. In the officially released requirements for raytracing, my hardware is right there.
So yeah, they lied.
What if I told you I KNOW you are lying? I am playing on a MSI Raider GE76 12UHS laptop, 32G, 3080ti 16G GDDR6, i7-12700H. I am playing next gen on Max settings off HDMI on a TV at 1080P getting 53-60 FPS depending on location. I can link 3dMark specs and video on request.
Unplayable? You ARE lying.
This is the million dollar question. Why? Misinformation and disinformation are two different things. Someone could be saying they never lied, but just didn't tell the WHOLE truth. (game is stacked with outdated mods)
Omission of data can tell a completely different story paired with the right narrative and a willing audience.
I completely understand your message, that's why the OP. Think about the data I linked, and the mindset you just described, knowing that the data points at a certain range of GPU's being used by a large majority globally on Steam and people claiming AAA computer GPU/CPU heavy games are "unplayable" to them.
And very telling of today's PC players lack of understanding about their hardware compared to expectations of software performance, and blaming studios for it masked in aggrieved diatribes.
Eye opening really.