安裝 Steam
登入
|
語言
簡體中文
日本語(日文)
한국어(韓文)
ไทย(泰文)
Български(保加利亞文)
Čeština(捷克文)
Dansk(丹麥文)
Deutsch(德文)
English(英文)
Español - España(西班牙文 - 西班牙)
Español - Latinoamérica(西班牙文 - 拉丁美洲)
Ελληνικά(希臘文)
Français(法文)
Italiano(義大利文)
Bahasa Indonesia(印尼語)
Magyar(匈牙利文)
Nederlands(荷蘭文)
Norsk(挪威文)
Polski(波蘭文)
Português(葡萄牙文 - 葡萄牙)
Português - Brasil(葡萄牙文 - 巴西)
Română(羅馬尼亞文)
Русский(俄文)
Suomi(芬蘭文)
Svenska(瑞典文)
Türkçe(土耳其文)
tiếng Việt(越南文)
Українська(烏克蘭文)
回報翻譯問題
So don’t assume that I’m trying to flex when I’m not at all
Have a 3080 and i9 9900k and averaging in the 80s maxed out at 4k dlss balanced. Should be noted changing the graphical options or using the console in-game seems to lead to performance degradation so experiment and restart before playing
Lolwut? C2 remaster looks like a bad retexture, everything looks dull and lifeless, lighting is even worse.
I still play the originals once a year and I've had W10 + same rig for 6 years, no issues to speak of.
And AMD sucks anyway, what else were you expecting?
Crysis 1 Remastered runs superbly maxed out now, after all the patches.
A 4090 does 4K60 maxed out with RT and no DLSS and it's GPU bound.
No it doesn't. Probably drops more on later levels as well.
Otherwise, he is GPU bound.
Lasted a matter of seconds (well it was longer coz I stopped to fiddle with settings and see if I count figure out what was going on). It seemed to be CPU related, as turning DLSS on/off (quality) had no impact on FPS, and dropping to balanced gained me 2 fps lol. CPU didnt seem stressed, but I would guess the game is not well threaded and that the cores it was using was maxed out (I dont monitor individual cores, but overall power and usage had plenty to spare)
Anyway, a few seconds out of the entire game is hardly worth crying over. Ran great, even on the meme setting. I had it capped at 95fps because using HDR with 4:4:4 requires 98Hz on my display, that needed less than half the GPUs TDP (typically 170w-210w)
Just sad C2R removed HDR. Strange since its on console, and C1R PC. But this one is also running well. Havent played much, but I noticed a 70w drop when going from all 'very high' to just reduced RT to 'high', might stick with that since I cant see the difference.
That said, most of the performance is down to three settings - objects, shadows and vegetation. Lower those and you'll gain a lot of performance. I was running objects high, shadows and vegetation medium for my last run, and at 1440p on a 5600X with 3070Ti, I was getting 100+ FPS even in the more heavy sections of the game, and didn't even get any stutter at the hill above the town in Recovery.
Those settings are quite comparable to the original game at max settings in terms of draw distance, yet the original drops like crazy in many places, so the remaster is definitely an improvement. And on higher end CPUs I suspect you'll manage high for all three settings, or maybe very high if you're targeting just above 60 FPS, and it'll look much better than the original in terms of draw distance.
I'm guessing they're talking about GPUs, but AMD is actually pretty dominant in the CPU market right now. AMD's best competition in CPUs is another AMD chip. Watch Linus Tech Tips..
Also, AMD GPUs, while they lack feature parity with nvidia offerings (regarding DLSS, ray tracing), they are actually a good value and offer better performance for the money if you limit yourself to rasterized graphics without ray tracing and DLSS. And there's not necessarily anything wrong with that because in probably most games the average person can't even tell the difference between rasterized graphics and ray traced graphics. Linus Tech Tips did a test of this and verified, even many of their tech-savvy people couldn't tell the difference, and Linus confirmed the AMD GPUs will be a good value for some time to come relative to nvidia for those who want decent performance with less money.
Linus himself acknowledged this and pointed out the good value offered by AMD, even if they aren't the literal best. So this "AMD sucks" mentality is uninformed garbage.
AMD also works much better for Linux gamers.
And I say this as a guy running an RTX 3070.
5700 XT, R5 3600, W11 22H2, drivers 22.11.1.
I'm playing the game at 900p, TAA, FSR1.0 injected through driver, game is maxed out, RT is set to High. I'm getting a locked 40 fps (it can go to 50-60 in spots, very rarely it drops to 30-35) on a 120 Hz monitor. Perfect frametimes, no stuttering or judder. If I attempt 720p, I can do 60 fps.
Completely GPU bound. CPU mostly sleeps at 15% load.
Game looks and runs phenomenally well. I actually played and finished Crysis 1 Remastered at a similar combo of 900p40 with RT, but I had to lower Object / Shadows / Vegetation a notch there. RT was also High (+ RT Overdrive).
GPUs with hardware acceleration for RT (though I can't tell if it's Nvidia exclusive or it accelerates on RDNA2 also) should run better than my GPU. I'd probably be demolished in performance and with better visuals by a 2060 running in 720p (1080p + DLSS Quality).
I must've played Crysis 2 some easily 10 times or more.
Played it at launch on PC on DX9. Played it with the DX11 patch and upgrades. Played with MaldoHD v4 maxed out. Played it in 2020 with my current 5700 XT with MaldoHD at 3200x1800 at 80-120 fps.
With the Remaster, I'm all here for the graphics. I can find a combo of higher Rez and 60-120 fps in the Remaster. But I want the maxed out RT experience. So I'm fine with 40 fps on 120 Hz monitor and FSR 1.0 Ultra Quality basically for 1080p rez.