安装 Steam
登录
|
语言
繁體中文(繁体中文)
日本語(日语)
한국어(韩语)
ไทย(泰语)
български(保加利亚语)
Čeština(捷克语)
Dansk(丹麦语)
Deutsch(德语)
English(英语)
Español-España(西班牙语 - 西班牙)
Español - Latinoamérica(西班牙语 - 拉丁美洲)
Ελληνικά(希腊语)
Français(法语)
Italiano(意大利语)
Bahasa Indonesia(印度尼西亚语)
Magyar(匈牙利语)
Nederlands(荷兰语)
Norsk(挪威语)
Polski(波兰语)
Português(葡萄牙语 - 葡萄牙)
Português-Brasil(葡萄牙语 - 巴西)
Română(罗马尼亚语)
Русский(俄语)
Suomi(芬兰语)
Svenska(瑞典语)
Türkçe(土耳其语)
Tiếng Việt(越南语)
Українська(乌克兰语)
报告翻译问题
Doom the Dark Ages doing it wouldn't be so bad. The issue is how almost every new UE5 title acts like a "Crysis" that demands a new rig when people just wanna play games. If the occasional high demanding title broke new grounds or set new trends, that'd be fine. In a way, that's why people didn't complain as much about Alan Wake 2 (the real complaint was it being exclusive to Epic Games Launcher). But now it's like "upgrade your GPU every year" or be left behind in terms of just being able to even play a game.
It really makes me start to hate gaming in general. The consoles aren't even close to 40xx series level of performance, yet PC gamers have to pray to the GPU gods just to have a playable game?! If consoles set the baseline standard, then that means too many studios are content to churn out console port slop and how that high end GPUs "brute force" decent performance. No thank you. I'll just go back to completing my backlog or buy NON-RTX related games.
I've had this feeling of hating the PC gaming for quite a few years myself due to the GPU. It's the most expensive component to buy, yet is always the first one to go obsolete.
It's exactly this reason why for years, I've set the standard where if I can set everything to high (save maybe shadows and shadow textures due to ray-tracing) and the game can still run at 55-60 fps, that's fine with me.
This is why ever since I got the 1080 SC back in 2017, I didn't upgrade the GPU, as none of the games really pushed it to its limit until the Dead Space remake.
But with Doom the Dark Ages, it's another story. The NVME hard drive is one thing I'm still debating on (since Stalker 2 said you must have an SSD to play the game, yet I've seen people play it just fine with an HDD), but there's no way around the GPU card.
That's why you do both. G-Sync never claimed to fully eliminate tearing. It was more meant to remove the latency penalty that you'd get from turning on Vsync. G-Sync WITH V-Sync however, removes tearing AND smooths it out by reducing latency. Low Latency mode usually helps to further reduce any latency that you get when you activate both G-Sync and vsync.
I see. Maybe if I ever return to the original Dead Space, I'll check it out.
I find even a SATA SDD would suffice. I mean, I have my OS on an NVME M.2 SSD, but most of my new and old games installed on my 2 TB Crucial MX500 SATA SSD. It's fast enough that games load into saves within less than a minute most of the time. Even RDR2 now loads up within 30 seconds.
This is why I believe you can get around the SSD issue even if you don't have an NVME. But I find it unlikely that Bethesda would suddenly announce that they'll give you a choice between turning ray-tracing on or off with the Dark Ages. After all, they didn't do that with the recent Indiana Jones game....
At the same time though, there's just not that many games out there where you MUST have the rtx card, though I'm curious if the Dark Ages is going to mark the turning point where all the other devs in the gaming industry will all start saying their games will all have FORCED ray-tracing.
I mean, I'd say try getting a 30xx series card, but those now sell for almost as much as the 40xx series. But, they might still be in stock compared to a 40xx or 50xx card. Personally, last time I was in a Walmart, I saw a 4060 or 4060 Ti I think locked up for $400 or $450-$500 I think. If you needed JUST a 1080P card with RTX, that might be your only choice, even if the 8 GB VRAM on the regular 4060 would be a bottleneck. The 128 bit bus probably wouldn't help. But you won't even find a 4070 for a reasonable price, much less in stock.
Yeah I know. But the 4060 (well, the non-Ti version) you can still get at a reasonable price. But the whole point of ray-tracing is about lights and reflections.
So the way I see it, if you turn everything to max, but then turn the lights and reflections back down to medium, shouldn't that make the Dark Ages run just fine even with a 4060 8GB?
I just do it like this (60, 82, 90, 120, or 163). With G-Sync, I haven't seen it become more choppy or stuttery because it doesn't divide evenly. Let's remember that the actual FPS is usually 2-3 frames higher, which is why ULL caps you at 2-3 frames below your refresh rate.
Depends how much VRAM it uses. Games like Last of Us Part 1 were crashing people's PCs because it would use up to 10-12 GB VRAM even at 1080P. Simply said, these new games use 4K textures even if you don't game at 4K. So the file size adds up, as does the VRAM requirement to fit them.
There's no "lower resolution" textures unless you count actually lowering the regular resolution to 720P or 480P, and I doubt ANYONE would tolerate playing at those resolutions. That doesn't include the downscaling the DLSS already does when it tries to upscale from a lower resolution to 1080P or 4K. That's why so many of these games look blurry if you aren't gaming at 1440P or 4K.