安装 Steam
登录
|
语言
繁體中文(繁体中文)
日本語(日语)
한국어(韩语)
ไทย(泰语)
български(保加利亚语)
Čeština(捷克语)
Dansk(丹麦语)
Deutsch(德语)
English(英语)
Español-España(西班牙语 - 西班牙)
Español - Latinoamérica(西班牙语 - 拉丁美洲)
Ελληνικά(希腊语)
Français(法语)
Italiano(意大利语)
Bahasa Indonesia(印度尼西亚语)
Magyar(匈牙利语)
Nederlands(荷兰语)
Norsk(挪威语)
Polski(波兰语)
Português(葡萄牙语 - 葡萄牙)
Português-Brasil(葡萄牙语 - 巴西)
Română(罗马尼亚语)
Русский(俄语)
Suomi(芬兰语)
Svenska(瑞典语)
Türkçe(土耳其语)
Tiếng Việt(越南语)
Українська(乌克兰语)
报告翻译问题
He benchmarked cards using ONE game, that is not how benchmarks are done.
1. Benchmark cards against a range of titles. You want these benchmarks to be in the same session/video/article to minimize differences in Driver revisions and Windows updates between them.
2. Benchmark different resolutions and DLSS/FSR, which is where you will notice the CPU combo issues. A lot of gamers don't care whether the game will run at Ultra on a 3080, 4080, or 4090. They also care if they will need to run at Low or Medium on an 8GB VRAM GPU like the RX 6600.
I'm not buying TLOU on PC because I want better graphics, I'm buying TLOU on PC because I don't own a PlayStation, other than the PlayStation Classic which I bought for $30. I'm a PC gamer, not a console game.
Its called damage control. ND is trying to scrub any relations with Iron Galaxy due to how badly they are known for messing up previous ports.
You remember Uncharted? Well, look at steam page for who did that port. Iron Galaxy. Same issues like mouse movement and stutter that is still present in Uncharted for steam is also in LoU. Coincidence? Not at all.
It was Iron Galaxy. They are also in the game's credits as PC team.
Time stamp: 18:40.
https://www.youtube.com/watch?v=N4GAGYVd4H0
There are also plenty of sites that have not redacted any of their statements to correct that Iron Galaxy didnt work on LoU.
Here is just 1 example.
https://www.dsogaming.com/news/iron-galaxy-will-be-responsible-for-the-last-of-us-part-i-remake-pc/
CPU benchmarks for this game may also appear soon. My guess is that 8 core 3700x similar ps5 CPU will be a bare minimum.
Also best value PC to run ps5 games videos may soon appear as well.
Win 10 (Version 1909 or Newer)
Ryzen 5 1500X or Intel Core i7-4770K
16 GB RAM
RX 470 (4 GB), AMD Radeon RX 6500 XT (4 GB), NVIDIA GeForce GTX 970 (4 GB), NVIDIA GeForce GTX 1050 Ti (4 GB)
That doesn't seem to be the case according to either statement.
Standard gaming PC design is to have the GPU be the bottleneck.
Your CPU is the backbone, you want it to be fast enough to support GPU upgrades in the future. Then you want enough RAM.
Tech Powerup is much more valuable than this garbage video from Hardware Unboxed.
Well, depends on Friday's patch I suppose.
For the current state of the game that is something. Hopefully Hardware Unboxed can publish something of quality and not the amateur hour that is their current video.
there are far bigger games with open world and better graphics uses less vram
GAME IS BROKEN BOY thats it
Then again, the video on The Last of Us by Digital Foundry must be one of the worst videos I have ever watched by them. They rambled and giggled for over an hour without doing much else but lamenting the washed out textures on the RTX 2070 Super. But the problem with the textures were only caused because the dude was adamant on playing on 1440p with his 2070 Super. Why on earth didn't he go down to 1080p when he ALREADY KNEW VRam was the problem?!
I have a 2060 Super, play on 1080p, have my textures on high, and have not encountered a single texture as badly washed out as in the DF video. And no memory related crashes either.
DF also said in the beginning of the video that because of the messiness of it all, they saw no sense in working out the optimized settings when many patches are yet to come out for the game. What an excuse! As if a week of patching suddenly optimized the game!
Meanwhile, I managed to do a optimized settings guide the first day I got the game...
All in all, a very lazy and very stupid video by Digital Foundry.
So it is simply not true that "both statements are probably correct". It is true though that both videos are based on faulty and fragmentary testing and hence both make claims that can't be neither true nor false in any objective sense since the tests themselves were not objective.
dude is being known to be a cheap woke activist. So no surprise there.