Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
Il you go UTRA for texture, you shouldn't be surprised. Ultra texture (or any other lvl) are the same for any res. So using it for 1080p is 1/ Waste of Vram 2/ Waste (because you'll not see lot of diff with "high" or even less in 1080p)
Same for many other option. Just stop thinking res is the only value who impact perf.
I really think equivalent of ps5 is "High" not "Ultra". (and less for 60fps stable)
Basically what you're all doing is making an excuse and rationalization at this point, because you know nVidia hardware is objectively and factually not that good. This is a really sore point for anybody that paid scalper prices on nVidia products. I'd be pretty mad and coping hard too if I'd paid $1000 USD on a crappy 3070. If nVidia actually made such truly wonderful unassailably good hardware people would only talk about the performance. You can tell a thing sucks because people end up looking like first year college students trying to b.s. their way through an exam, bringing up unrelated things like drivers or some gimmick or the looks of it, because the fundamentals are subpar and shoddy. This includes the inferior performance of the 3070ti when the 6800 smokes it for $20 less, but that's not even the real problem. The big problem is it has literally half the VRAM, and that makes it unsuitable for anything but 1080p longterm, whereas 6800 is still a good performing 1440p card. It's such a huge problem because RT takes massive amounts of memory, so the crying all over TLoU about performance issues is pretty exemplary of why you need to have enough VRAM. TLoU is taking up to 15gb of VRAM at the higher resolutions and ultra quality--in other words nearly double the 3070's VRAM. Which renders the whole RT thing quite moot if it technically has faster RT but overall is slower and a stuttery mess because it hasn't got adequate VRAM to do the RT.
No one is holding nVidia accountable and if you're all just sucking on nVidia's nuts it's really no wonder they keep doing stupid ♥♥♥♥ like charging $900 for a high midrange card or charging high end prices for a midrange card and then not sticking adequate VRAM on it. Absolute epitome of the type of person who wants to "own nothing and be happy." Then either be happy with what you have and accept its inferior performance for more money, or quit trying to slam every developer out there for your shoddy VRAM.
As someone who went from a 6800xt to a 4090, I disagree. I have experienced ♥♥♥♥♥♥ drivers from both camps. But I honestly miss the feature set I had with AMD drivers.
The overlay was a lot more customizable and being able to have per game overclock profiles was super nice. I was fond of AMD link as well.
And yes I know I can do all the same stuff on my 4090 with third party software, but having it all in the driver was nice and I miss it lol. I also hate that GeForce experience requires an account.
When I play TLoU I don't get close to running out of vram. This game is poorly optimized and or filled with bugs or as ND admitted a nasty memory leak is present. I run this game on the lowest settings and it still crashes. Sorry but not buying this argument it is the hardware.
Edit: in Hogwarts I am getting up to 130 FPS.
You can download Nvdia driver without GeForce Experience. You don't need GeForce Experience, or any account to just dl and use Nvidia Driver.
GeForce xp is here for cloud gaming, first. which, of course, requires an account.. and yes it also have the option to automatically dl and install new driver
Yes yes yes I know, but that doesn't change the fact that it annoys me lol.
You would use "Radeon chill" to limit frame rate and you could force vsync in the same menu you enable enhanced sync. But to be fair I don't exactly know when AMD added those features, it could have been after you jumped ship.
But even still that's my point. Two completely different people with completely different experiences. Makes it hard to say that one is objectively better than the other.
Complete user error on my part but because I was so used to AMD Radeon settings, when I switched to Nvidia it legit took me like an hour to find where fast sync was to enable it.
LOOL thats like trying to compare an 8 gb ddr2 stick with an 8gb ddr5 one. You are either ignorant or trying to cope about your overpriced GPU.
It's literally not a GPU issue at all, even at 720p lowest settings it still stutters, It's a cpu issue, I run an r5 3600 overclocked, and it performs better than the recommended 3600x at stock. but whenever the last of us is booted it pegs it to 100% usage no matter what
This game is extremely CPU intensive, so you are likely correct in your assumption
But is it playable?
It looks awful with anything below high textures, TLOU2 PS4 would have been lapidated had it looked like TLOU PC on medium settings. Either you have low standards, bad memory or both.
You guys will remember these threads someday when you try and run yet another AAA title and eventually coming to realize that 8gb was not enough and you can't patch VRAM. I almost never see someone turning the settings down.
How am I even back here I was trying to find something on the Rockstar launcher login