Instalar Steam
iniciar sesión
|
idioma
简体中文 (Chino simplificado)
繁體中文 (Chino tradicional)
日本語 (Japonés)
한국어 (Coreano)
ไทย (Tailandés)
български (Búlgaro)
Čeština (Checo)
Dansk (Danés)
Deutsch (Alemán)
English (Inglés)
Español - España
Ελληνικά (Griego)
Français (Francés)
Italiano
Bahasa Indonesia (indonesio)
Magyar (Húngaro)
Nederlands (Holandés)
Norsk (Noruego)
Polski (Polaco)
Português (Portugués de Portugal)
Português - Brasil (Portugués - Brasil)
Română (Rumano)
Русский (Ruso)
Suomi (Finés)
Svenska (Sueco)
Türkçe (Turco)
Tiếng Việt (Vietnamita)
Українська (Ucraniano)
Informar de un error de traducción
How did I imply the 8k performance was native? That wouldn't make any sense. If the 8k performance was native, then that would mean it could do higher resolution with DLSS?
If that were the case 4k would be a cakewalk? Why would you think I was making a claim that a 4090 could do 8k native. That's just crazy. If it could, this thread wouldn't even exist.
False statement.
I have a RTX 4090 and have yet to have a game drop below 56 FPS upon 4K resolution, using very high to ultra quality settings. Either use an optimized game or the correct game settings. Also, get the latest Nvidia drivers.
With the DLSS 3.0, new games added monthly, it doubles the FPS to 120 FPS or so. Ideal for my 120Hz monitor refresh rate. That includes using Ray Tracing and HDR quality.
Cyberpunk is quite a rough game, but this video shows close up the difference in using real-time ray tracing...
https://www.youtube.com/watch?v=Xf2QCdScU6o
https://www.youtube.com/watch?v=cJ1ubgaCl6I
Most people don't understand or respect the amount of processing power used to do that. Even seen it. So if you are happy at 1440p resolution with the preloaded shaders, stick to that and you won't need as a crazy beast of a graphic card to drive it. If you want to do 4K + HDR + Ray Tracing, then what do you expect? You are like still comparing game engines back to ones in the early 2000s when they had none of that quality or real-time processing.
https://www.youtube.com/watch?v=oCsgTrGLDiI
That is what the RTX cards are trying to drive towards in real-time. Hense RTX the marketing abbreviation, meaning “Ray Tracing Texel eXtreme”. If you don't want to use Ray Tracing, then why would you buy a graphics card labelled as such? It also helps with Virtual Reality 3D object environments and calculating 3D surround soundscapes, bouncing off those objects.
We want to be able to do that for a more immersive experience upon gaming and even creating CGI in movies, etc. Shadows have a huge impact here, specially in adventure games or horrors. Consider going through a dark cave, lit by a flaming torch and the sun rays coming through. A soft shadow of a monster casted upon the wall as it roams past. All with Ray Tracing and HDR quality, with rich whites and pure black darkness. Previously that would of just been all a dark blur and greyed out. It's the right step towards next gen PC gaming.
However, turn off DLSS 3.0 and Ray Tracing if desired:
https://www.youtube.com/watch?v=ELQ1hG7mOv0
The RTX 4090 still gets a min of 70 FPS / avg 95 FPS / high of 126 FPS at Ultra settings upon Tomb Raider and other games...
A 3090TI, while having to enable DLSS Quality, barely keeps up with a 4090 running native 4K. DLSS quality making it 2560x1440.
Ya, I think the entire thread is a joke, since they posted garbage and won't buy anything anyway and would rather complain without knowing any facts.
So if your 4090 gets you 4k/120fps/highest settings/raytracing/hdr with DLSS, what do you think it will give you at 8k?
I haven't tested any 8K monitors, nor do I wish.
The point is 1440p resolution is already maxed out and those that wish could sit happy upon it. Just like how 1080p resolution was maxed out in the past. We are touching now upon 4K. Not only that but new features which run in real-time to generate.
Basically put: It's like someone complaining about the original cost of a new coloured TV screen, when they had just arrived. Then comparing that coloured TV with the old black and white, saying it's not as good. Remember when we use to play 8-bit games and they are so cheap now, blah, blah, blah. It's an illogical statement of living in the past.
Convenient you don't want to talk about 8k performance and Nvidia's 8k marketing when you have a 4090. I got it. I was under the wrong impression.
I understand this point, but that isn't the point of this thread. I'll repeat it again though, your point is valid. I say the same thing at times and genuinely believe it. But I'll also point out that because a gamer should appreciate and be happy with a full 1440p experience, doesn't give Nvidia the right to mislead people into thinking a 4090 is a 8k card.
I already stated about the ambiguity of Nvidia's claims (even though Komarimaru helped my case by further specifying Nvidia's details on the performance at 8k). The way I see it, they have "marketed" the 4090 as a 8k card even if it's not 8k/144hz/rtx/ultra settings. My whole point is that by that claim a 4k experience would be easily achievable.
I mean, it says right there in big bold 3d text "8k HDR GAMING" and yet I'm not seeing 4k, but I do see "3840x2160 Resolution" in tiny letters and not even an fps is given in the performance section. All they did was put some bars side by side comparing it to a 3090 ti.
I also feel this helps my case. If 1440p is maxed out and we are "touching now upon 4k" then why is 8k hype bigger on the page than 4k?
In OP's case, he probably is an example of the people you're talking about. But just because he's the one that started the thread and you don't like his take on Nvidia and 4k performance doesn't mean another claim should be swept under the rug. My whole thing started with someone asking where marketed the card as an 4k card and so far I've been putting my answer to that.
Also, I'm not angry about the cards not reaching 4090. I doubt you'll believe this though.
Unreal engine 5.1 release in Nov 2022 a month later after RTX 4090 released.
Fortnite Engine update to 5.1 was less than a month ago...
Again this was not marketed towards UE 5.1, this wasn't even marketed to promote that you will get 120FPS in all games at 4k from it release and in the future like what you smoking?
Well, the wording you used with "new gpu" should be changed to "more modern gpu" if I get what you're saying. Someone could get a new gpu that isn't more modern, technically.
I was actually thinking the same exact thing you put down though. Nvidia would now everything about Unreal engine 5.1. Release dates only matter for the public I would think.
Game engines developers are not obligated to bow to GPU manufacturers they only do what they benefit them, or what they want for themselves, as GPU sales has no bareing on them so why would it matter what GPU markers do really besides new tech features that what I'm saying. So may have games not optimize well, and some games optimize very well, and for all you know Fortnite might not have gotten fully optimize for it UE 5.1, but again you shouldn't expect 120 FPS for all future games, and updates that happens, as that not how real world works, if Devs have to limit themselves, then basically UE 5.1 wouldn't exist, or be delay until it meet demand for latest GPU, hence why they don't bow to GPU manufactures.
Yeah they may know about it, but doesn't mean they will have hands on future game engines, let alone would do things with game engines as you expect them to do as NOT ALL GAMES are made the same, there more factors you kept ignoring.
Typo on year.
Yeah it's a typo, get over it.