Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
Why can't you? Doing standard testing of the hardware without any 'special features' turned on is exactly how tests of other machines & physical products works.
Rtx4060ti is the best budget card, because it has dlss 3.0
Said no one ever
I want realistic tropical jungle but without the heat and pests and wild venomous and dangerous animals, so I can't just "go outside", I run Crysis Remastered instead.
Also, it's cheaper to spend money once in several years on a graphics card and visit the virtual jungle any day you want than to regularly spend money on tickets/lodging in tropical locations.
In animation, a smear frame is a frame used to simulate motion blur. A character might turn quickly, so a frame in between the turn is actually a blur of that character.
https://www.youtube.com/watch?v=vMvJiKu-93g
I consider Nvidia AI "fake frames" a similar but much better version of that. A human brain doesn't even notice it unless they are seeking it out.
Anyways, DLSS 3.0 was a requirement to get double the FPS without a performance lost for 4K resolution and up. Without it, I wouldn't be able to get 60 FPS up to even 120 FPS at 4K with ultra high settings. It software improves over time with the latest Nvidia drivers towards the latest games, to the point it even can look better than the original frame. It will over time become a free hardware upgrade as such.
And believe it has it been improving greatly over time!
https://www.youtube.com/watch?v=uVCDXD7150U
😂 I feel like he hasn’t watched the video or read what was said. The world has changed around him and he refuses to look at the evidence of it happening.
Good grief. We’re way beyond that now for the reasons I said and you quoted but fail to comprehend.
Last time I counted there was over 750+ games supported so far. Around 240 of them being DLSS 3.0+ and increasing each and every single Nvidia game driver release, plus improving over time too the previous ones as they run upon an Nvidia AI upscaling server farm 24/7 upon super computers.
Both Deep Learning Super Sampling (DLSS) and Ray Tracing is being advanced greatly into the Nvidia RTX series, via not only hardware but software upon AI servers, pumping it back into the driver updates. If you only care to update them and tend to play the latest games.
You can even rent your own Nvidia enterprise server yourself to do deep learning training. Game developers can add to it themselves, if desired and tweak the quality/performance.
Branched off from that is the Nvidia GeForce NOW cloud gaming service, in which you can also rent to stream gaming from a high-end server with RTX 3080 graphic cards to stream it to your potato PC if you have the bandwidth (45Mbps) for up to 4K @ 120 FPS (max).
Technically they aren't "high-end servers with an RTX 3080". They are virtualized environments running with "3080-class" power; which that is actually now "4080-class" power. They are actually running on a partitioning GPU within their "SuperPods" (e.g. clusters of DGX A100 units).
But your point regarding GeForce Now and being able to play current gen games on a potato is accurate.
I personally disliked the concept at first with the new RTX 4090, till I actually tried it via my work and purchased it whole sale price. It makes so much sense now for the leap to 4K HDR with Ray Tracing quality. It's pretty much a requirement to get there unless you want to wait for years longer. Given that 4K is 1080p times four to juice it, plus real-time ray tracing would cut your FPS down to 21 FPS without DLSS 3 to support it back up to 60-120 FPS.
For people still happy on 1080p or 1440p resolution, it wouldn't make much different or sense. Yet for future generations, I can understand their path. The only other way to go about it is via SLI multiple graphic cards together which is a power drain and high cost.
ahh my favorite fairytale: another alternate universe where 240+ games support dlss3.0
https://youtu.be/rGBuMr4fh8w?si=7Hf1uqj9lseqM4QP
but yea.. dlss3.0++ is the bestest eva
Wow, you should work for government with spin like that.
The card is extremely overpriced and low on VRAM. That has nothing to do with DLSS 3.5 and the game changing benefits that it brings.
Its as if they're trying to say ultimately, rasterization performance is what matters, not some propietary features that (very very) few games have. There are only so much fake frames can be made from 10fps gaming. 🤯
But then again im talking to someone who think the gaming world has fully embraced frame generation. You should browse your own library and see how many games actually support it
Please, watch this video. Start at 11:37
https://www.youtube.com/watch?v=T_vbDx-sBg8&t=676s
You’ll see why their thinking is outdated.
There are so many presumptions in that video. They talk abt a feature that a game might or might not get, or an update (for future dlss ver) that a game might or might not get. (or even the card, ie: dlss3.0 and rtx 2xxx&3xxx). At that point, the reviews are just selling nvidia promises.