Installer Steam
Logg inn
|
språk
简体中文 (forenklet kinesisk)
繁體中文 (tradisjonell kinesisk)
日本語 (japansk)
한국어 (koreansk)
ไทย (thai)
Български (bulgarsk)
Čeština (tsjekkisk)
Dansk (dansk)
Deutsch (tysk)
English (engelsk)
Español – España (spansk – Spania)
Español – Latinoamérica (spansk – Latin-Amerika)
Ελληνικά (gresk)
Français (fransk)
Italiano (italiensk)
Bahasa Indonesia (indonesisk)
Magyar (ungarsk)
Nederlands (nederlandsk)
Polski (polsk)
Português (portugisisk – Portugal)
Português – Brasil (portugisisk – Brasil)
Română (rumensk)
Русский (russisk)
Suomi (finsk)
Svenska (svensk)
Türkçe (tyrkisk)
Tiếng Việt (vietnamesisk)
Українська (ukrainsk)
Rapporter et problem med oversettelse
Don't get the mental gymnastics to get there but if it lowers the latency to normal then it does change something, the latency.
Why is new tech or methods a band-aid, man if you know what Nvidia should do moving forward more than them you should really get a job there you will earn millions.
If you were around in the old days when street lights were invented would you have been one of the folks protesting siding with the church saying that it's the devils work?
Ahh but if they had your lack of vision we would still all be playing PONG by Atari as every new method that came out people would cry noooo.
What are you on about, I think that's the first time I've made such statements, if I have it's as a joke, if anything I always point out you don't need the latest and greatest, however, given said topic requires the latest tech to experience it, I hot a little daft with jokey language.
Plus, I AM poor, I only have my fancy toys because I budget, save and sacrifice other things to get things which, honestly are out of my price range if I didn't.
I start saving for the next upgrade the day after I build a new system.
Heck if most people show some restraint and just bring their own lunch to work, cut out smoking and expensive coffee they would easily have enough spare cash to buy high end every generation.
Yeah rubbish analogy that I'm not sure you fully grasp.
Then gooduck playing anything over the next few years.
You lot really sound like the people who cried about physics in games initially.
Physics enhances games. It allows for interesting things that make gameplay better.
All frame generation does is enable developers to continue their pattern of sh*ttier and sh*ttier optimization. Especially if they play ball with Nvidia.
"Oh it's okay if our game only gets 30fps on the card with the highest rasterisation, they can just generate fake frames and run it at 120fps!"
This is bad for the gaming industry and should be criticized.
Ultimately I doubt it'll do any good, though. Gamers are less than 15% of Nvidia's revenue so there's no reason to really care about them making a fuss nevermind a fraction of them making a fuss.
amd has their version of frame gen too
and even smart/medium dumb tvs can do it too
This whole spiel that's always specifically about big bad NVIDIA is always ridiculous, what NVIDIA is doing is actually beneficial for gamers in the long run as machine learning is an another way to get massive increases in performance without having to completely change GPU technology, we're eventually going to run into the full limitations of silicon, and there are few materials that can sufficiently replace it, the best one of which is graphene which runs the risk of getting so hot when used in a semiconductor that it's easily capable of going molten melting through steel tables if it isn't strictly limited, as circuits using graphene are considerably quicker than silicon circuits.
These alternatives are much more expensive to produce because the materials are more expensive to produce (graphene production is a slow process because you're basically shaving layers of carbon off one layer at a time at the atomic level) and it would take a significant amount of work to properly design a GPU using graphene that wouldn't immediately destroy itself, the first tests done of semiconductors using graphene without any safeguards that we see in current CPUs and GPUs went molten in a very short period of time, melted through a steel table it was sitting on, and melted into the concrete below it.
So the first NVIDIA flagship GPUs to use graphene instead of silicon, costing as much as they do, would cost several times more to produce than it costs to produce the 5090, if not several times more than the retail price of the 5090. It's an inevitable change because eventually silicon is going to be too much of a limiter on the potential of computing, period.
Semiconductor manufacturers like AMD, NVIDIA, and Intel are well aware of this inevitability and they're trying to drag it out as much as possible because the technology to make graphene based semiconductors viable isn't all there yet, as well as keeping it cool enough as if you thought CPUs and GPUs heating up as quickly as they do now is bad, graphene circuits work significantly faster so a graphene processor can shoot up to its throttle temperature instantaneously, it's much easier to get higher performance through machine learning technologies than it is to keep trying to stretch silicon is thin as possible. And it doesn't matter how we get performance increases, what matters is that we're getting performance increases, the issue with DLSS/FSR/XeSS is support, but eventually those things are going to be supported by everything and enabled by default anyway.
Here's a neat video on Graphene from 10 years ago: https://www.youtube.com/watch?v=Mcg9_ML2mXY
Face it, Nvidia sucks even if their cards and drivers are good enough.