How RTX4050 manages to be so powerful?
I recently bought Lenovo Legion 5 laptop with RTX4050 and I am amazed how brutally powerful the machine is. What confuses me though is that even if it's advertised as "productivity/budget/entry level GPU" and 2 years old now it can run pretty much every single game you throw at it, at least in 1080p & 60FPS (low or medium graphic settings). I don't only base it on my experience but even the latest benchmarks on Youtube show RTX4050 comfortably running every single game, even the most demanding in at least 1080p, low, 60FPS.

Before purchase I was scared by many reviews (dated early 2023.!!!!) stating RTX4050 is too weak, it won't run new releases, 6GB VRAM is too little, 8GB VRAM is absolute minimum, don't buy, blah blah blah... Yet, it's 2025. and 2 year old budget laptop GPU is still running everything you throw at it. I wonder if this GPU will actually serve well for 2-3 years more, heck, maybe even until 2030. I don't know if I'm missing something, plain crazy or what, but on one hand I see benchmarks running everything on other side legit and famous reviewers saying in their reviews 2023. such GPU is weak and won't run new games. In 3D Mark Time Spy I get 9500 score, which is insane. It really puzzles me how such budget card can be so brutally powerful. Also, on TechPowerUp site it seems that they severely underestimated power of such GPU...

EDIT: Some examples...

This joke of article claims BS
https://www.howtogeek.com/is-an-8gb-gpu-enough-in-2024-it-depends-what-youre-playing/

Disproved here in this benchmark
https://www.youtube.com/watch?v=S2RJKSXjW3o
Editat ultima dată de Str3lok; 7 apr. la 10:44
< >
Se afișează 1-15 din 21 comentarii
It has 20% less shaders than RTX4060 that uses the same chip.
And true, 6GB is on the edge.
4050 vs 890m chipset
https://www.youtube.com/watch?v=BOMTeJfT7B4&t=203s

1080p is starting to just become chipset graphics as the chipsets are getting more and more powerful

videocards have always been generation on generation the 80 variant being equal the 70 variant in newer generations so if a generation has good 1080p performance eventually that comes to the very low end

that said looking at vram if you get a vram limited game that 890m is going to beat you if they allocate 8-16 gb ram for the chipset
Editat ultima dată de Midnight Aurais; 7 apr. la 10:55
cool your happy your low end lappy plays what you want....the rest of us are looking to upgrade things like 3080's that blow your card out of the water as 60 frames will not do....
Depends what you play and how you wanna play it. Generally, there's a reason mostly even the ancient desktop 1080 Ti can still game. It won't break any benchmark records anymore. You may have to dial down a few settings. But it can game. Base consoles aren't equipped with massively better hardware. And they still dictate what gets be made. The bad thing about your laptop is that you won't be able to just easily switch out your graphics power when it doesn't suffice anymore -- 6GB are about as low as you can go these days. And that article isn't completely wrong. You can play AAA as well. But on highest (texture) settings, there's games that use more than 8. Setting the texture (cache) on the highest settings in Indiana Jones can make the game flat out crash... then again, it's "but" a cache.


Postat inițial de smokerob79:
cool your happy your low end lappy plays what you want....the rest of us are looking to upgrade things like 3080's that blow your card out of the water as 60 frames will not do....


That's gonna buy someone very happy now a lot of leather jackets. Also, 60, 100, 144, 240 fps are all rookie numbers. Gotta pump them up. Gotta get more leather jackets. https://www.tweaktown.com/news/102723/nvidia-says-how-it-will-get-gamers-to-play-their-games-at-1000fps/index.html
Editat ultima dată de fourfourtwo79; 8 apr. la 2:18
Aren't there even IGPUs wich are stronger than a 4050?
If I had the option between the 4050 and no GPU at all, I'd go for the latter.
rtx 4050 is a bit weaker than a gtx 1080
i doubt a igpu could do that

unless its the mobile 4050, then its closer to a 1070, but that also depends on the laptop
Postat inițial de _I_:
rtx 4050 is a bit weaker than a gtx 1080
i doubt a igpu could do that

unless its the mobile 4050, then its closer to a 1070, but that also depends on the laptop
It is a laptop chip. There is no desktop RTX 4050 because it doesn't exist for desktop and there is a mobile APU / integrated graphics processor that is faster than the desktop GTX 1080. See here: https://www.videocardbenchmark.net/compare/5322vs3502/Radeon-8050S-vs-GeForce-GTX-1080

The Radeon 8050S is the APU included with the AMD Ryzen AI Max PRO 385 & 390 mobile processors.
Editat ultima dată de Ontrix_Kitsune; 8 apr. la 16:03
they are all potential laptop or desktop chips, since they all are the same die with bad parts disabled during binning
the lowest binned go to laptop gpus

but yea oops, no 4050, only 3050

nvidia really needs to put the M back in the mobile models since they clearly do not stand up the similar desktop gpus
Postat inițial de _I_:
they are all potential laptop or desktop chips, since they all are the same die with bad parts disabled during binning
the lowest binned go to laptop gpus

but yea oops, no 4050, only 3050

nvidia really needs to put the M back in the mobile models since they clearly do not stand up the similar desktop gpus
I agree with you there. The biggest problem right now is the mobile RTX 5090 is just "RTX 5090" on mobile, the exact same as the desktop RTX 5090. But the mobile one is half the cores and half the performance. It's very, very deceptive to customers that don't know this.
guys it is all equal...as soon as you turn on x20 frame gen in losless scaling XD
Postat inițial de Midnight Aurais:
guys it is all equal...as soon as you turn on x20 frame gen in losless scaling XD
Except it's not. If one video card is half or less of the performance of another then it will still also have half the FPS when frame gen is enabled.
Postat inițial de Ontrix_Kitsune:
Postat inițial de Midnight Aurais:
guys it is all equal...as soon as you turn on x20 frame gen in losless scaling XD
Except it's not. If one video card is half or less of the performance of another then it will still also have half the FPS when frame gen is enabled.

you don't get the point even if he gets 30 fps that is still 600 fps there ain't many monitors that are 600 hertz 20x frame gen is more of a meme

the 4050 is a way better experience then i had for 10 years games between 2005-2008 ran around 15-22 fps on a 82945g express family chipset weirdly enough those were also the best gaming years as i always got excited that a game even remotely ran on the chipset also 3DAnalyzer was awesome as it allowed me to emulate like hardware T&L and some shaders intel removed from the chipset without it the list of games i could play were very low
Postat inițial de Midnight Aurais:
Postat inițial de Ontrix_Kitsune:
Except it's not. If one video card is half or less of the performance of another then it will still also have half the FPS when frame gen is enabled.

you don't get the point even if he gets 30 fps that is still 600 fps there ain't many monitors that are 600 hertz 20x frame gen is more of a meme

the 4050 is a way better experience then i had for 10 years games between 2005-2008 ran around 15-22 fps on a 82945g express family chipset weirdly enough those were also the best gaming years as i always got excited that a game even remotely ran on the chipset also 3DAnalyzer was awesome as it allowed me to emulate like hardware T&L and some shaders intel removed from the chipset without it the list of games i could play were very low
That's also not how frame gen works. As I tried to explain to you frame gen is not magic: It is tied to the video card that is using it. A slower video card will still have less FPS and be slower than a faster one even with frame gen enabled.

Or to put it in a way where you can understand it: An RTX 4050 laptop video card with frame gen will never reach 600 FPS in any title. It will always be < 300 FPS, even with frame gen.

If you want 600+ FPS with frame gen you would have to use a desktop RTX 5090 for that.
Editat ultima dată de Ontrix_Kitsune; 8 apr. la 17:07
Postat inițial de Ontrix_Kitsune:
Postat inițial de Midnight Aurais:

you don't get the point even if he gets 30 fps that is still 600 fps there ain't many monitors that are 600 hertz 20x frame gen is more of a meme

the 4050 is a way better experience then i had for 10 years games between 2005-2008 ran around 15-22 fps on a 82945g express family chipset weirdly enough those were also the best gaming years as i always got excited that a game even remotely ran on the chipset also 3DAnalyzer was awesome as it allowed me to emulate like hardware T&L and some shaders intel removed from the chipset without it the list of games i could play were very low
That's also not how frame gen works. As I tried to explain to you frame gen is not magic: It is tied to the video card that is using it. A slower video card will still have less FPS and be slower than a faster one even with frame gen enabled.

Or to put it in a way where you can understand it: An RTX 4050 laptop video card with frame gen will never reach 600 FPS in any title. It will always be < 300 FPS, even with frame gen.

If you want 600+ FPS with frame gen you would have to use a desktop RTX 5090 for that.

https://store.steampowered.com/app/993090/Lossless_Scaling/

ill leave it at that you really need to get the hint that it was sarcasm instead you delved so far down the rabbits hole i know how frame gen works and that is part of the 20x being a meme
< >
Se afișează 1-15 din 21 comentarii
Per pagină: 1530 50