The Witcher 3: Wild Hunt

The Witcher 3: Wild Hunt

View Stats:
Be ready to get 5000 rtx series.
Unreal Engine 5 proven already that 3000/4000 series is trash with actual performance.
All the games that been recently released on UE5 works like a crap and only answer for this issue is ''frame generation'' because in the gamedev world its a new solution for optimizing the games, pretty lazy tbh. with classic middle-fingers to people.
< >
Showing 1-15 of 23 comments
Navhkrin Dec 13, 2024 @ 11:41am 
If you are thinking about this for Witcher 4, don't. Wait until 6000 series. Given CDPR's announce to release ratio, it should release in 6000-7000 series range.

That being said, I am a solo game dev who is making a game with UE5. On latest version (5.5) performance is perfectly fine with 4090 (120fps+) without frame generation. To be more specific on my game it is 120 FPS native 1440p, ~170 with DLSS quality.

That being said Witcher 4 is going to use a lot more graphical features than I am
bshock Dec 13, 2024 @ 1:51pm 
If you want to run it at 4k native Max settings 100 fps, maybe. I'm not upgrading until at least the 6 series. DLSS/FG will be sufficient for quite some time on a 4080/4090.
Gamefever Dec 13, 2024 @ 1:56pm 
It is not the Unreal Engine, its been the fact that the industry generally does not optimize.
It does not matter which game engine, it takes time to properly optimize game assets for intended visuals and requires a level of skill that newer game devs likely do not have.

Lets not act like we dont all know this by now and we have plenty of games that have been really good examples of just how much better a game can perform after enough optimization. Speaking of which we dont have to look far for such an example, Cyberpunk 2077. Witcher 1, 2, and 3 all had the same issue with needing more optimization if you been playing CDPR games for the last 2 decades you would know this.
Navhkrin Dec 13, 2024 @ 2:00pm 
Originally posted by Gamefever:
It is not the Unreal Engine, its been the fact that the industry generally does not optimize.
It does not matter which game engine, it takes time to properly optimize game assets for intended visuals and requires a level of skill that newer game devs likely do not have.

Lets not act like we dont all know this by now and we have plenty of games that have been really good examples of just how much better a game can perform after enough optimization. Speaking of which we dont have to look far for such an example, Cyberpunk 2077. Witcher 1, 2, and 3 all had the same issue with needing more optimization if you been playing CDPR games for the last 2 decades you would know this.

That is not entirely true though. Up until UE5.4 Unreal actually had very bad performance. They significantly improved performance with UE5.4.

With UE5, they also switched physics engine, which is around 3-4x slower than PhysX that was used in UE4.

So it is not all white and black as you guys are thinking. Epic takes part of the blame and they deserve it. They still haven't properly addressed stuttering.
Gamefever Dec 13, 2024 @ 2:08pm 
Originally posted by Navhkrin:
Originally posted by Gamefever:
It is not the Unreal Engine, its been the fact that the industry generally does not optimize.
It does not matter which game engine, it takes time to properly optimize game assets for intended visuals and requires a level of skill that newer game devs likely do not have.

Lets not act like we dont all know this by now and we have plenty of games that have been really good examples of just how much better a game can perform after enough optimization. Speaking of which we dont have to look far for such an example, Cyberpunk 2077. Witcher 1, 2, and 3 all had the same issue with needing more optimization if you been playing CDPR games for the last 2 decades you would know this.

That is not entirely true though. Up until UE5.4 Unreal actually had very bad performance. They significantly improved performance with UE5.4.

With UE5, they also switched physics engine, which is around 3-4x slower than PhysX that was used in UE4.

So it is not all white and black as you guys are thinking. Epic takes part of the blame and they deserve it. They still haven't properly addressed stuttering.

That's going to be disappointing for the guys over in CDPR accounting for certain.
2D神 Dec 13, 2024 @ 2:12pm 
Just like 2077, NVIDIA is going to go hard in marketing alongside this game. Expect them to feature FG gen 2 with PT on the 5090.
nukeheal Dec 13, 2024 @ 3:17pm 
No point getting 50 series if you own 40. There are no massive games in 2025-2026 on PC worth wasting money on. 60 series is when W4 will be out.
Growlanser Dec 13, 2024 @ 3:29pm 
That was definitely a 50 series used for the trailer.
M2Stech Dec 13, 2024 @ 4:35pm 
Be ready to get disappointed then. Unless you opt for the uber tier 5090 the other cards will have very minimum performance lift compared to 40 series based on what industry insiders have leaked so far. witcher 4 won't be out until 2026 as best case scenario so better wait for 60 series or see if intel or amd finally man up and release a proper card.
Starwight/ttv Dec 13, 2024 @ 4:50pm 
wtf are you on about, I'm using a 4060 and run most unreal 5 games just fine lol
Growlanser Dec 13, 2024 @ 6:23pm 
Originally posted by Starwight/ttv:
wtf are you on about, I'm using a 4060 and run most unreal 5 games just fine lol

I have a 4060 TI 16GB. People say it can't do 4k. Meanwhile I'm playing everything in 4K and sometimes without DLSS.
Starwight/ttv Dec 13, 2024 @ 6:24pm 
Originally posted by Growlanser:
Originally posted by Starwight/ttv:
wtf are you on about, I'm using a 4060 and run most unreal 5 games just fine lol

I have a 4060 TI 16GB. People say it can't do 4k. Meanwhile I'm playing everything in 4K and sometimes without DLSS.

I have never tried with 4k, sadly my monitor doesn't support it x_x
rIdzu. x64 👊😎 Dec 13, 2024 @ 10:23pm 
Originally posted by Starwight/ttv:
wtf are you on about, I'm using a 4060 and run most unreal 5 games just fine lol

*With FG feature or because u playing 1080p
My friend have 4070 and i seen actual performance

No point getting 50 series if you own 40. There are no massive games in 2025-2026 on PC worth wasting money on. 60 series is when W4 will be out.

If you look at past, every 2nd gpu generation is reheated cutlet.
3rd one literally murdering last two.

780GTX > slightly stronger 980gtx
1080GTX > slightly stronger and overpriced 2080RTX
3080RTX > slightly stronger and overpriced 4080RTX but with exclusive frame generation
Last edited by rIdzu. x64 👊😎; Dec 13, 2024 @ 10:29pm
Rommel Dec 13, 2024 @ 10:34pm 
You won't see this game for 5 years at least......kinda crazy worrying about the specs right now. Man some gamer's ain't happy unless they got some thing to complain about!
First they get upset because it's a female.....now this. Be happy they decided to make another Witcher!
Zero Dec 13, 2024 @ 10:54pm 
Originally posted by Navhkrin:
That being said, I am a solo game dev who is making a game with UE5. On latest version (5.5) performance is perfectly fine with 4090 (120fps+) without frame generation. To be more specific on my game it is 120 FPS native 1440p, ~170 with DLSS quality.
You should never, EVER, develop a game to only function properly for the top-of-the-line hardware. That's idiotic.
< >
Showing 1-15 of 23 comments
Per page: 1530 50

Date Posted: Dec 13, 2024 @ 11:38am
Posts: 23