The Witcher 3: Wild Hunt

The Witcher 3: Wild Hunt

View Stats:
sauron2012 Apr 13, 2023 @ 6:45pm
RTX 4070 or 4070Ti or 3080 10GB
Hello there,
I have a question about doing upgrade for next gen gaming (After half year... sigh) and I can't decide which Card I should go. At first after Next Gen update came out, my RTX 3060Ti which I give a lot of hopes after quite good Cyberpunk 2077 performance (Ultra, DLSS Quality + RT Full on 1440p in 40-50 fps) I was sure this one will also work in The Witcher 3 on Ray Tracing without any problems. I was wrong. This card barely can get 25-30 stable Fps in Ultra+, DLSS on Performance (lol), RT Ultra and 1440p, which realy dissapointed me... I considered upgrade to RTX 3080 10GB (12GB not available...) also, but a lot of future technologies aren't supported anymore on 3000 series, the only thing that keeps this Card to be consider is very low price in the middle of 4070 and 4070Ti and a little better perfo compared to 4070 without DLSS. However my power supply is Corsair SF750 750W Platinum+ so I don't consider going RTX 3080Ti or above because power consumption and a big price in my country (More than 4070Ti for 3080Ti...).

So far only 2 Cards I can afford and fit into my Case shows up, RTX 4070Ti and RTX 4070. The problem is, both of them seems fine and I don't want to pay too much for GPU if both of them can keep stable 50-60 fps in The Witcher 3 Next Gen. 4070Ti in my country costs like 150% of price of new 4070, so I started to think which one would be better. The cost doesn't matter to be honest, but like I said, I don't want to pay too much if I don't need to. That's why my question, which GPU you consider as better choice to not regreat in future?

Btw, My CPU is i5 11400f with 32GB Ram. At 1080p yeah it would be bottleneck, but I heard a lot of good words about 3080, 4070Ti and 4070 on 1440p with this CPU. I also seen tests, but I wish to hear your opinions, maybe there is someone who used all of these cards in this game and could tell me which way to go.
Thank you anyway.
< >
Showing 1-15 of 16 comments
Zenocraft Apr 13, 2023 @ 8:05pm 
go with 3080ti and i5-13600KF or i5-13600K. If you pair 4000 series or 3080 with your i5 11400f you will fry that cpu, assuming this is not a troll post if you want to go for 4070 on 1440p you need to update that cpu or else something serious will happen and you would need to rebuild your pc from scratch.
sauron2012 Apr 13, 2023 @ 8:14pm 
Originally posted by Zenocraft:
go with 3080ti and i5-13600KF or i5-13600K. If you pair 4000 series or 3080 with your i5 11400f you will fry that cpu, assuming this is not a troll post if you want to go for 4070 on 1440p you need to update that cpu or else something serious will happen and you would need to rebuild your pc from scratch.

You got to be kidding... I used i5 3570K paired with GTX 1080 for over 5 years and it did not fry this CPU. Considered a troll post is your actualy... i5 11400f with even RTX 3080ti has no bottleneck problems when you go 1440p+. I also did not asked for CPU recommendation but for GPU.
sauron2012 Apr 13, 2023 @ 8:16pm 
Originally posted by sauron2012:
Originally posted by Zenocraft:
go with 3080ti and i5-13600KF or i5-13600K. If you pair 4000 series or 3080 with your i5 11400f you will fry that cpu, assuming this is not a troll post if you want to go for 4070 on 1440p you need to update that cpu or else something serious will happen and you would need to rebuild your pc from scratch.

You got to be kidding... I used i5 3570K paired with GTX 1080 for over 5 years and it did not fry this CPU. Considered a troll post is your actualy... i5 11400f with even RTX 3080ti has no bottleneck problems when you go 1440p+. I also did not asked for CPU recommendation but for GPU.

However I have seen many tests with this CPU and it started to have Bottleneck only on RTX 3090 with 10-15%. However my CPU has unleashed voltage and is boosted at 4.4ghz.
fidelio Apr 13, 2023 @ 8:31pm 
check out the recent 4070 reviews from both linus tech tips and gamers nexus on youtube as both draw many comparisons to the 3080 and have lots of charts.

in the end which is the best buy will likely come down to which you're able to get for the best price, and how expensive electricity is in your market.

playing with a 3080 12gb in witcher 3 with rt/performance/mostly ultra+ at 4k the result is a beautiful game at around 50-60fps, but with a good gsync diisplay is still buttery smooth.
Zenocraft Apr 13, 2023 @ 8:40pm 
Originally posted by sauron2012:
Originally posted by sauron2012:

You got to be kidding... I used i5 3570K paired with GTX 1080 for over 5 years and it did not fry this CPU. Considered a troll post is your actualy... i5 11400f with even RTX 3080ti has no bottleneck problems when you go 1440p+. I also did not asked for CPU recommendation but for GPU.

However I have seen many tests with this CPU and it started to have Bottleneck only on RTX 3090 with 10-15%. However my CPU has unleashed voltage and is boosted at 4.4ghz.
hey if this work go for it. 4070ti has higher floating point and memory clock speed. I suggested a cpu upgrade because it may not be wise to push that much work on older cpu dye. But its always better to go for 4000 series and adjust cpu overclock according to it.
sauron2012 Apr 13, 2023 @ 8:48pm 
Originally posted by Zenocraft:
Originally posted by sauron2012:

However I have seen many tests with this CPU and it started to have Bottleneck only on RTX 3090 with 10-15%. However my CPU has unleashed voltage and is boosted at 4.4ghz.
hey if this work go for it. 4070ti has higher floating point and memory clock speed. I suggested a cpu upgrade because it may not be wise to push that much work on older cpu dye. But its always better to go for 4000 series and adjust cpu overclock according to it.

It has also smaller bit rail with 192 bit compared to 252 bit rail RTX 3080 has. Thats why I consider these both cards, since they don't push so much VRam speed over CPU and that's why I asked question. Maybe will go with 4070 since it's realy cheap and has performance like 3080.
Zenocraft Apr 13, 2023 @ 9:14pm 
Originally posted by sauron2012:
Originally posted by Zenocraft:
hey if this work go for it. 4070ti has higher floating point and memory clock speed. I suggested a cpu upgrade because it may not be wise to push that much work on older cpu dye. But its always better to go for 4000 series and adjust cpu overclock according to it.

It has also smaller bit rail with 192 bit compared to 252 bit rail RTX 3080 has. Thats why I consider these both cards, since they don't push so much VRam speed over CPU and that's why I asked question. Maybe will go with 4070 since it's realy cheap and has performance like 3080.
when buying gpu look for future updates even when graphic cards are under performing at launch it will get better updates 4070 is better option that 3080, mine works fine even when I have games and editing open simultaneously, requirement may vary but I saw 4070 more stable in future proofing than 3080. Do you have a pci 4.0 on you motherboard ?
sauron2012 Apr 13, 2023 @ 10:36pm 
Originally posted by Zenocraft:
Originally posted by sauron2012:

It has also smaller bit rail with 192 bit compared to 252 bit rail RTX 3080 has. Thats why I consider these both cards, since they don't push so much VRam speed over CPU and that's why I asked question. Maybe will go with 4070 since it's realy cheap and has performance like 3080.
when buying gpu look for future updates even when graphic cards are under performing at launch it will get better updates 4070 is better option that 3080, mine works fine even when I have games and editing open simultaneously, requirement may vary but I saw 4070 more stable in future proofing than 3080. Do you have a pci 4.0 on you motherboard ?

My Motherboard is Asus B560i Gaming Wifi mItx so I guess yeah it has PCI-E 4.0 x16. Its says on the box also.
I mean...some people just don't ever learn, OP didn't you just more or less flatly admit you felt like you wasted money on getting a 3060ti despite the fun you had? I don't get people like this, you spent all this money and found out surprise, nVidia actually makes slow graphics cards. Whatever you do, don't bother wasting your money on a 3080 that;s just stupid especially from a 3060ti, you'll have just ended up wasting your money and needing to replace it next year anyway.

I think you should consider a 6800XT instead. 4000 series anything is too expensive and I don't trust them to release a not-♥♥♥♥ value graphics card in the RTX lineup. I guess if literally all you care about was raytracing, used 3080ti could work, maybe even a 3080 12gb, but realistically your VRAM limitations on a 3080 are going to cause the same notorious Ampere problems where people who have 3070tis found out the hard way they're actually getting worse raytracing performance on nVidia that the slightly cheaper RX 6800 just because the VRAM was crippling it so bad, like literally 12gb is the absolute minimum for VRAM now, preferably 16gb. This is doubly true for raytracing, you can maybe get away with it for raster but RT enabled is a VRAM hog and you're basically in for low performance if not stuttering and crashing outright on nVidia.

Originally posted by Zenocraft:
Originally posted by sauron2012:

It has also smaller bit rail with 192 bit compared to 252 bit rail RTX 3080 has. Thats why I consider these both cards, since they don't push so much VRam speed over CPU and that's why I asked question. Maybe will go with 4070 since it's realy cheap and has performance like 3080.
when buying gpu look for future updates even when graphic cards are under performing at launch it will get better updates 4070 is better option that 3080, mine works fine even when I have games and editing open simultaneously, requirement may vary but I saw 4070 more stable in future proofing than 3080. Do you have a pci 4.0 on you motherboard ?
Those cards are the definition of not futureproofing. Had these people gone with a 6800XT they'd ahve been more futureproofed than a 3070 or 3060ti or 3080 10gb, and on the note of getting better through driver updates that's kind of an AMD meme. It's true for both brands, but nVidia is notorious for aging badly, and the raytracing of modern RDNA3 ends up being better than Ampere anyway. I'd actually wait for the 7700XT and 7800XT to come out. Even if you're dead set on rewarding nVidias scumbaggery the very least we'll possibly see a price reduction at that point like how 4070ti already had 100 knocked off. It's still a complete shameless ripoff, but prices been going down after launch.

For me personally a part of me wants to get the 7900XT Taichi but I know it's such a waste of money and I don't need a new graphics card, so if 6800XT Taichi is cheap enough on ebay I may spring for that, otherwise I'm probably sitting this out until 7900XT hits the used market. 12gb of VRAM is just not enough, and it's insulting they'd actually increase prices that much and keep the non-4090 tier cards at basically the absolute limit before completely crippling it on launch rather than waiting for its VRAM to cripple it two years later. Like the fact the 6800 is now beating the 3070ti in raytracing stopped being funny and now is just sad. I question if the same thing will happen for the 3080 10gb vs 6800xt 16gb, since the 6800XT already is faster than the 3080 at near everything but raytracing anyway.not that it matters I mean my backlog is so long I'm playing 1.32 witcher and it works on a GTX 660ti level card or 1.5gb of VRAM on 1080 medium, I have as much fun in bed as on my main rig at this point and only even cared about upgrades because I wanted affordable 4k gaming, and nvidia just ♥♥♥♥♥♥ gamers too much for that to be reasonable atm
Originally posted by sauron2012:
Originally posted by Zenocraft:
hey if this work go for it. 4070ti has higher floating point and memory clock speed. I suggested a cpu upgrade because it may not be wise to push that much work on older cpu dye. But its always better to go for 4000 series and adjust cpu overclock according to it.

It has also smaller bit rail with 192 bit compared to 252 bit rail RTX 3080 has. Thats why I consider these both cards, since they don't push so much VRam speed over CPU and that's why I asked question. Maybe will go with 4070 since it's realy cheap and has performance like 3080.
>literally costs $100 more
>literally costs as much as the 3070ti that was price gouged at launch
>3070ti already cost $100 more than 2070 super, and 2070 S cost $500 when GTX 770 cost $400 and GTX 980 cost $550
>card literally fing has a xx60 bus
It's the exact opposite of a good deal. nVidia has had horrible value and performance for years now and it's only been getting worse. That's basically a 4060 and the reason they keep charging more for it while delivering less every single generation for many years now is because some gullible fool is willing to pay it anyway. Like what I don't get about this OP already on the edge of realizing he got got with his 3060ti and wants to replace it, but it's like dating an abusive person only to leave them for another abuser. You're still being abused. It's just really horrifically bad value in fact this is THE worst value, most expensive generation of GPUs I think I've ever seen in my entire life, and that's stretching back to 64mb graphical accelerators and 3DFx stuff.


I have not seen one single appealing GPU all year. The only half reasonableish ones are literally used mining cards at this point, and none of them are nVidia. All the nVidia cards still cost so much it's basically a complete ripoff even on the used market where some utterly crippled 3070 is going for basically MSRP of two years ago. There's no reason for this. The mining craze has ended, there is no mroe etherium GPU mining, and most people who were going to upgrade did so after the crypto crash and the market supply has finally been fixed. Only reason for prices being like this is just plain stupidity on the part of gamers. ALL those cards should cost $100-300 less than they're priced, both used market and new gen, and adding injury to insult you take one look at them and know anything below a 3080ti is going to age like ♥♥♥♥ and basically is the complete opposite of futureproof as it's already showing its age on so many titles. So not even RT is a selling point for 30 series cards anymore. Bizarrely, Radeon GPUs have fallen to semi-attractive prices though, at least in the States. Idk how different it is on Europe or Asia or South America, but where I am nVidia prices are insultingly, laughably stupid for what you get and AMD is only just now returning to normal. If a 6800XT Taichi hit ebay for $350 I'd definitely consider it, depending how the rest of the stack lines up. Otherwise this whole year has been a hard pass. I guess 5700XT is really good value though, $200 atm for basically 1080ti. Or just get a 1080ti for more budget builds.
Metro Apr 14, 2023 @ 1:26am 
buy a gpu with at least 16gb of vram, gpus with 10 or 12gb of vram are a waste of money especially now that games will use more and more vram.
sauron2012 Apr 14, 2023 @ 1:48am 
Originally posted by Red Star, Blood Moon:
Originally posted by sauron2012:

It has also smaller bit rail with 192 bit compared to 252 bit rail RTX 3080 has. Thats why I consider these both cards, since they don't push so much VRam speed over CPU and that's why I asked question. Maybe will go with 4070 since it's realy cheap and has performance like 3080.
>literally costs $100 more
>literally costs as much as the 3070ti that was price gouged at launch
>3070ti already cost $100 more than 2070 super, and 2070 S cost $500 when GTX 770 cost $400 and GTX 980 cost $550
>card literally fing has a xx60 bus
It's the exact opposite of a good deal. nVidia has had horrible value and performance for years now and it's only been getting worse. That's basically a 4060 and the reason they keep charging more for it while delivering less every single generation for many years now is because some gullible fool is willing to pay it anyway. Like what I don't get about this OP already on the edge of realizing he got got with his 3060ti and wants to replace it, but it's like dating an abusive person only to leave them for another abuser. You're still being abused. It's just really horrifically bad value in fact this is THE worst value, most expensive generation of GPUs I think I've ever seen in my entire life, and that's stretching back to 64mb graphical accelerators and 3DFx stuff.


I have not seen one single appealing GPU all year. The only half reasonableish ones are literally used mining cards at this point, and none of them are nVidia. All the nVidia cards still cost so much it's basically a complete ripoff even on the used market where some utterly crippled 3070 is going for basically MSRP of two years ago. There's no reason for this. The mining craze has ended, there is no mroe etherium GPU mining, and most people who were going to upgrade did so after the crypto crash and the market supply has finally been fixed. Only reason for prices being like this is just plain stupidity on the part of gamers. ALL those cards should cost $100-300 less than they're priced, both used market and new gen, and adding injury to insult you take one look at them and know anything below a 3080ti is going to age like ♥♥♥♥ and basically is the complete opposite of futureproof as it's already showing its age on so many titles. So not even RT is a selling point for 30 series cards anymore. Bizarrely, Radeon GPUs have fallen to semi-attractive prices though, at least in the States. Idk how different it is on Europe or Asia or South America, but where I am nVidia prices are insultingly, laughably stupid for what you get and AMD is only just now returning to normal. If a 6800XT Taichi hit ebay for $350 I'd definitely consider it, depending how the rest of the stack lines up. Otherwise this whole year has been a hard pass. I guess 5700XT is really good value though, $200 atm for basically 1080ti. Or just get a 1080ti for more budget builds.

I use Nvidia Ansel and Nvidia studio because I'm artwork maker and I don't plan to change to AMD. Kappa
sauron2012 Apr 14, 2023 @ 1:51am 
Originally posted by Metroplexmaster:
buy a gpu with at least 16gb of vram, gpus with 10 or 12gb of vram are a waste of money especially now that games will use more and more vram.

Investing into RTX 4080 is not in my moral and wallet range. Would require bigger case, bigger space, better CPU and better PSU. Literally full upgrade which I don't consider.

Like previous people said, I will probably go with 4070 or Ti version. Maybe will spend those extra $ for better version, but I think 4070 would be also not bad choice.
sauron2012 Apr 14, 2023 @ 1:56am 
Originally posted by Red Star, Blood Moon:
I mean...some people just don't ever learn, OP didn't you just more or less flatly admit you felt like you wasted money on getting a 3060ti despite the fun you had? I don't get people like this, you spent all this money and found out surprise, nVidia actually makes slow graphics cards. Whatever you do, don't bother wasting your money on a 3080 that;s just stupid especially from a 3060ti, you'll have just ended up wasting your money and needing to replace it next year anyway.

I think you should consider a 6800XT instead. 4000 series anything is too expensive and I don't trust them to release a not-♥♥♥♥ value graphics card in the RTX lineup. I guess if literally all you care about was raytracing, used 3080ti could work, maybe even a 3080 12gb, but realistically your VRAM limitations on a 3080 are going to cause the same notorious Ampere problems where people who have 3070tis found out the hard way they're actually getting worse raytracing performance on nVidia that the slightly cheaper RX 6800 just because the VRAM was crippling it so bad, like literally 12gb is the absolute minimum for VRAM now, preferably 16gb. This is doubly true for raytracing, you can maybe get away with it for raster but RT enabled is a VRAM hog and you're basically in for low performance if not stuttering and crashing outright on nVidia.

Originally posted by Zenocraft:
when buying gpu look for future updates even when graphic cards are under performing at launch it will get better updates 4070 is better option that 3080, mine works fine even when I have games and editing open simultaneously, requirement may vary but I saw 4070 more stable in future proofing than 3080. Do you have a pci 4.0 on you motherboard ?
Those cards are the definition of not futureproofing. Had these people gone with a 6800XT they'd ahve been more futureproofed than a 3070 or 3060ti or 3080 10gb, and on the note of getting better through driver updates that's kind of an AMD meme. It's true for both brands, but nVidia is notorious for aging badly, and the raytracing of modern RDNA3 ends up being better than Ampere anyway. I'd actually wait for the 7700XT and 7800XT to come out. Even if you're dead set on rewarding nVidias scumbaggery the very least we'll possibly see a price reduction at that point like how 4070ti already had 100 knocked off. It's still a complete shameless ripoff, but prices been going down after launch.

For me personally a part of me wants to get the 7900XT Taichi but I know it's such a waste of money and I don't need a new graphics card, so if 6800XT Taichi is cheap enough on ebay I may spring for that, otherwise I'm probably sitting this out until 7900XT hits the used market. 12gb of VRAM is just not enough, and it's insulting they'd actually increase prices that much and keep the non-4090 tier cards at basically the absolute limit before completely crippling it on launch rather than waiting for its VRAM to cripple it two years later. Like the fact the 6800 is now beating the 3070ti in raytracing stopped being funny and now is just sad. I question if the same thing will happen for the 3080 10gb vs 6800xt 16gb, since the 6800XT already is faster than the 3080 at near everything but raytracing anyway.not that it matters I mean my backlog is so long I'm playing 1.32 witcher and it works on a GTX 660ti level card or 1.5gb of VRAM on 1080 medium, I have as much fun in bed as on my main rig at this point and only even cared about upgrades because I wanted affordable 4k gaming, and nvidia just ♥♥♥♥♥♥ gamers too much for that to be reasonable atm

I never said I wasted money. Its just game that I waited for and is from my country has increased demanding on GPU side with better RT cores. 3060Ti is great card, and without RT it has 70-80 FPS in Witcher 3 on DX12. Which is great. The thing is I realy wanted to replay this game with RT, and only this one time 3060ti dissapointed. Other new games runs perfect except RE4 Remake which is more Capcom fail to compress those textures... 13GB Vram req. for this graphic? Come on...
Last edited by sauron2012; Apr 14, 2023 @ 1:57am
Originally posted by sauron2012:
Originally posted by Red Star, Blood Moon:
I mean...some people just don't ever learn, OP didn't you just more or less flatly admit you felt like you wasted money on getting a 3060ti despite the fun you had? I don't get people like this, you spent all this money and found out surprise, nVidia actually makes slow graphics cards. Whatever you do, don't bother wasting your money on a 3080 that;s just stupid especially from a 3060ti, you'll have just ended up wasting your money and needing to replace it next year anyway.

I think you should consider a 6800XT instead. 4000 series anything is too expensive and I don't trust them to release a not-♥♥♥♥ value graphics card in the RTX lineup. I guess if literally all you care about was raytracing, used 3080ti could work, maybe even a 3080 12gb, but realistically your VRAM limitations on a 3080 are going to cause the same notorious Ampere problems where people who have 3070tis found out the hard way they're actually getting worse raytracing performance on nVidia that the slightly cheaper RX 6800 just because the VRAM was crippling it so bad, like literally 12gb is the absolute minimum for VRAM now, preferably 16gb. This is doubly true for raytracing, you can maybe get away with it for raster but RT enabled is a VRAM hog and you're basically in for low performance if not stuttering and crashing outright on nVidia.


Those cards are the definition of not futureproofing. Had these people gone with a 6800XT they'd ahve been more futureproofed than a 3070 or 3060ti or 3080 10gb, and on the note of getting better through driver updates that's kind of an AMD meme. It's true for both brands, but nVidia is notorious for aging badly, and the raytracing of modern RDNA3 ends up being better than Ampere anyway. I'd actually wait for the 7700XT and 7800XT to come out. Even if you're dead set on rewarding nVidias scumbaggery the very least we'll possibly see a price reduction at that point like how 4070ti already had 100 knocked off. It's still a complete shameless ripoff, but prices been going down after launch.

For me personally a part of me wants to get the 7900XT Taichi but I know it's such a waste of money and I don't need a new graphics card, so if 6800XT Taichi is cheap enough on ebay I may spring for that, otherwise I'm probably sitting this out until 7900XT hits the used market. 12gb of VRAM is just not enough, and it's insulting they'd actually increase prices that much and keep the non-4090 tier cards at basically the absolute limit before completely crippling it on launch rather than waiting for its VRAM to cripple it two years later. Like the fact the 6800 is now beating the 3070ti in raytracing stopped being funny and now is just sad. I question if the same thing will happen for the 3080 10gb vs 6800xt 16gb, since the 6800XT already is faster than the 3080 at near everything but raytracing anyway.not that it matters I mean my backlog is so long I'm playing 1.32 witcher and it works on a GTX 660ti level card or 1.5gb of VRAM on 1080 medium, I have as much fun in bed as on my main rig at this point and only even cared about upgrades because I wanted affordable 4k gaming, and nvidia just ♥♥♥♥♥♥ gamers too much for that to be reasonable atm

I never said I wasted money. Its just game that I waited for and is from my country has increased demanding on GPU side with better RT cores. 3060Ti is great card, and without RT it has 70-80 FPS in Witcher 3 on DX12. Which is great. The thing is I realy wanted to replay this game with RT, and only this one time 3060ti dissapointed. Other new games runs perfect except RE4 Remake which is more Capcom fail to compress those textures... 13GB Vram req. for this graphic? Come on...
Well like other guy also said, 16gb is preferable, 12gb is literally the bare minimum for high res and RT enabled at this point. It isn't just those games. More and more I am seeing different random games using huge VRAM, like the way some people got super butthurt and trashed TLoU and Hogwart's Legacy reviews because they didn't realize the problems Ampere had especially in the 3060ti/3070/3070ti range and also 3080 10gb. All those cards had problems, also driver problems too apparently.

If you're doing lots of digital art then I'd imagine more VRAM is necessity, though not sure what you mean by art. If it's something you make money off of rendering and stuff then that makes sense. Still just be aware how AMD might be slower at raytracing is their recent card problem, but nVidia seems to design their cards to go obsolete fast is their problem. So it wouldn't even surprise me if 4070 starts to age too, but 3080 definitely doesn't have enough VRAM or power at this point. For me personally it's just so ridiculously overpriced I can't even consider them anymore, since I don't actually use my card to make money it's all about performance and higher resolution for me. but the prices are nuts, I was looking at 7900XTX Taichi lastnight and for something that comes off as a 80 level card it's like $1100+ where I am, probably gonna just wait two more years at this point
< >
Showing 1-15 of 16 comments
Per page: 1530 50

Date Posted: Apr 13, 2023 @ 6:45pm
Posts: 16