Instale o Steam
iniciar sessão
|
idioma
简体中文 (Chinês simplificado)
繁體中文 (Chinês tradicional)
日本語 (Japonês)
한국어 (Coreano)
ไทย (Tailandês)
Български (Búlgaro)
Čeština (Tcheco)
Dansk (Dinamarquês)
Deutsch (Alemão)
English (Inglês)
Español-España (Espanhol — Espanha)
Español-Latinoamérica (Espanhol — América Latina)
Ελληνικά (Grego)
Français (Francês)
Italiano (Italiano)
Bahasa Indonesia (Indonésio)
Magyar (Húngaro)
Nederlands (Holandês)
Norsk (Norueguês)
Polski (Polonês)
Português (Portugal)
Română (Romeno)
Русский (Russo)
Suomi (Finlandês)
Svenska (Sueco)
Türkçe (Turco)
Tiếng Việt (Vietnamita)
Українська (Ucraniano)
Relatar um problema com a tradução
Yeah, that is why I was saying just stick with what you got for another few years and start socking the $$ away into a "new build budget". 11th Gen + a 3070Ti is a perfectly capable rig as of now and at least for the next 3-4 years. The display tech is something that doesn't move super fast and usually takes longer than expected. MicroLED is already in the specialized market so I'm hoping in another 3 - 4 years it will get down to a decent consumer level with somewhat reasonable, if somewhat higher, price options.
As for the GPU to support it, that is another thing that I think in 3 - 4 years will be in a much better position for both performance and pricing (e.g. much better price:performance). NVIDIA moving to an MCM design, assuming they are successful with that, as well as getting preferred rates on the latest process node from TSMC with the partnership they noted at GDC would certainly give them a much better position in regards to performance:cost. It should substantially improve their yield-per-wafer with smaller chiplets, on a smaller node. I'd expect NVIDIA will do both increase their margin-per-GPU and lower the end-user pricing for graphics cards in that scenario. Some people are expecting the move to come with the RTX 50 generation and while I think that is a possibility, I think it will be the generation after.
This is the same reason I haven't upgraded to a 4090 or a 7800X3D or 13900K. I'm putting $100/mo into a new build budget and I'm planning on doing a new build in about 3 years once I see what Intel delivers after moving to their RibbonFET and tile based CPU packaging, and once NVIDIA moves to MCM. I'm also curious to see how Intel does with Battlemage and Celestial during that time, and to see if AMD can get in gear with more comparable features to NVIDIA and substantially improve their RayTracing capabilities.
I'm planning an upcoming GPU upgrade and was actually just doing some experimenting and here's what I'm dealing with. This is just Minecraft with shaders.
https://imgur.com/a/ePaETV1
I use a GTX 1060 6 GB, and Pascal was one of nVidia's VRAM blessed generations. I am also in need of (much) more GPU processing power so it's not just a VRAM issue, but if I had 3 GB or 4 GB VRAM instead, there's no way I could have seen it lasting this long. I am pushing the settings a bit here so in normal play it's mostly fine (but still absolutely not fine in some situations). But in case I can push this render distance and raise the resolution (something I'm not even trying here, so this truly isn't "worst case"), I wanted an idea of what I might need. And my idea between this and seeing what recent games can use is that less than 12 GB is going to compromise me going forward, and honestly 16 GB is probably what I want. This game, or at least how I'm running it, can be something else.
Now this means absolutely nothing to you unless you play it and have it set exactly as I do, so what I'm dealing with means nothing if you're not dealing with it. But given I'm seeing recent games come out and push VRAM (and RAM and core/thread use), I'd keep it in mind.
Your 11700K is great.
You have 32 GB RAM. You're good there.
GPU processing is great too. VRAM would have me cautioned though.
If I had that exact system, I'd be skipping on a CPU upgrade and doing an overhaul years from now, and instead have funds on standby for a possibly needed GPU upgrade. Which sounds like the plan you recently decided to consider. And if you don't need a GPU upgrade? Then you're just that much better off. Keep using the RTX 3070 Ti. When it's not it situations where it's lacking VRAM, it's a great GPU. And you make the most of it by using it as long as it's still sufficient. Upgrade it only when it becomes an issue for your own uses.
It's a shame too, because while shorting on the Vram has never really held Nvidia's GPUs back much, current gens anyway, but it looks like it will finally come to bite them with GPUs like the 3070, 3070 Ti, and even the 3080 10GB, having plenty of horsepower, but experiencing performance issues when the Vram spikes too high. Still great GPUs, but definitely help back by lack of Vram into 2023 and beyond.
As far as upgrading though, I think buying a new GPU is tied to the purchase of a new display. I can make do at 4K/60, but beyond that, I think more GPU power will be needed. So, when I finally decide which display I will purchase, I'll make a decision on the GPU. I think whichever comes first, the next is to follow. Even if I go back down to 3440 x 1440, I'll still want a GPU that could push the frames for what the screen can handle. Probably 165/175Hz for the 1440, and 120/144Hz for the 4K. Lol, while I doubt many games will keep consistant 120 at 4K, 60FPS is still my minimum, if it stays at that or reasonably above that, I am fine.
Thanks for the feedback. Oh 64GB now btw. Didn't need that either, but the same kit of 32GB was on sale back in December, and I said why not. I like the look of 4 dimms, and the kit I had originally bought was 16GB single rank dimms, and really you want dual rank ideal, but putting 4 single rank dimms in makes it like it is dual ranked. First kit I bought was manufactured by Samsung, and the second one Hynix, but haven't had any issue with them at all. XMP is enabled and everything is stable. GSkill TridentZ 3600Mhz CL18.
If you had a GPU with more VRAM and it was more unlikely you'd need more in the coming few years, I'd say sure entertain the CPU upgrade, even if it's not needed. But I'd personally wait on the CPU until Zen 5 and its 3D counterparts, and Intel's next platform, have launched. And honestly you likely shouldn't have to to upgrade even then (but it might be more justifiable as a "want" by then). But yeah next generation or even the one after is more what I'd do. Sit on it, or maybe a GPU upgrade (and only if needed) before then.
Besides, like me, if you've invested into 64 GB of DDR4, there's that much more reason to make use of your current platform.
I have an R9 5900X + 32GB + RX 7900 XTX rig, my monitor's an old Samsung 1st gen super ultrawide 32:9 3840x1440 144Hz monitor and I can hit >100fps in many games, rasterize games that is. Even some RT titles are quite playable, like Metro Exodus PC Enhanced (only game with RT that I care anything about). Playing DSR at max setting with RTAO enabled, and still hit well clear of 100fps.
So far, other than some minor driver bugs at its initial release, ever since I'd started using my card from around mid February, it's been pretty good. BTW, I'm no advocating that OP buys this card, just that drivers are improving and this was more to rebutt out Daswulf's claim about AMD cards. It is improving (Not just me, member of gur3d and TPU RX 7900 series threads).
But honestly, get the best of the best, no need for a CPU/platform upgrade, the CPU's perfectly fine for gaming, just focus your budget on the best card available now, the RTX 4090 24GB. Like the RX 7900 XTX, 24GB of VRAM would ensure that it stays relevant longer.
In this manner, I feel that nVidia's pulling the same crap it'd done with the RTX 3070/3070 Ti/3060 Ti with the RTX 4070 Ti. By strapping the latter with 12GB of VRAM, which IS sufficient for now at perhaps up to 1440P, but with a game or two hitting 12GB VRAM usage, it's just a question of time when shortfall in VRAM would rear its ugly head (much like the 8GB mid range previous gen nVidia cards). So get the best there is right now...
The 4090 would be nice, and should perform admirably for years to come, but considering its price and power demands, which would include me having to spend another couple hundred on a PSU, I think the 4080 or the 7900XTX are realistically the only sensible upgrade paths from a 3070 Ti.
I can make an EVGA Supernova 750 GT work with a 4080 or 7900XTX, but not with a 4090. Between the two, the 4080 or the 7900XTX, I lean more toward the 4080 because of the improved RT performance and DLSS, and better power efficiency, but I don't think I can stomach a +$1200 for this thing. The 7900XTX is more sensible at $1000, and has 24GB of Vram.
But while the 7900XTX's RT are fine, and you have FSR, DLSS is really nice to have, and DLSS is very beneficial in games that use RT, so not having DLSS for RT is a bumber. Most games that support DLSS now support have FSR, but there is no denying the cool features DLSS3 brings to the table, no matter whether you like them or not, they are there to help performance. And do a great job IMO. FSR is getting there but they are behind DLSS when it comes to new innovations and techniques. They'll most likely come to FSR but later.
If the 4080 can go down in price, I would consider it. But who knows, if the 4080 drops, maybe the 7900XTX would soon follow. But price cuts look like something that might be wishful thinking in the GPU marketplace.
Unfortunately, not counting x90 models, that cuts out EVERYTHING on nVidia's side not called the RTX 3060, RTX 3080 12 GB, RTX 4070 Ti, and RTX 4080. And I feel ALL of them are overpriced for what they are (and the former is awfully low on performance IMO for what I'm looking for in a GPU and how long I want it to last).
For me it's quite easy to just pick an RX 6800 XT as I'm coming from a GTX 1060 6 GB. But I don't blame you if that option is less appealing to you as you wouldn't be moving up TOO much in rasterized performance, and possibly even losing some ray tracing performance, so basically you're spending $500+ on a side grade to fix nVidia's VRAM greed mistake. Boo.
That's partly why I suggested sticking with the RTX 3070 Ti as long as you can. Yeah the VRAM is a concern, but... it's not a problem until it is. So until you run out, try and make the most of it and hope the market has better options for you at that time.
If not, yeah... I think the RTX 4080 or RX 7900 series are the only real options (besides the RTX 4090 of course but I think that's not on the table for you).
the new hardware drop and see where its going.your jumping the gun on 4k may be a reality
with the 5000 series GPU as of now even with the 4090 its just ok.with the poor optimizations
of AAA releases and new games demanding even more power.unless of course gameplay
is secondary vrs visual.your rig is very good your just exceeding its capabilities for a solid
all around gaming experience with the 4k monitor.
So for your case stick with it, a great cpu.
Just look at the RX 6800 16GB and the RTX 3070 Ti, both are similarly priced (last I'd checked) with the former being cheaper by a bit. Yet, the RX 6800 outperforms the RTX 3070 Ti in rasterize gaming, and has more VRAM to boot. It can easily handle newer VRAM intensive, something the RTX 3070 Ti can't. Like you, I feel that the 12GB VRAM on the RTX 4070 Ti falls a tad short of what the card deserves.
I'm waiting for AMD to release FSR3 frame generation, and see how good or bad it is, it would hopefully help with RT games with FSR3 support. As it is, I'm happy with my RX 6900 XT simply because it's a powerful enough card with a good amount of VRAM, and 24GB on the RX 7900 XTX is just icing on the cake. AMD's gonna release the RX 7800 XT sometime in the future, I'd hazard a guess that unlike the RTX 4070 Ti 12GB, it'd prolly have 16GB VRAM.
I was forced into 4K when I had to sell my Acer Predator 3440 x 1440 UW a couple years ago. I ended up hooking my PC up to my TV, and was actually amazed by the picture quality, so the experience has been great. It doesn't have the bells and whistles, like the 120Hz refresh rate and Gsync, that the UW had, but it is a decent TV, with low latency(for a TV) and a game mode.
The picture quality, and brightness are great, and has HDR10, but obviously I have locked FPS to 60FPS all the time. The 3070 Ti is not a beast 4K card, but I have been able to make it work and have settings maxed or close to max in pretty much every game. So, to me it is smooth as butter. But obviously it would be nice to be at high refresh rate again. Even if it isn't at 120FPS plus in all games, with Gync it allows games to look smooth at a range of FPS over 60.
So, it has still been a great, and good looking experience at 4K on a big screen. And the setup I have allows me to sit a good 6 feet away from the screen, so PPI hasn't been issue. That's why with already being there, I figured I might as well stay at 4K, and improve the display. But even if I do stay at 4K, the new display will need those bells and whistles that I was talking about.
But I have seriously thought about going back to 3440 x 1440 UW. There are some great displays out there. I would have to change the setup again, and honestly, I would still be looking for a better GPU, to better make use of those high frames.
Though, I am in no rush for a display now. I'm still having a great experience playing at 4K and 60FPS. Even if it isn't 120FPS, or more, it is a locked 60, so to me, still very smooth. I have made the 3070 Ti work for me. Whether that is with DLSS, or tweaking a few here and there, it has given me a solid 60. And I would like to make my hardware last as long as possible. Either way, upgrading display is my next priority. Then the GPU that will push that display.
Also, if you don't have a high-end GPU (like 7900xt or above), the point of 7800X3D is kinda moot, cause the output will be same as other relatively cheaper CPUs, which is even more true at higher resolutions.
your 3070ti 8gb card is going to plummet sell now and upgrade to at least 16GB card
anything but a 8 or 10gb card trust me on this one..goodluck
I am fine with 60 FPS, but not less. So, while I regularly played games at above 60 FPS, many times it was all over the place. Obviously GSync helps with the varying in FPS numbers at high refresh rate, but now that I am at a constant 60 FPS, it does seem smooth. So, it isn't 120FPS, but that is why my next GPU will be able to more closely do those numbers above 60FPS. Whether that is at 4K or 3440 x 1440.
I mean, not every game is super demanding at 4K. Only a couple of my games are super demanding, most of my games, even the 3070 TI can do 4K maxed at well above 60FPS. So, I don't think 4K is as much as a hurdle as you make it out to be. A 4080 or a 7900XTX get decent enough numbers above 60FPS in most games at 4K for me. And if all don't get well above 60FPS, 60FPS is still fine for me.
Besides, one thing I will say about a big 4K screen, it looks a lot better and sharper than my smaller higher refresh rate monitors that I had in the past. If I stick to 4K, my next display will look even better. But I still am strongly considering an UW 3440 x 1440. Even then, like I said, I will want a GPU upgrade.
Exactly. Upgrading from a 3070 Ti at the moment is a hard proposition. When upgrading, its nice to see a double uplift in performance, which you would get with a 7900XTX or a 4080, especially a 4090, but anything below that, and is it really worth spending that much for the upgrade you be getting.
That's why it is never really wise to upgrade within generation, even from like a 3060 to a 3080 let's say, unless you really have the cash to burn, and it is always best to upgrade after a couple generations, or more. Longer the better. If you want to see the most meaningful performance uplift. I personally try to always go at least a couple generations.
But I can see how hard it is wanting to upgrade from a 1060, when you are not really impressed with the offerings, and what would impress you is way overpriced. But at least you know when you upgrade, you will have a HUGE boost in performance.
And honestly, I would like to wait to till the 50 series or the 8000 series. And a better GPU is a want not a need. Like I said, upgrading for me is based on getting a new display. Right now, I am in the market for a display. But I am in no rush. I can stick with what I have for longer. Save more money, and then we will see what the new tech brings, or if a current monitor that I like gets a good sale that is too hard to pass up, then I will see.
But until I have that need for a better GPU with a new display, no need to rush. If I bought a display today, I would probably be looking at getting a 4080 or 7900XTX. But if I wait long enough, perhaps those options will improve.