Zainstaluj Steam
zaloguj się
|
język
简体中文 (chiński uproszczony)
繁體中文 (chiński tradycyjny)
日本語 (japoński)
한국어 (koreański)
ไทย (tajski)
български (bułgarski)
Čeština (czeski)
Dansk (duński)
Deutsch (niemiecki)
English (angielski)
Español – España (hiszpański)
Español – Latinoamérica (hiszpański latynoamerykański)
Ελληνικά (grecki)
Français (francuski)
Italiano (włoski)
Bahasa Indonesia (indonezyjski)
Magyar (węgierski)
Nederlands (niderlandzki)
Norsk (norweski)
Português (portugalski – Portugalia)
Português – Brasil (portugalski brazylijski)
Română (rumuński)
Русский (rosyjski)
Suomi (fiński)
Svenska (szwedzki)
Türkçe (turecki)
Tiếng Việt (wietnamski)
Українська (ukraiński)
Zgłoś problem z tłumaczeniem
Also another thing to consider is a how powerful your PSU is? You will need at least a decent 500w PSU for a 6600 and a 550w PSU for a 3060.
The 3060 will give you a few more frames per second then a 6600 at 1080 but right now a 3060 at a decent price might be harder to get than a 6600 at a decent price.
But if you want to play Cyberpunk 2077 at 1080 with high settings and at least 60 fps, I wouldn't really recommend anything less than an RX6600.
Here is a review of the RX6600 from Techspot. It has benchmarks for Cyberpunk and other games there, as well as compares the frame rates of other GPUs as well.
https://www.techspot.com/review/2343-amd-radeon-rx-6600/
It is also worth mentioning that Cyberpunk is a pretty demanding game. Here I am talking about High settings at 1080p. But even at 1080 if you want to max out settings(Ultra) and maintain 60fps then I would recommend something better than a midrange card.
24gb ram? That is definitely not a matching kit. It still surprises me that a gaming PC would be set up that way. You want 8gb, 16gb,or 32gb not 24.
24 means you just stuffed an extra dimm in there with no thought or reason. Only for the sake of more ram and disregarding configuration completely. Not ideal for a gaming PC.
I would definitely fix that ram before I would even think about a GPU.
Nah, man. Listen, I have three 8 GB RAM.
That's why I wrote 24.
Been a while since I came accross it, either way, it's best to remove the unmatched stick.
OP, what's your actual budget?
With your specs, your cpu will hold you back in alot of areas, with the current gpu shortages / price hikes, it would probably be better to maje do with what you have and save for a complete new build once prices / availability improves.
I'm not too serious about gaming and all that. I just want something that will handle most new games without me having to go back and forth with the graphic settings.
I guess I'll just stick to a RTX 3060 Ti.
I totally agree with the above, remove the odd ball dimm and just wait.
Welcome to the new norm of buying a graphics card. The search is the hardest part.
3060 is around $557 where I live.
The short version is, with a case of three modules of 8 GB on a platform that is limited to dual channel (so, no triple channel capability), 16 GB worth will be in dual channel and 8 GB worth will be in single channel. So theoretically the "extra" RAM (that is, the third 8 GB worth that is added to the 16 GB worth) isn't hurting because if you go beyond 16 GB then single channel speed is still far better than lacking RAM entirely. In practice, however, I'm not sure if the system is fully "smart" enough to only use the 16 GB of dual channel before using anything within the single channel space (or if it works in a way the alleviates that concern). I want to say with how systems manage address space this would be dealt with by the first 16 GB being in dual channel mode but it's "way above my pay grade" (understanding) so to speak. Maybe someone else knows this part.
Additionally, many things aren't hurt as much by single channel speed as some may think. Yes, it's definitely a difference and I'd definitely recommend no reason NOT to use dual channel if it is supported, but the practical difference isn't anywhere as different as the theoretical change (twice the effective bandwidth) suggests. The difference is typically far close to a fraction of that (may often be as low as single digit differences and ones you won't feel blindly).
***Keep in mind I refuse to play below 60fps so if it won't stay at above 60 at all times, I'm not playing it***
4k: DLSS has to be set to Ultra Performance and you can run it on high @ 4k @ 60fps. No Raytracing at all.
1440p: DLSS Balanced and can be set to Ultra no raytracing. Or can be set to High and have Reflections Raytraced @ 60.
1080p: DLSS not required, can be run Ultra and have all Raytracing features on BUT lighting has to be just on, not the other 2 higher settings.
Best pairing displaywise I've for for this card is 1440p. I used to have it hooked up to a 50" 4k Samsung but it was way too much screen for this card. Now on a 32" 1440p Asus Monitor and its a much better pairing.
I'd avoid Amd until FSR is in more games, though I did test it in Resident Evil 8 and it works quite nice (tested with Amd 5700xt and Nvidia 3060).