Instale o Steam
iniciar sessão
|
idioma
简体中文 (Chinês simplificado)
繁體中文 (Chinês tradicional)
日本語 (Japonês)
한국어 (Coreano)
ไทย (Tailandês)
Български (Búlgaro)
Čeština (Tcheco)
Dansk (Dinamarquês)
Deutsch (Alemão)
English (Inglês)
Español-España (Espanhol — Espanha)
Español-Latinoamérica (Espanhol — América Latina)
Ελληνικά (Grego)
Français (Francês)
Italiano (Italiano)
Bahasa Indonesia (Indonésio)
Magyar (Húngaro)
Nederlands (Holandês)
Norsk (Norueguês)
Polski (Polonês)
Português (Portugal)
Română (Romeno)
Русский (Russo)
Suomi (Finlandês)
Svenska (Sueco)
Türkçe (Turco)
Tiếng Việt (Vietnamita)
Українська (Ucraniano)
Relatar um problema com a tradução
3060 Ti, 4060 Ti and 4070 runs CP2077 at 1080p or 1440p 16:9 very well actually now that the game and drivers are more optimized. Maybe not on the Ultra Preset but still; go watch some GPU benchmark comparison videos showing games like CP2077, RDR2 and some other more demanding games nearly cranked up and running very smooth on these GPUs.
9900K with a 5.0 - 5.2 Ghz All Cores OC had no issues handling 2x 1080 Ti in SLI. It would be fine for up to around the 4070 Ti or so.
Not really, my 9900k at 5.3 was still hold ng back the 3090 by about 10%, a 4070ti is much faster than sli 1080ti, especially these days.
As for the OP, what's your budget and are you going to overclock?
The best value fast option is a 12700k, slap it on a z690 ir z790 board, I use an msi edge wifi ddr4 myself with my 13900k / 4090.
With the 14th gen dropping, i'd expect some good deals on 12th and 13th gen second hand.
But, you only have slow / cheap ram, so might be time for an upgrade, so ddr5 could be worth looking at, again, depending on your budget.
How old is this cooler? They do not tend to last long enough to keep using as you're changing builds (such as changing Motherboard + CPU years later)
https://static.cdprojektred.com/cms.cdprojektred.com/e4fde54e7fcfca001f98a02d2594d9435806d700.jpg?gasgm
The base game w/ 2.0 update runs just fine on 9700K + RTX 4070 Ti on 1440p/144Hz (21:9)
DLC runs almost the same like the main game. Dont let that guy tell you otherwise. Ive bought the DLC right after release and the difference is barely noticeable.. Same for official benchmark results on CBase, LTT or Overclock.net. maybe a little bit more GPU intensive,
but nothing that cant be patched by devs in a couple of weeks. Remember the main game release. ^^
I guess 007 is still spreadings his nonsense on this sub, even after being called out and stalking my profile with insults and comments until I had to block him.
edit:
Ignoring him would be the best thing you could do but yeah, Im with @Monk on that one. A 9900K will most def hold up a 6800XT/6900XT/3080TI/3090 (and higher) a tad.
Not on 4K, but I guess everything between 1080p 16:9 to 1440p 21:9 would be bottlenecked by a 9900K. Not much but yeah, youll notice.
https://support.cdprojektred.com/en/cyberpunk/pc/sp-technical/issue/1556/cyberpunk-2077-system-requirements
PS: it is not stalking and insult, I just stated a fact.
Dude is a menace to everyones sanity.
You think you need a 4080 to play Cp2077, I highly doubt you are at the level that you can tell the input delay between fg on and fg off, as for seeing if dlssison or off, look for the better looking image, it's likely on now!
It is true, deaf can tell a difference between DLSS and non-DLSS.
It is right there. Google is not hard. If you have DLSS turned off then framerate will dropped. It proven me correct that updated Cyberpunk 2077 does need RTX 4080 for higher RT settings with no DLSS.
https://support.cdprojektred.com/en/cyberpunk/pc/sp-technical/issue/1556/cyberpunk-2077-system-requirements
You don't need an Nvidia RTX 4080 to play Cyberpunk, your listed system should do great.
I played Cyberpunk in a much more inferior system and it does not like Crossfire (2014 AMD Rig). If you go AMD go with the best you can afford right now. MB/CPU swap is not a bad idea, but remember to SYSPREP your OS before the final shutdown, choose shutdown not reboot.
Might have better performance with DLSS off. No worries, it won't reduce quality of textures, it's just a "replacement method of rendering". A gimmick, just like nvidia Weapon Debris.