Steam 설치
로그인
|
언어
简体中文(중국어 간체)
繁體中文(중국어 번체)
日本語(일본어)
ไทย(태국어)
Български(불가리아어)
Čeština(체코어)
Dansk(덴마크어)
Deutsch(독일어)
English(영어)
Español - España(스페인어 - 스페인)
Español - Latinoamérica(스페인어 - 중남미)
Ελληνικά(그리스어)
Français(프랑스어)
Italiano(이탈리아어)
Bahasa Indonesia(인도네시아어)
Magyar(헝가리어)
Nederlands(네덜란드어)
Norsk(노르웨이어)
Polski(폴란드어)
Português(포르투갈어 - 포르투갈)
Português - Brasil(포르투갈어 - 브라질)
Română(루마니아어)
Русский(러시아어)
Suomi(핀란드어)
Svenska(스웨덴어)
Türkçe(튀르키예어)
Tiếng Việt(베트남어)
Українська(우크라이나어)
번역 관련 문제 보고
people here r lookin for solution and dev's respond.
current gpus as 3090 and ,most likely, future 3080ti HAVE sli feature.
nvidia Still offers Sli solutions so it means sli is alive. if next gen gonna loose sli feature then i say sli is dead.
some gamers invested huge amount of money to recieve best experience on cyberpunk. 4k resolution and 30-40 fps on single 3090 with Ultra + RTX Ultra + DLSS Quality? i wont be happy .
no i dont have 3090 , am using two 2080s. and game 's perfomance kinda bad.
Ya, I've run about 5 multi gpu setups over the last 10-12 years and had really great experiences with all of them. I've just recently sold my 1080 Ti SLI setup and bought a used 2080 Ti for about half its original MSRP - you're correct on what NVidia is doing with the overpriced cards now - that's why I stopped buying them new. I'll likely continue running one or two generations behind moving forward and just buying via 2nd or 3rd hand market as I simply will not normalize the pricing of these newer cards by purchasing them new. Over $1k for a single video card? lol...no. And I have tons of $, but that's just ridiculous.
With that said, here's some knowledge for some folks:
SLI is DX9 through DX11 only. It's implemented through Nvidia drivers and the game devs. Yes, sometimes you can skirt around this by using SLI profiles from other games.
MGPU, explicit or implicit, is what DX12 uses for multiple graphics cards. It's implemented through the game devs only within their game code with no driver side requirement (e.g. SLI profiles).
Cyberpunk 2077 is DX12 only. Therefor, SLI doesn't even apply - it's MGPU only.
So, if you're gonna bad mouth multi card setups, do everyone a favor and at least bad mouth the correct version, and bad mouth CDPR for not putting in a little extra effort that even smaller game developers have.
^ Well-said. I liked how well-implemented mGPU support was with the Tomb Raider games as an example. 4k / 60fps without a hitch on a pair of 1080 Ti cards.
The TR team at Crystal Dynamics show what can be done with mGPU support. I second your opinion. Did it require a bit of extra effort to implement? Yes, but their solution for DX12 mGPU was before they even added full RT support. They continued to support mgpu on DX12 and the RT features worked just fine. It looks gorgeous and the framerate is buttery smooth running 2 1080tis in my old rig and current rig with 2 2080tis. Squeaky wheel gets the oil so keep these mgpu threads alive folks. CDPR had great dx11 SLI support for The Witcher 3 and it is a sight to behold maxed out on a 4k HDR display six years later. I can see Nvidia not wanting them to do mgpu day 1 to keep 30 series demand high (as if scalpers/bots buying their inventory in seconds isn't enough) but a 3090 not hitting 60fps currently is all the more reason for mgpu. As they further patch and optimize this game I really hope this is on their post-launch project timeline for 2077. DLSS does return some performance but it isn't exactly wolfenstein youngblood in terms of a equal or better solution than native res but the only option currently if you want playable framerates and the RT eye candy.
The Witcher 3 sold 50+ million units not only by being one of the best action adventure RPGs ever made but by the constant TLC and updates over the years. Let's hope THAT CDPR lives on and we can all toss a coin to a de'V'. Hell I bought the fugly but fully playable switch version (the visuals grow on you after the initial shock and you realize despite the low res and aliasing a lot of the visual features ie fully dynamic trees, weather etc are in there) but I bought it because these maniacs did Switch/PC crossave!
TLDR-CDPR has a great track record with post-launch support. Let's let them know mgpu is at the top of our wish list.
This is a thread requesting it, we already have six pages of people saying this sh!t. If we own multiple GPUs, we know the official support status, but also the many custom inspector nip files for SLI/mgpu. We don't need this "SLI is dead" s#$t every other post. Thanks.
Yep, my thoughts exactly particularly with what you said about the 3090 - $1500+ GPU that can't reliably run this game at 4k / 60fps...utterly pathetic. It's a very clear statement of why mGPU should be pretty darn mandatory on any graphically demanding AAA title. I refuse to play into NVidia's little game where they are trying to sell these video cards for what, just a few years ago, was the price of TWO high end video cards, lol, and we don't even get the performance equivalent. I'll likely be buying my video cards 2nd hand or even 3rd hand moving forward.
Incorrect, SLI/Crossfire is dying, DX 12 and Vulkan support Multi-GPU via their API's this means coders will no longer have to rely on support from GPU drivers and can choose to support Multi-GPU in software.
For a good example just look at Ashes of the Singularity: Escalation. DX12 Multi-GPU support works wonderfully in that game and actually uses BOTH adapter's VRAM instead of just the primary adapter's. So if you have 2 8GB GPU's install you have a total of 16GB addressable instead of just 8GB.
SLI/Crossfire = Old way
DX12/Vulkan Multi GPU = New way
This will be the story with all DX12 titles as mGPU there takes a plenty of additional work. Likewise Vulkan (though slightly less work).
And current top GPUs cannot even touch the amount of raytracing this game attempts to throw at them, RTX or not. (DLSS and the like is fake resolution.)