Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
Yes apparently Nvidia with it's 40 and 50 series midrange cards decided to go with x8. However the 5060 ti can use PCIe 5.0 x8 which is as fast as PCI-E 4.0 x16 that your 3060 used.
I guess the question is what motherboard do you have does it support PCI-E 5.0?
Aside from all that it's not very often a midrange card is going to use more bandwidth than x8 bandwidth provides regardless of PCI-E 4.0 or 5.0. x16 is only better if you actually need it, and most of the time you don't. And most of the time by the time hardware outgrows a PCI-E spec we've jumped head a few versions and contemporary motherboards have more bandwidth the the GPUs can use.
And you can see from the benchmarks neither the 40 series or 50 series cards are crippled compared to 30 series cards and their x16 interface.
I have PCIE 5 on AsRock B650 Steel Legend, so its all about the power consumtion?
However the flip-side is, users don't like things feeling less than the previous thing. Numbers getting smaller raises concerns. In this case it's not anything to worry about. And the reality is the 5060 ti has the same bandwidth access the 3060 has, and twice the bandwidth the 4060 has and it doesn't need more than that.
ok now i know, thanks, could i also ask how to use 10 bit color? mine is on 8 bit and is not possible to change it from 8 bits in the nvidia control panel
Well it would be dependent on your monitor and whether it supports HDR features which would allow for 10bit color the way I'm reading it.
You'll have to look up the specs for your monitor and if HDR10 support isn't prominently listed it's not something that would be slipped in in secret. And realistically unless your monitor is fairly highend and relatively recent I wouldn't bet money you have that HDR support through dumb luck. But... someone is lucky every day so check those specs.
I already know that it supports up to 10 bit color
On the PC side usually a good idea to install CPU-z and run the Validation tool and then provide the link to the validation which will have most of the hardware info for your system. For your monitor, the make & model should be enough to look up what it's specs.
Samsung odyssey g5 LS27AG502PPXEN
validation: https://valid.x86.fr/gqaur0
it gives more fps than my 3060 12gb in the games i play on the same settings, while not staggering amount, but it feels better, i can see that it loads the textures instantly rather than taking a second like the 3060. the game runs smooth with lower latency but i do get frame drops 20 fps when i engage in combat while playing, i think that has to do with the Drivers rather than hardware since the gpu is new.
As of right now a new video card using PCIE-5.0 @ 8x should have zero performance loss in any game.
and nvme 5.0 x2 is way more than enough bandwdith
the newer cpus have more than enough lanes for everything
I think you should be reminded that this thread is discussing VIDEO CARDS, not SSD's.
5.0 x16 will have slightly better lowest of the lows