This topic has been locked
PCIE x8 and x16 performance diff
I cant figure out why RTX 5060 TI 16gb is on x8 mode? what is the diff between x16 lanes and x8? does it mean it will affect the performance of the card? RTX 3060 12gb runs on 16x but this new GPU runs on x8?
< >
Showing 1-15 of 114 comments
nullable May 22 @ 9:47am 
https://www.techpowerup.com/review/nvidia-geforce-rtx-5060-ti-pci-express-x8-scaling/

Yes apparently Nvidia with it's 40 and 50 series midrange cards decided to go with x8. However the 5060 ti can use PCIe 5.0 x8 which is as fast as PCI-E 4.0 x16 that your 3060 used.

I guess the question is what motherboard do you have does it support PCI-E 5.0?

Aside from all that it's not very often a midrange card is going to use more bandwidth than x8 bandwidth provides regardless of PCI-E 4.0 or 5.0. x16 is only better if you actually need it, and most of the time you don't. And most of the time by the time hardware outgrows a PCI-E spec we've jumped head a few versions and contemporary motherboards have more bandwidth the the GPUs can use.

And you can see from the benchmarks neither the 40 series or 50 series cards are crippled compared to 30 series cards and their x16 interface.
Last edited by nullable; Jun 5 @ 6:36am
Originally posted by nullable:
https://www.techpowerup.com/review/nvidia-geforce-rtx-5060-ti-pci-express-x8-scaling/

Yes apparently Nvidia iwht it's 40 and 50 series midrange cards decided to go with x8. However the 5060 ti can use PCIe 5.0 x8 which is as fast as PCI-E 4.0 x16 that your 3060 used.

I guess the question is what motherboard do you have does it support PCI-E 5.0?

Aside from all that it's not very often a midrange card is going to use more bandwidth than x8 bandwidth provides regardless of PCI-E 4.0 or 5.0. x16 is only better if you actually need it, and most of the time you don't. And most of the time by the time hardware outgrows a PCI-E spec we've jumped head a few versions and contemporary motherboards have more bandwidth the the GPUs can use.

And you can see from the benchmarks neither the 40 series or 50 series cards are crippled compared to 30 series cards and their x16 interface.

I have PCIE 5 on AsRock B650 Steel Legend, so its all about the power consumtion?
nullable May 22 @ 10:09am 
I mean I didn't say anything about power consumption. Bandwidth != power. The 5060 ti just doesn't need all PCI-E 5.0 x16 lanes, so it probably saves a dollar to use a smaller interface. There may be additional thoughts about only designing the hardware to use what it needs as opposed to just slapping the fastest interface on it regardless of need. Doing the latter could be interpreted as sketchy marketing: "It has a PCI-E 5.0 x16, so it must need it because it's so powerful." Something like that.

However the flip-side is, users don't like things feeling less than the previous thing. Numbers getting smaller raises concerns. In this case it's not anything to worry about. And the reality is the 5060 ti has the same bandwidth access the 3060 has, and twice the bandwidth the 4060 has and it doesn't need more than that.
Last edited by nullable; May 22 @ 10:14am
VALORHEART May 22 @ 10:13am 
Originally posted by nullable:
I mean I didn't say anything about power consumption. Bandwidth != power. The 5060 ti just doesn't need all PCI-E 5.0 x16 lanes, so it probably saves a dollar to use a smaller interface. There may be additional thoughts about only designing the hardware to use what it needs as opposed to just slapping the fastest interface on it regardless of need. Doing the latter could be interpreted as sketchy marketing.

However the flip-side is, users don't like things feeling less than the previous thing. Numbers getting smaller raises concerns. In this case it's not anything to worry about.

ok now i know, thanks, could i also ask how to use 10 bit color? mine is on 8 bit and is not possible to change it from 8 bits in the nvidia control panel
nullable May 22 @ 10:19am 
https://www.tomshardware.com/news/what-is-10-bit-color,36912.html

Well it would be dependent on your monitor and whether it supports HDR features which would allow for 10bit color the way I'm reading it.

You'll have to look up the specs for your monitor and if HDR10 support isn't prominently listed it's not something that would be slipped in in secret. And realistically unless your monitor is fairly highend and relatively recent I wouldn't bet money you have that HDR support through dumb luck. But... someone is lucky every day so check those specs.
VALORHEART May 22 @ 10:25am 
Originally posted by nullable:
https://www.tomshardware.com/news/what-is-10-bit-color,36912.html

Well it would be dependent on your monitor and whether it supports HDR features which would allow for 10bit color the way I'm reading it.

You'll have to look up the specs for your monitor and if HDR10 support isn't prominently listed it's not something that would be slipped in in secret. And realistically unless your monitor is fairly highend and relatively recent I wouldn't bet money you have that HDR support through dumb luck. But... someone is lucky every day so check those specs.

I already know that it supports up to 10 bit color
PopinFRESH May 22 @ 10:55am 
Originally posted by VALORHEART:
Originally posted by nullable:
https://www.tomshardware.com/news/what-is-10-bit-color,36912.html

Well it would be dependent on your monitor and whether it supports HDR features which would allow for 10bit color the way I'm reading it.

You'll have to look up the specs for your monitor and if HDR10 support isn't prominently listed it's not something that would be slipped in in secret. And realistically unless your monitor is fairly highend and relatively recent I wouldn't bet money you have that HDR support through dumb luck. But... someone is lucky every day so check those specs.

I already know that it supports up to 10 bit color
What is the monitor model number. You should probably provide the specific hardware you are talking about when asking questions about the hardware so people can know what the specifications are for the hardware you need help with.

On the PC side usually a good idea to install CPU-z and run the Validation tool and then provide the link to the validation which will have most of the hardware info for your system. For your monitor, the make & model should be enough to look up what it's specs.
VALORHEART May 22 @ 11:09am 
Originally posted by PopinFRESH:
Originally posted by VALORHEART:

I already know that it supports up to 10 bit color
What is the monitor model number. You should probably provide the specific hardware you are talking about when asking questions about the hardware so people can know what the specifications are for the hardware you need help with.

On the PC side usually a good idea to install CPU-z and run the Validation tool and then provide the link to the validation which will have most of the hardware info for your system. For your monitor, the make & model should be enough to look up what it's specs.

Samsung odyssey g5 LS27AG502PPXEN


validation: https://valid.x86.fr/gqaur0
gwwak May 22 @ 11:29am 
X8 is usually not an issue if your motherboard supports the latest pcie generation. People tend to run into problems with older boards on 2.0 or 3.0. Though with boards that old, perhaps the cpu will also be a limiting factor.
VALORHEART May 22 @ 11:53am 
Originally posted by gwwak:
X8 is usually not an issue if your motherboard supports the latest pcie generation. People tend to run into problems with older boards on 2.0 or 3.0. Though with boards that old, perhaps the cpu will also be a limiting factor.

it gives more fps than my 3060 12gb in the games i play on the same settings, while not staggering amount, but it feels better, i can see that it loads the textures instantly rather than taking a second like the 3060. the game runs smooth with lower latency but i do get frame drops 20 fps when i engage in combat while playing, i think that has to do with the Drivers rather than hardware since the gpu is new.
its not about power consumption....its about gen 5 supporting the bandwidth needed at only 8 lanes....this gives you more PCi-E lanes for M.2 drives if done right....
I don't see anyone here explaining it so I think you should know something, OP: A video card running at PCI-Express 5.0 8x is getting the exact same performance as a previous generation card that ran at PCI-Express 4.0 but at 16x. There is no video card that exists yet today that can gain even a single 1% of performance moving from a PCI-Express 4.0 16x system -> PCI-Express 5.0 16x, even the RTX 5000 series.

As of right now a new video card using PCIE-5.0 @ 8x should have zero performance loss in any game.
_I_ May 22 @ 4:54pm 
5.0 x16 will help with the 1% lows, thats about it since it can get the data a tiny needed a bit sooner

and nvme 5.0 x2 is way more than enough bandwdith

the newer cpus have more than enough lanes for everything
Originally posted by _I_:
5.0 x16 will help with the 1% lows, thats about it since it can get the data a tiny needed a bit sooner
When comparing today's new video cards no, it will not. There is literally zero difference between PCIE-Express 4.0x16 and 5.0x8 for today's new video cards (like the RTX 5000 series). This has been demonstrated and proven. The new video cards get literally exactly the same FPS (or within margin of error, +/- 1-3 FPS) on either option.

Originally posted by _I_:
and nvme 5.0 x2 is way more than enough bandwdith
I think you should be reminded that this thread is discussing VIDEO CARDS, not SSD's.
Last edited by Ontrix_Kitsune; May 22 @ 5:38pm
_I_ May 22 @ 6:28pm 
4.0 x16 = 5.0 x8
5.0 x16 will have slightly better lowest of the lows
< >
Showing 1-15 of 114 comments
Per page: 1530 50

Date Posted: May 22 @ 9:29am
Posts: 114