Thank you for being a valued part of the CNET community. As of December 1, 2020, the forums are in read-only format. In early 2021, CNET Forums will no longer be available. We are grateful for the participation and advice you have provided to one another over the years.

Thanks,

CNET Support

Question

20 PCIe lanes + Multiple GPUs?

Aug 15, 2018 10:49PM PDT

I've read that CPUs have PCIe lanes and that those are used by the graphics card.

1.- So if a processor has 20 PCIe lanes, does that mean I can only have one x16 (or 2 xCool graphics card?

2.- Does that mean if I want two x16 I need a 40 PCIe lanes processor?

3.- If that's the case, why does having multiple graphics cards on a 20 PCIe lane processor actually makes a difference while GPU rendering?

(About the third question, here's a post showcasing it)

https://www.cgdirector.com/best-har...-redshift-vray/

4.- If it doesn't affect negatively having multiple graphics cards on a 20 PCIe lane processor while doing GPU rendering, then, when does it actually matter? (negatively)

Thank you

Discussion is locked

- Collapse -
Clarification Request
Dead link.
Aug 16, 2018 9:33AM PDT

The fact is that the PCIe lanes are rarely saturated so this is why PCs don't need as many lanes as you posit. Lanes can be shared. To benchmark this load may have you searching for tools I have never looked for since I've rarely run into possible PCIe bottlenecks,

More common is that a gamer builds a PC but opts for a single stick or RAM. In today's PCs most have dual channel RAM which if they setup as single does create a bottleneck for both CPU and GPU.

- Collapse -
Link
Aug 16, 2018 10:16AM PDT
- Collapse -
I won't say that.
Aug 16, 2018 12:14PM PDT

There are apps that don't tax the PCIe lanes such as Bitcoin mining. There are games that really push a lot of data over the PCIe bus so dual x16 wins in that case but it's rare since game companies would get a lot of complaints if they did that. The RAM on the GPU works around that by holding the textures in the GPU memory (RAM.)

Again we are back to benchmarks, testings and measurement tools. There are boards with a lot more lanes and you fit the higher end CPUs with more lanes to eek out the most.

All that aside there's nothing bad about a single GPU like the usual 1080 Ti with the i7 and dual channel 16GB DDR4 and SSDs.

What are you attempting that needs more?

If it needs more, then it's usually time to set up a "compute farm."

- Collapse -
Thanks
Aug 16, 2018 2:10PM PDT

That's some nice info Happy

And the main reason of my question is because I'm looking for parts to build my PC, mostly for 3D rendering, so while I'll begin with a single GPU I would like to have a second GPU in a future to shorten the render times.

Then I found about the PCIe lanes thing, and had me wondering if I should go for a 40 PCIe lane CPU so both my GPUs are on x16 mode, or if dual x8 would do.

I guess I'll go with dual x8 since I didn't find any significant difference on gaming and 3D rendering.

- Collapse -
I've been around the 3D render discussions since
Aug 16, 2018 2:22PM PDT

I think ACAD R12 in 1993 for DOS did a 3D render so we've come a long way.

Be sure your CAD software taps the GPU(s) for rendering but I've seen fairly large drawings render in under one second. And that was on lesser machines than I noted as a baseline (i7, etc.)

https://www.youtube.com/watch?v=_vs1SqkLrI0 stopped at the 1060 GPU but went to 32GB main memory and frankly spent too much on the RGB lighting.

My monster CAD choice would still be the short version i7 I noted but for monster rendering, move to the i9, 32GB RAM, 1080 Ti and all SSDs.

- Collapse -
Yeah :D
Aug 16, 2018 2:56PM PDT

That's a nice setup! I was actually between the upcoming i9-9900k and the Ryzen 2700x, but since GPU render engines don't benefit much from the CPU speed and on top of that I don't plan to OC anything I came to the conclusion that Ryzen its the one for me Grin

I'm going for 2700x, 32 RAM, 250 or 500 GB NVMe PCIe SSD, and either a 1080 ti or one of the upcoming GeForce cards (gamescom!!!)