Creative Communities of the World Forums

The peer to peer support community for media production professionals.

Forums Maxon Cinema 4D Network speed

  • Network speed

     Rob Ainscough updated 2 months ago 2 Members · 6 Posts
  • Andy Kiernan

    October 3, 2021 at 1:18 pm

    Another question for the pros…

    Currently, I have 2 x PC’s both with 4 x GPUs inside them, both PCs are connected via 1Gb switch to each other and a storage server…

    Would anyone know, if there might be a better configuration? I’m speaking specifically about speeds when C4D octane uses the 2nd PC as a render node.

    I tend not to use the 2nd node when rendering light scenes as the transfer time (from master to slave) can take longer than the render, which is fine, I understand that bottleneck I guess!

    But I’m wondering, for larger scenes, if there’s anything to be done to help speed the transfer up a bit?

    Unfortunately, there are no free PCi lanes to add 10Gb ethernet. (everyone else in the studio has a Mac and is on a 10Gb network 😢)

    I think I know the answer to this, but maybe someone else has had similar thoughts and actually knows what they’re doing?!

    Thanks

    AK

  • Rob Ainscough

    October 3, 2021 at 6:28 pm

    Depends on how fast the “2nd PC as a render node” can render.

    For reference:

    1 Gbps = 125 MB/s

    10 Gbps = 1250 MB/s

    100 Gbps = 12500 MB/s

    M.2 NVMe (980 Pro local) = 7000 MB/s

    Latency will be much higher in any Network environment vs. local M.2, but latency will not be relevant to the task of rendering output to file.

    The bottleneck will primarily be the CPU cores hence the dependency of “how fast” is your render node at the CPU level … in my experience “render to file” (or stream) on a very fast PC can generate about 300 MB/s I/O load.

    Without knowing much about your “2nd PC”, I would still suggest that 10 Gbps should be sufficient I/O performance.

    You’ll need 10Gbps LAN cards, 10Gbps switch, Cat 6 or higher cable, and your cable run length is limited to about 50m (164ft).

    Cheers, Rob.

  • Andy Kiernan

    October 4, 2021 at 7:45 am

    Thanks, Rob, much of this is what I thought.

    Both PCs have Intel(R) Core(TM) i9-9900X CPU @ 3.50GHz 3.50 GHz.

    Yes unfortunately I’m unable to put 10Gbps cards in as there’s no room, too many graphics cards! A real shame since we have a 10gig switch and cabling already installed (for the editor’s Mac systems)

    Thanks for your help mate.

  • Rob Ainscough

    October 5, 2021 at 4:23 am

    Do you have an existing USB-C or Thunderbolt port? If yes, you can connect an external 10Gbps network adapter.

    Cheers, Rob.

  • Andy Kiernan

    October 5, 2021 at 7:31 am

    Hi Rob,

    Sorry, I should have said, no, only one of the units has a USB-c neither have a thunderbolt.

    I think my only hope in this regard is to either swap the graphics cards for 3090s (if they do really exist) and free up some space for a new network card, or maybe get a new case/ motherboard.

    Thank you for all your input, i hope one day to get a handle on all this network stuff!

    Cheers

    AK

  • Rob Ainscough

    October 5, 2021 at 5:11 pm

    Yes, getting a 3090 “at retail” price is very difficult, took me 11 months to get ONE EVGA 3090 and I signed up on their waiting listing the day they became available. I refuse to pay scalper prices and even some of the pre-built system markups.

    I’m currently running one Titan RTX 24GB on a AMD 5950X and it’s MUCH faster than my backup render PC (7900X with Titan X (pascal)).

    If you haven’t already, be sure to get the latest CUDA support from nVidia here: https://developer.nvidia.com/cuda-downloads

    Cheers, Rob.

Viewing 1 - 6 of 6 posts

Log in to reply.

We use anonymous cookies to give you the best experience we can.
Our Privacy policy | GDPR Policy