Forum Replies Created

Page 12 of 12
  • Sonic 67

    October 28, 2014 at 3:56 am in reply to: Sony Vegas stuttering issue (both preview and rendered)

    Without specs of your system is hard to tell… First of all you have rendering by CPU only. That will be one reason.
    My settings using GPU:
    https://forums.creativecow.net/thread/24/982578#983039

  • Sonic 67

    October 27, 2014 at 11:37 pm in reply to: Best GPUs for Sony Vegas Pro 12.

    Newer cards (nVidia or ATI) won’t work well in Vegas.
    See what MainConcept has on their site (well, DivX site, since they bought them):

    CUDA Encoder
    NVIDIA graphics card with CUDA support (Professional – Tesla, Quadro 4000-series, FX, CX, NVS, QuadroPlex; Consumer – GeForce 8, 9, 100, 200, 400-series GPUs – with a minimum of 256 MB of local graphics memory card or 512 MB for 1920x1080p encoding). CUDA compute capability support only up to 1.3 (excludes certain GeForce 8800 models – GTS, Ultra. Compute capability 1.0 works in general for encoding, but has known issues. Boards with Kepler architecture are not supported.

    OpenCL
    Please note that the MainConcept OpenCL H.264/AVC Encoder only works with ATI/AMD graphics hardware. For NVIDIA GPU based acceleration, click here for the MainConcept CUDA H.264/AVC Encoder.
    Supported ATI Graphics Boards:
    AMD Radeonâ„¢ HD Graphics
    6900 Series (6970, 6950)*, 6800 Series (6870, 6850)*
    ATI Radeonâ„¢ HD Graphics
    5900 Series (5970)**
    5800 Series (5870, 5850, 5830)*
    5700 Series (5770, 5750), 5600 Series (5670), 5500 Series (5570)
    ATI FireProâ„¢ Graphics
    V8800*, V7800
    * – preferred as tested
    ** – supported in single-GPU mode

    As for the age of MainConcept encoders, see in this folder:
    C: Program Files SonyVegas Pro 13.0 FileIO Plug-Ins mcmp4plug2 (or the other folders starting with mc) and check the digital signatures.

  • Sonic 67

    October 27, 2014 at 12:55 am in reply to: Can’t import h264 MP4 , is there any way to?

    Because of this line:
    “Bit depth: 10 bits”

    But since it is looking that you are trying to rip/edit a commercial TV series (Steins Gate S01E02 – Paranoia of Time Leaps), I think this thread is against the forum’s TOS.

  • Sonic 67

    October 26, 2014 at 10:38 pm in reply to: Best GPUs for Sony Vegas Pro 12.

    MainConcept CUDA encoders (used by Sony Vegas) are optimized for nVidia older Fermi generation video chips.
    Same happens for ATI (with OpenCL), because MainConcept development stopped in Dec 2010.

    Malcom, are you using the right settings?
    For the Quadro 4000, “Max Rendering Threads” should be 8 (Fermi has 32 cores per each group), the rest should be like in my pictures.

  • This are the settings?

  • Sonic 67

    October 25, 2014 at 1:55 am in reply to: Best setting to use for Full HD youtube filmed videos
  • Sonic 67

    October 23, 2014 at 4:50 pm in reply to: Sony Vegas Pro 13 CUDA rendering GTX680

    What options do you have selected below, at encode:

  • Sonic 67

    October 22, 2014 at 8:48 pm in reply to: Graphic Card Benchmark for Sony Vegas Pro 13 or 12 ?

    You are right, I meant OpenCL.In my tests I see some minimal GPU utilization if I select OpenCL, as opposed to CPU only, but that might be due to playback of resulting video, not due to encoding itself.
    Main Concept encoders page states that OpenCL works only for ATI.

  • Sonic 67

    October 22, 2014 at 5:56 pm in reply to: GPU acceleration Sony Vegas 13

    That’s so sad. I still use a Fermi card especially because of this reason.
    Main Concept’s CUDA and OpenCL encoder files included in latest update of Sony Vegas 13 are signed “20 Dec 2010″…

  • Sonic 67

    October 22, 2014 at 3:20 pm in reply to: Graphic Card Benchmark for Sony Vegas Pro 13 or 12 ?

    “As a result, the Main Concept CODECs only support GPUs that were around in 2010 and perhaps up to mid-2011.”

    I have to add to this that for nVidia, that means Fermi-based cards.
    The Main Concept CUDA encoding works very well on those cards, but Main Concept OpenGL encoding doesn’t work well on nVidia (several times less GPU utilization).
    Also newer generation nVidia cards have the floating point capability FP64 crippled by design to 1/24 in Kepler and 1/32 in Maxwell (compared to FP32).
    Fermi had 1/8 for gaming cards, up to 1/2 in the professional line.
    I think that makes a difference in the encoding process.

Page 12 of 12

We use anonymous cookies to give you the best experience we can.
Our Privacy policy | GDPR Policy