Creative Communities of the World Forums

The peer to peer support community for media production professionals.

Activity Forums VEGAS Pro Encoding Time – HDV vs AVCHD

  • Encoding Time – HDV vs AVCHD

    Posted by Steve Hardeman on September 28, 2010 at 12:46 am

    I’ve been recording to tape via a Canon HV30, uploading to HHD, editing, then encoding. The HV30’s obviously record in HDV. m2t format. I have been able to encode a 30 minute video in about 35 mins at 720 x 480 x 1500kbps WMV. And encode a 320 x 240 x 500kbps WMV version in 10 minutes.

    I’ll spare you my computer specs as I don’t yet think they’re relevant to this question just yet.

    I recently bought two Sony Camcorders that record in AVCHD. Rather than firewire the content onto my HDD, the convenience of copy/paste was nice. And it appears the file size is about half of m2t which again, is nice.

    Anyway, I edited my vid. Everything going well. The files have an m2ts extension. Again, it’s a 30 minute video but the encoding time for the 720 x 480 x 1500 kbps WMV took 3.5 hours. I chocked that up to just the way it’s going to be with the new file type. But, then I encoded the 320 x 240 x 500kbps WMV version and it also took 3.5 hours to encode.

    Is this just the way it’s going to be or am I doing something glaringly wrong?

    To add insult to injury, when I take the larger WMV file and, using a third party software, try to convert it to a bit rate 400 kbps mp4 file, it actually makes the final file about 20% LARGER than the 1500kbps wmv file. This is more of a side questions if you happen to know the answer. 🙂

    You can probably tell I’m fairly ignorant about this stuff so I apologize in advance if the solution is obvious and I just didn’t see it or if I’m using the wrong terminology. But any help you can offer would be appreciated.

    Thanks.

    Steve

    Dave Haynie replied 15 years, 7 months ago 3 Members · 2 Replies
  • 2 Replies
  • John Rofrano

    September 28, 2010 at 1:24 pm

    It sounds like the majority of the time might be spent decoding the AVCHD (which would be the same on both renders) rather than encoding the WMV. Try using the Sony AVC codec and see if the times are different.

    ~jr

    http://www.johnrofrano.com
    http://www.vasst.com

  • Dave Haynie

    September 29, 2010 at 4:35 am

    Here’s the thing… your computer specs do matter. Any time there’s a question of rendering times, it’s good to know what you’re using to render. I’ve used Vegas since version 1 (eg, before video was supported), on many different PCs. I’ve been making HD videos for about eight years, and Blu-Ray disc for 2.5 years. If I know your hardware, I’ll know immediately if you’re experiencing “expected behavior” or “weird-ass mystery”.

    We normally don’t think of decoding time as relevant, but it sure sounds like decoding time is an issue here, unless you have your .m2ts files on some very slow media (eg, not copied to your SATA or perhaps PATA hard disc drive).

    You think of video running in realtime or better. If I can play a video from the camcorder at full 1080/60i (or whatever) and use 8% of my CPU, it standard to reason that I could play it 10x faster, and still have 20% of CPU time left over for other fun.

    But AVC in HD is a pretty mean CODEC. Depending on the decoder, I have seen multithreaded software decoders take up to 50% of the CPU on my desktop: an Intel Core 2 Quad at 2.83GHz… that’s 50% of all four cores. It’s dramatically faster on playback using GPU-accelerated playback… down to under 8% CPU.

    So I don’t really know how efficient Vegas is as decode, but it’s not using GPU acceleration. If you guesstimate it’s taking 50% CPU decode, that means you can’t possible encode a 1/2 hour video in less than 15 minutes — you’d need that time for decode, plus the encoder time. Now, again, using that number, if Vegas isn’t pipelining the decode across all four CPUs, it would actually take a whole hour of decode time for that 1/2 hour video on just one CPU. IF that’s the case (not saying it is), you might also starve the other cores, even when rendering to a parallelized CODEC like MPEG-2 or AVC.

    Now imagine going to a single core older machine… you might well spend hours at the decode and encode. Naturally, any CPU time spent on decode-of-input video is time not spent on encode-to-output video. MPEG-2 is a much simpler CODEC .. standard-def MPEG-2 decoded just dandy on even 150-200MHz machines. HDV would take about 4x longer, but it’s pretty clear that decode time of HDV is not so significant to rendering time. And of course, any standard-def stuff you’re rendering is very easy to decode, particularly DV.

    Another thing is the WMV rendering… everyone uses Microsoft’s built-in WMV pipeline, which I suspect is not as easily managed as built-in CODECs like AVC. I wonder, for example, if there’s any multithreaded rendering going on there. If you have an SMP machine, it’s easy to tell: just fire up the system monitor and look at your CPU use. An efficient render will have all N cores lit up at 100%. If you see any less, there’s something wrong with your setup: single threaded rendering, HDD thrashing, low memory, etc.

    -Dave

We use anonymous cookies to give you the best experience we can.
Our Privacy policy | GDPR Policy