-
Need ideas for speeding up SVP 12 renders on brand new Dell T7500
Posted by Stephen Crye on January 12, 2014 at 11:36 pmHi All;
Well naturally the new computer smell wore off pretty quick and now I’m wondering how to make it faster for Vegas ;-p .
One slam-dunk will be to change out the primary C: drive for a new SSD. Been doing a bit of research on performance and reliability/durability, and am thinking of a 500G Corsair Neutron GTX. This article and the associated ones are very interesting:
https://techreport.com/review/25681/the-ssd-endurance-experiment-testing-data-retention-at-300tb
Next on the list is a GPU that will do better than my somewhat dated Nvidia Quadro 2000. In this post I put up some screen shots of the GPU utilization:
https://forums.creativecow.net/readpost/24/973030In browsing this forum and others, the consensus seems to be that the new high-end GTX cards are better than a “bigger” Quadro, and cost less. Sony itself seems to endorse the GTX 570 here:
https://www.sonycreativesoftware.com/vegaspro/gpuaccelerationBut, there are posts that show that sometimes the newer GTX cards either don’t improve things or actually are slower!
I can go down to the local Tiger Direct store and buy this or a better Nvidia GTX series card:
https://www.tigerdirect.com/applications/SearchTools/item-details.asp?EdpNo=4501123
This page shows benchmarks for rendering with Adobe, and has thousands of results – the Quadro does not rank as high as the GTX. https://ppbm5.com/DB-PPBM5-2.php
My question for the group, any experience with the GTX series vs Quadro?
I want to stick with Nvidia.
Thanks!
Steve
Win7 Pro X64 on Dell T7500, MultiTB SATA, 8GB RAM, nVidia Quadro 2000, Vegas 12, 11, 10, 9 DVDA 6.0 & 5.2(build 135) Sony HDR-CX550V, Panasonic GH3 with LUMIX G X VARIO 12-35mm / F2.8 ASPH, LUMIX G X VARIO 35-100mm / F2.8
Thayalan Paramasawam replied 12 years, 3 months ago 4 Members · 25 Replies -
25 Replies
-
Dave Haynie
January 13, 2014 at 7:48 amThe GPU percentage in your graphs looks very low. Do you have OpenCL enabled in the Vegas options page? I typically see about 40-50% utilization of my AMD Radeon HD6970, and I saw up to 80% utilization on a nVidia GTX560… but it ran slower, so I went with the AMD. These were actually with Vegas 11, I have not done recent detailed benchmarks. But GPU still makes things faster, even on my 6-core i7-3930K. You didn’t post your Xeon configuration, but you ought to be in the same ballpark. Unless your machine is so fast Vegas just doesn’t use the GPU… not possiblem far as I know.
But it’s true an older card may not be as useful. As for the pro vs consumer cards, they always use the same GPU cores. Sometimes the chips are identical, more recently the pro parts have been internally the same but supported more RAM. They’re so similar that the type of chip, as seen by software, is set on nVidia boards by a couple resistors… it’s not baked into the chip, even at the packaging level.
That, of course, determines the driver software that will bind to the card. The pro cards have more stable drivers in general. They do more testing of OpenGL, and you don’t see a new driver rekease for every new game that comes along. The consumer cards do support OpenGL, do eventually get fixes from the pro side code base, but it’s been well proven that nVidia at least intentionally cripples OpenGL performance in software on consumer drivers.
Dies that matter? Some plug-in do use OpenGL, I don’t know that you’ll run into tge crippled operations or not. They will matter if you’re doing mechanical CAD or some other OpenGL heavy lifting. There are some articles that detail this, on ARStechnica and probably some other geek sites. The crippled code is always proven: the author does the same operation using different code (OpenCL, multiple passes of simpler ops) and always gets close if not exactly to the performance of the pro drivers. But I digress.
I know there’s been some contraversy about performance of the GTX6xx versus GTX5xx. No idea if that is still an issue with the latest GTX7xx series, but you’d kinda of hope nVidia has improved things since 2009. Obviously Vegas benchmarks would be best, next other video packages. Games and CAD are using the GPU differently.
I looked around last summer, when I upgraded my PC, and did not find anything that would boost GPU performance over my HD6970 that was any sort of a reasonable ROI. Vegas will only use one GPU, so dual cards or dual processor cards may help on OpenGL things, but not OpenCL, which is all of the rendering and compositing other than the 3D stuff. And yeah, some plug-ins still use nVidia’s CUDA rather than the OpenCL standard, so if you use such, you definitely want to stick to nVidia.
-Dave
-
Stephen Crye
January 13, 2014 at 6:15 pmHi Dave!
I have to chuckle because when I posted my question, I thought to myself , “I bet Haynie will be the 1st one to respond, with a bunch of great details …”
I am at work so this will be brief.
I know little about OpenCL, must read about that and try it – thanks! This site seems to indicate that it is supported : https://compubench.com/device-environment.jsp?config=18634814 , but the nVidia product page does not mention it: https://www.nvidia.com/object/product-quadro-2000-us.html
The T7500 has a single Quad-core Xeon 5500 series, the clock speed is pretty low – I think just over 2 GHz. Data bus is 64bit, the chipset is an Intel 5520 (I think). PCI-Express 2.0 x16 133 MBs . The ram is supposed to RDIMM ECC DDR3 1333MHz (I’m wondering about that because CPUID CPU-Z for some reason can’t see the SPD data on the 4 sticks and does not give details !?! ) , SATA 2.0 (makes me sad it is not SATA 3.0 …)
I might have a chance to test an nVidia Quadro K4000 in a few weeks. On paper it looks impressive, with ~ 1300 GBs of memory bandwidth vs the ~ 40 GBs for my Quadro 2000.
What you mentioned about the “pro” vs “consumer” nVidia cards is very interesting. Can you confirm that the Quadro is “Pro” ?
The driver I just installed is 331.x, but the newest one on the Dell site is in the 200’s . Probably just Dell being conservative, but I will double-check to make sure that I have the best driver – the nVidia download page has several choices – ODE Graphics driver, Performance Driver, Partner Certified Driver, AutoCad Performance Driver, etc. Any hints?
For example, this is the page for the “performance driver” https://www.nvidia.com/download/driverResults.aspx/71899/en-us
vs this one for the “Optimal Drivers for Enterprise ” https://www.nvidia.com/download/driverResults.aspx/71895/en-us
Thanks again, Dave!
Steve
Win7 Pro X64 on Dell T7500, MultiTB SATA, 8GB RAM, nVidia Quadro 2000, Vegas 12, 11, 10, 9 DVDA 6.0 & 5.2(build 135) Sony HDR-CX550V, Panasonic GH3 with LUMIX G X VARIO 12-35mm / F2.8 ASPH, LUMIX G X VARIO 35-100mm / F2.8
-
Stephen Crye
January 14, 2014 at 2:15 amHI;
OK, home now, did some checking.
First, even though the nVidia driver is certified as OpenCL capable, and GPU-Z reports that fact, I can’t find anywhere in SVP12 to set it to use OpenCL. Here are some screen shots:
What is also puzzling is that SVP12 while rendering Sony AVC does not seem to be CPU, GPU RAM or I/O bound. I did not screen grab the IO graph, but it is very low)
Frustrating to have what appears to be unused capacity in the system.

Still using the 331 drivers – might test the most recent 332 later tonight.
Any comments much appreciated!
Steve
Win7 Pro X64 on Dell T7500, MultiTB SATA, 8GB RAM, nVidia Quadro 2000, Vegas 12, 11, 10, 9 DVDA 6.0 & 5.2(build 135) Sony HDR-CX550V, Panasonic GH3 with LUMIX G X VARIO 12-35mm / F2.8 ASPH, LUMIX G X VARIO 35-100mm / F2.8
-
John Rofrano
January 14, 2014 at 1:06 pm[Stephen Crye] “But, there are posts that show that sometimes the newer GTX cards either don’t improve things or actually are slower!”
If I understand the problem correctly, it’s that NVIDIA changed their API for the 6xx 7xx Kepler series cards from the 5xx Fermi series. If you use the old Fermi API (as Vegas Pro does) on a Kepler card you will only gain access to 1/4 of what the card is capable of. This is why 6xx & 7xx series cards are actually slower with Vegas than 5xx series cards.
[Stephen Crye] “What is also puzzling is that SVP12 while rendering Sony AVC does not seem to be CPU, GPU RAM or I/O bound. I did not screen grab the IO graph, but it is very low) “
I am seeing the exact same thing and it’s very disappointing. I cannot explain it. I’m doing a render and my 6 core 12 thread CPU is at 11% and my Quadro 4000 GPU is at 15% and there is practically no I/O and I’m sitting there wondering what on earth is Vegas doing because it’s certainly not making use of all the hardware sitting in front of me. If anyone figures it out I’d love to know why.
~jr
http://www.johnrofrano.com
http://www.vasst.com -
Stephen Crye
January 14, 2014 at 6:06 pmThanks for jumping in here, John!
Sorry to hear you are in the same painful boat with regard to SVP not using system resources.
You are one of the few people I know who have a Quadro. This begs several questions:
* I’m thinking of getting a Quadro K4000 – thoughts? What flavor of 4000 do you have?
* I assume you are running a newer driver that is OpenCL capable. Dos SVP on your system use CUDA or OpenCL? How do you select it (see me previous posts for screen shots of my confusion)
* Is there a similar fermi/kepler thing happening from Quadro 2000 >>> 4000?Thanks,
Steve
Win7 Pro X64 on Dell T7500, MultiTB SATA, 8GB RAM, nVidia Quadro 2000, Vegas 12, 11, 10, 9 DVDA 6.0 & 5.2(build 135) Sony HDR-CX550V, Panasonic GH3 with LUMIX G X VARIO 12-35mm / F2.8 ASPH, LUMIX G X VARIO 35-100mm / F2.8
-
Dave Haynie
January 15, 2014 at 2:55 pm[Stephen Crye] “First, even though the nVidia driver is certified as OpenCL capable, and GPU-Z reports that fact, I can’t find anywhere in SVP12 to set it to use OpenCL. Here are some screen shots:”
Oh, sorry… when it says “GPU acceleration of video processing”, that’s OpenCL. Vegas only uses OpenCL, and nVidia basically merged OpenCL into their CUDA drivers. So they say “OpenCL compatible” or something, but don’t say much else about OpenCL.
The old pre-Vegas-11 acceleration, set only in the rendering panels, were CUDA-only.
You are getting 79.8% CPU use, and your GPU doesn’t have much work to do, at least from the look of things. Try rendering that project with/without GPU (the above “GPU acceleration…” setting switched on and off) and see what this does to your CPU use. And which one renders faster.
Here’s the back-story. There’s some overhead for using the GPU. The OpenGL system actually runs a compiler — Vegas sends a higher level description of the problem to your nVidia OpenGL system, and it compiles that down to a program that can run on the hundreds of nVidia processors, which of course are different than the hundreds of processors you find on an AMD card, or even a bit different than those you’ll find on a different generation of nVidia chip.
That compiled version then gets sent to the GPU. This is a “loosely coupled” multiprocessing job — the code is sent, the GPU wakes up, does the work, then has to signal your main CPU that something’s finished, etc. In short, there’s some extra work to do, and some waiting by the CPU that might otherwise turn into useful work by that CPU. You don’t care if the GPU does the job faster — bottom line for me is rendering time, not how much of each processor’s time is being used. Of course, I’d feel better if both CPU and GPU went to 100%, but that would take a very, very deep pipeline in Vegas… it would have to be processing many things at a time, to give the CPU work to do while waiting for GPU tasks to finish, etc. I expect these things to improve generationally, but 100% of all resourses is a hard one to achieve.
The rendering project Sony put out for Vegas 11 (the car ad thing) is a really good test for GPU acceleration, because it’s able to use the GPU for in-project compositing as well as video rendering (CODEC dependent, of course). I still see a significant boot my my HD6970 on these, even with the much faster 6-core i7 these days (when Vegas 11 first came out, I had a 6-core AMD… much slower).
-Dave
-
Dave Haynie
January 15, 2014 at 3:39 pm[John Rofrano] “[Stephen Crye] “But, there are posts that show that sometimes the newer GTX cards either don’t improve things or actually are slower!”
If I understand the problem correctly, it’s that NVIDIA changed their API for the 6xx 7xx Kepler series cards from the 5xx Fermi series. If you use the old Fermi API (as Vegas Pro does) on a Kepler card you will only gain access to 1/4 of what the card is capable of. This is why 6xx & 7xx series cards are actually slower with Vegas than 5xx series cards. “That’s kind of a surprise… not so much the API changes, that happens. But OpenCL is supposed to avoid that, being higher level than CUDA (you can also run OpenCL on native x86 … or on FPGAs). Sony’s only using OpenCL for the in-program rendering, no more CUDA specifics. Are they claiming that nVidia hasn’t really supported OpenCL using the full Kepler architecture?
So I Googlez this… yeah, it’s apparently that, or at least in part… CUDA 4.0 seems to drop OpenCL performance by 50% compared to CUDA 3.x. And some folks claim that OpenCL has not been optimized for Kepler, at least not yet. I found a review on Tom’s Hardware of the GTX660/670 — consumer market Kepler-based devices — that covers a bunch of OpenCL benchmarks. It’s interesting:
https://www.tomshardware.com/reviews/geforce-gtx-660-geforce-gtx-650-benchmark,3297-19.htmlAlso, watch your card, because nVidia’s not big on naming changes. The Quadro 4000 is a Fermi-based GPU card, and should see none of the problems that may or may not exist specific to the new architecture (but possibly if CUDA itself has broken OpenCL performance). The Quadro K4000 is the card with the Kepler-based chip.
If you look at gaming or sometimes OpenGL benchmarks, the nVidia is usually leading the AMD. But on the majority (but not all) of the OpenCL benchmarks, AMD wins.. sometimes by crazy numbers (they include the Quadro 5000 and 6000, but not the 4000) One benchmark even has my HD6970 come out on top, even compared to the new AMD 79xx series. Go figure… that was $300 new, and today… cheap. And look at the “image surfacing” benchmark — the only explanation is that nVidia’s OpenCL there is horribly broken, an HD7970 should not run 29x faster than a Quadro 6000 only ANYTHING.. the Quadro 6000 is just too expensive to allow that to happen.
With that said, they have some video-oriented OpenCL benchmarks which greatly favor the Kepler chips. That may well be a place Sony needs to work on. Though OpenCL seems to change slower than CUDA. Open CL 1.1 was released in mid 2010, 1.2 was released in late 2011, and 2.0 came out last summer. No indication of the level these benchmarks are using, but given the review date, definitely prior to the 2.0 version.
As for your specific observation, John.. the only time I’ve seen such a slowdown on my system was when I had a single-threaded plug-in in a critical path. That’s really easy to see on a CPU monitor… only one CPU working. And of course, if it goes away when you switch off GPU rendering, you pretty much prove that it’s a big in the GPU drivers of some kind.
It sure seems, looking at various benchmarks, that nVidia isn’t giving priority to OpenCL, which may not be a shock given their devotion to CUDA. But worse still, they seem to be behind on OpenCL in the professional drivers. That’s not what I would expect, since there really aren’t many plain old consumer applications using OpenCL. They ought to be giving OpenCL — and CUDA — the similar attention as they give OpenGL in those drivers, and let that filter down to the consumer cards. Doesn’t seem to be the case here.
-Dave
-
Stephen Crye
January 15, 2014 at 10:56 pmThanks Dave!
I really appreciate the time you and others take to read my posts and reply in such detail.
I hope to do the Vegas rendering project test tonight, I have the files, just need to set them up. I will also do some GPU-off tests.
When you wrote: ” The OpenGL system actually runs a compiler — Vegas sends a higher level description of the problem to your nVidia OpenGL system.”, did you mean OpenCL?
Thanks for the assurance that Vegas is using OpenCL even though that is not explicitly stated anywhere.
Stand by for more results, and thanks again for keeping alive a thread that is getting old.
I hope John replies with his thoughts on the Quadro 2000 vs K4000 question. Although in a few weeks I will have a chance to test, it would mean ripping out the 2000, replacing with the K4000, installing drivers and all that. When I have to return the card I will have to undo all that, makes me nervous.
My Corsair Neutron GTX 480GB SSD drive will be here on Friday – hopefully the clone of the existing C: drive ( a WD Blue 500GB) will go well. After that I plan to use the WD in an external bay for periodic GHOST disk-disk clone backups of the C: drive.
Steve
Win7 Pro X64 on Dell T7500, MultiTB SATA, 8GB RAM, nVidia Quadro 2000, Vegas 12, 11, 10, 9 DVDA 6.0 & 5.2(build 135) Sony HDR-CX550V, Panasonic GH3 with LUMIX G X VARIO 12-35mm / F2.8 ASPH, LUMIX G X VARIO 35-100mm / F2.8
-
Stephen Crye
January 16, 2014 at 5:42 amWell I completed the tests using the Sony Vegas “Mercedes” benchmark project.
I’m starting a new thread for those results and the problems I discovered. All I will say about that here is I am very disturbed!
John, still hoping you will commend on the Quadro 2000 vs K4000 question I posted before.
Thanks,
Steve
Win7 Pro X64 on Dell T7500, MultiTB SATA, 8GB RAM, nVidia Quadro 2000, Vegas 12, 11, 10, 9 DVDA 6.0 & 5.2(build 135) Sony HDR-CX550V, Panasonic GH3 with LUMIX G X VARIO 12-35mm / F2.8 ASPH, LUMIX G X VARIO 35-100mm / F2.8
-
John Rofrano
January 16, 2014 at 3:19 pm[Stephen Crye] “* I’m thinking of getting a Quadro K4000 – thoughts? What flavor of 4000 do you have?”
I have a Quadro 4000 Fermi card. It has been very stable. It’s not the fastest for the money, but I haven’t had any of the stability problems or many GPU rendering problems that others have reported. I did have a few GPU rendering problems where I had to disable GPU but in the latest Vegas Pro 12.0 770 build I have it back on again. In other words, I didn’t spend good money on GeFoece GTX 760 only to have to disable it because GPU rendering is messed up. I use Boris Continuum Complete and Boris RED a lot which uses OpenGL so I’m not as concerned about the OpenCL problems. The Quadro 4000 works well with Boris RED and BCC.
The real question is, “would I buy one again?”. I guess I would reluctantly say yes. I say “reluctantly” because IMHO the price is outrageous just for video editing but this is my business so I get to write it off as a business expense. 😉 It’s not like you immediately say, “I would buy this again in a heart beat!”. I would expect an $800 video card to be stable and fast. the Quadro series is only stable. You want fast, get a GeForce. You shouldn’t have to pay that much just to get stable drivers and have to sacrifice performance. Luckily, I’ll never have to make that decision again because my next computer will be a Mac Pro and those have AMD cards which, as David pointed out, are blazing fast with OpenCL.
[Stephen Crye] “* I assume you are running a newer driver that is OpenCL capable. Dos SVP on your system use CUDA or OpenCL? How do you select it (see me previous posts for screen shots of my confusion)”
There is no way to select so I don’t know what Vegas is using under the covers. I just turn GPU acceleration on and Vegas does it’s thing.
[Stephen Crye] “* Is there a similar fermi/kepler thing happening from Quadro 2000 >>> 4000?”
Yea, the K2000 and K4000 are the Kepler versions. The 2000 and 4000 are the older Fermi versions. Kepler didn’t exist when I bought my card. I’m not sure I would buy a Kepler card to use with Vegas if what David said is true, that NVIDIA has poor support for OpenCL and Vegas relies on OpenCL. It seems the AMD cards might be a better choice for Vegas users, although the last AMD card I bought, I swore I would never buy another one because the drivers were so poor. I don’t know if they’ve gotten any better. Also AMD and Sony are business partners so you can bet Sony is going to have good AMD support in their products.
~jr
http://www.johnrofrano.com
http://www.vasst.com
Reply to this Discussion! Login or Sign Up

