Activity › Forums › VEGAS Pro › Building a $900-$1000 Intel+Nvidia PC. Should I go with an I5 or I7 for Sony Vegas+Gaming??
-
Building a $900-$1000 Intel+Nvidia PC. Should I go with an I5 or I7 for Sony Vegas+Gaming??
Heinrich Himmel replied 11 years, 5 months ago 7 Members · 15 Replies
-
Sonic 67
November 20, 2014 at 12:01 amMainConcept does not support newer nVidia cards, based on Kepler (and Maxwell) generation.
See here: https://www.mainconcept.com/products/sdks/gpu-acceleration-sdk/cuda-h264avc.html#product_page-5
“Boards with Kepler architecture are not supported.”
Your GT 720 is based on Kepler.Look on eBay for a Fermi generation card. I am using now a GTX480 modded in Quadro 6000, have used previously a Quadro 600 and Quadro 2000 with success, I think that a GTS450-GTX480 should work too.
-
Dave Haynie
November 20, 2014 at 5:44 am[John Rofrano] “I have also read that the AMD can’t come close to the Intel Core i7’s for video processing. I would not recommend an AMD processor to anyone doing video work.”
I agree. I came to my current i7-3930K from the AMD 1090T. Sure, that’s an older system than the current AMD “Piledriver” architecture.. but not necessarily in a bad way. Video is pretty dependent on floating point performance… the Phenom II 1090T six-core processor has six floating point units, versus four on the AMD FX-8350. I’ve heard claims the FX-8350 is a good match for some of the i5s, particularly if you overclock it (not recommended, but it can be done reliably if you spend lots of time and perhaps money on cooling), but probably not for gaming. Last I checked, most games these days were able to keep four cores pretty busy, but probably not eight. Maybe that’s changed recently, but that was largely believed to be the case when I built my son’s gaming PC a few years ago. Also the reason the X-Box 360 had three cores (six threads). Of course, with both X-Box One and PS4 being based on 8-core AMD Jaguar chips, maybe this is shifting.
The other big advantage of the i7-39xx/49xx (LGA2011) is that it’s a much more capable chip wrt. I/O. All those cores can easily thrash the typical dual-64-bit DRAM bus, so this series has four 64-bit DRAM buses. And a chip-wide total of 40 PCI Express links, versus 16 on the Intel Socket 1150/1155 (usually accompanies by an extra 6 or 8 implemented on the I/O hub chip), or AMD, which supports up to 22 or 38 PCI Express links, but all implemented in the I/O and funneled through the 16 HyperTransport lanes of the AM3/AM3+ socket.
I will say that for video acceleration in Vegas, I have got a better bang/buck with AMD than nVidia. I believe that’s at least in a bit way because AMD is dedicated to OpenCL support, whereas nVidia seems to treat it more as a hobby or necessary evil. Also, it’s pretty clear that nVidia artificially slows select OpenGL and OpenCL operations, in order to differentiate consumer, professional, and computing versions of their GPU boards, all which use essentially the same chip cores. It wouldn’t be a shock to find AMD doing this too, but so far, I haven’t seen a documented case of this, where it’s been pretty well proven for nVidia. I do not know if this is specifically an issue in everyday video rendering (aside from the Main Concept AVC issues, which are NOT, I repeat, NOT, the fault of either GPU vendor or directly the fault of Sony — though Sony’s certainly to blame for not offering a viable alternative), but it’s very real.
-Dave
-
Heinrich Himmel
November 20, 2014 at 11:46 amRoger, that GT 720 is not meant to be used for heavy GPU tasks. It is a basic GPU. While the Fermi cards are great, at this point they are considered antiques and will have a higher price (because they are nearly impossible to find new).
For overall performance, I would recommend anything in the AMD R9 series. Reason being that the extra vRAM is sometimes helpful. Render-wise, I believe any AMD R7 will pretty much max out performance. Look out for deals. Last week the Vapor-X R9 290x was only $229!!! That is the best version of the 290x you can find. For AMD cards, get Sapphire or Gigabyte. Those are probably the best (although there are others that are good as well) and stay away from reference design coolers.
-
Sonic 67
November 20, 2014 at 7:13 pmFermi cards are plenty on eBay, cheap. Nothing wrong with an used Fermi card. I think that newer ATI don’t work in MainConcept either…
https://www.mainconcept.com/products/sdks/gpu-acceleration-sdk/opencltm-h264avc.html#product_page-5
AMD Radeon™ HD Graphics
6900 Series (6970, 6950)*, 6800 Series (6870, 6850)*
ATI Radeon™ HD Graphics
5900 Series (5970)**
5800 Series (5870, 5850, 5830)*
5700 Series (5770, 5750), 5600 Series (5670), 5500 Series (5570)
ATI FirePro™ Graphics
V8800*, V7800
-
Heinrich Himmel
November 21, 2014 at 2:51 pmYou are correct that they are not supported, but something is happening. My R9 280x averages 25% utilization during renders with peaks of 48% using Mainconcept Blu Ray (note that I typically use Sony AVC for Blu Ray). However CPU usage is 100%! I think renders will speed up although not to the extent that the HD 6xxx series.
For comparison, using Sony AVC, renders take 60% of the time and GPU utilization is 35% with peaks at 42%. CPU utilization averages 30% with peaks of 35%. I always used Sony AVC because it is faster than Mainconcept even with my Fermi GPU.
Recently an HD 6970 was being sold by newegg on ebay new for $130. That is a good deal and the fastest Mainconcept approved GPU, but timeline acceleration works better on the newer AMD cards (not a big deal unless you pile on the effects).
As for used Fermi cards, that is pretty much the only option for them for a reasonable price.
Reply to this Discussion! Login or Sign Up