Forum Replies Created
-
Sonic 67
March 5, 2015 at 3:13 am in reply to: Nvidia vs AMD GPU’s in a Vegas laptop… what do we know…?I have GTX 960 in my PC. I had Radeon HD7970 GHz Edition (rebadged now as R9 280X), GTX480, Quadro 2000, GTS450.
Sadly only the GTX480, Quadro 2000, GTS450 where used by Vegas fully.
GTX960 has integrated engines for hardware acceleration of video decoding (up to 4K) and encoding (again up to 4K).
Sadly Vegas has no clue about this – it is the only GPU from the above that doesn’t get recognized by Vegas.
For example, CyberLink’s PowerDirector does.OpenCL? Sure, it works. But to me, the transitions are just minimal time gain, my six core (12 thread) can go trough those very quickly anyway.
The only saving grace might be Intel QuickSync, that seems to be supported by Vegas inside Sony encoder. But my CPU doesn’t have that option (doesn’t have a video integrated) so I cannot report is works or not.
-
Sonic 67
March 4, 2015 at 11:45 pm in reply to: Nvidia vs AMD GPU’s in a Vegas laptop… what do we know…?None of the newer video cards is supported fully by Vegas. They are stuck in 2010 with their software and they keep raking up the $$$. The best cards that can fully use the Vegas capabilities are the Fermi generation (on nVidia) or HD69xx (on AMD).
Sure John will jump in and say that “AMD all the way” but that is just not fully true. Is just his preference because at some point Apple decided to go AMD for their mainstream computers (though have nVidia upgrades).
In my opinion, stop wasting money on laptops for video editing. Get a decent desktop – it will provide more bang for the buck, it can be upgraded later, and will be a longer term investment than a laptop.
-
Sonic 67
February 25, 2015 at 2:57 am in reply to: What deinterlace method for progressive from the beggining to the end.You got it wrong. The 60i will not be called deinterlaced, it will be interlaced.
A de-flicker filter will apply a softening on image in order to eliminate flicker.
https://en.wikipedia.org/wiki/Three-two_pull_down -
They don’t scale so good, because of thermal limits – the frequency is reduced when more cores are in use.
https://images.anandtech.com/doci/8730/14C%20Frequency%20Response_575px.png
-
Sonic 67
February 18, 2015 at 11:29 pm in reply to: upgrade for integrated Intel Graphics 4000 for under $200?[Randy Brown] “Wow really? I wouldn’t mind using the best… especially for THAT price!”
Really. I had a 7970 GB edition in my PC (the same card actually with R280X, ATI just rebranded it).
It is NOT recognized by the encoders that Sony Vegas offers (last updated in Dec 2010), so no hardware accelerated encoding with those fancy cards. Highest ATI model that is recognized is the one that I just typed above.
On nVidia side – any Fermi generation card and not newer.Vegas is behind the curve here – I have acquired also CyberLink PowerDirector 13 (for 1/6 of the Vegas price) and guess what? It recognizes all the new video cards (nVidia, ATI, intel) and can use them to do hardware accelerated encoding even at 4K resolutions – if the card supports it!
-
Sonic 67
February 18, 2015 at 7:13 pm in reply to: upgrade for integrated Intel Graphics 4000 for under $200?[John Rofrano] “replacement.”The AMD Radeon R9 280 is a great card for around $200 and the 280x is just a bit over $200.”
Actually they are both overpriced and don’t work fully in Vegas. A Radeon HD6970 is the best card that you can use in Vegas today. On eBay is $100-125. -
1. Complain to Sony that new video cards are not supported in the hardware encoding. Ask for some money back due to false advertising.
2. Sell the x2 video card, Sony doesn’t use dual GPU’s anyway. 3. Purchase from eBay an older video card that is supported by Main Concept outdated encoder. The list of supported cards can be found here:
ATI
https://www.mainconcept.com/products/sdks/gpu-acceleration-sdk/opencltm-h264avc.html#product_page-5
NVIDIA
https://www.mainconcept.com/products/sdks/gpu-acceleration-sdk/cuda-h264avc.html#product_page-5 -
The VBR saves space versus CBR when the material has portions with fast action, many small details alternating with portions of slow movement and less details. Noise on image it is seen as “detail” and affects negatively the compression.
Now, of course that if you use CBR and VBR wit same rate will average at the same size.
The idea is that VBR can be set at a lower average rate, since the resulting maximum rate will still be sufficient for scenes with heavy movement or details.
In your case a VBR of 1500-2500 might result in same video quality as CBR 3500. -
Is the original file encoded 2-pass VBR (and therefore the value would be an average bitrate)?
Is the output file 1-pass CBR (and bitrate would be constant, maximum)?
Then all makes sense…
Encode your new file with VBR too. -
Regardless what Sony says you need to apply some common sense to your purchase.
“8 core” is a useless spec because the CPU’s are not the same. For example your 4 core intel i7-2600K is almost equal to an AMD 8 core CPU:
https://www.cpubenchmark.net/cpu.php?cpu=Intel+Core+i7-2600K+%40+3.40GHz