By “video card” do you mean an I/O subsystem, like the AJA or the Blackmagic card that will control a VCR?
There are obvious reasons for those — anyone who is ingesting video from tape will need such a card.
But Zvi specifically talks about disabling the driver for his computer’s video card, designed to drive his computer monitors. And he’s noticing that a “generic” display works better and produces better results.
Obviously you need a video card to drive your computer monitors. But the more expensive video cards are designed around creating and shading polygons in 3D space. As such, they’re pretty powerful but you don’t usually use this ability for video.
Where these cards will help you and increase your ability to get your job done is in the area of effects that may take advantage of their power. If you use the simple 3D effect in Premiere (and I use a really old version) it will take advantage of your graphics card to render that material. Additionally, there are other 3D effects you can apply to video that will benefit from a high-end graphics card. But normal video playback does not benefit from these cards.
The problem Zvi is having here is with the driver for that adapter. And it is, apparently, so poorly-written that it is actually slowing down performance.
In a case like that, your first move ought to be to go out and get a driver update, if there is one. You should also complain (loudly) to the card manufacturer and/or computer manufacturer if they installed that card in a stock system. Zvi’s description of how he solved his problem is an outstanding way of complaining. A graphics card (or GPU) ought not function better using a “generic” driver than it does with the driver designed for it.
But we also ought to consider what “generic” means.
Does “generic” mean a generic Open-GL graphics adapter? Because if it does, you are getting a serious benefit from the card’s hardware for any 3D work you are doing and the card itself is helping you as the driver software is hindering you.
Nvidia uses a proprietary model, called CUDA for controlling its cards and ATI uses DirectX (developed by Microsoft). But Open-GL tends to work with just about everyones graphics cards and if “generic” means Open GL, you’re harnessing the technology that lots of Linux propellerheads have created and refined for a number of years.
I would like Zvi to try a 3D render using the proprietary driver for his GPU and then try the same render using “generic.” If “generic” is faster, I’ll bet it’s using Open GL calls and points to some serious stupidity on the part of the GPU programmers who are supporting his graphics card.
What if there were no hypothetical questions?