-
Graphics card won’t render 10 bit workaround
Hi,
Apologies if you all know this, but I’ve been getting the ‘graphics card can’t render effect at current fps and bit depth’ error with certain effects on FCP (my card is the nvidia geforce 7300 gt) when I’m in a 1080i50, 10-bit sequence. I started looking into buying a new graphics card but that appears to be a bit of a minefield (even signed the petition to get Apple to support 32-bit drivers on the latest Ati card). I’ve just found a workaround and thought I’d post as I’ve read of a few users having similar problems and this might work for them too..
If I change the sequence settings to ‘Always Render in rgb’ then any effect will be rendered and previewed without an error popping up (mind you, takes a flipping long time!). Using the setting: ‘Render 10-bit material in high-precision yuv’ or, it would appear, any other option will stop the graphics card and fcp from rendering some effects.
Hope this helps someone! 🙂
Also, I don’t own a HD deck of any kind (hiring as and when) so if anyone can confirm that this setting doesn’t produce unwanted side-effects on playout (or, for that matter, side-effects in any situation), that’d be loverly, cheers.
Todd.