Creative Communities of the World Forums

The peer to peer support community for media production professionals.

Activity Forums Adobe After Effects After Effects Texture Memory doubt

  • After Effects Texture Memory doubt

    Posted by Adriano Castaldini on October 21, 2015 at 10:13 pm

    Hi everybody,
    I’m trying to optimize After Effects CC2015 performance onto a Yosemite compatible PC (someone could playfully call it hackintosh) and I’ve just enabled CUDA driver for my Titan X 12GB. In Preferences > Previews > GPU Information, there is written:
    Fast Draft: Available
    Texture Memory: 4915,00 MB
    Ray-tracing: GPU
    OpenGL
    Vendor: NVIDIA Corporation
    Device: NVIDIA GeForce GTX TITAN X OpenGL Engine
    Version: 2.1 NVIDIA-10.5.2 346.02.03f01
    Total Memory: 12,00 GB
    Shader Model: –
    CUDA
    Driver Version: 7.5
    Devices: 1 (GeForce GTX TITAN X)
    Current Usable Memory: 9,47 GB (at application launch)
    Maximum Usable Memory: 12,00 GB

    Here my questions:
    1. what does Fast Draft do?
    2. is a good idea to push up Texture Memory from 4915MB (default) to 9216MB (the max value that I can get)?

    Thanks a lot for your help

    Michael Szalapski replied 10 years, 6 months ago 3 Members · 3 Replies
  • 3 Replies
  • Walter Soyka

    October 21, 2015 at 10:41 pm

    Ae does not generally render on the GPU. The exceptions are a few third-party effects (like Video Copilot Element 3D, Mettle Freeform, and Red Giant Universe), the native Cartoon effect, and the ray-tracing renderer, which is no longer under active development [link]).

    See more on Ae’s use of the GPU here:
    https://helpx.adobe.com/after-effects/using/rendering-opengl.html

    [Adriano Castaldini] “1. what does Fast Draft do?”

    Fast Draft is an OpenGL-accelerated preview mode. It’s pretty useful for layout in comps that use the ray-tracing renderer, because it renders fast. However, its use ends there, because its quality is draft-only. You cannot render Fast Draft.

    [Adriano Castaldini] “2. is a good idea to push up Texture Memory from 4915MB (default) to 9216MB (the max value that I can get)?”

    Probably not.

    Walter Soyka
    Designer & Mad Scientist at Keen Live [link]
    Motion Graphics, Widescreen Events, Presentation Design, and Consulting
    @keenlive   |   RenderBreak [blog]   |   Profile [LinkedIn]

  • Adriano Castaldini

    October 21, 2015 at 11:30 pm

    Thanks a lot!

    Just another doubt:

    The difference between enabling CUDA in Ae CC2014 vs CC2015 is the location of raytracer_supported_cards.txt file (where you add the name of your card in order to make it recognizable by Ae): the difference is that it was stored into Ae CC2014 Content folder, while in CC2015 it’s in Content/Resounces folder.

    Now, I need to find (and update) the cuda_supported_cards.txt file that in Ae CC2014 was in Content folder and now in CC2015 I’m not finding it nowhere.

    Really thanks.

  • Michael Szalapski

    October 22, 2015 at 1:33 pm

    You don’t need to modify the .txt file in recent versions of AE. There is a checkbox that allows you to enable GPU acceleration on “unsupported cards”. The thing is, the Optix library on your card CANNOT work to accelerate the ray-traced renderer because NVIDIA changed the library that the effect relies on. No matter what you do with .txt files or anything it simply cannot work.
    The next update to AE [link] will update the Optix library of the ray-traced renderer so that it can work with the new cards, but until then, YOU CAN’T DO ANYTHING to make it work.

    Again, as Walter said, the ray-traced renderer is a dead end path. I would recommend against wasting any time working with it. Instead, learn how to use the more powerful new option [link]!

    – The Great Szalam
    (The ‘Great’ stands for ‘Not So Great, in fact, Extremely Humble’)

    No trees were harmed in the creation of this message, but several thousand electrons were mildly inconvenienced.

We use anonymous cookies to give you the best experience we can.
Our Privacy policy | GDPR Policy