-
Fade based on distance from camera but Z-ONLY?
Hi all,
I have a project in which the camera moves forward through a landscape of cutout 3D layers and I’m trying to do fake depth of field & depth cue/fog effects by having blurs, tint etc on each layer with expressions to make them fade out as the camera gets near. Would have thought this is a pretty common thing to do and I’ve found several threads here but the usual solution, some variation on the expression below, doesn’t actually work correctly in my opinion. (This is copied from my comp, i just added sliders for the start and end values):startFade = thisComp.layer("depth controller").effect("start")("Slider");
endFade = thisComp.layer("depth controller").effect("end")("Slider");
C = thisComp.activeCamera;
d = length(toWorld(anchorPoint),C.toWorld([0,0,0]));
linear(d,endFade,startFade,0,100)– This kind of works and maybe well enough for some projects but the problem is that, as far as I can tell not fully understanding the expression’s workings, it’s measuring the distance from the camera to the layer anchor in 3D space. This means that if a layer is to one side of the camera the effect fades in slower than if it’s more in line with the camera’s z axis (which I think kind of makes sense for fog maybe but not at all for depth of field blur). I need it to instead measure the z distance only, so that the x/y movement of a layer in relation to the camera does not effect it, ie like real depth of field, a focal plane parallel to the camera. Is this possible?
Hope that makes sense, any help much appreciated, thanks!!