Andrew Mckee
Forum Replies Created
-
Marquee actually works quite well too once you get to grips with it. You can import the full res image and do pretty much any type of animation you would need. The only annoyance is that is assumes the pixel aspect ratio matches the project, so if you’re working in non square pixels you get distortion that you need to compensate for. I quite like Pan & Zooms operation but without rotation it’s pretty much useless.
Andy
-
Platform, version, origin of footage?
Andy
-
A few things to consider here.
1) You change your project to 32bpc (bits per channel) but that does not mean you are rendering a 32bpc image. Pretty much every way of rendering out of AE gives you an 8pbc file. Whilst processing in 16 and 32 is pretty common now, the file sizes would be so huge that our output is generally 8 or sometime 10bpc depending on format.
2) The confusing thing here is that you should be rendering a 32bit file, just not a 32bit per channel file. A 32bit file has 8 bits for each of the 4 channels; Red, Green, Blue and Alpha. In AE this comes up as selecting RGB+A in the render dialogue. The 8bpc image you render will still process in 32bpc if that is what you selected for the project and produce the effect you get in that mode. The 4th channel (the Alpha) will tell Avid which parts of the image to make transparent and which to make opaque.
3) Avid expects straight (or unmatted) alpha channels. Many programmes premultiply the RGB channels by the A channel as a way of saving space but make sure you change this in AE on render so that Avid will interpret your Alpha correctly. Blurred edges and glows in particular will behave strangely otherwise. The other thing to consider is that you will have to invert the alpha channel on import to Avid in order for it to be interpreted correctly. This will be selected by default most likely, but just double check.
Andy
-
Why use ProRes transcodes for an offline edit in Avid? If you use Metafuze to create MXF media for Avid, then your EDL and AAF exports will work like a dream. If you look at avidscreencast.com there are a few videos that talk about this workflow, including the correct procedure for getting your EDL to Avid afterwards. Or you could even just AMA to the r3d files and then transcode what you use before generating your EDL for Color.
Andy
-
Thanks for your comments Janusz. Let me just make sure I’ve got my head around everything correctly. Please correct me if I say anything wrong. I set my light meter so it knows the sensitivity of my medium and the shutter speed. I then decide on my stop (say F2.8 for instance). I then use a keylight that will gives me a reading of F2.8 on an incident meter at the point my subject will be. Now, something that was middle grey would expose bang in the middle of my dynamic range? As I’m a colourist I tend to think in IRE rather than stops, so this would produce an IRE of 50 on a digital image or a scanned version of the film? However, skin is brighter than middle grey and so I’ll get more exposure. Something in the 60-70 range depending on skin tone?
The alternative is to use a spot meter which measures the reflection off my subject. So in order to get that 60-70 exposure on the skin I should be expecting readings a couple of stops over my aperture?
Obviously then there is everything else in the image to consider, I’m just using a person as an example.
Andy
-
Nice post Rick. I’m 24 (not far from the age you mentioned) and put all my creative effort into not painting by the numbers. If someone has done it before you, then what is the point? My tactic is to soak up as much information about what is being done now, so that when I do something different I know why.
I had similar experience to you a few nights ago during a power cut. I was using the light from my phone to get a glass of water from the kitchen. Walking back from the kitchen with my phone pressed against the glass, I was fascinated by the patterns it through on the wall. A little experimenting made me realise that the water was not only creating a rippling pattern but also focusing the soft light from my phone like a fresnel lens, creating a much harder source.
Andy
-
Thanks for your comments guys. I really like the look of using a fixed stop for dramatic scenes. As an editor it makes sense to me as when you start cutting in to tighter shots you are trying to make your audience more involved in the scene and so it makes sense to me that the tighter you are, the shallower the DOF. Sometimes I may vary the stop for a two shot if I need to get both subjects in, but I’m still always trying to light for the stop I want, rather than letting the lighting dictate it. This brings another question, if I want to stop down to increase the depth of field but still want the same level of exposure do I have to relight the scene? Using a DSLR I have tended to just raise the ISO, but I’ve never really enjoyed the change in grain that results, what do you do with a less flexible medium? Is there any other solution other than increasing your lighting but trying to match what you had before? What if one of your light sources is natural and unchangeable?
Andy
Andy
-
I see it as just a gimick too, but I think whoever does the first HDR feature will get some attention, so long as it fits the genre. I could see a 300 like comic book film working.
Andy
Some contents or functionalities here are not available due to your cookie preferences!This happens because the functionality/content marked as “Vimeo framework” uses cookies that you choosed to keep disabled. In order to view this content or use this functionality, please enable cookies: click here to open your cookie preferences.
-
Why in the heck would you do it in camera? You’re not getting anything out of it. Just tape the lcd screen so that you frame correctely and then crop it in post. There is no way to do it in camera and those lenses (whilst very nice) will not change the ratio.
Andy
-
I thought nesting had changed in MC5 so that you could get a list up of your effects in the same style as After Effects and reposition them without going into nests? Havent seen it for myself.
Andy