Forum Replies Created
-
Yep, once you go to 32 bit float you have the possibility of values less than 0. If you are crushing blacks then they are dropping below 0.
This isn’t a bug, it is the way float values work. You just need to be aware of it and clamp values at 0 where needed.
Curves and gamma do even weirder stuff sometimes as they usually pivot around 1 and 0, so you can think you are making everything brighter, but you can actually be darkening your superbrights (at least this happens in Nuke).
—
-
I haven’t heard of DPX Plus so you might be right. All I know is that Nuke lets you setup a DPX render with an alpha channel, but then it fails in render.
—
-
-
I don’t think anyone in our project will be using an Avid but it is good to know that there can be issues.
Nice to know that it’s is faster as MXF too.
Thanks!
—
-
Conrad Olson
July 31, 2013 at 11:13 pm in reply to: Should I worry about linear workflow (as a photographer)?I know this is an old thread but I thought I should reply.
As a visual effects compositor who works with a linear workflow every day in Nuke I would like respectfully disagree with Jonathan.
The reason that we work with linear light is because that is exactly how the real world works and all of the maths that we do to adjust our images is designed to work on linear images. That said, if your software is dealing with all the different colour spaces correctly is should take care of this for you.
Film however does not work in a linear fashion.
The main reason that we don’t work with linear light all of the time is dynamic range. Most devices cannot record or display the full range of brightness values that our eyes can see, so apply some kind of curve to the values to remap them into something that visually makes sense.
Personally I don’t know if RAW images from a stills camera are linear or not, the RAW plug-ins would deal with that. Jpegs are almost always nonlinear and are in sRGB.
I don’t know how the colour science works in Photoshop or Lightroom, I let it deal with it all for me, so I can’t answer your original question. But I thought I should chime in.
This PDF gives a decent explanation as to how Nuke works in a linear way. It might be interesting for non Nuke users too:
https://vfxio.com/PDFs/Nuke_Color_Management_Wright.pdf—
-
Thanks for the advice. I think I will go for the MXF.
You are right about doing the VFX on the higher res files. That was always our plan, but there are a lot of VFX to do so I might have to do some temp ones in AE, just to see how things are working.
—
-
I would guess that they are using an optical flow effect, like Twixtor or Reel Smart Motion Blur, to generate motion vectors for the footage. Then instead of retiming the footage, using these vectors to drive other distortion and colour effects.
—
-
To get really realistic movement you can always shoot something with a handheld camera. Camera track that footage and then apply that camera track data to the camera in your scene.
—
-
I haven’t used an Avid for a long time but I know it used to convert anything that you imported into the format that the project was set to. This can be slow, and if you get the settings wrong you have to re-import your media, you can’t just change the interpretation, like you can in After Effects.
If the resolution is wrong then Avid will re-scale the media to fit. I’m sure After Effects will do a better job at this so you may as well render the correct format out of AE.
As for having more resolution in a square pixel image, this is true but if you are working in an anamorphic project then you are never going to deliver these extra pixels. You will have to lose them and convert to none square pixels at some point. If it doesn’t happen in AE then it is going to happen in the Avid or at the export/master to tape stage. You may as well do it at the point that gives the best result, or makes the rest of the workflow easier.
I know that the way that Avid deal with media has changed in the past few years so I don’t know if this is still the case.
—
-
Conrad Olson
February 1, 2013 at 8:26 pm in reply to: Having trouble with converted GoPro footage from Cineform studioThe ProTune colour correction settings are not baked into the video file, they are applied on the fly when the file is decoded by whichever application you are reading it in. I’ve never tried to use a ProTune Cineform file in AE but they work fine in Premier Pro CS6.
Perhaps you need to open the Cineform file in something else and then re-export it to another codec, like ProRes. This should bake in the colour correction that you applied in Cineform Studio.
—