Activity › Forums › Apple Final Cut Pro › Understanding Moire in FCPX?
-
Robert Olding
August 29, 2017 at 5:18 pmThe problem isn’t moire, it’s compression. The cameras you mentioned record video using Sony’s AVCHD codec for the a6000 and Sony’s XAVC S codec for the A7S II. Both these codecs compress your captured footage to give you the smallest possible file that still looks great during playback. But, once you start to grade the footage, it begins to fall apart and you’ll see artifacts such as those in the edited footage you’ve shown us.
In the example of the output settings you shown us, you’ve chosen the H.264 codec. The H.264 codec compresses the footage even more. What you have are original highly compressed XAVC S files being compressed even further during output. This combination will make the final edited movie look terrible.
Converting the original captured footage from either of these cameras to ProRes or some other codec before editing won’t solve the issue. Check to make sure you’re using the latest version of Final Cut Pro X and see this Apple Support document about importing files encoded with Sony codecs into Final Cut Pro X.
I would recommend not converting the footage to ProRes before you edit. I would also recommend that you edit in the original resolution. If the footage was shot in 4K, edit in 4K. The idea is to keep as much as of the original data as possible before you output. Once your ready to output, I would recommend that you output a master file at the original resolution and select the ProRes 422 HQ codec.
Run that master file through Apple’s Compressor to knock it down to the size you’d like to actually deliver to your client but leave the codec at ProRes 422 HQ. If you want to show the movie on YouTube, use the “Publish to YouTube” preset that is available in Apple’s Compressor.
To alleviate the issues associated with highly compressed capture files, it would be best to capture your footage in a codec that is more forgiving. Atomos and other companies make external recorders that you can attach to the A7S II via the HDMI output on the camera. This will allow you to capture the footage in various ProRes or Avid DNxHD codecs allowing for a much higher quality of captured footage that will hold up during editing and grading.
Just an FYI on moire … the quality of the processing that happens in camera during capture has gotten so good that moire isn’t seen that often anymore. Once in awhile, I’ll see it on clothing that has a printed pattern of vertical or horizontal stripes on it such as this example.
Robert Olding
http://www.8streetstudio.com
Minneapolis, MN -
Loren Risker
August 29, 2017 at 10:23 pm2 questions:
Does your project have the same framerate and resolution as your source footage?
Are you viewing your footage with a viewer size of 100%, both when you see your good looking raw and your export? Scaling will add moire and other artifacts during playback only.
————-
OutOfFocus.TV – Music Videos 24/7 -
Kasey Gay
August 30, 2017 at 1:30 amHey guys,
Thanks so much for the in-depth feedback and responses. I feel like you guys stand to save me months of trial and error.
Robert, not to steal anymore of your time, but I just want to make sure I’m understanding everything.
1. My knowledge up until this point said that transcoding to Pro-res was actually BETTER for color grading. I take it the people giving me this advice weren’t accounting for the fact I was shooting on Sony.
2. If buying an external recorder isn’t in my price range yet, would I possibly be getting better results by avoiding shooting SLOG and flat profile, and instead sticking with the *in camera* colors.
3A. Loren both you and Robert have encouraged me to make sure I’m editing in the same resolution and frame rate as I shot in.
The Sony a6000 only shoots 1080, but the Phantom shoots 4k, and Im’ using them both. Is there a best practice to follow in this situation? When using a slider, I like to shoot 60fps on the Sony, but I can’t get that out of the drone. Is there any way to reconcile the two, or am I asking for trouble if I don’t get both cameras running identical settings?
3B. When we say “edit in the same frame rate/resolution. We mean only the time line, right? Example, if I shot in 60 FPS with intention to slow down playback, I’ve operated under the impression that I could have my time line be 24 FPS, and then drop that 60fps clip in and have it slowed down. (Not arguing. Begging to be corrected)
Thanks so much for your time guys. Your advice is invaluable.
-Kasey
-
Loren Risker
August 30, 2017 at 6:04 am3A. Loren both you and Robert have encouraged me to make sure I’m editing in the same resolution and frame rate as I shot in.
Its not a hard rule to match the resolution and the frame rate.
But here are a couple things to look out for.
Scaling can add artifacts such as moire and softening.
Say you have some 4k footage, and the rest of your footage is 1080p. If you put it in a 1080 project, the 4k will need to be cropped, or scaled down. If you have a 4k project, all of the 1080 footage will need to be scaled up.
As for frame rate, you can make it anything you want and mix framerates. However, it you play the frames in such a way that the computer has to create or drop frames, it will add some visual artifacting.
For instance if you have a 30fps project and you had a 60fps clip, if you play the 60 frames at your project rate you’ll get slow motion (50% speed). If you chose something slower, like 15% speed, the computer would have to make up frames or repeat frames, and you’ll see a strong drop in quality.
If you put it
————-
OutOfFocus.TV – Music Videos 24/7 -
Robert Olding
August 30, 2017 at 3:33 pm[Kasey Gay] “1. My knowledge up until this point said that transcoding to Pro-res was actually BETTER for color grading. I take it the people giving me this advice weren’t accounting for the fact I was shooting on Sony.”
Hi Kasey … ProRes IS better for EDITING in Final Cut Pro X. The size of video files are typically very large and the ProRes codec was designed to work exceptionally well in Final Cut Pro X and other NLE’s such as DaVinci Resolve and Adobe Premier. When editing, you’re not changing the quality of the footage in any meaningful manner, you’re just arranging the footage to tell a story.
When you color grade, you are actually changing the footage from one thing into another. This is a destructive process. It may appear that you’re adding data (a constructive process) but in reality, you taking data away. Since your original footage is already compromised by high compression you’re actually making it worse by modifying the color. The higher the quality of the original footage, (meaning the less compressed it is), the better it will hold up while you actually destroy it.
The high compression that exists in your original footage won’t be made better by converting it to a higher quality codec such as ProRes 422. It will just be easier to edit. Any footage you convert will include the limitations of the original and possibly introduce new artifacts because the software has to interpolate the original into something new.
Robert Olding
http://www.8streetstudio.com
Minneapolis, MN -
Robert Olding
August 30, 2017 at 3:49 pm[Kasey Gay] “2. If buying an external recorder isn’t in my price range yet, would I possibly be getting better results by avoiding shooting SLOG and flat profile, and instead sticking with the *in camera* colors.”
Yes you would. The only reason you’d want to shoot with any of the “flat” profiles is if the scene has too much contrast that you can’t control by any other means.
Once your able to capture footage with a higher quality codec, color grading in post will be more forgiving. This will allow you to use a “flat” profile in scenes that may not require it.
Robert Olding
http://www.8streetstudio.com
Minneapolis, MN -
Robert Olding
August 30, 2017 at 4:26 pm[Kasey Gay] “3A. Loren both you and Robert have encouraged me to make sure I’m editing in the same resolution and frame rate as I shot in.
The Sony a6000 only shoots 1080, but the Phantom shoots 4k, and Im’ using them both. Is there a best practice to follow in this situation? When using a slider, I like to shoot 60fps on the Sony, but I can’t get that out of the drone. Is there any way to reconcile the two, or am I asking for trouble if I don’t get both cameras running identical settings?”
As Loren stated, you’re either going to have to scale the 1080 footage up to 4K or scale the 4K footage down to 1080. As a rule of thumb footage that has been scaled down will look better than footage that has been scaled up.
The FPS is much less likely to matter much if at all. If your project timeline is at 24 FPS and you add footage to the timeline that was shot at 24 FPS it will appear to play at normal speed. Any footage that was shot at a FPS higher than 24 will appear to play in slow-motion. Any footage that was shot at a FPS less than 24 will appear to play in fast-motion.
Footage will be interpolated by the NLE when you ask the NLE to slow the footage from what it was shot at. So, if your project timeline is at 24 FPS and you add footage to the timeline that was shot at 24 FPS then ask the NLE to slow the motion, the NLE will have to interpolate the original footage and will add frames that didn’t exist before . This could introduce artifacts.
The reverse is the opposite. If your project timeline is at 24 FPS and you add footage to the timeline that was shot at 24 FPS then ask the NLE to speed the motion, the NLE will delete frames from original footage. This will NOT introduce artifacts since this is actually removing frames and nothing new is being created.
Robert Olding
http://www.8streetstudio.com
Minneapolis, MN -
Robert Olding
August 30, 2017 at 4:34 pmKasey … consider purchasing Neat Video. It’s a miracle worker on footage that doesn’t look so great either from capture or from the destruction caused in post.
Robert Olding
http://www.8streetstudio.com
Minneapolis, MN -
Robert Olding
August 31, 2017 at 8:00 pm[Kasey Gay] “1. My knowledge up until this point said that transcoding to Pro-res was actually BETTER for color grading. I take it the people giving me this advice weren’t accounting for the fact I was shooting on Sony.”
Kasey … I forgot to mention that you can transcode your footage into a ProRes proxy file. Edit and grade using the proxies. Then conform your original footage to the final proxy cut. This will make it easier to edit and grade than using the original Sony files. It may or may not help with the artifacts that are introduced once you output.
Robert Olding
http://www.8streetstudio.com
Minneapolis, MN -
Michael Sanders
August 31, 2017 at 9:27 pmMoire is a result from when the when high frequency detail is close to the resolution of the camera or display. And as others have pointed out compression can also cause problems as well.
There is a scene in an episode of The Crown (one with fine tweed) which exhibited Moire when I watch via an Apple TV. I checked with a friend who worked on it and he said it was clear on his monitor during post – but then he was watching on a top end Sony 4K monitor from the RAW files ☺
https://www.xdcam-user.com/tech-notes/aliasing-and-moire-what-is-it-and-what-can-be-done-about-it/
Michael Sanders
London Based DP/Editor
Reply to this Discussion! Login or Sign Up