Forum Replies Created

Page 2 of 45
  • I know this is probably not the type of advice you want, but I would look at examples of actual night photography (still or moving) that approximate the look that is in your mind. Example:

    I did a day for night shot that had cars going through it. At first, I was just tracking some lens flares to the car headlight positions, and it was not working. Looked totally phony, more like a cartoon. Then I actually looked at footage of cars driving at night and copied that. It looked amazing, and the funniest thing was that it was much simpler; i.e. no flare.

    Look at what the light does to the surrounding physical elements like trees, fences, buildings, etc. Light spills and the further away from the source, the weaker it is. This is where things can get a little complicated. Some objects might be covering up other objects, which means you’ll need to use masks so the light is only affecting objects in a logical way. (Think about a tree in the foreground, light in the middle on a person; if you artificially add light to the person, you’ll need a mask to keep the light off the tree which would be in a silhouette).

  • Winston A. cely

    September 23, 2021 at 12:53 pm in reply to: Is Motion broken?

    …. When is Motion 6 coming out …. 😔

  • Winston A. cely

    September 22, 2021 at 6:56 pm in reply to: Is Motion broken?

    I’m having issues making a lower third as well. I can add markers, they initially show up green, but as soon as I edit to make them a poster frame, or an optional build-in or optional build-out, they disappear.

    I used the Motion Template for titles. I’m on a brand new iMac M1.

  • Winston A. cely

    March 29, 2021 at 3:00 pm in reply to: Audio Files from external recorders

    Yeah, my students are recording on Sony a6300’s so either XAVC S for 4K or HD. Thanks for your help on this! You’ve clarified something I should have known already.?

  • Winston A. cely

    March 29, 2021 at 12:40 pm in reply to: Audio Files from external recorders

    Gotcha. I’ll make this our policy moving forward.

    Is there a reason this has to be done? When importing video from our SD Cards, we don’t have to copy the video to our hard drive first and then import it.

  • Winston A. cely

    March 22, 2021 at 6:25 pm in reply to: Audio Files from external recorders

    Yeah, I did.

  • I’m not sure if this is what you’re looking for or not. I kinda think it might not be upon re-reading your posts, but I’m gonna throw it down here just in case:

    So I think you need an image sequence that you playback as a single movie.

    First, cull together all the images you need. Make sure that they are in a video-friendly format (like png), scaled to roughly the same size, and that they have an alpha channel.

    Next, make sure that they all follow the same naming pattern. This is per Apple’s site for image sequences for Motion:

    Important: Any imported image sequence must contain three or more digits of padding—for example, “imagename.0001.tif.”

    So you could do, “emoji.0001.png” “emoji.0002.png” etc.

    Then, make your Motion project and import your image sequence. Again, from Apple’s site:

    1. In Motion, do any of the following:

      • Choose File > Import (or press Command-I).

      • In the toolbar, click the Import button.

      • Control-click an empty area of the Layers list or canvas (in the black area outside the project), then choose Import from the shortcut menu.

    2. In the dialog that appears, navigate to the image sequence, then select an image in the sequence.

    3. Select the Image Sequence checkbox.

      Note: If the Image Sequence checkbox does not appear, click Options in the lower portion of the dialog.

    4. Click Import.

      Motion uses each image in the sequence as a frame in a movie clip.

    Now, that you have the image sequence imported, you can use it as your source for your emitter. The key here is that you want the emitter to choose a random frame from the source image sequence.

    This should enable you to use one emitter that emits different images.

  • Question 1: These new features don’t move the needle for me, mostly because I just bought an iPhone 11ProMax. I think they’re great, and in three or so years, when I have killed my phone, I’ll really look forward to the 2nd or third generation of these features as they’ll be more robust by then. If I had an iPhone that was at least a couple of years old, I would jump at getting a 12Pro if I had the cash.

    Question 2: I would absolutely think about using it as a B camera, as I would with my current 11ProMax. As an A camera, it would depend on the narrative. Documentary, or “simple” fictional stories that don’t require a lot of “post-magic” or highly specific and minute control of camera functions, no. But that’s all sorta obvious.

    Question 3: I would absolutely add it to my collection of cameras. I’m no spring chicken (just passed 40 recently) but I do teach high school students, and this is the future. Most of my students have had a smartphone of some type glued to their hands since middle school. By the time I get these students, they want the ease of use more than anything. They’re perfectly willing to give up creative control to just get things done, so I see smartphones being used as primary cameras soon enough. The biggest obstacle to this right now is the perception of the client; meaning that they expect to see some massive camera with tons of attachments and equipment as “professional” and a smartphone as unprofessional.

  • Winston A. cely

    October 15, 2020 at 3:07 pm in reply to: How is everyone doing?

    I teach film production/broadcasting in South Carolina to 10th, 11th, and 12th graders. Needless to say, many of our teachers, myself included have felt like we’re being lead to slaughter with no hazard pay, and cancellation of the pay raise that was approved last year.

    Teaching has been strange, to say the least. Students are separated by plexiglass barriers, grouped by schools, and very little work that gets them out of their seats. Thankfully, we haven’t had many quarantined students from our school, but our feeder schools (I teach at a tech school and we support 4 different high schools) have been having more and more students get quarantined each month.

    I’ve started the year with theory and critical analysis to keep everyone as separate as possible, but imagine yourself in their shoes… Bored to death cause you’re not getting to actually film anything or tell your stories. We are now, very slowly and carefully, transitioning to making our own projects. Starting with testing making videos with smartphones and Chromebooks (using free software because the feeder schools won’t pay for editing software to install on Chromebooks.*) so that if we do have to go fully virtual we have already addressed and fixed or come up with alternatives to the problems we in counter. Then we’ll start using our cameras and lighting equipment, in small groups. Time will tell if this creates problems.

    On a personal level, I am terrified that I’ll lose my job. I teach a career that has been devastated by our current situation, on top of the fact that South Carolina is a fairly hostile state when it comes to production. The only hospitable place in the state being Charleston, which is more than 3 hours from where I teach.

    It’s good to see that productions are picking up, but I live in a state so disinterested in production, that I fear the business here may be on its last legs.

  • Winston A. cely

    August 28, 2020 at 3:02 pm in reply to: iOS question

    Thanks again!

    Winston A. Cely
    ACTC Media Broadcasting Video Instructor
    Apple Certified Editor FCPX 3

    \”If you can talk brilliantly enough about a subject, you can create the consoling illusion it has been mastered.\” – Stanley Kubrick

Page 2 of 45

We use anonymous cookies to give you the best experience we can.
Our Privacy policy | GDPR Policy