Shawn Larkin
Forum Replies Created
-
Thanks all.
Kevin, you’re reply is what I needed to know. Is there any way you can create your own custom Title > Presets ?
That was the power of the Motion > FCP relationship; the amazing simplicity of publishing templates into FCP which have fully-customizable text boxes and controls in FCP so the editor doesn’t have to do much, like open another app.
In the new FCP X / Motion 5 it’s even better because you can set all kinds of rigging and publishing parameters. But I really want to go the Adobe direction instead. This is one of those major features for our shop because the editors are NOT motion graphics savy. They need drag and drop simplicity of a template that a good designer has already worked on.
Also, I was not talking about dragging a Motion project onto the timeline. I mean specifically publishing from Motion to FCP so the templates just show up in the Effects > Templates > Subcategory Bin.
If you have to import an AE project and then choose which Comp to put down on a video track, then do you have to go into AE to edit the text in that comp? There is no control of the parameters inside PP, right?
Please explain more.
Thanks in advance again.
-
Thanks for the reply Scott. I’m glad someone had something to say to me regarding that post.
For the record:
I use Avid, Premiere Pro, AE, FCP, Motion and any other tool I need in a given situation or environment. I’m not really a “fanboy” of what Apple makes. I just like well designed solutions that make things easier.
Basically, I look at things from “does it work or not” perspective. Apple is in a unique position to survey the NLE landscape from 10 years of experience and feedback and to create what they think answers a lot of problems with editing.
I “got” this from watching the demo. The way you navigate the timeline and media and, well, edit seemed like a paradigm shift to me and if felt like these smart architects were really trying to re-think editing from the basics on forward. This, of course, is my opinion. But it felt very “new” to me. And I’ve been making my living at this for over 10 years — started on Media 100, then Avid, then FCP, then PP, blah, blah, blah…
Hence, no one knows if this is going to work or not. Not yet.
And yet there is a very closed mind to this before actually testing it for an extended period of time.
So it goes…
-
Great thread so far. I read Walter’s article on his blog first and now here again. My feelings reading this and other Pro responses have to do with the open vs. closed mindset of “Mature” Pro Editors.
Apple was clear that this was a sneak peek. Larry Jordan did a fine job reiterating this in his coverage so far. Surely, Apple will HAVE TO address all the backwards compatibility and interoperability concerns Pro Editors have with FCP X.
When you step back and look at the big picture here, you see can see the thinking involved with this product at the demo stage: they are trying to simplify and clean up how to edit. There is a bit of a paradigm shift with all the reliance on metadata, and the single viewer, and the “trackless” magnetic timeline, which poses massive concerns for those that know how to “get around” and are comfortable with what they already know.
But no one — NO ONE ON THIS FORUM ANYWAY — has experience learning this version and using it and getting the best out of it yet. All the hoopla about “will it support X or Y because we have invested in hardware or need a solution for whatever…” is immediate, but misses the point of the demo: to show Pro Editors a new / better way to edit with a shift in habits. This will take some getting used to. And surely all the concern is rightfully part of all the money and time spent investing in previous gear and habits.
But change is not always easy. It’s almost like listening to those Avid Editors which never wanted to try FCP. Or try telling a die hard AE guy that his composite is easer with a node-based solution, like Nuke.
So what if the software looks pretty and clean and “iMovie-ish?” At the end of the day, if Editor A can use FCP X more fluidly with a shorter learning curve and get more power out of it than Editor B can form his older editing software, then who cares what it looks like? And if you are upset about how cheap it is and that it narrows the gap between Pro and Consumer, well, you’re right. The argument about the artist using the tool to create and that access does not equate to good work still stands.
Everything seems very speculative to me and without using this tool no one knows if it works or doesn’t.
Even if Apple forces everyone using FCP X to abandon current gear — WHICH I DOUBT THEY WILL DO — to use this system, it might be for the better in the long run. Backwards support sure seems to hinder other platforms and systems. I mean, look at Microsoft Windows for example 🙂
Ultimately, I’m sure FCP7 and all of FCS3 will be used transitionally as people learn how to use FCP X. And all this much ado about nothing will be an afterthought.
Or not.
-
Thanks Scott.
I have only found that certain SSDs had problems running ProApps in the past. Anything new SSD formatted HFS+ should be fine under OS X 10.6.7.
I think what I am getting at is more of an paradigm shift idea: you might not need to set your media scratch disk to a non-boot drive. I mean, in the past, there was good reason for this. And if you capture files — small or big — you still need more space than the average SSD can provide.
However, if blazing fast SSDs were the norm and they had lots of storage space on them, would we really want to have an external drive for scratch disks, cache files, and render locations? Probably. But in some cases, probably not. I think mine is one of those cases. And, well, I guess I’ll have to test the performance once I install the drive.
Thanks for the feedback all.
-
Thanks for the input.
Actually, I will end up rendering to an external HD. I send all jobs to Compressor which defaults to rendering to an external HD. So really the only files saved to my boot drive will be the cached render files and project backup files each FCS app generates. And I will clean these up regularly.
My question was really more of a…
Does anyone see any foreseeable problems with allowing your boot drive to be your media drive (by keeping all your FCS apps pointed to it instead of an external drive) IF it’s a super fast SSD and the majority of the media you edit is located on an external network drive?
My guess is there is not real concern. But maybe someone knows something I don’t so I thought I would ask.
Thanks again in advance.
-
The point is not to use Arri Raw, which translates to creating enormous, cumbersome files recorded onset and big pipeline issues to deal with in post. The gained latitude and color accuracy does not outweigh the cost and time and inconvenience. And speaking from experience. I have seen the ultimate filmout demo at FotoKem and know about many tests all related to ProRes4444 via Alexa.
For me what is “revolutionary” about this camera is workflow and bang for buck with the codec.
So yes, SSD’s make sense if they can be small enough to be housed in a mini enclosure that clips onto the camera (Red style) AND record ProRes4444 “full raster” at all available frame rates.
Thanks for your help.
-
So the real “bottleneck” regarding full image raster and over-cranking is the SxS media write speeds –not anything else with the camera hardware?
I ask in the hopes that faster SxS cards will fix both these limitations down the line.
I don’t really want to record raw and use an additional device when ProRes4444 is enough.
-
FYI: The vast majority of theaters in the world still project 35mm. And you get these choices with a non-4×3 film frame:
1) Hard Matte in 1.85 or 1.66 using Spherical Lenses
2) Full Projection Area using Anamorphic LensesWith the newer crop of digital 2K and 4K projection, I do not know if Mattes with Spherical Lenses OR Anamorphic Lenses are being used OR if they just project the exact pixel aspect ratio of the show and adjust the Spherical Lens focal distance and screen curtain accordingly.
Do you know the standards here? This seems importnat to me.
Thanks for the Hawk Lens referral; they look quite promising. Are they the only company doing a 1.3 Anamorphic Lens now?
-
Thanks. Just started to read some tech specs on viewfinder. All very clear now.
My Anamorphic Squeeze question is in relation to 35mm projection, which can only project 2.35/1 with Anamorphic Lenses. There is no hard matte to crop the projection area of a film print to anything wider than 1.85/1.
So you would have to squeeze the cropped Alexa image into a film negative during film out in order to project it “widescreen anamorphic.”
I’m sure this looks good, but I wanted to hear from someone that has done this before any thoughts or considerations.
Thanks in advance.
-
Gary,
I think you misunderstood my post.
You HAVE TO DeBayer with RED. You do not with Alexa since the camera is doing this on the fly to ProRes. That = Better to me.
The ProRes 422 HQ and 4444 Codecs may be “overkill” for offline, but they are small enough file sizes to use in this fashion.
AND if you want to get hardcore, you make sure you have recorded RAW in sync with ProRes so you can online to that for film work. However, iMHO you don’t need this much latitude if you work in ProRes 4444.
Regarding Avid, well, they now support QT and ProRes. But I doubt MC5 will be ubiquitous enough and I am sure old Avid Editors will not be able to jump on this bandwagon–especially in broadcast, which has overpaid for Avid boxes, which have a very simple Codec Workflow.
To me Alexa just makes more sense in post. But that’s me.