So Im working on a personal video series which spans a decade and the amount of content is exhaustive to say the least.
Im looking at probably a total of 60-100k of video files that are all related to this project and Im trying to devise a workflow without losing my mind…
Currently I split everything into a few different libraries. 20K video files in one library, 15K in another, another 10K in one plus 15-20K iPhone video files in another. (But that’s not including the other 50K which are probably 100-1000 per library)
Ideally I would have everything in 1 library (which theoretically seems possible but I’ve had issues with the MacOS and external OWC RAIDs getting hung up on reading large libraries especially with so many projects within it)
The second best option would be to have specific libraries for the content and each season of the series could be its own library and pull footage from the “content” libraries. The issue is with keeping metadata synced… keywords, favorites, used media etc.
So I’d be curious to know if anyone else has experience or advices working with MASSIVE FCPX libraries.
Yes, I created The Second Life of Jamie P a feature doc with more than 9TB of footage. FCP handled it okay, sort of, but too late I realized I should have used <b style=”font-family: inherit; font-size: inherit;”>Proxy files which increase editing performance and take up considerably less storage space than optimized files.
Thanks Roger, but I actually have no problem with original media (I’ve got a 8bay RAID0 with 1500MBs) and FCPX on the M1 Max handles the media no problems. In fact I actually had a TON of issues when trying to work with this library using Proxy files. It seemed like FCPX would freeze trying to read / associate all of the proxies event though working with original media was fine.
My question is mostly focused on library to library workflow to make sure keywords / favorites / projects stay synced and files aren’t duplicated when transferred.
Of if anyone has a single library with 50K clips within it and works without issues.
Libraries can be large in terms of data size, number of clips, number of edits, or some combination. Apparently your main concern is number of clips.
There is no documented limit, but practical experience indicates as you exceed 10,000 clips per event it may have problems. The FCP library is internally composed of several SQLite databases, one for each event and one for each project. Each database contains several SQL tables with indexes on certain columns.
Each clip can require multiple rows in some SQL tables to store the attributes, effects and edits. However even if this was 20 or 50 rows per clip, I don’t know why there would be even a soft limit around 10,000 clips, as SQLite can supposedly handle millions of rows.
However the underlying database model is apparently an object-oriented “graph database” using Core Data, which in turn uses SQLite as the persistent data store. It appears FCP does not directly emit SQL statements to the lower database layer but Core Data translates graph database directives to produce SQL statements. Maybe there are limitations there.
I’ve edited a large documentary composed of 8,500 4k clips, proxies for all of them, spread across about eight events, totaling about 20 terabytes of external data. It worked fairly well on a 10-core iMac Pro.
I recently had several crashes on FCP 10.6.1 on an event that contained 17,000 fairly small HEVC clips. In the Event Browser if I paged down, periodically setting markers, then tried to jump forward and backward between markers with the shortcut keys CTRL+; and CTRL+, (IOW CTRL+semicolon and CTRL+comma), it would crash in a hashing function, and the stack trace implies a thread was trying to obtain a shared read lock, probably on a database page.
In general the FCP library data integrity and performance seems very good. However in general I’d be a bit cautious about exceeding 10,000 clips per event.
To search across all events it may do a relational join or else store interim results in temp tables. I’m not sure if staying below 10,000 clips per event would allow you to have 100,000 clips spread across 10 events — you’d have to test that. The tests should include performance on library-wide queries, library-wide smart collections, setting markers on clips in the event browser and jumping forward and backward between those.
There is a more common performance issue if you have lots of projects. Each project is comprised of a separate SQLite database, and there is system overhead in keeping a database open (I don’t know how much). FCP apparently uses a “deferred open” algorithm for the projects, maybe to reduce overhead. It also seems to enumerate all or many projects in certain phases, and that seems to make opening a library or event slow. I don’t know why it does that but having lots of projects, project duplicates and snapshots will cause performance issues fairly quickly.