Apple’s ‘Magnetic Timeline’ Final Cut Pro X patent
This week Apple was awarded a patent related to the Final Cut Pro X timeline. It is available as text and 95 page PDF. It was applied for on June 6, 2011. Most of the abstract covers only part of the patent:
A media-editing application of some embodiments allows a user of the application to group media clips displayed in the timeline into a single clip representation. A composite display area of the media-editing application often displays numerous clips at various instances in time and at various levels in the compositing hierarchy. To reduce the number of media clips in the timeline, the media-editing application of some embodiments allows the user to select several media clips and combine them into a one media clip representation. In this manner, the media-editing application reduces the congestion in the timeline. These single clip representations are referred to as “compound clips.” Compound clips can be viewed as containers that can include several media clips of the same type in some embodiments, or that can include several media clips of several different types in other embodiments.
Although the abstract mainly covers compound clips, most of the ways a non-track-based magnetic timeline works is described in the patent itself. A little ‘smuggling’ by the patent lawyers?
Here’s a list of contents to show what editing software features Apple now has a patent for. Instead of reading the text, use this list of figures with some interesting quotes from the relevant sections. Remember that the phrase “in some embodiments” doesn’t mean that Apple planned to add that feature to Final Cut Pro, these clauses are included to make the patent cover a wider range of possible editing software features.
Fig 1: Main UI
Connected clips in secondary storylines are referred to as “anchored clips in anchored or secondary lanes.”
Instead of, or in conjunction with, having several levels of media clips that anchor off the central compositing lane, some embodiments allow media clips to be placed in these anchor lanes and to be anchored off of other anchored media clips placed in these anchor lanes.
Skimming within clips with the option to see skimmed clips in the viewer:
In some embodiments, the playback (or skimming) is not shown in the timeline clips, but rather in the preview display area
Fig 2: Selecting ranges in clips before reducing audio volume of range
Fig 3-4: Expand Audio/Video clips and Detach Audio
Fig 5: Change appearance of clips in the timeline
Some embodiments may also allow the user to directly manipulate the media clips (e.g., by expanding or contracting the audio or video portions using a cursor) to change the appearance.
Fig 6: Zooming timeline horizontally
Fig 7-9: The playhead and the skimmer
Fig 10-11 Clip skimming
Fig 12 Insert edit
Fig 13 Adding a selected event clip to the end of the timeline
Fig 14 Connecting a clip
Fig 15-18 Replace edits
Gap clips to maintain duration:
When the second media clip is shorter in duration than the first media clip, some embodiments replace the first clip by placing the second clip at the point that the first clip starts. This will cause a gap to close or to be filled with a position clip after the second clip.
Fig 19-20 Gap clips
Known here as Position clips
Fig 21 Trimming connected clips
Including the fact that if connected clips are audio only, they can be trimmed down to the sample level instead of being limited to whole frames.
Fig 22 Slipping clips
Fig 23 Connection point
Including an option to use the point dragged from in the event clip as the connection point when dragging to the timeline i.e. if the mousedown is 1/3rd of the way along the clip or selection when you start dragging, then the connection point is set to 1/3rd along the clip.
Fig 24 Changing the connection point
Fig 25 Creating a secondary storyline
Some embodiments allow the user to create a container first in an anchor lane and add media clips into the container from the timeline or the clip browser.
Fig 26 Extend edits
Fig 27-31 Editing and moving the playhead using timecode entry
Fig 32 Editing with video only, audio only or both audio and video
Fig 33-35 Two Up view
The media-editing application in some embodiments displays two frames in the viewer for other types of editing as well. Some of different types of editing include a ripple edit, a slide edit, a slip edit, etc.
Fig 36 Making compound clips
Fig 37 Navigating Timeline History
Fig 38 Bookmarking a timeline history view
Fig 39 Timeline history state diagram
Fig 40 Retiming a compound clip
Fig 41-46 Importing clips into a database
Including transcoding and proxy generation
Fig 47-50 How timelines are represented in the database
Fig 51 Application architecture
It might be possible to associate some of the internal frameworks in Final Cut Pro X and iMovie with elements of this diagram. For example, the Rendering Engine could be implemented by ‘Ozone.framework’ – the ‘headless’ copy of Apple Motion 5.X in Final Cut and iMovie. You might be able to guess what ‘TLKit.framework’ does.
There’s an interesting hint about how the application defined in the patent might not only be a traditional application running on a computer:
In some embodiments, the media editing application is a stand-alone application or is integrated into another application, while in other embodiments the application might be implemented within an operating system. Furthermore, in some embodiments, the application is provided as part of a server-based solution. In some such embodiments, the application is provided via a thin client. That is, the application runs on a server while a user interacts with the application via a separate machine remote from the server. In other such embodiments, the application is provided via a thick client. That is, the application is distributed from the server to the client machine and runs on the client machine.
Fig 52 Computing device
All software patents need to include a description of a computing device for the software to run on.