Articles tagged with: Patent

Apple’s patent for applying effects to clips with specific roles

Tuesday, 07 June 2016

The name of patent 9,240,215 may be ‘Editing operations facilitated by metadata,’ but it is about applying effects to roles in Final Cut Pro X:

For example, several clips may be assigned one audio role of "Dialog", "Music", or "SFX". A process  then provides one or more user interface controls. These user interface controls are also associated with the tagged clips. That is, the user interface controls are associated so that these controls can be used to display or modify properties of the tagged clips.

PDF version.

 

Apple’s structure editing patent

Tuesday, 07 June 2016

While editors wait for the next big Final Cut Pro X update, I hope the Apple ProApps team will implement some of the ideas in their ‘structure editing’ patent. Here’s my old writeup of the patent they applied for in 2009 on fcp.co:

Most people think that the editor’s job is ‘to cut out the bad bits’ in individual scenes. Many are surprised to discover that editors commonly change and improve storytelling by changing story structure. As many film and TV makers consider that structure is very important when it comes to telling stories, I think it is a good idea for video editing software to recognise story structure.

Structure applies to feature films, TV shows, groups of corporate videos on an intranet, legal video depositions, architects’ video proposals or open-ended weekly web series. The more video applications can have these structures encoded in their projects, the better the tools they’ll be able to provide to a wider range of people all over the world.

Apple's 'Magnetic Timeline' Final Cut Pro X patent

Wednesday, 29 October 2014

This week Apple was awarded a patent related to the Final Cut Pro X timeline. It is available as text and 95 page PDF. It was applied for on June 6, 2011. Most of the abstract covers only part of the patent:

A media-editing application of some embodiments allows a user of the application to group media clips displayed in the timeline into a single clip representation. A composite display area of the media-editing application often displays numerous clips at various instances in time and at various levels in the compositing hierarchy. To reduce the number of media clips in the timeline, the media-editing application of some embodiments allows the user to select several media clips and combine them into a one media clip representation. In this manner, the media-editing application reduces the congestion in the timeline. These single clip representations are referred to as "compound clips." Compound clips can be viewed as containers that can include several media clips of the same type in some embodiments, or that can include several media clips of several different types in other embodiments.

Although the abstract mainly covers compound clips, most of the ways a non-track-based magnetic timeline works is described in the patent itself. A little 'smuggling' by the patent lawyers?

Here's a list of contents to show what editing software features Apple now has a patent for. Instead of reading the text, use this list of figures with some interesting quotes from the relevant sections. Remember that the phrase "in some embodiments" doesn't mean that Apple planned to add that feature to Final Cut Pro, these clauses are included to make the patent cover a wider range of possible editing software features.

Fig 1: Main UI

8875025-fig1

Connected clips in secondary storylines are referred to as "anchored clips in anchored or secondary lanes."

Instead of, or in conjunction with, having several levels of media clips that anchor off the central compositing lane, some embodiments allow media clips to be placed in these anchor lanes and to be anchored off of other anchored media clips placed in these anchor lanes.

Skimming within clips with the option to see skimmed clips in the viewer:

In some embodiments, the playback (or skimming) is not shown in the timeline clips, but rather in the preview display area

Fig 2: Selecting ranges in clips before reducing audio volume of range

Fig 3-4: Expand Audio/Video clips and Detach Audio

Fig 5: Change appearance of clips in the timeline

Some embodiments may also allow the user to directly manipulate the media clips (e.g., by expanding or contracting the audio or video portions using a cursor) to change the appearance.

Fig 6: Zooming timeline horizontally

Fig 7-9: The playhead and the skimmer

Fig 10-11 Clip skimming

Fig 12 Insert edit

Fig 13 Adding a selected event clip to the end of the timeline

Fig 14 Connecting a clip

Fig 15-18 Replace edits

Gap clips to maintain duration:

When the second media clip is shorter in duration than the first media clip, some embodiments replace the first clip by placing the second clip at the point that the first clip starts. This will cause a gap to close or to be filled with a position clip after the second clip.

Fig 19-20 Gap clips

Known here as Position clips

Fig 21 Trimming connected clips

Including the fact that if connected clips are audio only, they can be trimmed down to the sample level instead of being limited to whole frames.

Fig 22 Slipping clips

Fig 23 Connection point

Including an option to use the point dragged from in the event clip as the connection point when dragging to the timeline i.e. if the mousedown is 1/3rd of the way along the clip or selection when you start dragging, then the connection point is set to 1/3rd along the clip.

Fig 24 Changing the connection point

Fig 25 Creating a secondary storyline

Some embodiments allow the user to create a container first in an anchor lane and add media clips into the container from the timeline or the clip browser.

Fig 26 Extend edits

Fig 27-31 Editing and moving the playhead using timecode entry

Fig 32 Editing with video only, audio only or both audio and video

Fig 33-35 Two Up view

The media-editing application in some embodiments displays two frames in the viewer  for other types of editing as well. Some of different types of editing include a ripple edit, a slide edit, a slip edit, etc.

Fig 36 Making compound clips

Fig 37 Navigating Timeline History

Fig 38 Bookmarking a timeline history view

Fig 39 Timeline history state diagram

Fig 40 Retiming a compound clip

Fig 41-46 Importing clips into a database

Including transcoding and proxy generation

Fig 47-50 How timelines are represented in the database

Fig 51 Application architecture

simple-block-diagram

It might be possible to associate some of the internal frameworks in Final Cut Pro X and iMovie with elements of this diagram. For example, the Rendering Engine could be implemented by 'Ozone.framework' - the 'headless' copy of Apple Motion 5.X in Final Cut and iMovie. You might be able to guess what 'TLKit.framework' does.

There's an interesting hint about how the application defined in the patent might not only be a traditional application running on a computer: 

In some embodiments, the media editing application is a stand-alone application or is integrated into another application, while in other embodiments the application might be implemented within an operating system. Furthermore, in some embodiments, the application is provided as part of a server-based solution. In some such embodiments, the application is provided via a thin client. That is, the application runs on a server while a user interacts with the application via a separate machine remote from the server. In other such embodiments, the application is provided via a thick client. That is, the application is distributed from the server to the client machine and runs on the client machine.

Fig 52 Computing device

All software patents need to include a description of a computing device for the software to run on.

New patent shows a little early Apple thinking on Final Cut Pro X

Monday, 13 October 2014

Last week Apple was awarded a patent concerning how to highlight discontiguous groups of clips on a timeline.

Even though it is a new patent, it shows Apple’s thinking back when it was applied for. This patent took years to be awarded, so any feature hints found in the application document have been superseded by what Apple chose to implement in the intervening years.

Patenting concepts that apply to editing software requires a description of a sample editing user interface. Here is an example from the patent:

Final-Cut-X-May-2009

The imaginary sample editing application shows a combination of Final Cut Pro 7, iMovie and Final Cut Pro X. Layers from Final Cut Pro 7, iMovie's way of having content in more than one place at a time (clips would appear in 'All Files,' 'Video' and in an interview folder). The viewer/inspector/timeline layout are from Final Cut Pro X.

The parts of this illustration that interest me are the labels above the viewer and the inspector. The viewer has three control areas: 'Display Types,' 'Viewer Tools' and 'Overlays.' The inspector seems to have two tabs at the top - 'Inspector' and 'Transcript' with clip 'Specific Controls' at the bottom of the inspector.

This patent was applied for in May 2009. Interesting that Apple considered including a clip transcript panel in a clip inspector. I also hope that Apple will expand the way overlays work in the Final Cut Pro X viewer.

The patent.

 

 

2010 Apple patent: Final Cut Pro X concepts

Saturday, 30 August 2014

A patent filed by Apple in 2010 shows a possible future direction for iMovie that includes ideas that have appeared in Final Cut Pro X. More evidence of the large amount of research and development Apple put into editing user interfaces.

I've written before about Apple's patent concerning wider story structure as well as timeline structure. This week saw Apple being awarded a patent that seems to have arisen out of making editing easier for a wider range of people. A step on the way to general video literacy.

US Patent 8,819,557 is for "Media-editing application with a free-form space for organizing or compositing media clips." It follows on from the way older editing applications gave space in icon-based bins for editors to play with clip order. In Final Cut Pro 7, Adobe Premiere and Media Composer bins can show clips as icons that can be arranged in any way prior to being added to a timeline. 

This Apple patent turns icon-view bins into spaces where editors can combine clips together into timelines, as well as perform many other operations - including assigning keywords, defining selected ranges, skimming, trimming sequences and trimming edits and more.

8819557-editing-pasteboard-a-editing

 

8819557-editing-pasteboard-c-dynamic-trimming

The patent includes a storyboard that shows an element of dyamic trimming. Here is an excerpt from the text:

During playback, a playhead moves along the media clips in the sequence. Before the playhead reaches the end of the media clip, the user can invoke a command that will cause the media clip to continue playing content from its source file after the current out-point is reached. When the playhead reaches a location in the media clip source at which the user wants to set a new out-point for the media clip, the user can invoke a command that will cause the frame at that location to be set as the new out-point for the media clip.

As well as the full text of the patent, you can view the whole 119 page patent - including many UI storyboards in this 13MB PDF. The easiset way of reading the patent is to have the text in one window with the PDF in an another.

 

 

Apple patent: Media compilation generation

Tuesday, 19 August 2014

Apple have been assigned a patent concerning the creation of video compilations based on individual preferences. 

The present disclosure is directed to a online video parsing application for parsing one or more videos and for performing one or more actions on particular portions of the videos. For example, the online video parsing application may identify portions of interest within particular videos. A "portion" of a video refers to at least a subset of content included within the video, and may be designated by a time interval. The video parsing application may process any type of video, and any type of video portion. The online video parsing application as discussed below may be implemented using any suitable combination of software, hardware, or both.

In some embodiments, the online video parsing application may create a compilation of video content. Compiling will be understood to mean concatenating, or arranging in series, videos or portions of videos thereby forming a new video. Compiling video, portions of video, or combinations thereof, may provide a technique for delivering desirable content to a user. In some approaches, compilations may be generated based on user input, may be manually assembled by a user, or both. For example, a user may specify content of interest by manually selecting portions of online videos. The user may also input keywords, preference information, any other suitable indicators, or any combination thereof to the online video parsing application for searching video content. In some approaches, the online video parsing application may generate compilation videos using, for example, information provided by the user, automated processes, or both.

8812498-compilations-content

The idea depends on tagging parts of online content. This could be done by their creators, third parties, or by software. Individuals could profit from being curators who discover and tag content well.

Users would be able to specify how long they want their compilation to be: from a few minutes to a continuous feed. 

I've written before about how iTunes Radio could more than a service that plays music content, but to make a custom radio station based on the full range of content a person might find interesting. Looks like Apple will be able to do this with other forms of media.

PS

The patent refers to 'online video' as 'podcast.' As podcasts can be audio or video podcasts, I wrote this post replacing the word 'podcast' with 'video' or 'online video.' One trick when applying for patents is to get protection for ideas in such a way that the competition don't think the idea applies to them.

Those who make the best compilations win!

 

 

Apple patent: Metadata generation from nearby devices

Tuesday, 29 July 2014

Today Apple was awarded a patent for a process where when data is created or saved on a device, the device detects nearby devices ('second devices') and offers possible metadata tag options that could be associated with the data:

Identifying the content can include identifying the content that has just been created (e.g., identifying a digital picture right after the picture is taken), or selecting from a list of content that was created when at least one of the second devices was in transmission range with the first device. In the latter case, each content file can be associated with a list of second devices that were present. The user of the first device can have the options of labeling the file with a particular second device or a group of second devices (e.g., multiple labels can be assigned to each file).

The content can have a variety of formats. For example, the content can be a text file (e.g., a text note), a data file (e.g., a spreadsheet), a multimedia file (e.g., a sound recording, a digital image, or a digital movie clip), or in any other format (e.g., a voice mail message, an entry in a data sheet, a record in the database, etc)

For OS X and iOS 8 users, the metadata would appear as tags associated with a file, calendar event, contact or note. For Pro Apps users the metadata would appear as keywords associated with stills, audio and video clips recorded on iOS, OS X and other devices.

Those controlling public devices such as iBeacons could also offer up useful metadata for those creating content in public spaces.

 

Apple's video conferencing patent

Tuesday, 29 July 2014

Apple has been awarded a video conferencing patent for connecting multiple cameras in one location:

Multiple cameras are oriented to capture video content of different image areas and generate corresponding original video streams that provide video content of the image areas.

One part seems to overlap with the way Google Hangouts works:

An active one of the image areas may be identified at any time by analyzing the audio content originating from the different image areas and selecting the image area that is associated with the most dominant speech activity.

The illustrations are interesting - ranging from a TARDIS-like desk to a vertical video display (showing that the video feeds could be sent to devices people view in portrait mode).

Tardis

 

 

 

Apple's metadata propagation patent

Tuesday, 22 July 2014

Apple has been awarded a patent that says that metadata propagation rules can be included with video files. That means you could pass on a video file with metadata that would be available to an editor but not exported when the generate new content based on the files you sent them:

Some embodiments provide a method for processing metadata associated with digital video in a multi-state video computer readable medium. The method specifies a set of rules for propagating the metadata between different states in the video computer readable medium. It then propagates the metadata between the states based on the specified set of rules.

It also describes an example when the metadata in one set of video clips can be assigned to a related set of clips stored elsewhere. This would apply if an on-set assistant had added metadata to lo-res H.264 clips on an iPad and an editor wanted some of the metadata applied to the media from the professional cameras.

It also says that the metadata could also define which parts of the high-quality media should be captured later:

In some embodiments, the method recaptures digital video from a first storage, when at least a portion of the digital video is also stored in a second storage. The method retrieves the digital video from the first storage. It identifies a set of metadata that is stored for the digital video in the second storage, and then determines whether there is an associated set of rules for processing this set of metadata when the digital video is re-captured from the first storage. If so, the method then stores the set of metadata with the retrieved digital video in a third storage.

 

Apple Awarded 'Second Screen' Patent

Tuesday, 24 June 2014

Today Apple was awarded U.S. patent 8,763,060:

A system and method for providing companion content on a device that downloads content associated with a media presentation playing on a media player and displays the downloaded content at times synchronized to time-offsets (from the start of the program) of the presentation by signals from the media player.

This is popularly known as 'Second Screen' media - information you receive on a small personal  device that acts in sync with media playing on a larger, sometimes shared screen.

The Apple method is that a media player (such as an Apple TV) broadcasts information that nearby devices (such as iPhones or iPads) can use to display relevant content. This first figure shows personal devices receiving a content URL and a time offset (from the start of the programme):

Apples-second-screen-patent

Some examples of what personal devices might do given in Apple's patent description include:

- "The current film playing is… Starring ActorName…"

- "Click for the ActorName fan club"

- "Read script of film"

- "View film storyboard"

- "Show closed captions" (would work for translated subtitles)

- Show advert relevant to programme or to personal device user (This is a primary idea in the patent description that says that this method would avoid bad product placement within a film by showing ads on nearby connected devices instead).

- "Like the shirt ActorName is wearing now? Click to buy it now"

- Act as media device remote control

Many people will point out that this isn't the kind of idea that should be patentable, and that there are many examples of prior art. I'm writing about this because this patent provides hints to possible future directions for Apple products and services.

Beyond Second Screens: HomeKit

Not mentioned in the patent description that this idea would work very well with OS X Yosemite and iOS 8 HomeKit integration. As well as affecting media playback and web browsing on hand-held devices with screens, this system could also control HomeKit managed devices.

Imagine allowing a horror film to turn off the lights in your home or even play scary sounds in nearby rooms! 

testosterone molar mass look at here tren muscle builder click to find out more click here for more info source check this out