Google intelligent image scaling could mean 4K quality at HD data rates

Thursday, 19 January 2017

PC magazine has written about a new Google technology that shows very good results in scaling up lower resolution stills to higher resolution:

The new technique is called RAISR, which stands for "Rapid and Accurate Image Super-Resolution." [It] works by taking a low-resolution image and upsampling it, which basically means enhancing the detail using filtering. Anyone who's ever tried to do this manually knows that the end result looks a little blurred. RAISR avoids that thanks to machine learning.

Check out the article for examples of what RAISR can do.

Imagine this idea applied to ‘UHD’ video distribution. Store frames as 1920x1080 but with more colour information. To display 3840x2160 video, the algorithm could interpret the missing pixels and show an image with detail that even those very close to the display would find indistinguishable from real 4K.

Although this would be hard to do in real-time (in less than 1/60th or 1/120th of a second depending on frame rate), the the algorithm would need to interpolate less detail for moving video. There are limits to what the human visual system can discern. If video was slowed down or paused, the algorithm would have more time to produce higher quality 4K.

In practice 4K for broadcast is mainly for bragging rights, less than 1% of people sitting at a comfortable viewing distance are able to see the improvement over HD. It makes more sense to dedicate precious bandwidth to better quality pixels. This algorithm will then be able to scale the image up well if need be.

Logic Pro X 10.3: It’s new iCloud feature would be big for Final Cut Pro X

Wednesday, 18 January 2017

Logic Pro X 10.3 is out. Alongside Touch Bar and other UI improvements, there's an ProApp feature that should interest Final Cut Pro X users: Syncing with Garageband for iOS via iCloud. According to 9to5 Mac

Tap a button in Logic Pro X and your whole project will now be sent to iCloud as a single reference track. That allows you to open up Garageband on your iPhone or iPad and continue working on the project in a way that doesn’t require you to upload heavyweight session files or bounce and pass around your own audio files in email or elsewhere. Once you add some new tracks in Garageband, the project will be synced back to iCloud ready for you to continue working in Logic Pro when you get back to the studio.

iCloud doesn't quite have the storage for a similar service for Final Cut users. However a 'Share with iMovie for iOS via iCloud Proxies’ feature would be useful in many workflows.

Imagine the following message you could send ‘Here's the latest edit. Open it on iMovie on your iPhone, iPad or Mac. The timeline will link to the media in the cloud. You can roll the edit points a second or so in each direction, and change clip metadata. If you need to, you can also change effect/transition/title/generator settings and modify markers and keywords. Send me your version when you are ready.’

Recent Alex4D posts on Medium

Wednesday, 30 November 2016

Apple’s ProApps team should get their heads into the Cloud

For post production, Apple might be able to infiltrate media organisations via everyone except the post-production team. At some point when writers, researchers, producers and directors ask why the work they’ve already done in Final Cut needs to be transferred to a ‘professional system like Avid (or Premiere)’ — post people won’t be able to come up with a good enough argument.

‘New video distribution models mean new video creation tools

Post tools — be they high-end editing applications or free online services — will need to be able to create stories made of clouds of video, audio, graphics, effects, pictures and transitions. Each of the main video and audio post-production tools are at different points on the path towards being able to do this. They are limited by history, user interface metaphor and ability to deliver.

Apple have great plans for Macs, which don’t include turning them into giant iPhones

I don’t think Apple will make touchscreen Macs — specially ones with large screens. Direct manipulation of a UI much larger than an iPad Pro for hours on end is a choice between aching arms (screen in front of you) or aching neck (screen at an angle comfortable enough for your arms).

Good News: No More Wireless Networking Products from Apple

Why is this good news? The engineers in these teams are now free to do more distinctive things. Obviously Apple thinks that there is not much they can add to these products in coming years. They’re leaving these mature markets to others.

Now video editors can use the macOS Finder like a media database

I was lucky enough to be presenting at the 2016 FCPX Creative Summit a month ago when Benjamin Brodbeck came up to me with a problem. He works at Caterpillar Inc. and has a team of over 30 editors and assistants working on videos being shot and edited all over the world.

They constantly need to make films covering specific regions, models of equipment and categories of engineering project. Categorising footage is very important. A film featuring forest equipment being used in China can’t feature a shot of the wrong piece of equipment, or one taken in the wrong country.

To keep up to date, follow me on Medium.

Avid no longer going after mid-market and individual creatives

Saturday, 15 October 2016

It looks like Avid Technology are changing their strategy. The old strategy was described in a May 2015 video (no need to enter your information just click 'Submit' to see the video): get larger proportions of what they call Tier 2 and Tier 3 markets -The $3.1bn business and institutions market and the $1.8bn individual creatives market.

Now they are talking about capturing more of the workflow in Tier 1. CEO Louis Hernandez Jr. appeared on CNBC's Mad Money show with Jim Cramer:

Avid's behind some of the largest media companies" … "A hugely influential player in 140 countries around the world

Avid editors…

will earn 44% higher than any other editor over their lifetime

[2:03] He refers to Media Composer and ProTools as 'Heritage’ products. As there is much more to workflow than editing and mixing, Avid has had to build on those tools to "…participate in the rest of the workflow”


These transformations are never that easy […] We are surging in cloud subscriptions and large enterprise deployments

He said that they previously announced that their transformation would be over in Q2 2017, but they discovered they had to restate 8 years of financial results. The upside of being delisted was that although the transformation was delayed, they were able to more quickly make the investments they needed to transform the way they planned to, so they are still on track.


We've said the end of this transformation is 2017, mid-year. You're starting to see with every given quarter real progress. You've seen platform sales up last quarter from quarter to quarter of 47%. You've seen 4 times growth on our cloud-based subscriptions and more to come

He later said that NBC needed to make a transition to digital economically, and was able to do so using Avid Technology technology.

I’ll leave it to Avid users themselves to determine whether recent surges in ‘cloud-based subscriptions’ were from new users or from those changing the way they already paid for Media Composer and ProTools.

My guess: they are hoping for an acquisition. Who would be a good fit? If you believe in Avid’s strategy, one that has products and services that complement the aim to provide solutions for the whole workflow at the high end. This means that Avid doesn't like the idea of selling Media Composer to someone else, even if there's almost no chance that it can be anywhere near as profitable as the audio side.

I wonder whether investors have a way of measuring how successful Avid's new strategy is successful. I'd be surprised if the results of the previous one were going to look very good. 

Recent updates to my Alex4D Facebook page

Friday, 09 September 2016

I've been adding of links and articles to my public Final Cut Pro X page on Facebook. Here are links to some of them:

26 July: A 360° film I shot on a Kodak PixPro SP360 4K Rig:

It shows a pilot and principal scientist Professor Alex Rogers taking a submersible from the #BaslineExplorer, the Nekton Mission deep ocean research ship, down to the Atlantic ocean floor 30 miles off the coast of Bermuda

10 August: Another 360° film:

The rule with VR video: keep the camera still. We made this with a 360 camera attached to a speeding boat criss-crossing the ocean while the crew recovers a research submersible after a dive into the deep ocean.

18 August: Apple's ‘multi-ranges in a clip’ patent:

It patents a way of selecting and showing multiple ranges within the same clip […] Part of the description of a way multiple ranges don't yet work in Final Cut

19 August: Apple patent hints they've been thinking of expanding the use of roles:

Many want a roles-based mixer and effects system in the next update. This patent awarded to Apple earlier this year should give them hope.

20 August: Tutorial on exporting Final Cut timelines for delivery to broadcasters - including a link to the UK standard:

How do you send specific channels in your audio mix to channels in your output file in Final Cut Pro X? Adam Schoales shows you how.

23 August: How Apple's culture determines how they compete

Apple, Adobe and Avid seem to compete in the world of post production, but their cultures are so incompatible as for there to be no real competition.

26 August: There'll be Final Cut Pro X special event at IBC in Amsterdam

I've giving a presentation on shooting, stitching, editing, adding graphics to and distributing VR Video using Final Cut Pro X. I will also be in the demo area on Saturday and Sunday to answer your 360 video production questions.

29 August: An article comparing Apple and Google might also apply to Apple in the post production world:

If Apple update Final Cut Pro X to version 10.3 later this year, I hope Final Cut moves from a primarily interdependent architecture to a primarily modular architecture.

29 August: Want to only transcode the media used in a specific project?

Useful if you have a few hours of 360° 4K clips, but you only need to generate proxy versions of clips you've selected in a project for editing on a slower Mac.

30 August: Join me at the Amsterdam Supermeet:

IBC – the biggest European video trade fair – has many stands for those for shoot and post produce video. Sadly it also has very many stands that are of no interest to the same people.

The great thing about the Amsterdam SuperMeet is that all you need is concentrated in one evening of presentations, prizes and people.

1 September: Editors make Hacktintoshes while they wait for upgraded Macs:

The idea behind a Heckintosh is that you get a Mac configured exactly the way you want using standard PC parts for much less than Apple charge.

In practice, the time it takes to build and maintain your cobbled together computer obviates the price advantage - but you can get exactly the Mac you want

6 September: Adobe has beaten Apple when it comes to collaborative editing:

Big Adobe Premiere Pro news: Team Projects - the ability to share timelines between editors.

7 September: The Apple ProApps team have a vacancy:

The Apple Professional Apps Design group is looking to employ a ‘Video Applications Product Designer’

9 September: Additional information about Adobe Premiere's Team Projects feature

Although some would call the implementation clunky, it is only competing with Avid's ancient (but battle-tested) bin-locking alternative.

Check out every few days to see more.

Apple WWDC 2016 Announcements and Post Production

Tuesday, 14 June 2016

Every year at their Worldwide Developer Conference Apple presents some of their plans relevant to software and hardware developers at a keynote presentation. Here are my notes and links from the 2016 keynote.

The main screen and the webcast stream didn’t have the normal 16:9 ratio. It was wider at the Cinemascope ratio of 1:2.40. Could this be a hint that a future Apple-branded display will have a 21:9 (1:2.33) aspect ratio?

New Name

As iOS will reach version 10 this Autumn and OS X has been around for over 16 years, Apple will now rename their Mac operating system macOS. The next version will be macOS Sierra, version 10.12. This renaming will make Final Cut Pro, Logic and iMovie stand out as being part of an older naming scheme.

There’s a chance that iMovie will become ‘Movies’ for iOS and macOS - following on from how iPhoto become Photos. An alternative is that productions started in iMovie will be edited in macMovie and then be openable by macFCP while the soundtrack is modified in macLogic. More likely is that Final Cut and Logic will simply drop their X suffixes.


Siri for macOS means that Macs will be able to be controlled by voice as iOS devices can be today. SiriKit for iOS 10 gives a limited set of third party applications the option to be controlled by Siri.

If SiriKit was introduced to macOS the ProApps team would have the option to add much more voice control to their apps. This would be especially useful for finding clips based on keywords and other metadata. As well as asking “Show me clips in the browser with the ‘Interviews’ keyword” or “Show me clips in the timeline with dual mono,” Siri also understands context: “Show me interview clips… show me those with dual mono” will only show interview clips with dual mono - not first one selection of clips followed by all clips with dual mono.

Although there are many different ways of asking for the same thing, those are interpreted by Siri and passed to the target app in a standard way. This kind of automation would work well with scripting. Apple has released a new guide on that subject: The Mac Automation Scripting Guide. Currently there are no hints that scripting will be added to iMovie/Final Cut yet.

For now SiriKit for third party iOS apps will only be used for the following tasks:

  • Audio or video calling
  • Messaging
  • Payments
  • Searching photos
  • Workouts
  • Ride booking

WWDC 2016 session on SiriKit.

New Photos features useful for video

Photos for iOS 10 and macOS Sierra will have a couple of new features of interest: more advanced content recognition and the automatic generation of ‘Memories’ videos.

As well as recognising all photos with a specific person, Photos will also recognise other kinds of content. This means that photos can be grouped based on the content detected. Examples include photos with beaches, with horses, shot in fields. This kind of automatic categorisation will be very useful for iMovie/Final Cut users - especially when clips are very long. The content recognition should be able to mark only the time in a long shot when a certain person or object appears.

Using this image recognition technology, Photos will also be able to generate ‘Memories.’ A Memory can look like a web page or publication on a subject. Memories can include videos made up of automatically animated photos. If users want to change the mood of a video, they can choose a new soundtrack and the story will be re-generated to match the music.

Will these video Memories will be modifiable in iMovie or Final Cut Pro X? It would be a very quick way to get new people into making movies. The same technology could be used to make automatic videos from selected clips in a video library.

Differential Privacy

Apple have found a way of using information from millions of Apple users to power services without compromising any specific individual’s privacy. ‘Differential Privacy’ is a mathematical method that ensures privacy when sharing data from millions of people.

Specific mathematical equations define a specific amount of ‘noise’ to add to a single piece of data. This noise makes the original data associated with a specific person impossible for anyone - including Apple - to decode. The trick is that when hundreds of thousands of pieces of unbreakable encoded data are combined together, there are statistical measures that will be able to detect trends amongst all the results. Apple will have no way of knowing what an individual value was, but will have an accurate representation of the distribution of all the original values over a large population.

This is the way Apple is able to use the large amount of private information it has access to provide intelligent services. An original mathematical paper: “The Algorithmic Foundations of Differential Privacy.”

Messages and iMessage Apps

Messages in iOS and macOS will get a big upgrade this year. Apple will provide a range of stickers and animations that people can use in conversations. For example ‘Invisible Ink’ will make an image blurred until each person in the conversation swipe over the picture. They also will be able to annotate other people’s messages and pictures. They’ll be able to add animation to speech bubbles, emoji and pictures.

As well as Apple-supplied animations and effects, third-parties will be able to make iMessage Apps to do more with messages. 

I hope Apple define a new graphic and animation file format for Messages that can be applied in other applications, such as Photos, Keynote, iMovie and Final Cut Pro. A metadata-driven format will display differently depending on the device showing the graphics. This will be useful when videos are made up of objects: video clips, images and metadata that tells the playback compositing software how to present the story.

If Apple start presenting Messages as a place for ad-hoc group-based collaboration (for play or for work), there should be a place for video.

WWDC session Part 1 and Part 2

Recording and playback of multiple simultaneous video streams

Created for those who want to record on-screen gameplay for later sharing online, ReplayKit for iOS now adds simple live streaming plus the ability to also record the play themselves commentating using a front-facing camera. This means a standard UI for viewers to be able to switch between ‘angles’ in a playback stream whenever they want. 

A new file system: APFS

The APple File System is designed for modern storage devices. The current file system - HFS+ - was designed to work with floppy discs. APFS is designed for Flash/solid state memory. HFS+ is known to degrade over time - normal day to day usage will result in files getting lost. APFS is designed for recoverability. It will be much easier to get at ‘deleted’ data. It will handle backups much more smoothly. 

As with Final Cut Pro X projects, the state of whole drives or parts of drives can be captured in a Snapshot.

A new file system doesn't mean a new Finder. It means that applications that spend most of their time manipulating files - like the Finder - will need to be updated to understand the new ways of organising documents and applications on storage devices.

Apple’s programming guide to the Apple File System. Ars Technica on APFS.

Important: APFS is released as a Developer Preview in OS X 10.12, and is scheduled to ship in 2017.

Better colour

The new Wide Color system framework will add wide colour gamut picture/picture capture and manipulation to iOS and macOS. Following on from its introduction to recent iMacs and iPad Pros, Apple have settled on the DCI-P3 gamut - the standard colour space used to specify the colours used in US cinema projection. Some think Adobe RGB would have been a better choice.

Sharing private data via CloudKit

CloudKit Web Services Reference:

You use the CloudKit native framework to take your app’s existing data and store it in the cloud so that the user can access it on multiple devices.

Currently any data that is stored in the cloud using Apple’s CloudKit framework is either public or private. This year CloudKit in Apple OSs will add the ability for iCloud users to share data amongst themselves.

This would be very useful for post production applications. For example Final Cut could upload proxy versions of all media (or media used within a specific project) so that collaborators would be able to have a live timeline to work with.

WWDC 2016 session.

QuickTime in Apple OSes

QuickTime as a container for video and audio files has a great future. The AVFoundation framework is the basis of Apple software that records, manipulates and plays QuickTime documents (amongst other file formats).

QuickTime the software framework is depreciated in macOS. This means that applications that use the QuickTime API will still work in macOS Sierra (10.12), but may not work in a future version. There is no way yet to know if Final Cut Pro 7 will work in macOS Sierra, but my guess is that it probably will.

As part of building applications Xcode, Apple’s development system, checks to see if the code uses old or depreciated OS features. It uses a API Diffs file to look at all code. The QuickTime part shows that the API headers have been removed. The AVFoundation part shows a lot has been added.

QuickTime the API has been depreciated for a while. Removing the headers means that applications can no longer compile if the code uses the old API. Applications already compiled on older OSes will still work in macOS Sierra.

Once again, the file format lives on. The part of Apple OSes that manipulate media called QuickTime will be replaced by AVFoundation eventually. This shouldn’t be a problem for Mac users of old applications for now. Remember that one day they will not work in a future version of macOS.

Apple and the future of media

Apple didn’t make any announcements directly relevant to post production. There was no mention of Retina displays, 4K, VR or 360° video.

On the other hand they laid some interesting foundations for collaboration. One day we might look back at this week and see elements vital to a new product or service introduced in coming months and years.

I'm looking forward to seeing what happens next.

Apple App Store Subscriptions and iMovie/Final Cut Pro X

Thursday, 09 June 2016

Apple has announced that developers making apps for sale in the iOS, Apple TV and Mac app stores can now offer subscription pricing for any app:

Starting this fall, apps in all categories on the App Store will be eligible to offer in-app purchases for auto-renewable subscriptions to services or content. Users enjoy the reliability that comes with subscribing to a service that they love, and the experience must provide ongoing value worth the recurring payment for an auto-renewable subscription to make sense.

Although Apple is pointing to subscriptions being used to pay for updated content or a continuing service, Phil Schiller added two more categories in an interview with Lauren Goode on The Verge:

He suggests many enterprise apps could move to subscription, and that professional apps that require “a lot of maintenance of new features and versions” would be a good fit.

As I often look at the tech world through the lens of post production, I wouldn’t be surprised if Apple use Final Cut Pro X as an example of the latter example at WWDC next week.

Many developers say that this change will make it much more likely they will create professional apps for the iPhone, iPad and Mac.

In the world of post production however, many Final Cut users thumb their noses at Adobe Premiere users over Adobe’s Creative Cloud compulsory ‘software rental’ subscription service.

Since Final Cut was ‘updated’ to version X (pronounced ‘ten’) in 2011, Apple have not changed upgrade fees for the many versions over the years. Those comparing the price for renting Adobe Premiere often compare that with the cost of owning Final Cut. If you bought it for $300 in July 2011, the monthly cost works out to $5 per month over the last five years. Any change in the way Final Cut Pro X is paid for is a big deal for those fighting NLE platform wars.

Final Cut Pro X subscription?

If Apple did introduce a subscription pricing to their professional applications, they have some options:

  • Next version of Final Cut Pro X only available by subscription - with or without a reduction for the first year for existing users.
  • Future versions of Final Cut free for those who bought / buy it for $300, $10 per month subscription option who can’t afford initial outlay.
  • Some features only needed for high-end or industry-specific workflows could be unlocked by subscription.

Final Cut Pro X as an iMovie subscription option 

Ever since X was introduced, traditional NLE users have joked that it wasn’t much more than ‘iMovie Pro.’ For a while now Apple has actually developed iMovie as a customised version of Final Cut Pro X with professional features turned off and with additional consumer-focussed features. They are currently the same application with different UIs activated depending on whether it is running as a consumer application or professional application. 

This might become relevant if Apple add a subscription payment option for Final Cut. Apple would need to decide what happens when customers no longer want to pay their subscription fees (which can be annual, monthly or even weekly). In the case of Adobe Creative Cloud users, if they stop paying Adobe, they can no longer open CC apps to view their projects or make any changes.

Apple have the option of going different ways with Final Cut if the subscription is stopped:

  • Final Cut reverts to the last full version paid for (for those who bought a subscription for new versions after having paid full price before).
  • Final Cut reverts to a ‘no modifications, just export’ mode.
  • Final Cut reverts to the iMovie feature set. This would allow changes to existing projects possible to do in iMovie. This would mean that Final Cut Pro X would be a subscription option for iMovie users.

Subscriptions and the wider Final Cut ecosystem

If subscriptions were introduced to Final Cut, its ecosystem of plugins, services and applications might need to update to reflect the changes.

Firstly, will any current developers move their products and services to the Mac App Store?

What does it mean for FxFactory? It is ‘The App Store for Pro Users’ who sell plugins and apps for Final Cut Pro, Apple Motion, Adobe Premiere and Adobe After Effects. They offer watermarked trial versions of most products. Will developers distributed by FxFactory release future products on the Mac App Store?

Red Giant Universe is a growing pack of plugins for Final Cut Pro X that is available by subscription. Would Red Giant trade some of their subscription money in return for making them available to the huge number of people who use iMovie?

Will Apple provide API hooks that allow Final Cut adjacent products check the subscription status of Final Cut so they can provide different features?

Will Apple Motion and Apple Compressor become subscription options for iMovie/Final Cut?

This change may make developers take another look at making professional applications for the Mac, the iPhone and the iPad Pro. Good news for post production at all levels.

FCPX Creative Summit 2016 Provisional Schedule - More from Apple

Wednesday, 08 June 2016

The provisional schedule for October’s FCPX Creative Summit is now available.

Interesting: Instead of last year’s 90 minute presentation given twice to two groups, the schedule shows a 60 minute ‘General Address’ followed by a choice between 90 minute breakout sessions:

2:00 – 3:00pm General Address: The Future (Apple Campus)
3:00 – 4:30pm Apple Session Breakouts (Apple Campus)

What could Apple be talking about in these sessions which would mean attendees would have to choose one session over an other?

Another point: The Summit was held in late June last year. This year it will be in late October. Given this event is organised to fit in with the plans of the ProApps team, there is a chance there will be more to talk about later this year.

Next week at the WWDC 16 there is a chance that Apple will announce or pre-announce a new version of the Mac Pro, just as they did in 2013. Final Cut Pro X is the application that most people understand needs a lot of power. Perhaps Apple will once again use a Final Cut screenshot during the keynote (which will be streamed online on Monday).

Blackmagic Fusion 8.1: A new VR video option for Avid users

Wednesday, 08 June 2016

This week Blackmagic Design announced a new version of Fusion, their high-end compositing application.

The good news for Avid Media Composer users is that Fusion version 8.1 works with with Fusion Connect 8.1:

This software adds the Fusion Connect for Avid plug-in that is compatible with Avid edit systems, so you can send any clip or stack of clips from an Avid Media Composer timeline directly into Fusion or Fusion Studio. Now Avid editors can have access to Fusion’s powerful 3D compositing and animation tools

Up until now many Avid users have been advised to transfer timelines to NUKE as a 360° video solution. Fusion 8 is a free application that is a peer to NUKE. Before being bought by Blackmagic Design, it was an expensive application. Fusion has been used in the post production of many high-end TV shows and feature films. To promote their hardware, Blackmagic Design made a Mac version of Fusion and released a almost fully functional free version.

As well as being a high-end node-based 3D compositor, Fusion 8 can also use 360° plugins. For example there is the $249 Domemaster Fusion Macros from Andrew Hazelden

The Domemaster Fusion Macros allow artists to create immersive 360° stereo composites. The new immersive toolset is designed to work with Blackmagic Design’s Fusion compositing software. These macros are great for preparing pre-rendered content for use in a fulldome theater, or on a head mounted display like the Oculus Rift, Samsung Gear VR, HTC VIVE, OSVR, or Google Cardboard.

So, if you are a Media Composer editor who is comfortable with learning new compositing applications who is interested in working with 360° video, check out Blackmagic Fusion 8.1 and Domemaster Fusion Macros.

Xsend Motion - Send Final Cut Pro X timelines to Apple Motion

Tuesday, 07 June 2016

For those who used Final Cut Pro Studio before 2011, a very popular feature request for Final Cut Pro X is the ability to send clips to Apple Motion. Motion can be used for the kind of more advanced motion graphics tasks. Post production file format translation supremo Wes Plate has made Automatic Duck Xsend Motion.

Iain Anderson’s review at Mac Pro Video:

Despite the excellent integration between FCP X and Motion, this critical piece has always been missing and often been requested. Finally, it's here, and while it's maybe not as feature-complete as if Apple had done it themselves, it's very useful, and still under active development by a veteran in this space. Heavy Motion users should grab it now.

Xsend elsewhere?

This might be greedy, but what else could this technology be used for? Now that the Automatic Duck team have learnt Final Cut Pro X XML and the Motion document format well enough to make this product, where else should X timelines be sent? 

Given the power of Blackmagic Design’s Fusion 8 node-based compositing application, perhaps Xsend Fusion and Msend Fusion would have a appreciative audience!

Apple’s patent for applying effects to clips with specific roles

Tuesday, 07 June 2016

The name of patent 9,240,215 may be ‘Editing operations facilitated by metadata,’ but it is about applying effects to roles in Final Cut Pro X:

For example, several clips may be assigned one audio role of "Dialog", "Music", or "SFX". A process  then provides one or more user interface controls. These user interface controls are also associated with the tagged clips. That is, the user interface controls are associated so that these controls can be used to display or modify properties of the tagged clips.

PDF version.


Apple’s structure editing patent

Tuesday, 07 June 2016

While editors wait for the next big Final Cut Pro X update, I hope the Apple ProApps team will implement some of the ideas in their ‘structure editing’ patent. Here’s my old writeup of the patent they applied for in 2009 on

Most people think that the editor’s job is ‘to cut out the bad bits’ in individual scenes. Many are surprised to discover that editors commonly change and improve storytelling by changing story structure. As many film and TV makers consider that structure is very important when it comes to telling stories, I think it is a good idea for video editing software to recognise story structure.

Structure applies to feature films, TV shows, groups of corporate videos on an intranet, legal video depositions, architects’ video proposals or open-ended weekly web series. The more video applications can have these structures encoded in their projects, the better the tools they’ll be able to provide to a wider range of people all over the world.

Introduction to VR Video with Final Cut Pro X

Tuesday, 31 May 2016

At the FCP Exchange event at NAB in April, Tim Dashwood and I gave a presentation on working with VR 360° video in Final Cut Pro X and Motion.

Initially I explained the art of spherical video from first principles comparing it to VR apps. I showed how editors can learn specialised tools that understand 'equirectangular' video, effects and graphic overlays to tell stories that play out all around you.

I also explained how editors can share their work with millions of smartphone users around the world. Tim Dashwood then gave a quick rundown of the science of high-end VR video effects that are available for Final Cut Pro X today.


360° Virtual Reality with FCPX from FCPWORKS on Vimeo

FCP Exchange is a series of free industry seminar days presented by FCPWORKS and

Dashwood 360 VR Toolbox and 360 VR Express.

Sound Design Lessons for VR Video from VR Games

Friday, 27 May 2016

VR Video tools for video editors have progressed quickly in the last year, but there has been less discussion about the audio side of VR video. Although VR audio tools have yet to be integrated into NLEs, audio experts (and video editors who spend much of their time refining their soundtracks) should consider how audio design is different for VR.

Those designing audio for VR games are probably further along in coming up with what makes VR different. 

At a mini conference on game audio earlier this year, developer Gordon McGladdery gave a presentation on audio for VR games.

His game Fantastic Contraption is one of those given away with each HTC Vive (a VR headset that detects where you are in a room for ‘room-scale’ VR), and he has worked on the sound for VR commercials.

He spoke with Matthew Marteinsson on episode 25 of the ‘Beards, Cats and Indie Game Audio’ podcast about VR audio. Here is a summary of some of what was said:

[7:07] Binaural audio is very important - without it, experiencing VR is ‘like watching a 3D movie without the glasses on.’

[7:33] Music score doesn't work in VR games - it ‘muddies everything up’ [The music is] ‘coming from nowhere in the world and just seems to cloud the entire immersion.’

[10:17] Everything matters. The current video game sound design orthodoxy is that some sounds are more important than others;  time and budget determine a well known order of priorities when it comes to sound design. Everything shown in a game that can make sounds, must have game audio.

[14:21] Even if your target VR platforms don't have advanced audio, incorporating advanced audio future-proofs your current productions.

[15:05] ‘A lot of what we do here is to design right up until the end.’ It is important to design your sound workflow so that if the design of the game changes near launch (the eqivalent of a new edit of a film), ‘we as audio can quickly move with it.’

[16:14] ‘Distance falloff is really finicky’ - pay close attention to sound volume based on position - ‘none of the defaults work.’ Different sound sources have different falloff curves, some objects need to be heard from further away. You need different curves for every objects. Based on character need, not realistic sound physics.

[23:33] ‘Dynamic range is back - we're not crushing everything any more’ - adding heavy compression doesn't work - it just makes evreything loud. ‘VR audio will be pretty uncompressed.’ Prepare for the fact that different audio soundtracks work for different playing environments. Most VR experiences will be in quiet environments, but some will be in noisy places - which will need compression to punch through.

Listen to the rest of the podcast to hear Gordon’s take on VR use outside the world of games, as he is getting more non-game work due to his VR audio skills.

BBC TV production change could mean more workflow stories

Tuesday, 22 March 2016

Proponents of Final Cut Pro X and Adobe Premiere are frustrated when editors of high-end TV and features say "I don't know anyone who doesn't use Avid."

We value case studies showing what alternative applications can do. In the UK, the BBC is not allowed to publicise its workflows - used on TV shows that are world-famous - because being cubically-funded it is not allowed to promote one commercial supplier over others. All we can do is gather indirect information, such as tweets by those working with and for the BBC:

The flow of BBC production stories may increase soon: they are moving the majority of their TV production to a commercial division on April 29. This is the way commercial TV in the UK and elsewhere works. In a year ‘BBC Studios’ will also make shows for other broadcasters and networks.

If I was promoting Final Cut Pro X and Adobe Premiere workflows, I'd start preparing case studies now.

Dual lens iPhone 7 = Multi-angle QuickTime files

Thursday, 10 March 2016

The day after the announcement of a new iPhone, the speculation starts for what might appear in the next iPhone. This speculation is based on Apple patents, acquisitions, what parts suppliers can produce and what Android phones have had ‘for years.’

This week’s speculation from MacRumors suggests that the next iPhone will have a dual lens camera. Such a camera would be able to capture a normal shot and a close up shot at the same time:

Amid rumors a dual-lens camera will be introduced in the iPhone 7, Apple recently submitted a patent application published in January which gives us rare insight into what Apple thinks a dual-lens camera interface could look like on future iOS devices.

The patent outlines a dual-camera system that consists of one standard wide-angle lens similar to what's in the iPhone today and a second telephoto lens capable of capturing zoomed-in video and photos.

Apple’s iPhone is the combination of hardware and software, the interesting part for me is the software part. As with slow motion recording, there are software implications. Dual lenses means

  • A user interface for capturing two video ‘angles’ at once (with a single shared soundtrack - unless audio from multiple microphones is also captured)
  • Storing more than one video stream in the same QuickTime file
  • A user interface for marking footage at points when playback should switch showing footage recorded with one lens to the other

Multicam for ‘the rest of us’

Consumer-level multicam has implications for the next versions of iMovie for iOS and OS X and Final Cut Pro for OS X (and iOS?). As well as being able to handle multi-video track QuickTime files, they might provide features to ease users into multi-angle production.

iMovie would need a command and shortcut to add metadata to a multilayer clip to say ‘switch to other layer here.’ That would be a useful shortcut to add to Final Cut Pro X as well. Currently editors  working with two-angle multicam clips,  must alternately use ‘Cut and Switch to Viewer Angle 1’ and ‘Cut and Switch to Viewer Angle 2’ depending on which angle is active. 

Multi-track QuickTime is back

Once consumers get used to multi-stream video files, they might start expecting that multiple devices at the same location should be able to contribute to a single QuickTime record of an event. As long as all devices have an iCloud account, Apple could provide the synced file for all to share with each other and others.

More and more professional production uses multiple cameras. Final productions will probably include multiple video assets that are shown depending on playback settings. This means multicam user interfaces for production and playback alongside multiple video layers being stored in single movie files.

Also recording devices will probably encode multiple video angles into movies. Aldready on Convergent Design’s Apollo switcher/recorder product page:

exports separate Apple ProRes files with matching timecode or a single multi-stream QuickTime file that drops directly into the timeline of supporting NLEs such as FCP-X.

How interesting.

Last year’s Apple WWDC had a relevant session on editing movies using AV Foundation for iOS and OS X developers:

There are methods for creating and removing tracks, and you see to create a track, we have to say what type of track we want.

Do we want a video track, do we want an audio track, and so forth…

Features seem to come from Final Cut Pro X:

We can now open these, edit them, and write them back. At the track level, we have a similar setup. As you know, a composition is composed of composition tracks, and at the mutable level, we have AVMutableComposition tracks.

Media doesn't need to be in the same file:

Now, it's possible for the sample data that a track refers to exist in another file altogether, so you can have external sample references. It's even possible for the sample references to refer only to external sample data. Now, when we have this situation, the little movie box and its file type box is called a sample reference movie file.

Coming soon to OS X and iOS applications?

Final Cut Pro X at RTS Swiss National Television and the future of post production consultancy

Tuesday, 08 March 2016

A new case study shows how collaborative storage systems are coming into their own for TV stations using Final Cut Pro X. The team behind the implementation demonstrates how different the Final Cut high-end post production consultancy ecosystem is from the ‘left over from the 20th Century’ establishment. 

Ronny Courtens and the implementation team have written a detailed post at

[National Swiss TV broadcaster RTS ] needed a 100 TB effective and fully expandable enterprise NAS system with high redundancy and high-availability for 24 client connections. 12 connections over 10Gig SFP+ to their existing fiber channel network for the editing stations and the audio and ingest stations. And 12 connections over Gigabit Ethernet to their existing Cat6 network for extra ingest, titling and graphics machines, Open Directory and system admin.


all editing and ingest clients must be able to perform high-speed file transfers to the server without affecting the sustained Read bandwidth of any editing station. This is one of the biggest problems most NAS systems will face in this kind of setup. During the tests they did with the previous systems, bandwidth dropped considerably and brought the editing systems to a halt as soon as one of the Studios or Outside Broadcast trucks started streaming live multicam footage onto the server over the high-speed network. Or even when one of the editing stations did a simple export over 10GigE.

‘Mystery is Margin’ no longer

As well as the detailed technical story of the solution, the article includes the business story. On reading about their case study about factual TV production in Denmark, RTS engineers went to a freelance workflow consultant and his colleague for help. Ronny Courtens and Anouchka Demeulenaere proposed a new solution from a LA-based company LumaForge

It can’t be very often that a national TV station goes to a pair of freelance workflow consultants with no website or Twitter account. One of the videos embedded in the case study contrasts the old method of post consultancy (‘Mystery about how things work means Margin for us’) vs. the modern (‘Let’s work this out together’).

After trying to get one system working:

Finally they sent us a tech guy who started writing in the Terminal without explaining anything.

Compare that with:

The guys from LumaForge came in and Eric explained the entire system to us. 

Freelancers to the rescue - A new job for the 2010s?

Despite wanting post professionals to take a look at Final Cut Pro X, Apple have shown little interest in fitting into the ecomomics of post production consultancy. They might offer engineering support with proposals and support, but they don’t make it easy for people to make money out of installing and maintaining Final Cut Pro X at the high end. The software is too cheap and easy to buy, there is very little margin in Apple hardware.

There used to be money in multiple training courses for staff. Recently a person from another broadcaster responsible for ensuring that hundreds of journalists and camera people keep their skills up to date told me that they toured the many newsrooms full of Final Cut Pro X a few months after initial training. He asked them if they wanted any more training. They all said they didn't any: they were happy to find all the answers they need by going to the internet.

It could be that some post consultancies don’t recommend Final Cut Pro X because they can’t make enough money on those installations. They have expensive offices, salespeople and teams of engineers to support. 

In the case of Metronome in Denmark and RTS in Switzerland it wasn’t one of the big companies that provided the solution. It was Ronny Courtens and Anouchka Demeulenaere. They found a way of delivering a solution and making enough money to justify their time.

Ronny says:

We are not even consultants or integrators. The projects we get come from people whom we have known for years in the industry, or from people we know from the forums and groups. So we don't need a website or Twitter, nor do we need a large team. We just make the contacts, we analyze the issues and then we team up with people we think will be able to help us provide solutions.

Usually we don't charge anything for a first meeting, no matter where it is. We are always interested to discover new companies and workflows.

Things have indeed changed a lot lately.

The FileMaker model

That might be an interesting model to take to Apple. The Pro Apps team can’t get the rest of Apple too excited about helping a few thousand high-end post people make TV shows and feature films more easily. That doesn’t match Apple’s aim of empowering people and “leaving the world better than we found it.” What if the Pro Apps team proposed that they support thousands of freelance post consultants in introducing video to businesses and organisations of all sizes all over the world.

They do something very similar to this with their FileMaker database product and freelance community. Go to their website now and imagine the word FileMaker replaced with Final Cut Pro X. Where you see ‘database developer’ imagine ‘post workflow consultant’ instead. See software that can be bought and rented, where workflow tools work on Macs, servers and iOS devices. Also discover how much Apple promote and involve freelance developers with third party tools.

Making Final Cut Pro X a platform like FileMaker would help Apple truly revolutionise the future of video for businesses and organisations everywhere.

Final Cut Pro X: Rate Conform

Wednesday, 16 December 2015

When working with video clips that have frame rates that are close to being a multiple of the timeline frame rate, but not quite, Final Cut Pro X sometimes speeds them up or slows them down.

When this happens, you will be able to see a section in the inspector showing that its frame rate has been conformed:


This shows that a clip that normally runs at 25 frames a second will be slowed down so that it plays at 23.976 frames per second. It's playback speed on the timeline will be 95.904%. This means one frame of the clip will be displayed for one frame of the timeline.

Here are the rate conforms that Final Cut Pro X does automatically: 

When you add a clip with this frame rate To a timeline with this frame rate It automatically  plays back at this frame rate Speed % of original duration
23.976 24p 24 99.90% 100.10%
23.976 25p/i 25 95.90% 104.27%
23.976 50p 25 95.90% 104.27%
24 23.98p 23.976 100.10% 99.90%
24 25p/i 25 96.00% 104.17%
24 50p 25 96.00% 104.17%
25 23.98p 23.976 104.27% 95.90%
25 24p 24 104.17% 96.00%
29.97 30p 30 99.90% 100.10%
30 29.97p/i 29.97 100.10% 99.90%
30 59.94p 29.97 100.10% 99.90%
50 23.98p 47.952 104.27% 95.90%
50 48p 48 104.17% 96.00%
59.94 60p 60 99.90% 100.10%
60 29.97p/i 59.94 100.10% 99.90%
60 59.94p 59.94 100.10% 99.90%

This is a full listing of the combinations where Final Cut Pro automatically changes the speed of a clip. For any other frame rate combnations Final Cut will drop or repeat frames so that they source clip seems to play at its original speed.

For example if you add a 30p iPhone video to a 25p timeline, Final Cut will skip some of those frames every second so the playback speed remains the same and the duration stays the same: a 3 second 30p clip will take 3 seconds to display in a 25p timeline. If that same 30p clip was added to a 48p timeline, then Final Cut will repeat some frames so the playback speed will remain the same: the 3 second 30p clip will display for 3 seconds on a 48p timeline.

Letting audiences make structural choices in films

Wednesday, 30 September 2015

Editors determine structure. From individual frames, shots and sequences up to scenes and acts. At the higher levels they work out what order to tell stories and with how much detail to go into.

When we tell stories, it is common to divide them up into parts – ‘atoms’ – which we think are the smallest indivisible parts of the tale.

In the case of news stories and documentaries, these atoms are made up of video, text, images and sound. For news organisations, the same atom is likely to be used in multiple stories. When making items for broadcast, the same atoms are used time after time as news stories evolve.

As part of their ‘Elastic News’ project, BBC R&D have been testing ideas that allow audiences to determine levels of detail and even the order that news stories are told. One model was a mobile app…

…that uses chapterised videos and text captions as the core experience while allowing users to insert additional video chapters into the main timeline where they want to know more. This creates a custom user journey of the news story.

Visit the post to read more and see a video simulation of the models they tested. 

Overall, our top-level recommendations from this user testing were:

  • continue to use a mixture of content (video, text, audio, etc)
  • provide 3 levels of depth - overview, richer content, links to full story
  • card-based, using text and images work well as a quick overview of the story- video might be more appropriate for deeper content
  • text over videos is confusing - users aren’t sure if it’s relevant to the specific scene where it appears or if it is subtitles or captions

The next iteration of our project will be taking the best features from both prototypes and recommendations from the user testing. The next prototype will also address data structure challenges as we collaborate with BBC News Labs.

Not only ‘Elastic News’ - elastic documentaries, features, TV…

In this case the BBC were testing younger people’s interaction with news items on mobile phones.

Perhaps some of these ideas could be applied to longer stories: documentaries, feature films, TV series. They could also apply to new forms such as websites, games and VR stories.

This requires editors and their tools to be able to work with story atoms as well as whole stories. 

This research seems to be about seeing audiences as individual news ‘users.’ Once we have a model for individual audience members being able to ‘choose their own adventure’ it’ll be time to work on how to make shared experiences possible… Maybe a teacher/pupils model would be a place to start.

IBC 2015

Wednesday, 16 September 2015

Over the last five days I spent my time in Amsterdam attending IBC 2015. I also attended the FCP EXPO.

IBC is a trade show for the TV and film business:

IBC is the premier annual event for professionals engaged in the creation, management and delivery of entertainment and news content worldwide.

Across 14 big halls, two or three had exhbitors relevant for production and post production. 

There were many high-end media asset management systems, virtual studios with motion control cameras and large cages where drones were shown flying around.

As last year, there weren’t many signs of Final Cut on the IBC show floor. Apart from the Avid and Adobe stands, few screens were showing any kind of editing application. If you weren’t part of a NLE decision-making team, you’d think there was no choice in NLEs yet. 

Camera manufacturers were starting to admit that they are better camera makers than digital recording device makers. Good news for companies making devices that can convert uncompressed camera source to codecs from Avid and Apple.

Despite Final Cut being hardly mentioned, Apple was everywhere because of ProRes. Whenever a video, sign or stand staffer covered high-end workflow, ProRes was always mentioned - usually first for some reason.

As with the vast majority of trade fairs of any kind around the world, Apple didn't pay for a stand. They choose to attend informally. Visiting stands, arranging meetings and supporting events near the main show.

At the US equivalent of IBC, the NAB Show held in Las Vegas, Apple organised their own invite-only suite in a nearby venue. They also gave presentations at an event orgainsed by FCPWORKS, a US-based post production systems integrator.


This September FCPWORKS teamed up with UK-based Soho Editors to put on a Final Cut Pro X-focussed event for IBC attendees. FCP EXPO was a two day event at a venue a few minutes walk from the IBC halls with sessions including presentations from Apple, Alex Snelling for Soho Editors and Ronny Courtens on Metronome’s reality TV workflow.

I gave a presentation as part of the FxFactory session which included a demo from Tim Dashwood on his exciting new toolkit for editing 360º video on the Final Cut Pro X timeline. As well as being able to play 360º video directly to a connected Oculus Rift VR headset, the 360VR Toolbox also allows editors to make creative choices based on how edits feel - almost impossible until now.

In coming days, some of the presentations will be made available online.

The presentation Apple gave had moved on a great deal even since the one they gave on the Apple Campus as part of the FCPX Creative Summit in June. It included more examples of great work from various projects around the world and demonstrations of features from recent Final Cut and Motion updates. Apple also introduced who from the team were there and welcomed attendee questions throughout the day.

Even though the day started with Apple, there was no drop off in attendance throughout both days as people stayed for a wide variety of presentations, networking and conversations in an exhibition area featuring pro Final Cut Pro X suppliers such as Intelligent Assistance

It is good news that Soho Editors were putting this event on. They are a long-established post production staffing agency and training company. Their support shows they think there’s a benefit to them encouraging their freelancers to learn Final Cut Pro X and that Final Cut training is a valuable service they can offer.

At the moment many TV journalists, researchers and producers are learning Final Cut through in-house training. Agencies like Soho Editors represent editors who already have years of high-end post experience. Once other established editors realise that freelance contemporaries are learning X, they may want to make sure they keep up.


Now that IBC is over, it is time to plan for NAB in Las Vegas in 2016. I've organised my flights already. I hope FCPWORKS and Apple take what they've learnt from Final Cut at IBC and do more in April.

Soho Editors has many clients and freelancers who aren’t sold on Final Cut Pro X yet, so they were a great choice for a Final Cut event partner. I hope FCPWORKS tries to reach more unconverted editors and post people when publicising a ‘NAB adjacent’ event.

As the UI for Final Cut is so much less threatening than the competition, I think there is mileage in attempting to get non-editing and post people to attend as well. People who have all kinds of jobs in TV, games and feature film production would benefit from learning Final Cut. My take would be: ‘Why should editors be the only ones who benefit from the ease and speed of Final Cut Pro X,’ but I’m no marketing expert…