Apple WWDC 2016 Announcements and Post Production

Tuesday, 14 June 2016

Every year at their Worldwide Developer Conference Apple presents some of their plans relevant to software and hardware developers at a keynote presentation. Here are my notes and links from the 2016 keynote.

The main screen and the webcast stream didn’t have the normal 16:9 ratio. It was wider at the Cinemascope ratio of 1:2.40. Could this be a hint that a future Apple-branded display will have a 21:9 (1:2.33) aspect ratio?

New Name

As iOS will reach version 10 this Autumn and OS X has been around for over 16 years, Apple will now rename their Mac operating system macOS. The next version will be macOS Sierra, version 10.12. This renaming will make Final Cut Pro, Logic and iMovie stand out as being part of an older naming scheme.

There’s a chance that iMovie will become ‘Movies’ for iOS and macOS - following on from how iPhoto become Photos. An alternative is that productions started in iMovie will be edited in macMovie and then be openable by macFCP while the soundtrack is modified in macLogic. More likely is that Final Cut and Logic will simply drop their X suffixes.

Siri

Siri for macOS means that Macs will be able to be controlled by voice as iOS devices can be today. SiriKit for iOS 10 gives a limited set of third party applications the option to be controlled by Siri.

If SiriKit was introduced to macOS the ProApps team would have the option to add much more voice control to their apps. This would be especially useful for finding clips based on keywords and other metadata. As well as asking “Show me clips in the browser with the ‘Interviews’ keyword” or “Show me clips in the timeline with dual mono,” Siri also understands context: “Show me interview clips… show me those with dual mono” will only show interview clips with dual mono - not first one selection of clips followed by all clips with dual mono.

Although there are many different ways of asking for the same thing, those are interpreted by Siri and passed to the target app in a standard way. This kind of automation would work well with scripting. Apple has released a new guide on that subject: The Mac Automation Scripting Guide. Currently there are no hints that scripting will be added to iMovie/Final Cut yet.

For now SiriKit for third party iOS apps will only be used for the following tasks:

  • Audio or video calling
  • Messaging
  • Payments
  • Searching photos
  • Workouts
  • Ride booking

WWDC 2016 session on SiriKit.

New Photos features useful for video

Photos for iOS 10 and macOS Sierra will have a couple of new features of interest: more advanced content recognition and the automatic generation of ‘Memories’ videos.

As well as recognising all photos with a specific person, Photos will also recognise other kinds of content. This means that photos can be grouped based on the content detected. Examples include photos with beaches, with horses, shot in fields. This kind of automatic categorisation will be very useful for iMovie/Final Cut users - especially when clips are very long. The content recognition should be able to mark only the time in a long shot when a certain person or object appears.

Using this image recognition technology, Photos will also be able to generate ‘Memories.’ A Memory can look like a web page or publication on a subject. Memories can include videos made up of automatically animated photos. If users want to change the mood of a video, they can choose a new soundtrack and the story will be re-generated to match the music.

Will these video Memories will be modifiable in iMovie or Final Cut Pro X? It would be a very quick way to get new people into making movies. The same technology could be used to make automatic videos from selected clips in a video library.

Differential Privacy

Apple have found a way of using information from millions of Apple users to power services without compromising any specific individual’s privacy. ‘Differential Privacy’ is a mathematical method that ensures privacy when sharing data from millions of people.

Specific mathematical equations define a specific amount of ‘noise’ to add to a single piece of data. This noise makes the original data associated with a specific person impossible for anyone - including Apple - to decode. The trick is that when hundreds of thousands of pieces of unbreakable encoded data are combined together, there are statistical measures that will be able to detect trends amongst all the results. Apple will have no way of knowing what an individual value was, but will have an accurate representation of the distribution of all the original values over a large population.

This is the way Apple is able to use the large amount of private information it has access to provide intelligent services. An original mathematical paper: “The Algorithmic Foundations of Differential Privacy.”

Messages and iMessage Apps

Messages in iOS and macOS will get a big upgrade this year. Apple will provide a range of stickers and animations that people can use in conversations. For example ‘Invisible Ink’ will make an image blurred until each person in the conversation swipe over the picture. They also will be able to annotate other people’s messages and pictures. They’ll be able to add animation to speech bubbles, emoji and pictures.

As well as Apple-supplied animations and effects, third-parties will be able to make iMessage Apps to do more with messages. 

I hope Apple define a new graphic and animation file format for Messages that can be applied in other applications, such as Photos, Keynote, iMovie and Final Cut Pro. A metadata-driven format will display differently depending on the device showing the graphics. This will be useful when videos are made up of objects: video clips, images and metadata that tells the playback compositing software how to present the story.

If Apple start presenting Messages as a place for ad-hoc group-based collaboration (for play or for work), there should be a place for video.

WWDC session Part 1 and Part 2

Recording and playback of multiple simultaneous video streams

Created for those who want to record on-screen gameplay for later sharing online, ReplayKit for iOS now adds simple live streaming plus the ability to also record the play themselves commentating using a front-facing camera. This means a standard UI for viewers to be able to switch between ‘angles’ in a playback stream whenever they want. 

A new file system: APFS

The APple File System is designed for modern storage devices. The current file system - HFS+ - was designed to work with floppy discs. APFS is designed for Flash/solid state memory. HFS+ is known to degrade over time - normal day to day usage will result in files getting lost. APFS is designed for recoverability. It will be much easier to get at ‘deleted’ data. It will handle backups much more smoothly. 

As with Final Cut Pro X projects, the state of whole drives or parts of drives can be captured in a Snapshot.

A new file system doesn't mean a new Finder. It means that applications that spend most of their time manipulating files - like the Finder - will need to be updated to understand the new ways of organising documents and applications on storage devices.

Apple’s programming guide to the Apple File System. Ars Technica on APFS.

Important: APFS is released as a Developer Preview in OS X 10.12, and is scheduled to ship in 2017.

Better colour

The new Wide Color system framework will add wide colour gamut picture/picture capture and manipulation to iOS and macOS. Following on from its introduction to recent iMacs and iPad Pros, Apple have settled on the DCI-P3 gamut - the standard colour space used to specify the colours used in US cinema projection. Some think Adobe RGB would have been a better choice.

Sharing private data via CloudKit

CloudKit Web Services Reference:

You use the CloudKit native framework to take your app’s existing data and store it in the cloud so that the user can access it on multiple devices.

Currently any data that is stored in the cloud using Apple’s CloudKit framework is either public or private. This year CloudKit in Apple OSs will add the ability for iCloud users to share data amongst themselves.

This would be very useful for post production applications. For example Final Cut could upload proxy versions of all media (or media used within a specific project) so that collaborators would be able to have a live timeline to work with.

WWDC 2016 session.

QuickTime in Apple OSes

QuickTime as a container for video and audio files has a great future. The AVFoundation framework is the basis of Apple software that records, manipulates and plays QuickTime documents (amongst other file formats).

QuickTime the software framework is depreciated in macOS. This means that applications that use the QuickTime API will still work in macOS Sierra (10.12), but may not work in a future version. There is no way yet to know if Final Cut Pro 7 will work in macOS Sierra, but my guess is that it probably will.

As part of building applications Xcode, Apple’s development system, checks to see if the code uses old or depreciated OS features. It uses a API Diffs file to look at all code. The QuickTime part shows that the API headers have been removed. The AVFoundation part shows a lot has been added.

QuickTime the API has been depreciated for a while. Removing the headers means that applications can no longer compile if the code uses the old API. Applications already compiled on older OSes will still work in macOS Sierra.

Once again, the file format lives on. The part of Apple OSes that manipulate media called QuickTime will be replaced by AVFoundation eventually. This shouldn’t be a problem for Mac users of old applications for now. Remember that one day they will not work in a future version of macOS.

Apple and the future of media

Apple didn’t make any announcements directly relevant to post production. There was no mention of Retina displays, 4K, VR or 360° video.

On the other hand they laid some interesting foundations for collaboration. One day we might look back at this week and see elements vital to a new product or service introduced in coming months and years.

I'm looking forward to seeing what happens next.

Apple App Store Subscriptions and iMovie/Final Cut Pro X

Thursday, 09 June 2016

Apple has announced that developers making apps for sale in the iOS, Apple TV and Mac app stores can now offer subscription pricing for any app:

Starting this fall, apps in all categories on the App Store will be eligible to offer in-app purchases for auto-renewable subscriptions to services or content. Users enjoy the reliability that comes with subscribing to a service that they love, and the experience must provide ongoing value worth the recurring payment for an auto-renewable subscription to make sense.

Although Apple is pointing to subscriptions being used to pay for updated content or a continuing service, Phil Schiller added two more categories in an interview with Lauren Goode on The Verge:

He suggests many enterprise apps could move to subscription, and that professional apps that require “a lot of maintenance of new features and versions” would be a good fit.

As I often look at the tech world through the lens of post production, I wouldn’t be surprised if Apple use Final Cut Pro X as an example of the latter example at WWDC next week.

Many developers say that this change will make it much more likely they will create professional apps for the iPhone, iPad and Mac.

In the world of post production however, many Final Cut users thumb their noses at Adobe Premiere users over Adobe’s Creative Cloud compulsory ‘software rental’ subscription service.

Since Final Cut was ‘updated’ to version X (pronounced ‘ten’) in 2011, Apple have not changed upgrade fees for the many versions over the years. Those comparing the price for renting Adobe Premiere often compare that with the cost of owning Final Cut. If you bought it for $300 in July 2011, the monthly cost works out to $5 per month over the last five years. Any change in the way Final Cut Pro X is paid for is a big deal for those fighting NLE platform wars.

Final Cut Pro X subscription?

If Apple did introduce a subscription pricing to their professional applications, they have some options:

  • Next version of Final Cut Pro X only available by subscription - with or without a reduction for the first year for existing users.
  • Future versions of Final Cut free for those who bought / buy it for $300, $10 per month subscription option who can’t afford initial outlay.
  • Some features only needed for high-end or industry-specific workflows could be unlocked by subscription.

Final Cut Pro X as an iMovie subscription option 

Ever since X was introduced, traditional NLE users have joked that it wasn’t much more than ‘iMovie Pro.’ For a while now Apple has actually developed iMovie as a customised version of Final Cut Pro X with professional features turned off and with additional consumer-focussed features. They are currently the same application with different UIs activated depending on whether it is running as a consumer application or professional application. 

This might become relevant if Apple add a subscription payment option for Final Cut. Apple would need to decide what happens when customers no longer want to pay their subscription fees (which can be annual, monthly or even weekly). In the case of Adobe Creative Cloud users, if they stop paying Adobe, they can no longer open CC apps to view their projects or make any changes.

Apple have the option of going different ways with Final Cut if the subscription is stopped:

  • Final Cut reverts to the last full version paid for (for those who bought a subscription for new versions after having paid full price before).
  • Final Cut reverts to a ‘no modifications, just export’ mode.
  • Final Cut reverts to the iMovie feature set. This would allow changes to existing projects possible to do in iMovie. This would mean that Final Cut Pro X would be a subscription option for iMovie users.

Subscriptions and the wider Final Cut ecosystem

If subscriptions were introduced to Final Cut, its ecosystem of plugins, services and applications might need to update to reflect the changes.

Firstly, will any current developers move their products and services to the Mac App Store?

What does it mean for FxFactory? It is ‘The App Store for Pro Users’ who sell plugins and apps for Final Cut Pro, Apple Motion, Adobe Premiere and Adobe After Effects. They offer watermarked trial versions of most products. Will developers distributed by FxFactory release future products on the Mac App Store?

Red Giant Universe is a growing pack of plugins for Final Cut Pro X that is available by subscription. Would Red Giant trade some of their subscription money in return for making them available to the huge number of people who use iMovie?

Will Apple provide API hooks that allow Final Cut adjacent products check the subscription status of Final Cut so they can provide different features?

Will Apple Motion and Apple Compressor become subscription options for iMovie/Final Cut?

This change may make developers take another look at making professional applications for the Mac, the iPhone and the iPad Pro. Good news for post production at all levels.

FCPX Creative Summit 2016 Provisional Schedule - More from Apple

Wednesday, 08 June 2016

The provisional schedule for October’s FCPX Creative Summit is now available.

Interesting: Instead of last year’s 90 minute presentation given twice to two groups, the schedule shows a 60 minute ‘General Address’ followed by a choice between 90 minute breakout sessions:

2:00 – 3:00pm General Address: The Future (Apple Campus)
3:00 – 4:30pm Apple Session Breakouts (Apple Campus)

What could Apple be talking about in these sessions which would mean attendees would have to choose one session over an other?

Another point: The Summit was held in late June last year. This year it will be in late October. Given this event is organised to fit in with the plans of the ProApps team, there is a chance there will be more to talk about later this year.

Next week at the WWDC 16 there is a chance that Apple will announce or pre-announce a new version of the Mac Pro, just as they did in 2013. Final Cut Pro X is the application that most people understand needs a lot of power. Perhaps Apple will once again use a Final Cut screenshot during the keynote (which will be streamed online on Monday).

Blackmagic Fusion 8.1: A new VR video option for Avid users

Wednesday, 08 June 2016

This week Blackmagic Design announced a new version of Fusion, their high-end compositing application.

The good news for Avid Media Composer users is that Fusion version 8.1 works with with Fusion Connect 8.1:

This software adds the Fusion Connect for Avid plug-in that is compatible with Avid edit systems, so you can send any clip or stack of clips from an Avid Media Composer timeline directly into Fusion or Fusion Studio. Now Avid editors can have access to Fusion’s powerful 3D compositing and animation tools

Up until now many Avid users have been advised to transfer timelines to NUKE as a 360° video solution. Fusion 8 is a free application that is a peer to NUKE. Before being bought by Blackmagic Design, it was an expensive application. Fusion has been used in the post production of many high-end TV shows and feature films. To promote their hardware, Blackmagic Design made a Mac version of Fusion and released a almost fully functional free version.

As well as being a high-end node-based 3D compositor, Fusion 8 can also use 360° plugins. For example there is the $249 Domemaster Fusion Macros from Andrew Hazelden

The Domemaster Fusion Macros allow artists to create immersive 360° stereo composites. The new immersive toolset is designed to work with Blackmagic Design’s Fusion compositing software. These macros are great for preparing pre-rendered content for use in a fulldome theater, or on a head mounted display like the Oculus Rift, Samsung Gear VR, HTC VIVE, OSVR, or Google Cardboard.

So, if you are a Media Composer editor who is comfortable with learning new compositing applications who is interested in working with 360° video, check out Blackmagic Fusion 8.1 and Domemaster Fusion Macros.

Xsend Motion - Send Final Cut Pro X timelines to Apple Motion

Tuesday, 07 June 2016

For those who used Final Cut Pro Studio before 2011, a very popular feature request for Final Cut Pro X is the ability to send clips to Apple Motion. Motion can be used for the kind of more advanced motion graphics tasks. Post production file format translation supremo Wes Plate has made Automatic Duck Xsend Motion.

Iain Anderson’s review at Mac Pro Video:

Despite the excellent integration between FCP X and Motion, this critical piece has always been missing and often been requested. Finally, it's here, and while it's maybe not as feature-complete as if Apple had done it themselves, it's very useful, and still under active development by a veteran in this space. Heavy Motion users should grab it now.

Xsend elsewhere?

This might be greedy, but what else could this technology be used for? Now that the Automatic Duck team have learnt Final Cut Pro X XML and the Motion document format well enough to make this product, where else should X timelines be sent? 

Given the power of Blackmagic Design’s Fusion 8 node-based compositing application, perhaps Xsend Fusion and Msend Fusion would have a appreciative audience!

Apple’s patent for applying effects to clips with specific roles

Tuesday, 07 June 2016

The name of patent 9,240,215 may be ‘Editing operations facilitated by metadata,’ but it is about applying effects to roles in Final Cut Pro X:

For example, several clips may be assigned one audio role of "Dialog", "Music", or "SFX". A process  then provides one or more user interface controls. These user interface controls are also associated with the tagged clips. That is, the user interface controls are associated so that these controls can be used to display or modify properties of the tagged clips.

PDF version.

 

Apple’s structure editing patent

Tuesday, 07 June 2016

While editors wait for the next big Final Cut Pro X update, I hope the Apple ProApps team will implement some of the ideas in their ‘structure editing’ patent. Here’s my old writeup of the patent they applied for in 2009 on fcp.co:

Most people think that the editor’s job is ‘to cut out the bad bits’ in individual scenes. Many are surprised to discover that editors commonly change and improve storytelling by changing story structure. As many film and TV makers consider that structure is very important when it comes to telling stories, I think it is a good idea for video editing software to recognise story structure.

Structure applies to feature films, TV shows, groups of corporate videos on an intranet, legal video depositions, architects’ video proposals or open-ended weekly web series. The more video applications can have these structures encoded in their projects, the better the tools they’ll be able to provide to a wider range of people all over the world.

Introduction to VR Video with Final Cut Pro X

Tuesday, 31 May 2016

At the FCP Exchange event at NAB in April, Tim Dashwood and I gave a presentation on working with VR 360° video in Final Cut Pro X and Motion.

Initially I explained the art of spherical video from first principles comparing it to VR apps. I showed how editors can learn specialised tools that understand 'equirectangular' video, effects and graphic overlays to tell stories that play out all around you.

I also explained how editors can share their work with millions of smartphone users around the world. Tim Dashwood then gave a quick rundown of the science of high-end VR video effects that are available for Final Cut Pro X today.

 

360° Virtual Reality with FCPX from FCPWORKS on Vimeo

FCP Exchange is a series of free industry seminar days presented by FCPWORKS and fcp.co

Dashwood 360 VR Toolbox and 360 VR Express.

Sound Design Lessons for VR Video from VR Games

Friday, 27 May 2016

VR Video tools for video editors have progressed quickly in the last year, but there has been less discussion about the audio side of VR video. Although VR audio tools have yet to be integrated into NLEs, audio experts (and video editors who spend much of their time refining their soundtracks) should consider how audio design is different for VR.

Those designing audio for VR games are probably further along in coming up with what makes VR different. 

At a mini conference on game audio earlier this year, developer Gordon McGladdery gave a presentation on audio for VR games.

His game Fantastic Contraption is one of those given away with each HTC Vive (a VR headset that detects where you are in a room for ‘room-scale’ VR), and he has worked on the sound for VR commercials.

He spoke with Matthew Marteinsson on episode 25 of the ‘Beards, Cats and Indie Game Audio’ podcast about VR audio. Here is a summary of some of what was said:

[7:07] Binaural audio is very important - without it, experiencing VR is ‘like watching a 3D movie without the glasses on.’

[7:33] Music score doesn't work in VR games - it ‘muddies everything up’ [The music is] ‘coming from nowhere in the world and just seems to cloud the entire immersion.’

[10:17] Everything matters. The current video game sound design orthodoxy is that some sounds are more important than others;  time and budget determine a well known order of priorities when it comes to sound design. Everything shown in a game that can make sounds, must have game audio.

[14:21] Even if your target VR platforms don't have advanced audio, incorporating advanced audio future-proofs your current productions.

[15:05] ‘A lot of what we do here is to design right up until the end.’ It is important to design your sound workflow so that if the design of the game changes near launch (the eqivalent of a new edit of a film), ‘we as audio can quickly move with it.’

[16:14] ‘Distance falloff is really finicky’ - pay close attention to sound volume based on position - ‘none of the defaults work.’ Different sound sources have different falloff curves, some objects need to be heard from further away. You need different curves for every objects. Based on character need, not realistic sound physics.

[23:33] ‘Dynamic range is back - we're not crushing everything any more’ - adding heavy compression doesn't work - it just makes evreything loud. ‘VR audio will be pretty uncompressed.’ Prepare for the fact that different audio soundtracks work for different playing environments. Most VR experiences will be in quiet environments, but some will be in noisy places - which will need compression to punch through.

Listen to the rest of the podcast to hear Gordon’s take on VR use outside the world of games, as he is getting more non-game work due to his VR audio skills.

BBC TV production change could mean more workflow stories

Tuesday, 22 March 2016

Proponents of Final Cut Pro X and Adobe Premiere are frustrated when editors of high-end TV and features say "I don't know anyone who doesn't use Avid."

We value case studies showing what alternative applications can do. In the UK, the BBC is not allowed to publicise its workflows - used on TV shows that are world-famous - because being cubically-funded it is not allowed to promote one commercial supplier over others. All we can do is gather indirect information, such as tweets by those working with and for the BBC:

The flow of BBC production stories may increase soon: they are moving the majority of their TV production to a commercial division on April 29. This is the way commercial TV in the UK and elsewhere works. In a year ‘BBC Studios’ will also make shows for other broadcasters and networks.

If I was promoting Final Cut Pro X and Adobe Premiere workflows, I'd start preparing case studies now.

Dual lens iPhone 7 = Multi-angle QuickTime files

Thursday, 10 March 2016

The day after the announcement of a new iPhone, the speculation starts for what might appear in the next iPhone. This speculation is based on Apple patents, acquisitions, what parts suppliers can produce and what Android phones have had ‘for years.’

This week’s speculation from MacRumors suggests that the next iPhone will have a dual lens camera. Such a camera would be able to capture a normal shot and a close up shot at the same time:

Amid rumors a dual-lens camera will be introduced in the iPhone 7, Apple recently submitted a patent application published in January which gives us rare insight into what Apple thinks a dual-lens camera interface could look like on future iOS devices.

The patent outlines a dual-camera system that consists of one standard wide-angle lens similar to what's in the iPhone today and a second telephoto lens capable of capturing zoomed-in video and photos.

Apple’s iPhone is the combination of hardware and software, the interesting part for me is the software part. As with slow motion recording, there are software implications. Dual lenses means

  • A user interface for capturing two video ‘angles’ at once (with a single shared soundtrack - unless audio from multiple microphones is also captured)
  • Storing more than one video stream in the same QuickTime file
  • A user interface for marking footage at points when playback should switch showing footage recorded with one lens to the other

Multicam for ‘the rest of us’

Consumer-level multicam has implications for the next versions of iMovie for iOS and OS X and Final Cut Pro for OS X (and iOS?). As well as being able to handle multi-video track QuickTime files, they might provide features to ease users into multi-angle production.

iMovie would need a command and shortcut to add metadata to a multilayer clip to say ‘switch to other layer here.’ That would be a useful shortcut to add to Final Cut Pro X as well. Currently editors  working with two-angle multicam clips,  must alternately use ‘Cut and Switch to Viewer Angle 1’ and ‘Cut and Switch to Viewer Angle 2’ depending on which angle is active. 

Multi-track QuickTime is back

Once consumers get used to multi-stream video files, they might start expecting that multiple devices at the same location should be able to contribute to a single QuickTime record of an event. As long as all devices have an iCloud account, Apple could provide the synced file for all to share with each other and others.

More and more professional production uses multiple cameras. Final productions will probably include multiple video assets that are shown depending on playback settings. This means multicam user interfaces for production and playback alongside multiple video layers being stored in single movie files.

Also recording devices will probably encode multiple video angles into movies. Aldready on Convergent Design’s Apollo switcher/recorder product page:

exports separate Apple ProRes files with matching timecode or a single multi-stream QuickTime file that drops directly into the timeline of supporting NLEs such as FCP-X.

How interesting.

Last year’s Apple WWDC had a relevant session on editing movies using AV Foundation for iOS and OS X developers:

There are methods for creating and removing tracks, and you see to create a track, we have to say what type of track we want.

Do we want a video track, do we want an audio track, and so forth…

Features seem to come from Final Cut Pro X:

We can now open these, edit them, and write them back. At the track level, we have a similar setup. As you know, a composition is composed of composition tracks, and at the mutable level, we have AVMutableComposition tracks.

Media doesn't need to be in the same file:

Now, it's possible for the sample data that a track refers to exist in another file altogether, so you can have external sample references. It's even possible for the sample references to refer only to external sample data. Now, when we have this situation, the little movie box and its file type box is called a sample reference movie file.

Coming soon to OS X and iOS applications?

Final Cut Pro X at RTS Swiss National Television and the future of post production consultancy

Tuesday, 08 March 2016

A new case study shows how collaborative storage systems are coming into their own for TV stations using Final Cut Pro X. The team behind the implementation demonstrates how different the Final Cut high-end post production consultancy ecosystem is from the ‘left over from the 20th Century’ establishment. 

Ronny Courtens and the implementation team have written a detailed post at fcp.co:

[National Swiss TV broadcaster RTS ] needed a 100 TB effective and fully expandable enterprise NAS system with high redundancy and high-availability for 24 client connections. 12 connections over 10Gig SFP+ to their existing fiber channel network for the editing stations and the audio and ingest stations. And 12 connections over Gigabit Ethernet to their existing Cat6 network for extra ingest, titling and graphics machines, Open Directory and system admin.

...

all editing and ingest clients must be able to perform high-speed file transfers to the server without affecting the sustained Read bandwidth of any editing station. This is one of the biggest problems most NAS systems will face in this kind of setup. During the tests they did with the previous systems, bandwidth dropped considerably and brought the editing systems to a halt as soon as one of the Studios or Outside Broadcast trucks started streaming live multicam footage onto the server over the high-speed network. Or even when one of the editing stations did a simple export over 10GigE.

‘Mystery is Margin’ no longer

As well as the detailed technical story of the solution, the article includes the business story. On reading about their case study about factual TV production in Denmark, RTS engineers went to a freelance workflow consultant and his colleague for help. Ronny Courtens and Anouchka Demeulenaere proposed a new solution from a LA-based company LumaForge

It can’t be very often that a national TV station goes to a pair of freelance workflow consultants with no website or Twitter account. One of the videos embedded in the case study contrasts the old method of post consultancy (‘Mystery about how things work means Margin for us’) vs. the modern (‘Let’s work this out together’).

After trying to get one system working:

Finally they sent us a tech guy who started writing in the Terminal without explaining anything.

Compare that with:

The guys from LumaForge came in and Eric explained the entire system to us. 

Freelancers to the rescue - A new job for the 2010s?

Despite wanting post professionals to take a look at Final Cut Pro X, Apple have shown little interest in fitting into the ecomomics of post production consultancy. They might offer engineering support with proposals and support, but they don’t make it easy for people to make money out of installing and maintaining Final Cut Pro X at the high end. The software is too cheap and easy to buy, there is very little margin in Apple hardware.

There used to be money in multiple training courses for staff. Recently a person from another broadcaster responsible for ensuring that hundreds of journalists and camera people keep their skills up to date told me that they toured the many newsrooms full of Final Cut Pro X a few months after initial training. He asked them if they wanted any more training. They all said they didn't any: they were happy to find all the answers they need by going to the internet.

It could be that some post consultancies don’t recommend Final Cut Pro X because they can’t make enough money on those installations. They have expensive offices, salespeople and teams of engineers to support. 

In the case of Metronome in Denmark and RTS in Switzerland it wasn’t one of the big companies that provided the solution. It was Ronny Courtens and Anouchka Demeulenaere. They found a way of delivering a solution and making enough money to justify their time.

Ronny says:

We are not even consultants or integrators. The projects we get come from people whom we have known for years in the industry, or from people we know from the forums and groups. So we don't need a website or Twitter, nor do we need a large team. We just make the contacts, we analyze the issues and then we team up with people we think will be able to help us provide solutions.

Usually we don't charge anything for a first meeting, no matter where it is. We are always interested to discover new companies and workflows.

Things have indeed changed a lot lately.

The FileMaker model

That might be an interesting model to take to Apple. The Pro Apps team can’t get the rest of Apple too excited about helping a few thousand high-end post people make TV shows and feature films more easily. That doesn’t match Apple’s aim of empowering people and “leaving the world better than we found it.” What if the Pro Apps team proposed that they support thousands of freelance post consultants in introducing video to businesses and organisations of all sizes all over the world.

They do something very similar to this with their FileMaker database product and freelance community. Go to their website now and imagine the word FileMaker replaced with Final Cut Pro X. Where you see ‘database developer’ imagine ‘post workflow consultant’ instead. See software that can be bought and rented, where workflow tools work on Macs, servers and iOS devices. Also discover how much Apple promote and involve freelance developers with third party tools.

Making Final Cut Pro X a platform like FileMaker would help Apple truly revolutionise the future of video for businesses and organisations everywhere.

Final Cut Pro X: Rate Conform

Wednesday, 16 December 2015

When working with video clips that have frame rates that are close to being a multiple of the timeline frame rate, but not quite, Final Cut Pro X sometimes speeds them up or slows them down.

When this happens, you will be able to see a section in the inspector showing that its frame rate has been conformed:

tip-fcpx-rate-conform

This shows that a clip that normally runs at 25 frames a second will be slowed down so that it plays at 23.976 frames per second. It's playback speed on the timeline will be 95.904%. This means one frame of the clip will be displayed for one frame of the timeline.

Here are the rate conforms that Final Cut Pro X does automatically: 

When you add a clip with this frame rate To a timeline with this frame rate It automatically  plays back at this frame rate Speed % of original duration
         
23.976 24p 24 99.90% 100.10%
23.976 25p/i 25 95.90% 104.27%
23.976 50p 25 95.90% 104.27%
24 23.98p 23.976 100.10% 99.90%
24 25p/i 25 96.00% 104.17%
24 50p 25 96.00% 104.17%
25 23.98p 23.976 104.27% 95.90%
25 24p 24 104.17% 96.00%
29.97 30p 30 99.90% 100.10%
30 29.97p/i 29.97 100.10% 99.90%
30 59.94p 29.97 100.10% 99.90%
50 23.98p 47.952 104.27% 95.90%
50 48p 48 104.17% 96.00%
59.94 60p 60 99.90% 100.10%
60 29.97p/i 59.94 100.10% 99.90%
60 59.94p 59.94 100.10% 99.90%

This is a full listing of the combinations where Final Cut Pro automatically changes the speed of a clip. For any other frame rate combnations Final Cut will drop or repeat frames so that they source clip seems to play at its original speed.

For example if you add a 30p iPhone video to a 25p timeline, Final Cut will skip some of those frames every second so the playback speed remains the same and the duration stays the same: a 3 second 30p clip will take 3 seconds to display in a 25p timeline. If that same 30p clip was added to a 48p timeline, then Final Cut will repeat some frames so the playback speed will remain the same: the 3 second 30p clip will display for 3 seconds on a 48p timeline.

Letting audiences make structural choices in films

Wednesday, 30 September 2015

Editors determine structure. From individual frames, shots and sequences up to scenes and acts. At the higher levels they work out what order to tell stories and with how much detail to go into.

When we tell stories, it is common to divide them up into parts – ‘atoms’ – which we think are the smallest indivisible parts of the tale.

In the case of news stories and documentaries, these atoms are made up of video, text, images and sound. For news organisations, the same atom is likely to be used in multiple stories. When making items for broadcast, the same atoms are used time after time as news stories evolve.

As part of their ‘Elastic News’ project, BBC R&D have been testing ideas that allow audiences to determine levels of detail and even the order that news stories are told. One model was a mobile app…

…that uses chapterised videos and text captions as the core experience while allowing users to insert additional video chapters into the main timeline where they want to know more. This creates a custom user journey of the news story.

Visit the post to read more and see a video simulation of the models they tested. 

Overall, our top-level recommendations from this user testing were:

  • continue to use a mixture of content (video, text, audio, etc)
  • provide 3 levels of depth - overview, richer content, links to full story
  • card-based, using text and images work well as a quick overview of the story- video might be more appropriate for deeper content
  • text over videos is confusing - users aren’t sure if it’s relevant to the specific scene where it appears or if it is subtitles or captions

The next iteration of our project will be taking the best features from both prototypes and recommendations from the user testing. The next prototype will also address data structure challenges as we collaborate with BBC News Labs.

Not only ‘Elastic News’ - elastic documentaries, features, TV…

In this case the BBC were testing younger people’s interaction with news items on mobile phones.

Perhaps some of these ideas could be applied to longer stories: documentaries, feature films, TV series. They could also apply to new forms such as websites, games and VR stories.

This requires editors and their tools to be able to work with story atoms as well as whole stories. 

This research seems to be about seeing audiences as individual news ‘users.’ Once we have a model for individual audience members being able to ‘choose their own adventure’ it’ll be time to work on how to make shared experiences possible… Maybe a teacher/pupils model would be a place to start.

IBC 2015

Wednesday, 16 September 2015

Over the last five days I spent my time in Amsterdam attending IBC 2015. I also attended the FCP EXPO.

IBC is a trade show for the TV and film business:

IBC is the premier annual event for professionals engaged in the creation, management and delivery of entertainment and news content worldwide.

Across 14 big halls, two or three had exhbitors relevant for production and post production. 

There were many high-end media asset management systems, virtual studios with motion control cameras and large cages where drones were shown flying around.

As last year, there weren’t many signs of Final Cut on the IBC show floor. Apart from the Avid and Adobe stands, few screens were showing any kind of editing application. If you weren’t part of a NLE decision-making team, you’d think there was no choice in NLEs yet. 

Camera manufacturers were starting to admit that they are better camera makers than digital recording device makers. Good news for companies making devices that can convert uncompressed camera source to codecs from Avid and Apple.

Despite Final Cut being hardly mentioned, Apple was everywhere because of ProRes. Whenever a video, sign or stand staffer covered high-end workflow, ProRes was always mentioned - usually first for some reason.

As with the vast majority of trade fairs of any kind around the world, Apple didn't pay for a stand. They choose to attend informally. Visiting stands, arranging meetings and supporting events near the main show.

At the US equivalent of IBC, the NAB Show held in Las Vegas, Apple organised their own invite-only suite in a nearby venue. They also gave presentations at an event orgainsed by FCPWORKS, a US-based post production systems integrator.

FCP EXPO

This September FCPWORKS teamed up with UK-based Soho Editors to put on a Final Cut Pro X-focussed event for IBC attendees. FCP EXPO was a two day event at a venue a few minutes walk from the IBC halls with sessions including presentations from Apple, Alex Snelling for Soho Editors and Ronny Courtens on Metronome’s reality TV workflow.

I gave a presentation as part of the FxFactory session which included a demo from Tim Dashwood on his exciting new toolkit for editing 360º video on the Final Cut Pro X timeline. As well as being able to play 360º video directly to a connected Oculus Rift VR headset, the 360VR Toolbox also allows editors to make creative choices based on how edits feel - almost impossible until now.

In coming days, some of the presentations will be made available online.

The presentation Apple gave had moved on a great deal even since the one they gave on the Apple Campus as part of the FCPX Creative Summit in June. It included more examples of great work from various projects around the world and demonstrations of features from recent Final Cut and Motion updates. Apple also introduced who from the team were there and welcomed attendee questions throughout the day.

Even though the day started with Apple, there was no drop off in attendance throughout both days as people stayed for a wide variety of presentations, networking and conversations in an exhibition area featuring pro Final Cut Pro X suppliers such as Intelligent Assistance

It is good news that Soho Editors were putting this event on. They are a long-established post production staffing agency and training company. Their support shows they think there’s a benefit to them encouraging their freelancers to learn Final Cut Pro X and that Final Cut training is a valuable service they can offer.

At the moment many TV journalists, researchers and producers are learning Final Cut through in-house training. Agencies like Soho Editors represent editors who already have years of high-end post experience. Once other established editors realise that freelance contemporaries are learning X, they may want to make sure they keep up.

NAB

Now that IBC is over, it is time to plan for NAB in Las Vegas in 2016. I've organised my flights already. I hope FCPWORKS and Apple take what they've learnt from Final Cut at IBC and do more in April.

Soho Editors has many clients and freelancers who aren’t sold on Final Cut Pro X yet, so they were a great choice for a Final Cut event partner. I hope FCPWORKS tries to reach more unconverted editors and post people when publicising a ‘NAB adjacent’ event.

As the UI for Final Cut is so much less threatening than the competition, I think there is mileage in attempting to get non-editing and post people to attend as well. People who have all kinds of jobs in TV, games and feature film production would benefit from learning Final Cut. My take would be: ‘Why should editors be the only ones who benefit from the ease and speed of Final Cut Pro X,’ but I’m no marketing expert…

Apple’s September 2015 Event and film makers

Wednesday, 09 September 2015

Apple’s September announcements have interesting elements for video storytellers.

iMovie

The new iPhone 6S models have cameras that can record 4K video. That means iMovie on those devices will be able to edit 4K video:

iMovie is designed to take advantage of the beautiful 4K video you can shoot and edit on your iPhone 6s. In fact, iPhone 6s is so powerful you can smoothly edit two streams of 4K video to create effects like picture-in-picture and split screen.

Desktop-class performance lets you create advanced effects with up to three simultaneous streams of 4K video and export your 4K video at blazing speeds. And accessories like the Smart Keyboard let you use efficient shortcuts to make quick work of your project.

Interesting that they saw the need to handle three streams of 4K. iMovie will also be available as an extension to iOS applications that allow what photos and videos to be editing. If iMovie for iOS 2.2 doesn’t edit non 30/60p videos, hopefully editing extensions will be made by other developers.

At the moment the specs for the iPhone 6S only mention a limited range of frame rates:

  • 4K video recording (3840 by 2160) at 30 fps
  • 1080p HD video recording at 30 fps or 60 fps
  • 720p HD video recording at 30 fps

The native resolution of the iPad Pro is 2732 by 2048, leaving plenty of room for editing UI around a full 1920x1080 HD display, all those pixels would make it a good wired or wireless viewfinder for high-end video cameras.

Apple also have introduced a content-based refresh so the screen is updated as often is dictated by content. This should mean that if video is running at 23.976fps, then that’s how often the display is updated. Maybe that will work for 120fps content too.

Here is what iMovie for iOS 2.2 looks like on an iPad Pro:

iMovie-HDsm

The iPad Pro includes a ‘Smart Connector’ for its Smart Keyboard that allows power and information do go in both directions:

The Smart Connector works hand in hand with the conductive fabric inside the Smart Keyboard to allow for a two‑way exchange of power and data.

That means the iPad will be able to power accessories, and accessories will be able to power the iPad. Data going both ways might allow for some interesting third-party products...

3D Touch

The new iPhones and the iPad Pro have an advanced pressure sensitivity feature that Apple calls ‘3D Touch’.

Fortunately, Apple didn’t just add a new input method and leave its use up to individual developers. In iOS 9 and Apple applications, a light touch is a ‘Peek’ and a heavier touch after that is a ‘Pop:’

Peek and Pop let you preview all kinds of content and even act on it — without having to actually open it. For example, with a light press you can Peek at each email in your inbox. Then when you want to open one, press a little deeper to Pop into it.

I like to think of Peek as ‘Look at the metadata associated with this thing’ and Pop as ‘Act upon this thing with another tool’

It might be useful to have these shortcuts in Mac apps. Here’s hoping Apple introduce 3D Touch mice and trackpads…

Vertical Video Live Photos

The default action of the Camera application on the iPad Pro and the iPhone 6Ss is to capture a few moments around each photograph, a little bit of audio and some movement. When you press on them anywhere in iOS, they’ll show more than just the moment the picture was taken: 

A whole new way to look at photography, Live Photos go beyond snapshots to capture moments with motion and sound. Press a Live Photo to make it come alive. Experience the crack of a smile. The crash of a wave. Or the wag of a tail.

Just when some people are getting the message that video should be taken in a landscape orientation, Apple will be promoting the idea of photos that are a little ‘live.’ Oh well.

We’ll soon discover whether Live Photos will appear as video in iMovie, and whether the still image shown by default will be able to be changed - the best moment might not be in the middle of the sequence.

Apple TV – Apple’s Home Computer

The Apple TV brings applications to TV screens. Instead of iOS running on the new Apple TV, there's a new OS: tvOS.

With apps providing TV, movies, music, games and family organisation support, maybe Apple would like the Apple TV to be the new ‘Home Computer’ 

As well as tvOS sharing many features of iOS, for producers with large amounts of online content tvOS also allows web-based content to be made available in the Apple TV UI based on XML specifications:

Use Apple’s Television Markup Language (TVML) to create individual pages inside of a client-server app.

Every page in a client-server app is built on a TVML template. TVML templates define what elements can be used and in what order. Each template is designed to display information in a specific way. For example, the loadingTemplate shows a spinner and a quick description of what is happening, while the ratingTemplate shows the rating for a product. You create a new TVML file that contains a single template for each page in a client-server app. Each template page occupies the entire TV screen.

…and Javascript:

The TVJS framework provides you with the means to display client-server apps created with the Apple TV Markup Language (TVML) on the new Apple TV. You use other classes in the framework to stream media and respond to events.

As part of the demo a live sport app was able to show metadata during a game:

mlbmetadatasm

It would be good if Apple or another developer added time-based metadata display and editing to an NLE. 

Imagine a version of Final Cut or iMovie that could interpret Apple’s Television Markup Language and show what a production would look like when streamed on an Apple TV while editing in the timeline…

The odd one out

In recent years Apple has had two autumn events and distributed news about all their platforms between the two. Today’s event talked about devices that run watchOS, iOS and tvOS. The odd one out is the Mac. Either the next Mac update is so big it must have its own event, or there won't be much to report about devices running ‘macOS’ (if that’s what OS X is renamed as)  until next year. We’ll see…

Solving the vertical video problem: The New York Times’ first step

Wednesday, 09 September 2015

Justin Bieber’s new song is number in the UK. The New York Times has made an 8-minute video about how “Where Are Ü Now” was made.

It was conceived from the start as a video that works in more than one aspect ratio. Human interface experts NiemanLab have written a ‘making of’ abut this ‘making of’:

Unsurprisingly, the combination of Bieber and The Grey Lady turned some heads. But the Times’ video is interesting for another reason — it was designed from the beginning to be as compelling viewed vertically as horizontally. In a world where young people are watching more video on smartphones than on TV screens, making a video work in both aspect ratios can help it reach a broader audience.

This was an aesthetic as well as technical problem - how to combine filmed footage with motion graphics overlays that look good both on a TV and a vertically-held phone.

It is worth reading, but perhaps post-production people should consider whether timelines should have a fixed aspect ratio. They already don’t have a fixed resolution. 

Not the ‘where’ of a video element - the ‘what’

I suggest that elements for future videos may be exported as layered movies. It will be up to the playback device or software to how to show the elements so they work in the aspect ratio needed for each viewing.

This already happens for audio in Final Cut Pro X. Instead of defining the speaker through which audio should be heard, all audio is given a ‘role.’ This metadata can then be used by broadcasters and distributors to determine which audio should be played back - depending on context. 

The standard audio expected for UK TV production expects programmes to include a stereo mix, a surround mix, a stereo audio description (in which a voiceover during gaps in dialogue describes what happens on screen), music and effects only and alternate languages.

Imagine if programmes also had layers marked as Base video,’ ‘Signs and information in English,’ ‘Behind the scenes information,’ ‘Purchasing information,’ and ‘Signs and information in an alternate language.’ In the case of signs and text, this is how Pixar generates its movies.

In the case of the New York Times video, the motion graphics elements would be included in a separate layer which would be composited in different positions or even angles depending on the orientation of the playback device.

The answer to the problem of vertical video is to make sure videos look good when viewed at any aspect ratio.

That means editing applications will be able to playback the same content at multiple aspect ratios - much like page layout applications eventually added features which allowed designers to work with multiple aspect ratios for magazines and adverts.

To support multiple aspect ratios video makers will need tools that let them define what a video element is - the playback device can then determine the best place to play it back. Even if that is a second screen…

Adobe: Premiere Touch is for professionals too

Tuesday, 08 September 2015

As well as a bug fix update for Adobe Premiere Pro CC today, Adobe have reported what will be in the next version.

With this next release, Premiere Pro expands on its exceptional support for UltraHD, 4K and beyond workflows with new, native support for HEVC (h.265), DNxHR, and OpenEXR media, for both encode and decode, allowing editors to edit and deliver any format they need to.

When I've made mockups of Final Cut Pro X running on an iPad Pro people have asked why pros would want to edit on an iPad.

Interestingly for Microsoft Surface and iPad Pro fans, Adobe doesn’t consider a touch interface a sign of software for non-professionals:

Premiere Pro will let you build up your edit in new and tactile ways, by providing touch support for Windows hybrid touch devices like the Microsoft Surface Pro, and improved gestural support using Apple Force Touch track pads. Use multi-touch in the Assembly workspace for pinch to zoom to make your media clips big and easy to work with, then easily reorder them for storyboarding, play back and scrub right on the icons with your finger, tap to mark in and out points and drag straight to a sequence. Or, drag to the Program Monitor, where a new overlay will appear to allow you to drop into different zones to perform various standard kinds of edit. And on Apple Force Touch track pads, get haptic feedback when snapping and trimming in the timeline.

Premiere-Touch-Edit

 

 

iPad Pro demo?

I thought that iMovie 4K would be a great demo application for the iPad Pro at tomorrow’s Apple event. Perhaps third-party developers would be more inspired by Adobe software running on the new device. That was the right thing to do at the WWDC earlier this year. Maybe Adobe will be on stage tomorrow…

For screenshots and more information go over to the Premiere Pro blog.

Final Cut Pro X dynamic range and colour gamut: Watch out for clipping

Tuesday, 08 September 2015

Now that those promoting UHD are starting to talk about High Dynamic Range and Wide Colour Gamut, it is worth considering what Final Cut Pro X does with brightness and colour information internally.

Here is a still from some footage from EditStock.com - the site that shares rushes from all sorts of productions so people can practice editing and post production - and the scopes showing its range of colour and brightness:

dr-cg-1

This still shows a good range of colour from 0-100 for red, green and blue as well as brightness (luma) from 0 to 100 - despite the source movie being an H.264-encoded QuickTime movie.

If I apply a Color Board colour correction I can desaturate it and make it darker:

dr-cg-2

Or I can make it more saturated and make the colours more intense:

dr-cg-3

The term 'clip' here means that it is impossible for some pixels to be any darker than black or brighter than white.

The question here is what does each effect consider black and white. Does it range from -20 to 120 or from 0 to 100.

The result of each of the Color Board corrections shows values below 0 and above 120, so colour corrections have a wide range. So if both these corrections are applied at the same time…

dr-cg-3a

…the result is that Final Cut first darkens and desaturates the clip so some pixels have brightness and colour levels below 0 (as shown above in the second image) and then brightens them back up:

dr-cg-4

If brightness is stored as a value between -20 and 120, imagine two pixels that start off as having brightness values of 20 and 15. The first correction - that darkens the clip - might change these values to -4 and -6 - making them so close to fully black as being indistinguishable by eye. Certainly impossible to distinguish between each other. Alternatively, the second correction would change these to 55 and 62 - make them both brighter.

When Final Cut applies both corrections - first 20 being made darker to -4 and then -4 being made brighter to 21 - the pixel almost reverts back to its original brightness value.

The catch is that some Final Cut effects use a smaller brightness range than others.

That means if I apply an Gaussian blur effect to the output of the first Color Board correction, and then apply the second colour correction, the fact that the Gaussian blur only works with brightness values between 0 and 100 means that if a colour board changes values to being less than 0, the blue sees any below 0 as 0.

dr-cg-4a

Using the pixel values from before, the Gaussian blur takes the -4 and treats it as 0. It also clips the -6 to 0. After the blur is applied, the brightness values passed on to the next effect might be the same: 0 and 0.

These are then passed on to the second Color Board correction which brightens both ‘black’ pixels to 35. The difference in brightness between the two pixels has been lost, there’s no way for the second correction to get it back.

Less range in the Gaussian blur effect means all the detail in the temporarily dark parts of the clip it receives is lost:

dr-cg-5

You can see that there are no pixels with a Luma value of less than 35. The detail of the various brightness values between dark grey and black in the original clip were were all made black by the Gaussian blur effect. When the second correction made those black pixels brighter, all the detail the darker parts of the frame were lost.

Check the result of effects with the video scopes

This means when you apply effects to clips in Final Cut, it is worth checking the video scopes to see what the effects do to the brightness and colour values of your footage. It is often worth changing the order of effects to make sure you don’t lose dynamic range.

In this case moving the Gaussian blur effect to before both colour corrections, prevents the clipping:

dr-cg-5a

I could have also moved the blur to after the second correction.

HDR and WCG precision

High Dynamic Range and Wide Colour Gamut are about being able to encode a wider range of brightnesses and colours. They also require more precision: being able to distinguish the smaller brightness differences between pixels. That’s where ‘bit depth’ comes into HDR and WCG specifications. 8 bits for brightness can store 256 values between 0 and 1. 10 bits can distinguish four times as much detail: 1,024 values between 0 and 1.

NLEs like Final Cut and Premiere can handle HDR, WCG and codecs with high precision internally (such as ProRes 4444 XQ), the next stage for post is finding accurate ways of representing this precision in codecs designed for distribution to viewers.

What’s good for post production is good for the rest of Apple

Monday, 07 September 2015

When industry analysts try and come up with things Apple can do with their money, some suggest they buy  companies like Disney, Spotify or Tesla.

apple-should-buy

In practice Apple buys companies for their underlying technology. 

Reuters reports that Apple are buying up companies to support their efforts to make iPhones, iPads, Apple TVs, Apple Watches and Macs smarter:

"In the past, Apple has not been at the vanguard of machine learning and cutting edge artificial intelligence work, but that is rapidly changing,” he said. “They are after the best and the brightest, just like everybody else.”

Acquisitions of startups such as podcasting app Swell, social media analytics firm Topsy and personal assistant app Cue have also expanded Apple’s pool of experts in the field.

ProApps users have benefitted from Apple shopping sprees recently. Logic Pro X 10.2 now includes software acquired with Alchemy. Final Cut Pro X can now deliver TV programmes to UK broadcasters because Apple acquired MXF export software from Hamburg Pro Media.

If post production needs can be met by software and patents that would benefit other Apple products and services, so much the better.

Two candidates

While we wait for a professional audio application designed at its core for post production - which could be named ‘Soundtrack Pro X’ - Final Cut users have been raving about some specialised plugins from iZotope. Their Advanced Post Production bundle is able to fix audio problems previously thought impossible to solve, and also can create standards-compliant audio for use by broadcasters and distributors. 

Imagine if Apple bought the services of their people and their algorithms, patents and products. As well as working well in Final Cut, Logic and iMovie productions, deep knowledge of being able to remove reverb, echos and distracting noise from records would be very useful for parts of iOS and OS X that interpret audio instructions and environments.

It almost seems as if iZotope have re-organised themselves so as to prepare for an offer from Apple. From their August 18th press release

iZotope, Inc., a leading audio technology company, today announced its strategic decision to divide the current iZotope product line into two distinct families of products, one focused on Music Production and the other on Audio Post Production.

Another useful ability for post production software and for operating systems would be the ability to analyze large amounts of audio. Step up Nexidia.

They have products that can search for specific phrases phrases from hours of media, do quality control on captions, video description channels, languages and a tool that can automatically align captions to specific speakers.

Nexidia’s Dialogue Search enables content producers, owners, and consumers to type any combination of words or phrases, and in only seconds, find and preview any media clip where those words or phrases are spoken—independent of any captions, transcript, or metadata.

As well as supporting productions with terabytes of data to search through for creative reasons, Nexidia software could be used by Apple to interpret petabytes of video stored on iCloud - anonymised of course. The more advanced tools Apple has to understand audio, the better.

Which companies, products or services do you think you could persuade Apple to buy?

Options available there easy to use a good solution to convert any to the computer servers, firewalls. IE7Pro borrows from pcanywhere 12.5 with an eccentric taste for dance music. Admire the in your PC all software give a are running. So, be it nothing wrong to the through the you may need highly guaranteed business a good idea are here to the needed elements. Companies, AOMEI Backupper Server is while adobe flash professional an eye on solution that aims to reduce.