Apple WWDC 2016 Announcements and Post Production
Tuesday, 14 June 2016
Every year at their Worldwide Developer Conference Apple presents some of their plans relevant to software and hardware developers at a keynote presentation. Here are my notes and links from the 2016 keynote.
The main screen and the webcast stream didn’t have the normal 16:9 ratio. It was wider at the Cinemascope ratio of 1:2.40. Could this be a hint that a future Apple-branded display will have a 21:9 (1:2.33) aspect ratio?
As iOS will reach version 10 this Autumn and OS X has been around for over 16 years, Apple will now rename their Mac operating system macOS. The next version will be macOS Sierra, version 10.12. This renaming will make Final Cut Pro, Logic and iMovie stand out as being part of an older naming scheme.
There’s a chance that iMovie will become ‘Movies’ for iOS and macOS - following on from how iPhoto become Photos. An alternative is that productions started in iMovie will be edited in macMovie and then be openable by macFCP while the soundtrack is modified in macLogic. More likely is that Final Cut and Logic will simply drop their X suffixes.
Siri for macOS means that Macs will be able to be controlled by voice as iOS devices can be today. SiriKit for iOS 10 gives a limited set of third party applications the option to be controlled by Siri.
If SiriKit was introduced to macOS the ProApps team would have the option to add much more voice control to their apps. This would be especially useful for finding clips based on keywords and other metadata. As well as asking “Show me clips in the browser with the ‘Interviews’ keyword” or “Show me clips in the timeline with dual mono,” Siri also understands context: “Show me interview clips… show me those with dual mono” will only show interview clips with dual mono - not first one selection of clips followed by all clips with dual mono.
Although there are many different ways of asking for the same thing, those are interpreted by Siri and passed to the target app in a standard way. This kind of automation would work well with scripting. Apple has released a new guide on that subject: The Mac Automation Scripting Guide. Currently there are no hints that scripting will be added to iMovie/Final Cut yet.
For now SiriKit for third party iOS apps will only be used for the following tasks:
- Audio or video calling
- Searching photos
- Ride booking
WWDC 2016 session on SiriKit.
New Photos features useful for video
Photos for iOS 10 and macOS Sierra will have a couple of new features of interest: more advanced content recognition and the automatic generation of ‘Memories’ videos.
As well as recognising all photos with a specific person, Photos will also recognise other kinds of content. This means that photos can be grouped based on the content detected. Examples include photos with beaches, with horses, shot in fields. This kind of automatic categorisation will be very useful for iMovie/Final Cut users - especially when clips are very long. The content recognition should be able to mark only the time in a long shot when a certain person or object appears.
Using this image recognition technology, Photos will also be able to generate ‘Memories.’ A Memory can look like a web page or publication on a subject. Memories can include videos made up of automatically animated photos. If users want to change the mood of a video, they can choose a new soundtrack and the story will be re-generated to match the music.
Will these video Memories will be modifiable in iMovie or Final Cut Pro X? It would be a very quick way to get new people into making movies. The same technology could be used to make automatic videos from selected clips in a video library.
Apple have found a way of using information from millions of Apple users to power services without compromising any specific individual’s privacy. ‘Differential Privacy’ is a mathematical method that ensures privacy when sharing data from millions of people.
Specific mathematical equations define a specific amount of ‘noise’ to add to a single piece of data. This noise makes the original data associated with a specific person impossible for anyone - including Apple - to decode. The trick is that when hundreds of thousands of pieces of unbreakable encoded data are combined together, there are statistical measures that will be able to detect trends amongst all the results. Apple will have no way of knowing what an individual value was, but will have an accurate representation of the distribution of all the original values over a large population.
This is the way Apple is able to use the large amount of private information it has access to provide intelligent services. An original mathematical paper: “The Algorithmic Foundations of Differential Privacy.”
Messages and iMessage Apps
Messages in iOS and macOS will get a big upgrade this year. Apple will provide a range of stickers and animations that people can use in conversations. For example ‘Invisible Ink’ will make an image blurred until each person in the conversation swipe over the picture. They also will be able to annotate other people’s messages and pictures. They’ll be able to add animation to speech bubbles, emoji and pictures.
As well as Apple-supplied animations and effects, third-parties will be able to make iMessage Apps to do more with messages.
I hope Apple define a new graphic and animation file format for Messages that can be applied in other applications, such as Photos, Keynote, iMovie and Final Cut Pro. A metadata-driven format will display differently depending on the device showing the graphics. This will be useful when videos are made up of objects: video clips, images and metadata that tells the playback compositing software how to present the story.
If Apple start presenting Messages as a place for ad-hoc group-based collaboration (for play or for work), there should be a place for video.
Recording and playback of multiple simultaneous video streams
Created for those who want to record on-screen gameplay for later sharing online, ReplayKit for iOS now adds simple live streaming plus the ability to also record the play themselves commentating using a front-facing camera. This means a standard UI for viewers to be able to switch between ‘angles’ in a playback stream whenever they want.
A new file system: APFS
The APple File System is designed for modern storage devices. The current file system - HFS+ - was designed to work with floppy discs. APFS is designed for Flash/solid state memory. HFS+ is known to degrade over time - normal day to day usage will result in files getting lost. APFS is designed for recoverability. It will be much easier to get at ‘deleted’ data. It will handle backups much more smoothly.
As with Final Cut Pro X projects, the state of whole drives or parts of drives can be captured in a Snapshot.
A new file system doesn't mean a new Finder. It means that applications that spend most of their time manipulating files - like the Finder - will need to be updated to understand the new ways of organising documents and applications on storage devices.
Important: APFS is released as a Developer Preview in OS X 10.12, and is scheduled to ship in 2017.
The new Wide Color system framework will add wide colour gamut picture/picture capture and manipulation to iOS and macOS. Following on from its introduction to recent iMacs and iPad Pros, Apple have settled on the DCI-P3 gamut - the standard colour space used to specify the colours used in US cinema projection. Some think Adobe RGB would have been a better choice.
Sharing private data via CloudKit
You use the CloudKit native framework to take your app’s existing data and store it in the cloud so that the user can access it on multiple devices.
Currently any data that is stored in the cloud using Apple’s CloudKit framework is either public or private. This year CloudKit in Apple OSs will add the ability for iCloud users to share data amongst themselves.
This would be very useful for post production applications. For example Final Cut could upload proxy versions of all media (or media used within a specific project) so that collaborators would be able to have a live timeline to work with.
QuickTime in Apple OSes
QuickTime as a container for video and audio files has a great future. The AVFoundation framework is the basis of Apple software that records, manipulates and plays QuickTime documents (amongst other file formats).
QuickTime the software framework is depreciated in macOS. This means that applications that use the QuickTime API will still work in macOS Sierra (10.12), but may not work in a future version. There is no way yet to know if Final Cut Pro 7 will work in macOS Sierra, but my guess is that it probably will.
As part of building applications Xcode, Apple’s development system, checks to see if the code uses old or depreciated OS features. It uses a API Diffs file to look at all code. The QuickTime part shows that the API headers have been removed. The AVFoundation part shows a lot has been added.
QuickTime the API has been depreciated for a while. Removing the headers means that applications can no longer compile if the code uses the old API. Applications already compiled on older OSes will still work in macOS Sierra.
Once again, the file format lives on. The part of Apple OSes that manipulate media called QuickTime will be replaced by AVFoundation eventually. This shouldn’t be a problem for Mac users of old applications for now. Remember that one day they will not work in a future version of macOS.
Apple and the future of media
Apple didn’t make any announcements directly relevant to post production. There was no mention of Retina displays, 4K, VR or 360° video.
On the other hand they laid some interesting foundations for collaboration. One day we might look back at this week and see elements vital to a new product or service introduced in coming months and years.
I'm looking forward to seeing what happens next.