Apple’s patent for applying effects to clips with specific roles

Tuesday, 07 June 2016

The name of patent 9,240,215 may be ‘Editing operations facilitated by metadata,’ but it is about applying effects to roles in Final Cut Pro X:

For example, several clips may be assigned one audio role of "Dialog", "Music", or "SFX". A process  then provides one or more user interface controls. These user interface controls are also associated with the tagged clips. That is, the user interface controls are associated so that these controls can be used to display or modify properties of the tagged clips.

PDF version.

 

Apple’s structure editing patent

Tuesday, 07 June 2016

While editors wait for the next big Final Cut Pro X update, I hope the Apple ProApps team will implement some of the ideas in their ‘structure editing’ patent. Here’s my old writeup of the patent they applied for in 2009 on fcp.co:

Most people think that the editor’s job is ‘to cut out the bad bits’ in individual scenes. Many are surprised to discover that editors commonly change and improve storytelling by changing story structure. As many film and TV makers consider that structure is very important when it comes to telling stories, I think it is a good idea for video editing software to recognise story structure.

Structure applies to feature films, TV shows, groups of corporate videos on an intranet, legal video depositions, architects’ video proposals or open-ended weekly web series. The more video applications can have these structures encoded in their projects, the better the tools they’ll be able to provide to a wider range of people all over the world.

Introduction to VR Video with Final Cut Pro X

Tuesday, 31 May 2016

At the FCP Exchange event at NAB in April, Tim Dashwood and I gave a presentation on working with VR 360° video in Final Cut Pro X and Motion.

Initially I explained the art of spherical video from first principles comparing it to VR apps. I showed how editors can learn specialised tools that understand 'equirectangular' video, effects and graphic overlays to tell stories that play out all around you.

I also explained how editors can share their work with millions of smartphone users around the world. Tim Dashwood then gave a quick rundown of the science of high-end VR video effects that are available for Final Cut Pro X today.

 

360° Virtual Reality with FCPX from FCPWORKS on Vimeo

FCP Exchange is a series of free industry seminar days presented by FCPWORKS and fcp.co

Dashwood 360 VR Toolbox and 360 VR Express.

Sound Design Lessons for VR Video from VR Games

Friday, 27 May 2016

VR Video tools for video editors have progressed quickly in the last year, but there has been less discussion about the audio side of VR video. Although VR audio tools have yet to be integrated into NLEs, audio experts (and video editors who spend much of their time refining their soundtracks) should consider how audio design is different for VR.

Those designing audio for VR games are probably further along in coming up with what makes VR different. 

At a mini conference on game audio earlier this year, developer Gordon McGladdery gave a presentation on audio for VR games.

His game Fantastic Contraption is one of those given away with each HTC Vive (a VR headset that detects where you are in a room for ‘room-scale’ VR), and he has worked on the sound for VR commercials.

He spoke with Matthew Marteinsson on episode 25 of the ‘Beards, Cats and Indie Game Audio’ podcast about VR audio. Here is a summary of some of what was said:

[7:07] Binaural audio is very important - without it, experiencing VR is ‘like watching a 3D movie without the glasses on.’

[7:33] Music score doesn't work in VR games - it ‘muddies everything up’ [The music is] ‘coming from nowhere in the world and just seems to cloud the entire immersion.’

[10:17] Everything matters. The current video game sound design orthodoxy is that some sounds are more important than others;  time and budget determine a well known order of priorities when it comes to sound design. Everything shown in a game that can make sounds, must have game audio.

[14:21] Even if your target VR platforms don't have advanced audio, incorporating advanced audio future-proofs your current productions.

[15:05] ‘A lot of what we do here is to design right up until the end.’ It is important to design your sound workflow so that if the design of the game changes near launch (the eqivalent of a new edit of a film), ‘we as audio can quickly move with it.’

[16:14] ‘Distance falloff is really finicky’ - pay close attention to sound volume based on position - ‘none of the defaults work.’ Different sound sources have different falloff curves, some objects need to be heard from further away. You need different curves for every objects. Based on character need, not realistic sound physics.

[23:33] ‘Dynamic range is back - we're not crushing everything any more’ - adding heavy compression doesn't work - it just makes evreything loud. ‘VR audio will be pretty uncompressed.’ Prepare for the fact that different audio soundtracks work for different playing environments. Most VR experiences will be in quiet environments, but some will be in noisy places - which will need compression to punch through.

Listen to the rest of the podcast to hear Gordon’s take on VR use outside the world of games, as he is getting more non-game work due to his VR audio skills.

BBC TV production change could mean more workflow stories

Tuesday, 22 March 2016

Proponents of Final Cut Pro X and Adobe Premiere are frustrated when editors of high-end TV and features say "I don't know anyone who doesn't use Avid."

We value case studies showing what alternative applications can do. In the UK, the BBC is not allowed to publicise its workflows - used on TV shows that are world-famous - because being cubically-funded it is not allowed to promote one commercial supplier over others. All we can do is gather indirect information, such as tweets by those working with and for the BBC:

The flow of BBC production stories may increase soon: they are moving the majority of their TV production to a commercial division on April 29. This is the way commercial TV in the UK and elsewhere works. In a year ‘BBC Studios’ will also make shows for other broadcasters and networks.

If I was promoting Final Cut Pro X and Adobe Premiere workflows, I'd start preparing case studies now.

Dual lens iPhone 7 = Multi-angle QuickTime files

Thursday, 10 March 2016

The day after the announcement of a new iPhone, the speculation starts for what might appear in the next iPhone. This speculation is based on Apple patents, acquisitions, what parts suppliers can produce and what Android phones have had ‘for years.’

This week’s speculation from MacRumors suggests that the next iPhone will have a dual lens camera. Such a camera would be able to capture a normal shot and a close up shot at the same time:

Amid rumors a dual-lens camera will be introduced in the iPhone 7, Apple recently submitted a patent application published in January which gives us rare insight into what Apple thinks a dual-lens camera interface could look like on future iOS devices.

The patent outlines a dual-camera system that consists of one standard wide-angle lens similar to what's in the iPhone today and a second telephoto lens capable of capturing zoomed-in video and photos.

Apple’s iPhone is the combination of hardware and software, the interesting part for me is the software part. As with slow motion recording, there are software implications. Dual lenses means

  • A user interface for capturing two video ‘angles’ at once (with a single shared soundtrack - unless audio from multiple microphones is also captured)
  • Storing more than one video stream in the same QuickTime file
  • A user interface for marking footage at points when playback should switch showing footage recorded with one lens to the other

Multicam for ‘the rest of us’

Consumer-level multicam has implications for the next versions of iMovie for iOS and OS X and Final Cut Pro for OS X (and iOS?). As well as being able to handle multi-video track QuickTime files, they might provide features to ease users into multi-angle production.

iMovie would need a command and shortcut to add metadata to a multilayer clip to say ‘switch to other layer here.’ That would be a useful shortcut to add to Final Cut Pro X as well. Currently editors  working with two-angle multicam clips,  must alternately use ‘Cut and Switch to Viewer Angle 1’ and ‘Cut and Switch to Viewer Angle 2’ depending on which angle is active. 

Multi-track QuickTime is back

Once consumers get used to multi-stream video files, they might start expecting that multiple devices at the same location should be able to contribute to a single QuickTime record of an event. As long as all devices have an iCloud account, Apple could provide the synced file for all to share with each other and others.

More and more professional production uses multiple cameras. Final productions will probably include multiple video assets that are shown depending on playback settings. This means multicam user interfaces for production and playback alongside multiple video layers being stored in single movie files.

Also recording devices will probably encode multiple video angles into movies. Aldready on Convergent Design’s Apollo switcher/recorder product page:

exports separate Apple ProRes files with matching timecode or a single multi-stream QuickTime file that drops directly into the timeline of supporting NLEs such as FCP-X.

How interesting.

Last year’s Apple WWDC had a relevant session on editing movies using AV Foundation for iOS and OS X developers:

There are methods for creating and removing tracks, and you see to create a track, we have to say what type of track we want.

Do we want a video track, do we want an audio track, and so forth…

Features seem to come from Final Cut Pro X:

We can now open these, edit them, and write them back. At the track level, we have a similar setup. As you know, a composition is composed of composition tracks, and at the mutable level, we have AVMutableComposition tracks.

Media doesn't need to be in the same file:

Now, it's possible for the sample data that a track refers to exist in another file altogether, so you can have external sample references. It's even possible for the sample references to refer only to external sample data. Now, when we have this situation, the little movie box and its file type box is called a sample reference movie file.

Coming soon to OS X and iOS applications?

Final Cut Pro X at RTS Swiss National Television and the future of post production consultancy

Tuesday, 08 March 2016

A new case study shows how collaborative storage systems are coming into their own for TV stations using Final Cut Pro X. The team behind the implementation demonstrates how different the Final Cut high-end post production consultancy ecosystem is from the ‘left over from the 20th Century’ establishment. 

Ronny Courtens and the implementation team have written a detailed post at fcp.co:

[National Swiss TV broadcaster RTS ] needed a 100 TB effective and fully expandable enterprise NAS system with high redundancy and high-availability for 24 client connections. 12 connections over 10Gig SFP+ to their existing fiber channel network for the editing stations and the audio and ingest stations. And 12 connections over Gigabit Ethernet to their existing Cat6 network for extra ingest, titling and graphics machines, Open Directory and system admin.

...

all editing and ingest clients must be able to perform high-speed file transfers to the server without affecting the sustained Read bandwidth of any editing station. This is one of the biggest problems most NAS systems will face in this kind of setup. During the tests they did with the previous systems, bandwidth dropped considerably and brought the editing systems to a halt as soon as one of the Studios or Outside Broadcast trucks started streaming live multicam footage onto the server over the high-speed network. Or even when one of the editing stations did a simple export over 10GigE.

‘Mystery is Margin’ no longer

As well as the detailed technical story of the solution, the article includes the business story. On reading about their case study about factual TV production in Denmark, RTS engineers went to a freelance workflow consultant and his colleague for help. Ronny Courtens and Anouchka Demeulenaere proposed a new solution from a LA-based company LumaForge

It can’t be very often that a national TV station goes to a pair of freelance workflow consultants with no website or Twitter account. One of the videos embedded in the case study contrasts the old method of post consultancy (‘Mystery about how things work means Margin for us’) vs. the modern (‘Let’s work this out together’).

After trying to get one system working:

Finally they sent us a tech guy who started writing in the Terminal without explaining anything.

Compare that with:

The guys from LumaForge came in and Eric explained the entire system to us. 

Freelancers to the rescue - A new job for the 2010s?

Despite wanting post professionals to take a look at Final Cut Pro X, Apple have shown little interest in fitting into the ecomomics of post production consultancy. They might offer engineering support with proposals and support, but they don’t make it easy for people to make money out of installing and maintaining Final Cut Pro X at the high end. The software is too cheap and easy to buy, there is very little margin in Apple hardware.

There used to be money in multiple training courses for staff. Recently a person from another broadcaster responsible for ensuring that hundreds of journalists and camera people keep their skills up to date told me that they toured the many newsrooms full of Final Cut Pro X a few months after initial training. He asked them if they wanted any more training. They all said they didn't any: they were happy to find all the answers they need by going to the internet.

It could be that some post consultancies don’t recommend Final Cut Pro X because they can’t make enough money on those installations. They have expensive offices, salespeople and teams of engineers to support. 

In the case of Metronome in Denmark and RTS in Switzerland it wasn’t one of the big companies that provided the solution. It was Ronny Courtens and Anouchka Demeulenaere. They found a way of delivering a solution and making enough money to justify their time.

Ronny says:

We are not even consultants or integrators. The projects we get come from people whom we have known for years in the industry, or from people we know from the forums and groups. So we don't need a website or Twitter, nor do we need a large team. We just make the contacts, we analyze the issues and then we team up with people we think will be able to help us provide solutions.

Usually we don't charge anything for a first meeting, no matter where it is. We are always interested to discover new companies and workflows.

Things have indeed changed a lot lately.

The FileMaker model

That might be an interesting model to take to Apple. The Pro Apps team can’t get the rest of Apple too excited about helping a few thousand high-end post people make TV shows and feature films more easily. That doesn’t match Apple’s aim of empowering people and “leaving the world better than we found it.” What if the Pro Apps team proposed that they support thousands of freelance post consultants in introducing video to businesses and organisations of all sizes all over the world.

They do something very similar to this with their FileMaker database product and freelance community. Go to their website now and imagine the word FileMaker replaced with Final Cut Pro X. Where you see ‘database developer’ imagine ‘post workflow consultant’ instead. See software that can be bought and rented, where workflow tools work on Macs, servers and iOS devices. Also discover how much Apple promote and involve freelance developers with third party tools.

Making Final Cut Pro X a platform like FileMaker would help Apple truly revolutionise the future of video for businesses and organisations everywhere.

Final Cut Pro X: Rate Conform

Wednesday, 16 December 2015

When working with video clips that have frame rates that are close to being a multiple of the timeline frame rate, but not quite, Final Cut Pro X sometimes speeds them up or slows them down.

When this happens, you will be able to see a section in the inspector showing that its frame rate has been conformed:

tip-fcpx-rate-conform

This shows that a clip that normally runs at 25 frames a second will be slowed down so that it plays at 23.976 frames per second. It's playback speed on the timeline will be 95.904%. This means one frame of the clip will be displayed for one frame of the timeline.

Here are the rate conforms that Final Cut Pro X does automatically: 

When you add a clip with this frame rate To a timeline with this frame rate It automatically  plays back at this frame rate Speed % of original duration
         
23.976 24p 24 99.90% 100.10%
23.976 25p/i 25 95.90% 104.27%
23.976 50p 25 95.90% 104.27%
24 23.98p 23.976 100.10% 99.90%
24 25p/i 25 96.00% 104.17%
24 50p 25 96.00% 104.17%
25 23.98p 23.976 104.27% 95.90%
25 24p 24 104.17% 96.00%
29.97 30p 30 99.90% 100.10%
30 29.97p/i 29.97 100.10% 99.90%
30 59.94p 29.97 100.10% 99.90%
50 23.98p 47.952 104.27% 95.90%
50 48p 48 104.17% 96.00%
59.94 60p 60 99.90% 100.10%
60 29.97p/i 59.94 100.10% 99.90%
60 59.94p 59.94 100.10% 99.90%

This is a full listing of the combinations where Final Cut Pro automatically changes the speed of a clip. For any other frame rate combnations Final Cut will drop or repeat frames so that they source clip seems to play at its original speed.

For example if you add a 30p iPhone video to a 25p timeline, Final Cut will skip some of those frames every second so the playback speed remains the same and the duration stays the same: a 3 second 30p clip will take 3 seconds to display in a 25p timeline. If that same 30p clip was added to a 48p timeline, then Final Cut will repeat some frames so the playback speed will remain the same: the 3 second 30p clip will display for 3 seconds on a 48p timeline.

Letting audiences make structural choices in films

Wednesday, 30 September 2015

Editors determine structure. From individual frames, shots and sequences up to scenes and acts. At the higher levels they work out what order to tell stories and with how much detail to go into.

When we tell stories, it is common to divide them up into parts – ‘atoms’ – which we think are the smallest indivisible parts of the tale.

In the case of news stories and documentaries, these atoms are made up of video, text, images and sound. For news organisations, the same atom is likely to be used in multiple stories. When making items for broadcast, the same atoms are used time after time as news stories evolve.

As part of their ‘Elastic News’ project, BBC R&D have been testing ideas that allow audiences to determine levels of detail and even the order that news stories are told. One model was a mobile app…

…that uses chapterised videos and text captions as the core experience while allowing users to insert additional video chapters into the main timeline where they want to know more. This creates a custom user journey of the news story.

Visit the post to read more and see a video simulation of the models they tested. 

Overall, our top-level recommendations from this user testing were:

  • continue to use a mixture of content (video, text, audio, etc)
  • provide 3 levels of depth - overview, richer content, links to full story
  • card-based, using text and images work well as a quick overview of the story- video might be more appropriate for deeper content
  • text over videos is confusing - users aren’t sure if it’s relevant to the specific scene where it appears or if it is subtitles or captions

The next iteration of our project will be taking the best features from both prototypes and recommendations from the user testing. The next prototype will also address data structure challenges as we collaborate with BBC News Labs.

Not only ‘Elastic News’ - elastic documentaries, features, TV…

In this case the BBC were testing younger people’s interaction with news items on mobile phones.

Perhaps some of these ideas could be applied to longer stories: documentaries, feature films, TV series. They could also apply to new forms such as websites, games and VR stories.

This requires editors and their tools to be able to work with story atoms as well as whole stories. 

This research seems to be about seeing audiences as individual news ‘users.’ Once we have a model for individual audience members being able to ‘choose their own adventure’ it’ll be time to work on how to make shared experiences possible… Maybe a teacher/pupils model would be a place to start.

IBC 2015

Wednesday, 16 September 2015

Over the last five days I spent my time in Amsterdam attending IBC 2015. I also attended the FCP EXPO.

IBC is a trade show for the TV and film business:

IBC is the premier annual event for professionals engaged in the creation, management and delivery of entertainment and news content worldwide.

Across 14 big halls, two or three had exhbitors relevant for production and post production. 

There were many high-end media asset management systems, virtual studios with motion control cameras and large cages where drones were shown flying around.

As last year, there weren’t many signs of Final Cut on the IBC show floor. Apart from the Avid and Adobe stands, few screens were showing any kind of editing application. If you weren’t part of a NLE decision-making team, you’d think there was no choice in NLEs yet. 

Camera manufacturers were starting to admit that they are better camera makers than digital recording device makers. Good news for companies making devices that can convert uncompressed camera source to codecs from Avid and Apple.

Despite Final Cut being hardly mentioned, Apple was everywhere because of ProRes. Whenever a video, sign or stand staffer covered high-end workflow, ProRes was always mentioned - usually first for some reason.

As with the vast majority of trade fairs of any kind around the world, Apple didn't pay for a stand. They choose to attend informally. Visiting stands, arranging meetings and supporting events near the main show.

At the US equivalent of IBC, the NAB Show held in Las Vegas, Apple organised their own invite-only suite in a nearby venue. They also gave presentations at an event orgainsed by FCPWORKS, a US-based post production systems integrator.

FCP EXPO

This September FCPWORKS teamed up with UK-based Soho Editors to put on a Final Cut Pro X-focussed event for IBC attendees. FCP EXPO was a two day event at a venue a few minutes walk from the IBC halls with sessions including presentations from Apple, Alex Snelling for Soho Editors and Ronny Courtens on Metronome’s reality TV workflow.

I gave a presentation as part of the FxFactory session which included a demo from Tim Dashwood on his exciting new toolkit for editing 360º video on the Final Cut Pro X timeline. As well as being able to play 360º video directly to a connected Oculus Rift VR headset, the 360VR Toolbox also allows editors to make creative choices based on how edits feel - almost impossible until now.

In coming days, some of the presentations will be made available online.

The presentation Apple gave had moved on a great deal even since the one they gave on the Apple Campus as part of the FCPX Creative Summit in June. It included more examples of great work from various projects around the world and demonstrations of features from recent Final Cut and Motion updates. Apple also introduced who from the team were there and welcomed attendee questions throughout the day.

Even though the day started with Apple, there was no drop off in attendance throughout both days as people stayed for a wide variety of presentations, networking and conversations in an exhibition area featuring pro Final Cut Pro X suppliers such as Intelligent Assistance

It is good news that Soho Editors were putting this event on. They are a long-established post production staffing agency and training company. Their support shows they think there’s a benefit to them encouraging their freelancers to learn Final Cut Pro X and that Final Cut training is a valuable service they can offer.

At the moment many TV journalists, researchers and producers are learning Final Cut through in-house training. Agencies like Soho Editors represent editors who already have years of high-end post experience. Once other established editors realise that freelance contemporaries are learning X, they may want to make sure they keep up.

NAB

Now that IBC is over, it is time to plan for NAB in Las Vegas in 2016. I've organised my flights already. I hope FCPWORKS and Apple take what they've learnt from Final Cut at IBC and do more in April.

Soho Editors has many clients and freelancers who aren’t sold on Final Cut Pro X yet, so they were a great choice for a Final Cut event partner. I hope FCPWORKS tries to reach more unconverted editors and post people when publicising a ‘NAB adjacent’ event.

As the UI for Final Cut is so much less threatening than the competition, I think there is mileage in attempting to get non-editing and post people to attend as well. People who have all kinds of jobs in TV, games and feature film production would benefit from learning Final Cut. My take would be: ‘Why should editors be the only ones who benefit from the ease and speed of Final Cut Pro X,’ but I’m no marketing expert…

Apple’s September 2015 Event and film makers

Wednesday, 09 September 2015

Apple’s September announcements have interesting elements for video storytellers.

iMovie

The new iPhone 6S models have cameras that can record 4K video. That means iMovie on those devices will be able to edit 4K video:

iMovie is designed to take advantage of the beautiful 4K video you can shoot and edit on your iPhone 6s. In fact, iPhone 6s is so powerful you can smoothly edit two streams of 4K video to create effects like picture-in-picture and split screen.

Desktop-class performance lets you create advanced effects with up to three simultaneous streams of 4K video and export your 4K video at blazing speeds. And accessories like the Smart Keyboard let you use efficient shortcuts to make quick work of your project.

Interesting that they saw the need to handle three streams of 4K. iMovie will also be available as an extension to iOS applications that allow what photos and videos to be editing. If iMovie for iOS 2.2 doesn’t edit non 30/60p videos, hopefully editing extensions will be made by other developers.

At the moment the specs for the iPhone 6S only mention a limited range of frame rates:

  • 4K video recording (3840 by 2160) at 30 fps
  • 1080p HD video recording at 30 fps or 60 fps
  • 720p HD video recording at 30 fps

The native resolution of the iPad Pro is 2732 by 2048, leaving plenty of room for editing UI around a full 1920x1080 HD display, all those pixels would make it a good wired or wireless viewfinder for high-end video cameras.

Apple also have introduced a content-based refresh so the screen is updated as often is dictated by content. This should mean that if video is running at 23.976fps, then that’s how often the display is updated. Maybe that will work for 120fps content too.

Here is what iMovie for iOS 2.2 looks like on an iPad Pro:

iMovie-HDsm

The iPad Pro includes a ‘Smart Connector’ for its Smart Keyboard that allows power and information do go in both directions:

The Smart Connector works hand in hand with the conductive fabric inside the Smart Keyboard to allow for a two‑way exchange of power and data.

That means the iPad will be able to power accessories, and accessories will be able to power the iPad. Data going both ways might allow for some interesting third-party products...

3D Touch

The new iPhones and the iPad Pro have an advanced pressure sensitivity feature that Apple calls ‘3D Touch’.

Fortunately, Apple didn’t just add a new input method and leave its use up to individual developers. In iOS 9 and Apple applications, a light touch is a ‘Peek’ and a heavier touch after that is a ‘Pop:’

Peek and Pop let you preview all kinds of content and even act on it — without having to actually open it. For example, with a light press you can Peek at each email in your inbox. Then when you want to open one, press a little deeper to Pop into it.

I like to think of Peek as ‘Look at the metadata associated with this thing’ and Pop as ‘Act upon this thing with another tool’

It might be useful to have these shortcuts in Mac apps. Here’s hoping Apple introduce 3D Touch mice and trackpads…

Vertical Video Live Photos

The default action of the Camera application on the iPad Pro and the iPhone 6Ss is to capture a few moments around each photograph, a little bit of audio and some movement. When you press on them anywhere in iOS, they’ll show more than just the moment the picture was taken: 

A whole new way to look at photography, Live Photos go beyond snapshots to capture moments with motion and sound. Press a Live Photo to make it come alive. Experience the crack of a smile. The crash of a wave. Or the wag of a tail.

Just when some people are getting the message that video should be taken in a landscape orientation, Apple will be promoting the idea of photos that are a little ‘live.’ Oh well.

We’ll soon discover whether Live Photos will appear as video in iMovie, and whether the still image shown by default will be able to be changed - the best moment might not be in the middle of the sequence.

Apple TV – Apple’s Home Computer

The Apple TV brings applications to TV screens. Instead of iOS running on the new Apple TV, there's a new OS: tvOS.

With apps providing TV, movies, music, games and family organisation support, maybe Apple would like the Apple TV to be the new ‘Home Computer’ 

As well as tvOS sharing many features of iOS, for producers with large amounts of online content tvOS also allows web-based content to be made available in the Apple TV UI based on XML specifications:

Use Apple’s Television Markup Language (TVML) to create individual pages inside of a client-server app.

Every page in a client-server app is built on a TVML template. TVML templates define what elements can be used and in what order. Each template is designed to display information in a specific way. For example, the loadingTemplate shows a spinner and a quick description of what is happening, while the ratingTemplate shows the rating for a product. You create a new TVML file that contains a single template for each page in a client-server app. Each template page occupies the entire TV screen.

…and Javascript:

The TVJS framework provides you with the means to display client-server apps created with the Apple TV Markup Language (TVML) on the new Apple TV. You use other classes in the framework to stream media and respond to events.

As part of the demo a live sport app was able to show metadata during a game:

mlbmetadatasm

It would be good if Apple or another developer added time-based metadata display and editing to an NLE. 

Imagine a version of Final Cut or iMovie that could interpret Apple’s Television Markup Language and show what a production would look like when streamed on an Apple TV while editing in the timeline…

The odd one out

In recent years Apple has had two autumn events and distributed news about all their platforms between the two. Today’s event talked about devices that run watchOS, iOS and tvOS. The odd one out is the Mac. Either the next Mac update is so big it must have its own event, or there won't be much to report about devices running ‘macOS’ (if that’s what OS X is renamed as)  until next year. We’ll see…

Solving the vertical video problem: The New York Times’ first step

Wednesday, 09 September 2015

Justin Bieber’s new song is number in the UK. The New York Times has made an 8-minute video about how “Where Are Ü Now” was made.

It was conceived from the start as a video that works in more than one aspect ratio. Human interface experts NiemanLab have written a ‘making of’ abut this ‘making of’:

Unsurprisingly, the combination of Bieber and The Grey Lady turned some heads. But the Times’ video is interesting for another reason — it was designed from the beginning to be as compelling viewed vertically as horizontally. In a world where young people are watching more video on smartphones than on TV screens, making a video work in both aspect ratios can help it reach a broader audience.

This was an aesthetic as well as technical problem - how to combine filmed footage with motion graphics overlays that look good both on a TV and a vertically-held phone.

It is worth reading, but perhaps post-production people should consider whether timelines should have a fixed aspect ratio. They already don’t have a fixed resolution. 

Not the ‘where’ of a video element - the ‘what’

I suggest that elements for future videos may be exported as layered movies. It will be up to the playback device or software to how to show the elements so they work in the aspect ratio needed for each viewing.

This already happens for audio in Final Cut Pro X. Instead of defining the speaker through which audio should be heard, all audio is given a ‘role.’ This metadata can then be used by broadcasters and distributors to determine which audio should be played back - depending on context. 

The standard audio expected for UK TV production expects programmes to include a stereo mix, a surround mix, a stereo audio description (in which a voiceover during gaps in dialogue describes what happens on screen), music and effects only and alternate languages.

Imagine if programmes also had layers marked as Base video,’ ‘Signs and information in English,’ ‘Behind the scenes information,’ ‘Purchasing information,’ and ‘Signs and information in an alternate language.’ In the case of signs and text, this is how Pixar generates its movies.

In the case of the New York Times video, the motion graphics elements would be included in a separate layer which would be composited in different positions or even angles depending on the orientation of the playback device.

The answer to the problem of vertical video is to make sure videos look good when viewed at any aspect ratio.

That means editing applications will be able to playback the same content at multiple aspect ratios - much like page layout applications eventually added features which allowed designers to work with multiple aspect ratios for magazines and adverts.

To support multiple aspect ratios video makers will need tools that let them define what a video element is - the playback device can then determine the best place to play it back. Even if that is a second screen…

Adobe: Premiere Touch is for professionals too

Tuesday, 08 September 2015

As well as a bug fix update for Adobe Premiere Pro CC today, Adobe have reported what will be in the next version.

With this next release, Premiere Pro expands on its exceptional support for UltraHD, 4K and beyond workflows with new, native support for HEVC (h.265), DNxHR, and OpenEXR media, for both encode and decode, allowing editors to edit and deliver any format they need to.

When I've made mockups of Final Cut Pro X running on an iPad Pro people have asked why pros would want to edit on an iPad.

Interestingly for Microsoft Surface and iPad Pro fans, Adobe doesn’t consider a touch interface a sign of software for non-professionals:

Premiere Pro will let you build up your edit in new and tactile ways, by providing touch support for Windows hybrid touch devices like the Microsoft Surface Pro, and improved gestural support using Apple Force Touch track pads. Use multi-touch in the Assembly workspace for pinch to zoom to make your media clips big and easy to work with, then easily reorder them for storyboarding, play back and scrub right on the icons with your finger, tap to mark in and out points and drag straight to a sequence. Or, drag to the Program Monitor, where a new overlay will appear to allow you to drop into different zones to perform various standard kinds of edit. And on Apple Force Touch track pads, get haptic feedback when snapping and trimming in the timeline.

Premiere-Touch-Edit

 

 

iPad Pro demo?

I thought that iMovie 4K would be a great demo application for the iPad Pro at tomorrow’s Apple event. Perhaps third-party developers would be more inspired by Adobe software running on the new device. That was the right thing to do at the WWDC earlier this year. Maybe Adobe will be on stage tomorrow…

For screenshots and more information go over to the Premiere Pro blog.

Final Cut Pro X dynamic range and colour gamut: Watch out for clipping

Tuesday, 08 September 2015

Now that those promoting UHD are starting to talk about High Dynamic Range and Wide Colour Gamut, it is worth considering what Final Cut Pro X does with brightness and colour information internally.

Here is a still from some footage from EditStock.com - the site that shares rushes from all sorts of productions so people can practice editing and post production - and the scopes showing its range of colour and brightness:

dr-cg-1

This still shows a good range of colour from 0-100 for red, green and blue as well as brightness (luma) from 0 to 100 - despite the source movie being an H.264-encoded QuickTime movie.

If I apply a Color Board colour correction I can desaturate it and make it darker:

dr-cg-2

Or I can make it more saturated and make the colours more intense:

dr-cg-3

The term 'clip' here means that it is impossible for some pixels to be any darker than black or brighter than white.

The question here is what does each effect consider black and white. Does it range from -20 to 120 or from 0 to 100.

The result of each of the Color Board corrections shows values below 0 and above 120, so colour corrections have a wide range. So if both these corrections are applied at the same time…

dr-cg-3a

…the result is that Final Cut first darkens and desaturates the clip so some pixels have brightness and colour levels below 0 (as shown above in the second image) and then brightens them back up:

dr-cg-4

If brightness is stored as a value between -20 and 120, imagine two pixels that start off as having brightness values of 20 and 15. The first correction - that darkens the clip - might change these values to -4 and -6 - making them so close to fully black as being indistinguishable by eye. Certainly impossible to distinguish between each other. Alternatively, the second correction would change these to 55 and 62 - make them both brighter.

When Final Cut applies both corrections - first 20 being made darker to -4 and then -4 being made brighter to 21 - the pixel almost reverts back to its original brightness value.

The catch is that some Final Cut effects use a smaller brightness range than others.

That means if I apply an Gaussian blur effect to the output of the first Color Board correction, and then apply the second colour correction, the fact that the Gaussian blur only works with brightness values between 0 and 100 means that if a colour board changes values to being less than 0, the blue sees any below 0 as 0.

dr-cg-4a

Using the pixel values from before, the Gaussian blur takes the -4 and treats it as 0. It also clips the -6 to 0. After the blur is applied, the brightness values passed on to the next effect might be the same: 0 and 0.

These are then passed on to the second Color Board correction which brightens both ‘black’ pixels to 35. The difference in brightness between the two pixels has been lost, there’s no way for the second correction to get it back.

Less range in the Gaussian blur effect means all the detail in the temporarily dark parts of the clip it receives is lost:

dr-cg-5

You can see that there are no pixels with a Luma value of less than 35. The detail of the various brightness values between dark grey and black in the original clip were were all made black by the Gaussian blur effect. When the second correction made those black pixels brighter, all the detail the darker parts of the frame were lost.

Check the result of effects with the video scopes

This means when you apply effects to clips in Final Cut, it is worth checking the video scopes to see what the effects do to the brightness and colour values of your footage. It is often worth changing the order of effects to make sure you don’t lose dynamic range.

In this case moving the Gaussian blur effect to before both colour corrections, prevents the clipping:

dr-cg-5a

I could have also moved the blur to after the second correction.

HDR and WCG precision

High Dynamic Range and Wide Colour Gamut are about being able to encode a wider range of brightnesses and colours. They also require more precision: being able to distinguish the smaller brightness differences between pixels. That’s where ‘bit depth’ comes into HDR and WCG specifications. 8 bits for brightness can store 256 values between 0 and 1. 10 bits can distinguish four times as much detail: 1,024 values between 0 and 1.

NLEs like Final Cut and Premiere can handle HDR, WCG and codecs with high precision internally (such as ProRes 4444 XQ), the next stage for post is finding accurate ways of representing this precision in codecs designed for distribution to viewers.

What’s good for post production is good for the rest of Apple

Monday, 07 September 2015

When industry analysts try and come up with things Apple can do with their money, some suggest they buy  companies like Disney, Spotify or Tesla.

apple-should-buy

In practice Apple buys companies for their underlying technology. 

Reuters reports that Apple are buying up companies to support their efforts to make iPhones, iPads, Apple TVs, Apple Watches and Macs smarter:

"In the past, Apple has not been at the vanguard of machine learning and cutting edge artificial intelligence work, but that is rapidly changing,” he said. “They are after the best and the brightest, just like everybody else.”

Acquisitions of startups such as podcasting app Swell, social media analytics firm Topsy and personal assistant app Cue have also expanded Apple’s pool of experts in the field.

ProApps users have benefitted from Apple shopping sprees recently. Logic Pro X 10.2 now includes software acquired with Alchemy. Final Cut Pro X can now deliver TV programmes to UK broadcasters because Apple acquired MXF export software from Hamburg Pro Media.

If post production needs can be met by software and patents that would benefit other Apple products and services, so much the better.

Two candidates

While we wait for a professional audio application designed at its core for post production - which could be named ‘Soundtrack Pro X’ - Final Cut users have been raving about some specialised plugins from iZotope. Their Advanced Post Production bundle is able to fix audio problems previously thought impossible to solve, and also can create standards-compliant audio for use by broadcasters and distributors. 

Imagine if Apple bought the services of their people and their algorithms, patents and products. As well as working well in Final Cut, Logic and iMovie productions, deep knowledge of being able to remove reverb, echos and distracting noise from records would be very useful for parts of iOS and OS X that interpret audio instructions and environments.

It almost seems as if iZotope have re-organised themselves so as to prepare for an offer from Apple. From their August 18th press release

iZotope, Inc., a leading audio technology company, today announced its strategic decision to divide the current iZotope product line into two distinct families of products, one focused on Music Production and the other on Audio Post Production.

Another useful ability for post production software and for operating systems would be the ability to analyze large amounts of audio. Step up Nexidia.

They have products that can search for specific phrases phrases from hours of media, do quality control on captions, video description channels, languages and a tool that can automatically align captions to specific speakers.

Nexidia’s Dialogue Search enables content producers, owners, and consumers to type any combination of words or phrases, and in only seconds, find and preview any media clip where those words or phrases are spoken—independent of any captions, transcript, or metadata.

As well as supporting productions with terabytes of data to search through for creative reasons, Nexidia software could be used by Apple to interpret petabytes of video stored on iCloud - anonymised of course. The more advanced tools Apple has to understand audio, the better.

Which companies, products or services do you think you could persuade Apple to buy?

Final Cut Pro X 10.2.2 Update: Codecs and workflow

Friday, 04 September 2015

Today Apple updated Final Cut Pro X, Motion and Compressor. Here is what the Final Cut 10.2.2 release note says about what’s changed:

  • Native support for Sony XAVC-L and Panasonic AVC-Intra 4:4:4 up to 4K resolution
  • Import Canon XF-AVC 8-bit video files with Canon plug-in
  • Export interlaced H.264 video
  • Asset management systems can include a library backup file when sharing from Final Cut Pro
  • Fixes render errors that could occur when using reflective materials with 3D text
  • Improves stability when swapping materials on 3D text with published parameters
  • Improves performance when loading text styles
  • Motion Title templates with published text layout parameters now export correctly
  • Fixes an issue that could cause 3D text to appear dark when rendered
  • Addresses issues with timing on certain animated effects

Motion 5.2.2’s note lists a subset of the Final Cut features. Compressor 4.2.1 adds:

  • Fixes a crash that could occur after migrating a user account to another system
  • Restores the ability to use markers for i-frame placement in H.264 exports
  • Improves audio and video sync of closed captions and subtitles

For many Final Cut users, the biggest news with this update is that the text entry field titles is now resizable:

resize-text-field

Peter Wiggins reported on Twitter that some plugins that have been broken since the 10.2 update now work:

As regards

  • Native support for Sony XAVC-L and Panasonic AVC-Intra 4:4:4 up to 4K resolution

There is a slight wrinkle for those dealing with media from the Sony X70. ddixon writes on the fcp.co forum:

Yes! Works fine for me. The 2.0 firmware for the camera is required, and once applied you must reformat the SDXC card in-camera, then record new clips. The last FCPX update fixed this for HD - today's update fixes it for 4K.

Note that the memory card must also be reformatted in-camera after the firmware update. However, if you have 4K clips that were shot by a 2.0 firmware upgraded camera, those now magically work with 10.2.2. And, before today you could not mix HD and 4K on the same card, but I assume this limitation is no longer the case - although that's an assumption, have not tested it.

Looks like for media compatibility, you need to check if there is any other software you need to update. For now the Canon XF-AVC software hasn't been udated.

An IBC-friendly update

As well as better 4K camera compatibility, a feature seems designed to support developers presenting at next week’s IBC show in Amsterdam:

  • Asset management systems can include a library backup file when sharing from Final Cut Pro

Part of this feature is a settings file inside Final Cut that third-party applications can use to make libraries using Final Cut. Some users are more comfortable with library .fcpbundle files than .fcpxml files:

Library bundle

Looks like this isn’t designed for users to be able to do from within the normal Final Cut interface yet.

Perhaps we’ll learn more after product announcements at IBC soon. Maybe during a presentation at the FCPWORKS/Soho Editors FCP EXPO

OS X El Capitan compatibility?

This update has appeared only weeks before the expected launch of OS X 10.11. There are various possibilities here:

  • This version of Final Cut has been updated for OS X compatibility as well
  • There was no time for this version to include OSX compatibility
  • Due to buggy new features in the release candidate of OS X El Capitan, it wasn’t possible to make Final Cut Pro X 10.2.2 compatible with what will be an initially feature incomplete version of OS X

I went into why this is so in the last post here: ‘OS X updates and Final Cut Pro X: A false sense of security?

These compatibility possibilities may also apply to third-party products and services in the Final Cut Pro X ecosystem, so unlike recent OS transitions, we might have to wait a few weeks before updating to OS X El Capitan.

OS X updates and Final Cut Pro X: A false sense of security?

Thursday, 03 September 2015

One of the interesting aspects of the Apple Worldwide Developer Conference each year are the new additions of Mac OS X (and iOS) that could benefit those who use Final Cut Pro. Here’s another way of looking at this OS X El Capitan: What if the new features mean problems for the tools we use every day?

In recent years OS X and Final Cut Pro updates have run smoothly. 10 years ago, the standard advice was to wait for three or bugfix updates of OS X or Final Cut to come out before upgrading.

Although Final Cut Pro X has been developed in parallel with improvements of the OS X video architecture, I’ve had no problems with accepting every new update of Final Cut in recent years. Often new versions of Final Cut often mean libraries needing to be updated to a new file format.

It is a good to archive copies of your libraries in the old format before allowing a newly updated Final Cut to change them into the new format.

As suggested by Apple I make an archive copy of the ‘old’ version of Final Cut in the Finder (Using the ‘Compress…’ command from the File menu). Once Final Cut has been updated, I can still run the older version on the same Mac to access the older libraries.

In practice, it is the OS updates that are likely to cause more problems than application updates.

Core Image is deeper than AV Foundation

This year it might be a good idea waiting a while before updating your Mac to OS X El Capitan. 

Metal will be a new underpinning to Core Image. Here’s Apple’s definition of Core Image:

Core Image is an image processing and analysis technology designed to provide near real-time processing for still and video images. It operates on image data types from the Core Graphics, Core Video, and Image I/O frameworks, using either a GPU or CPU rendering path. Core Image hides the details of low-level graphics processing by providing an easy-to-use application programming interface (API). You don’t need to know the details of OpenGL or OpenGL ES to leverage the power of the GPU, nor do you need to know anything about Grand Central Dispatch (GCD) to get the benefit of multicore processing. Core Image handles the details for you.

You can see from this definition that Core Image is the basis of more of OS X (and iOS) than AV Foundation. That means updating it for new technology (Metal) takes more resource from Apple, and a deep change might cause more temporary disruption.

Although we’d all like the benefits of Apple’s Metal technology to improve Final Cut’s speed, the price might be a little incompatibility for a version of OS X or two. There’ll be enough brave Mac fans who won’t be able to resist having the newest OS on their computer. Their feedback to Apple will hasten bug fix updates if they are needed.

Impatient users may complain about third party tools not working in El Capitan: ‘Why didn’t developers take part in the OS X beta and prepare their tools for the new version?’ In some cases it might be that Apple won’t have time to squash OS bugs before release - bugs that third parties have no way of working around.

The good news is as the underpinnings of OS X and iOS become more similar, investment in improving one OS will immediately benefit the other. In the case of Metal, making graphics and processing much more efficient in constrained hardware environment of iOS devices will make OS X Macs run even more quickly and use less power.

Read my summary of an episode of the Debug podcast featuring former Apple people talking about the annual OS development cycle.

 

Baselight and ColorFinale: Two approaches to colour for editors

Wednesday, 02 September 2015

Today FilmLight announced that they will soon offer free plugins for editors to see and render Baselight colour grades in their NLE made on other full-price Baselight systems:

The forthcoming version 4.4m1 release of Baselight Editions will make it free to view – or to render – grades passed between departments, or even to non-Baselight facilities. The new version will be released following IBC2015.

“Our aim is to make it even easier to pass around and refine the creative intent using BLGs,” said Steve Chapman, director at FilmLight. “If you want to see the latest colour grade in, say NUKE or Avid, you can do so by applying a BLG to a shot with the free version of Baselight Editions. It even allows you to render out the grade in your deliverables. If you want to modify the grade from these applications with the power of the Baselight core toolset, then you can buy – for $995 – the Baselight Editions packages.”

At the moment their software only Baselight Editions system works in Media Composer and NUKE. Their Final Cut Pro 7 version isn’t in active development.

Steve Chapman’s term of phrase in the press release is interesting: “If you want to see the latest colour grade in, say NUKE or Avid…” He didn’t say ‘If you want to see the latest colour grade in NUKE or Avid…’ the ‘say’ implies that the ‘view and render’ versions will be available for other applications.

For now it seems that FilmLight don’t think that Apple and Adobe’s markets don’t overlap with theirs. However, people and businesses with expensive Baselight installations would benefit if the grades they create could be viewed and rendered in Premiere and Final Cut. The separation that exists between editorial and high-quality colour is maintained: let people stick to what they are good at.

This would be FilmLight making the most of their ‘sunk cost’ - leveraging their investment in the UI and metaphors of their industry standard Baselight grading family. 

Color Finale: Minimal viable colour

ColorGrading Central has a different approach to colour in Final Cut Pro X: Color Finale. Despite having years of grading experience working on high-end productions using expensive systems, he went for the ‘minimal viable product’ route.

When creating a new product or service, there is the option to release a version as soon as possible instead of waiting until everything is perfect. This is the way Randy Ubillos approached Final Cut Pro 1.0 (eventually), iMovie ‘08 and Final Cut Pro X. 

  • Choose a minimal set of features that show who you are aiming the product at
  • Implement those features in such a way that communicates your philosophy: how you want to support your well-defined audience
  • Leave out all but the most vital features
  • Make sure your first version is good enough to capture enough interest and justify continual development
  • Update and improve quickly and often, and be clear about future developments

This method worked very well for Final Cut Pro X and is working very well for Color Finale. Color Finale version 1, like Final Cut Pro X, was good enough to use on high-end work, and it is quickly getting better.

How minimal is Color Finale? It doesn’t yet support Undo and Redo for actions in its palette - all you can do is reset the control to its default. This disadvantage is worth dealing with in return for getting to use Color Finale for the months that would have taken for Color Grading Central and Apple being able to deliver standard undo and redo commands.

From the FAQ:

What about masks and tracking?

They are a part of our development roadmap.  We plan to introduce features like these in a forthcoming 'Pro' version. The benefit of being an early adopter is you will get these new feature as a free update.

Already it is possible to make colour grades track features in shots in conjunction with SliceX from CoreMelt - as demonstrated by Sam Mestman in a recent episode of Ripple Training’s MacBreak Studio video training series.

Note that minimal doesn’t mean amateur: experienced colourists are already grading TV shows and feature films using Color Finale.

For more on Color Finale, read Oliver Peters’ detailed review.

Industry standard vs. new approach

Both approaches work well for Final Cut Pro X editors. Those that want to leave the grading work to experts will rely on round-tripping with other systems such as DaVinci Resolve and BaseLight. Others will benefit from tools that are better integrated into the application they use all the time. This is how sound is handled in post.

The question is whether FilmLight is interested in capturing the editing market - for both professionals and those who edit for other reasons. Can Color Finale become more advanced more quickly than Baselight becomes more suited to a wider audience?

In practice if the financial models of their audiences are different enough, Color Grading Central and FilmLight aren’t in competition. One doesn’t have to lose in order for the other to win. Which reminds me of the different approaches of Apple and Avid when it comes to developing their NLEs.

Apple and Cisco. Enterprise video next?

Monday, 31 August 2015

Following Apple’s and IBM’s partnership announced last year, Apple and Cisco have announced a rather vague partnership when it comes to iOS in the enterprise:

August 31, 2015 — Apple® and Cisco today announced a partnership to create a fast lane for iOS business users by optimizing Cisco networks for iOS devices and apps, integrating iPhone® with Cisco enterprise environments and providing unique collaboration on iPhone and iPad®.

Looks like Cisco will be doing most of the technical work and supplying their connections with big business and government ‘with Apple’s support’:

Apple and Cisco are also working together to make iPhone an even better business collaboration tool in Cisco voice and video environments, with the goal of providing employees with a seamless experience between iPhone and their desk phone.

With Apple's support, Cisco will deliver experiences specially optimized for iOS across mobile, cloud, and premises-based collaboration tools such as Cisco Spark, Cisco Telepresence and Cisco WebEx in order to deliver seamless team collaboration and reinvent the meeting experience.

Another way of looking at this is that Cisco’s sales team are probably being incentivised to sell iOS compatibility with Cisco’s telepresence products. That means iPhones being used in conjunction with wall-sized screens in meeting rooms and with people in other parts of the world who also use iPhones.

Mobile devices and enterprise video

At some point people will become accustomed to recording 4K video on their iOS devices for work reasons. How will they organise all that content?

Up until now Adobe’s strategy is to replicate the Desktop Publishing wave that helped Apple Macs get into corporate marketing departments. Adobe are building up a name for internal and external marketing based on Creative Cloud. Although hard, it is easier selling 250,000 Creative Cloud seats to 5,000 purchasers in large organisations than it is selling 250,000 memberships to the public - and even harder getting them to keep paying.

Eventually video will be a peer to text documents, PDFs and presentations: organisational currency. 

Desktop publishing in the late 80s and early 90s was Apple’s corporate lifeline. 80% of DTP’s graphic design features were absorbed into word processing applications by the late 90s. Apple couldn’t survive on what was left - Final Cut Pro 1.0 was designed to sell Macs to people who would realise that they needed to work with video. It has just taken much longer than many expected for video to become mainstream.

Which video editing metaphor will be absorbed into applications dedicated to day-to-day organisation of communication in large organisations? That’ll depend on which one people are more comfortable with, and which one IBM’s and Cisco’s sales teams push the most.

A ‘Movies for iWork’ application anyone?

Logic Pro X 10.2: Update for video editors?

Monday, 31 August 2015

Over at Logic Pro Expert, Edgar Rothermich has written up the many undocumented features in the Logic Pro X 10.2 update:

The release notes that come with the Logic Pro X 10.2 update indicate how big of an update this is. The list has over 250 items of new features, changes and bug fixes: https://support.apple.com/en-us/HT203718.

However, this is not all. There are additional changes in Logic Pro X 10.2 that are not mentioned in the release notes. In this article, I will not only list those changes, but provide in-depth explanations for each of those topics.

Here are aspects of this update of interest to Final Cut Pro X users.

Force-click support

A small thing is how the Logic team have chosen to implement force click for Macs that have input devices that can detect how hard you press. In Logic 10.2 force click acts as ‘click with pencil tool’ without switching from the current tool - which is usually the pointer tool in Logic.

In the case of Final Cut, we usually spend most of our time with the pointer tool - there is a less obvious ‘second most popular tool’ to assign to the force-click action. If you are in a metadata-oriented mode it would be the range tool, if you are fine cutting it would be the trim tool. As most editors have a hand on the keyboard able to switch tools instantly or even temporarily, I’m not sure what I’d want force-click to mean in Final Cut.

No closer to Soundtrack Pro X

Logic isn’t getting many new features that make it more useful for those working on video and feature projects. The Logic team are concentrating on supporting those who work with music and audio-originated productions. If you do use Logic for video post production, the new features that aid recording will be useful though.

Logic certainly doesn't look like it is getting closer to being able to work in a trackless/role-based way. It is becoming a much better track-based audio application.

Hopefully this means there’s space for a new audio post production application better suited for sound editors, mixers and designers.

Collaboration

As well as being able to upload music directly to established Apple Music Connect accounts, an interesting precedent is Logic 10.2’s integration with a third-party collaboration service called Gobbler.

Once signed into Gobbler, Logic users can

  • Send their whole project to another Gobbler user
  • Send a song to another Gobbler users
  • Automatically back up their projects to their cloud-based Gobbler accounts

Equivalent features for Final Cut Pro X/iMovie users could be very useful! 

Some may say that Final Cut’s integration with some video collaboration systems is close to providing these services. In the case of Logic, the integration is deeper: If you’ve signed into Gobbler, Apple modifies the Logic UI. Edgar lists the UI changes in his article:

Once you have Gobbler installed on your machine, Logic Pro will notice that and adds additional Gobbler-related menu items and settings in Logic Pro.

The Procedure for backing up your Project to Gobble

  • You can enable the Gobbler backup mechanism with a checkbox directly in the Project Save Dialog when you first save your Project.
  • That checkbox is also available in the Project Settings ➤ Assets window where you can disable/enable it at any time.
  • In addition, the File ➤ Gobbler ➤ submenu contains commands to start, pause, and remove the Backup procedure, plus access (load) any previous backups that are listed in that submenu

Read about Logic Pro X 10.2 and more at Logic Pro Expert.