Letting audiences make structural choices in films

Wednesday, 30 September 2015

Editors determine structure. From individual frames, shots and sequences up to scenes and acts. At the higher levels they work out what order to tell stories and with how much detail to go into.

When we tell stories, it is common to divide them up into parts – ‘atoms’ – which we think are the smallest indivisible parts of the tale.

In the case of news stories and documentaries, these atoms are made up of video, text, images and sound. For news organisations, the same atom is likely to be used in multiple stories. When making items for broadcast, the same atoms are used time after time as news stories evolve.

As part of their ‘Elastic News’ project, BBC R&D have been testing ideas that allow audiences to determine levels of detail and even the order that news stories are told. One model was a mobile app…

…that uses chapterised videos and text captions as the core experience while allowing users to insert additional video chapters into the main timeline where they want to know more. This creates a custom user journey of the news story.

Visit the post to read more and see a video simulation of the models they tested. 

Overall, our top-level recommendations from this user testing were:

  • continue to use a mixture of content (video, text, audio, etc)
  • provide 3 levels of depth - overview, richer content, links to full story
  • card-based, using text and images work well as a quick overview of the story- video might be more appropriate for deeper content
  • text over videos is confusing - users aren’t sure if it’s relevant to the specific scene where it appears or if it is subtitles or captions

The next iteration of our project will be taking the best features from both prototypes and recommendations from the user testing. The next prototype will also address data structure challenges as we collaborate with BBC News Labs.

Not only ‘Elastic News’ - elastic documentaries, features, TV…

In this case the BBC were testing younger people’s interaction with news items on mobile phones.

Perhaps some of these ideas could be applied to longer stories: documentaries, feature films, TV series. They could also apply to new forms such as websites, games and VR stories.

This requires editors and their tools to be able to work with story atoms as well as whole stories. 

This research seems to be about seeing audiences as individual news ‘users.’ Once we have a model for individual audience members being able to ‘choose their own adventure’ it’ll be time to work on how to make shared experiences possible… Maybe a teacher/pupils model would be a place to start.

IBC 2015

Wednesday, 16 September 2015

Over the last five days I spent my time in Amsterdam attending IBC 2015. I also attended the FCP EXPO.

IBC is a trade show for the TV and film business:

IBC is the premier annual event for professionals engaged in the creation, management and delivery of entertainment and news content worldwide.

Across 14 big halls, two or three had exhbitors relevant for production and post production. 

There were many high-end media asset management systems, virtual studios with motion control cameras and large cages where drones were shown flying around.

As last year, there weren’t many signs of Final Cut on the IBC show floor. Apart from the Avid and Adobe stands, few screens were showing any kind of editing application. If you weren’t part of a NLE decision-making team, you’d think there was no choice in NLEs yet. 

Camera manufacturers were starting to admit that they are better camera makers than digital recording device makers. Good news for companies making devices that can convert uncompressed camera source to codecs from Avid and Apple.

Despite Final Cut being hardly mentioned, Apple was everywhere because of ProRes. Whenever a video, sign or stand staffer covered high-end workflow, ProRes was always mentioned - usually first for some reason.

As with the vast majority of trade fairs of any kind around the world, Apple didn't pay for a stand. They choose to attend informally. Visiting stands, arranging meetings and supporting events near the main show.

At the US equivalent of IBC, the NAB Show held in Las Vegas, Apple organised their own invite-only suite in a nearby venue. They also gave presentations at an event orgainsed by FCPWORKS, a US-based post production systems integrator.


This September FCPWORKS teamed up with UK-based Soho Editors to put on a Final Cut Pro X-focussed event for IBC attendees. FCP EXPO was a two day event at a venue a few minutes walk from the IBC halls with sessions including presentations from Apple, Alex Snelling for Soho Editors and Ronny Courtens on Metronome’s reality TV workflow.

I gave a presentation as part of the FxFactory session which included a demo from Tim Dashwood on his exciting new toolkit for editing 360º video on the Final Cut Pro X timeline. As well as being able to play 360º video directly to a connected Oculus Rift VR headset, the 360VR Toolbox also allows editors to make creative choices based on how edits feel - almost impossible until now.

In coming days, some of the presentations will be made available online.

The presentation Apple gave had moved on a great deal even since the one they gave on the Apple Campus as part of the FCPX Creative Summit in June. It included more examples of great work from various projects around the world and demonstrations of features from recent Final Cut and Motion updates. Apple also introduced who from the team were there and welcomed attendee questions throughout the day.

Even though the day started with Apple, there was no drop off in attendance throughout both days as people stayed for a wide variety of presentations, networking and conversations in an exhibition area featuring pro Final Cut Pro X suppliers such as Intelligent Assistance

It is good news that Soho Editors were putting this event on. They are a long-established post production staffing agency and training company. Their support shows they think there’s a benefit to them encouraging their freelancers to learn Final Cut Pro X and that Final Cut training is a valuable service they can offer.

At the moment many TV journalists, researchers and producers are learning Final Cut through in-house training. Agencies like Soho Editors represent editors who already have years of high-end post experience. Once other established editors realise that freelance contemporaries are learning X, they may want to make sure they keep up.


Now that IBC is over, it is time to plan for NAB in Las Vegas in 2016. I've organised my flights already. I hope FCPWORKS and Apple take what they've learnt from Final Cut at IBC and do more in April.

Soho Editors has many clients and freelancers who aren’t sold on Final Cut Pro X yet, so they were a great choice for a Final Cut event partner. I hope FCPWORKS tries to reach more unconverted editors and post people when publicising a ‘NAB adjacent’ event.

As the UI for Final Cut is so much less threatening than the competition, I think there is mileage in attempting to get non-editing and post people to attend as well. People who have all kinds of jobs in TV, games and feature film production would benefit from learning Final Cut. My take would be: ‘Why should editors be the only ones who benefit from the ease and speed of Final Cut Pro X,’ but I’m no marketing expert…

Apple’s September 2015 Event and film makers

Wednesday, 09 September 2015

Apple’s September announcements have interesting elements for video storytellers.


The new iPhone 6S models have cameras that can record 4K video. That means iMovie on those devices will be able to edit 4K video:

iMovie is designed to take advantage of the beautiful 4K video you can shoot and edit on your iPhone 6s. In fact, iPhone 6s is so powerful you can smoothly edit two streams of 4K video to create effects like picture-in-picture and split screen.

Desktop-class performance lets you create advanced effects with up to three simultaneous streams of 4K video and export your 4K video at blazing speeds. And accessories like the Smart Keyboard let you use efficient shortcuts to make quick work of your project.

Interesting that they saw the need to handle three streams of 4K. iMovie will also be available as an extension to iOS applications that allow what photos and videos to be editing. If iMovie for iOS 2.2 doesn’t edit non 30/60p videos, hopefully editing extensions will be made by other developers.

At the moment the specs for the iPhone 6S only mention a limited range of frame rates:

  • 4K video recording (3840 by 2160) at 30 fps
  • 1080p HD video recording at 30 fps or 60 fps
  • 720p HD video recording at 30 fps

The native resolution of the iPad Pro is 2732 by 2048, leaving plenty of room for editing UI around a full 1920x1080 HD display, all those pixels would make it a good wired or wireless viewfinder for high-end video cameras.

Apple also have introduced a content-based refresh so the screen is updated as often is dictated by content. This should mean that if video is running at 23.976fps, then that’s how often the display is updated. Maybe that will work for 120fps content too.

Here is what iMovie for iOS 2.2 looks like on an iPad Pro:


The iPad Pro includes a ‘Smart Connector’ for its Smart Keyboard that allows power and information do go in both directions:

The Smart Connector works hand in hand with the conductive fabric inside the Smart Keyboard to allow for a two‑way exchange of power and data.

That means the iPad will be able to power accessories, and accessories will be able to power the iPad. Data going both ways might allow for some interesting third-party products...

3D Touch

The new iPhones and the iPad Pro have an advanced pressure sensitivity feature that Apple calls ‘3D Touch’.

Fortunately, Apple didn’t just add a new input method and leave its use up to individual developers. In iOS 9 and Apple applications, a light touch is a ‘Peek’ and a heavier touch after that is a ‘Pop:’

Peek and Pop let you preview all kinds of content and even act on it — without having to actually open it. For example, with a light press you can Peek at each email in your inbox. Then when you want to open one, press a little deeper to Pop into it.

I like to think of Peek as ‘Look at the metadata associated with this thing’ and Pop as ‘Act upon this thing with another tool’

It might be useful to have these shortcuts in Mac apps. Here’s hoping Apple introduce 3D Touch mice and trackpads…

Vertical Video Live Photos

The default action of the Camera application on the iPad Pro and the iPhone 6Ss is to capture a few moments around each photograph, a little bit of audio and some movement. When you press on them anywhere in iOS, they’ll show more than just the moment the picture was taken: 

A whole new way to look at photography, Live Photos go beyond snapshots to capture moments with motion and sound. Press a Live Photo to make it come alive. Experience the crack of a smile. The crash of a wave. Or the wag of a tail.

Just when some people are getting the message that video should be taken in a landscape orientation, Apple will be promoting the idea of photos that are a little ‘live.’ Oh well.

We’ll soon discover whether Live Photos will appear as video in iMovie, and whether the still image shown by default will be able to be changed - the best moment might not be in the middle of the sequence.

Apple TV – Apple’s Home Computer

The Apple TV brings applications to TV screens. Instead of iOS running on the new Apple TV, there's a new OS: tvOS.

With apps providing TV, movies, music, games and family organisation support, maybe Apple would like the Apple TV to be the new ‘Home Computer’ 

As well as tvOS sharing many features of iOS, for producers with large amounts of online content tvOS also allows web-based content to be made available in the Apple TV UI based on XML specifications:

Use Apple’s Television Markup Language (TVML) to create individual pages inside of a client-server app.

Every page in a client-server app is built on a TVML template. TVML templates define what elements can be used and in what order. Each template is designed to display information in a specific way. For example, the loadingTemplate shows a spinner and a quick description of what is happening, while the ratingTemplate shows the rating for a product. You create a new TVML file that contains a single template for each page in a client-server app. Each template page occupies the entire TV screen.

…and Javascript:

The TVJS framework provides you with the means to display client-server apps created with the Apple TV Markup Language (TVML) on the new Apple TV. You use other classes in the framework to stream media and respond to events.

As part of the demo a live sport app was able to show metadata during a game:


It would be good if Apple or another developer added time-based metadata display and editing to an NLE. 

Imagine a version of Final Cut or iMovie that could interpret Apple’s Television Markup Language and show what a production would look like when streamed on an Apple TV while editing in the timeline…

The odd one out

In recent years Apple has had two autumn events and distributed news about all their platforms between the two. Today’s event talked about devices that run watchOS, iOS and tvOS. The odd one out is the Mac. Either the next Mac update is so big it must have its own event, or there won't be much to report about devices running ‘macOS’ (if that’s what OS X is renamed as)  until next year. We’ll see…

Solving the vertical video problem: The New York Times’ first step

Wednesday, 09 September 2015

Justin Bieber’s new song is number in the UK. The New York Times has made an 8-minute video about how “Where Are Ü Now” was made.

It was conceived from the start as a video that works in more than one aspect ratio. Human interface experts NiemanLab have written a ‘making of’ abut this ‘making of’:

Unsurprisingly, the combination of Bieber and The Grey Lady turned some heads. But the Times’ video is interesting for another reason — it was designed from the beginning to be as compelling viewed vertically as horizontally. In a world where young people are watching more video on smartphones than on TV screens, making a video work in both aspect ratios can help it reach a broader audience.

This was an aesthetic as well as technical problem - how to combine filmed footage with motion graphics overlays that look good both on a TV and a vertically-held phone.

It is worth reading, but perhaps post-production people should consider whether timelines should have a fixed aspect ratio. They already don’t have a fixed resolution. 

Not the ‘where’ of a video element - the ‘what’

I suggest that elements for future videos may be exported as layered movies. It will be up to the playback device or software to how to show the elements so they work in the aspect ratio needed for each viewing.

This already happens for audio in Final Cut Pro X. Instead of defining the speaker through which audio should be heard, all audio is given a ‘role.’ This metadata can then be used by broadcasters and distributors to determine which audio should be played back - depending on context. 

The standard audio expected for UK TV production expects programmes to include a stereo mix, a surround mix, a stereo audio description (in which a voiceover during gaps in dialogue describes what happens on screen), music and effects only and alternate languages.

Imagine if programmes also had layers marked as Base video,’ ‘Signs and information in English,’ ‘Behind the scenes information,’ ‘Purchasing information,’ and ‘Signs and information in an alternate language.’ In the case of signs and text, this is how Pixar generates its movies.

In the case of the New York Times video, the motion graphics elements would be included in a separate layer which would be composited in different positions or even angles depending on the orientation of the playback device.

The answer to the problem of vertical video is to make sure videos look good when viewed at any aspect ratio.

That means editing applications will be able to playback the same content at multiple aspect ratios - much like page layout applications eventually added features which allowed designers to work with multiple aspect ratios for magazines and adverts.

To support multiple aspect ratios video makers will need tools that let them define what a video element is - the playback device can then determine the best place to play it back. Even if that is a second screen…

Adobe: Premiere Touch is for professionals too

Tuesday, 08 September 2015

As well as a bug fix update for Adobe Premiere Pro CC today, Adobe have reported what will be in the next version.

With this next release, Premiere Pro expands on its exceptional support for UltraHD, 4K and beyond workflows with new, native support for HEVC (h.265), DNxHR, and OpenEXR media, for both encode and decode, allowing editors to edit and deliver any format they need to.

When I've made mockups of Final Cut Pro X running on an iPad Pro people have asked why pros would want to edit on an iPad.

Interestingly for Microsoft Surface and iPad Pro fans, Adobe doesn’t consider a touch interface a sign of software for non-professionals:

Premiere Pro will let you build up your edit in new and tactile ways, by providing touch support for Windows hybrid touch devices like the Microsoft Surface Pro, and improved gestural support using Apple Force Touch track pads. Use multi-touch in the Assembly workspace for pinch to zoom to make your media clips big and easy to work with, then easily reorder them for storyboarding, play back and scrub right on the icons with your finger, tap to mark in and out points and drag straight to a sequence. Or, drag to the Program Monitor, where a new overlay will appear to allow you to drop into different zones to perform various standard kinds of edit. And on Apple Force Touch track pads, get haptic feedback when snapping and trimming in the timeline.




iPad Pro demo?

I thought that iMovie 4K would be a great demo application for the iPad Pro at tomorrow’s Apple event. Perhaps third-party developers would be more inspired by Adobe software running on the new device. That was the right thing to do at the WWDC earlier this year. Maybe Adobe will be on stage tomorrow…

For screenshots and more information go over to the Premiere Pro blog.

Final Cut Pro X dynamic range and colour gamut: Watch out for clipping

Tuesday, 08 September 2015

Now that those promoting UHD are starting to talk about High Dynamic Range and Wide Colour Gamut, it is worth considering what Final Cut Pro X does with brightness and colour information internally.

Here is a still from some footage from EditStock.com - the site that shares rushes from all sorts of productions so people can practice editing and post production - and the scopes showing its range of colour and brightness:


This still shows a good range of colour from 0-100 for red, green and blue as well as brightness (luma) from 0 to 100 - despite the source movie being an H.264-encoded QuickTime movie.

If I apply a Color Board colour correction I can desaturate it and make it darker:


Or I can make it more saturated and make the colours more intense:


The term 'clip' here means that it is impossible for some pixels to be any darker than black or brighter than white.

The question here is what does each effect consider black and white. Does it range from -20 to 120 or from 0 to 100.

The result of each of the Color Board corrections shows values below 0 and above 120, so colour corrections have a wide range. So if both these corrections are applied at the same time…


…the result is that Final Cut first darkens and desaturates the clip so some pixels have brightness and colour levels below 0 (as shown above in the second image) and then brightens them back up:


If brightness is stored as a value between -20 and 120, imagine two pixels that start off as having brightness values of 20 and 15. The first correction - that darkens the clip - might change these values to -4 and -6 - making them so close to fully black as being indistinguishable by eye. Certainly impossible to distinguish between each other. Alternatively, the second correction would change these to 55 and 62 - make them both brighter.

When Final Cut applies both corrections - first 20 being made darker to -4 and then -4 being made brighter to 21 - the pixel almost reverts back to its original brightness value.

The catch is that some Final Cut effects use a smaller brightness range than others.

That means if I apply an Gaussian blur effect to the output of the first Color Board correction, and then apply the second colour correction, the fact that the Gaussian blur only works with brightness values between 0 and 100 means that if a colour board changes values to being less than 0, the blue sees any below 0 as 0.


Using the pixel values from before, the Gaussian blur takes the -4 and treats it as 0. It also clips the -6 to 0. After the blur is applied, the brightness values passed on to the next effect might be the same: 0 and 0.

These are then passed on to the second Color Board correction which brightens both ‘black’ pixels to 35. The difference in brightness between the two pixels has been lost, there’s no way for the second correction to get it back.

Less range in the Gaussian blur effect means all the detail in the temporarily dark parts of the clip it receives is lost:


You can see that there are no pixels with a Luma value of less than 35. The detail of the various brightness values between dark grey and black in the original clip were were all made black by the Gaussian blur effect. When the second correction made those black pixels brighter, all the detail the darker parts of the frame were lost.

Check the result of effects with the video scopes

This means when you apply effects to clips in Final Cut, it is worth checking the video scopes to see what the effects do to the brightness and colour values of your footage. It is often worth changing the order of effects to make sure you don’t lose dynamic range.

In this case moving the Gaussian blur effect to before both colour corrections, prevents the clipping:


I could have also moved the blur to after the second correction.

HDR and WCG precision

High Dynamic Range and Wide Colour Gamut are about being able to encode a wider range of brightnesses and colours. They also require more precision: being able to distinguish the smaller brightness differences between pixels. That’s where ‘bit depth’ comes into HDR and WCG specifications. 8 bits for brightness can store 256 values between 0 and 1. 10 bits can distinguish four times as much detail: 1,024 values between 0 and 1.

NLEs like Final Cut and Premiere can handle HDR, WCG and codecs with high precision internally (such as ProRes 4444 XQ), the next stage for post is finding accurate ways of representing this precision in codecs designed for distribution to viewers.

What’s good for post production is good for the rest of Apple

Monday, 07 September 2015

When industry analysts try and come up with things Apple can do with their money, some suggest they buy  companies like Disney, Spotify or Tesla.


In practice Apple buys companies for their underlying technology. 

Reuters reports that Apple are buying up companies to support their efforts to make iPhones, iPads, Apple TVs, Apple Watches and Macs smarter:

"In the past, Apple has not been at the vanguard of machine learning and cutting edge artificial intelligence work, but that is rapidly changing,” he said. “They are after the best and the brightest, just like everybody else.”

Acquisitions of startups such as podcasting app Swell, social media analytics firm Topsy and personal assistant app Cue have also expanded Apple’s pool of experts in the field.

ProApps users have benefitted from Apple shopping sprees recently. Logic Pro X 10.2 now includes software acquired with Alchemy. Final Cut Pro X can now deliver TV programmes to UK broadcasters because Apple acquired MXF export software from Hamburg Pro Media.

If post production needs can be met by software and patents that would benefit other Apple products and services, so much the better.

Two candidates

While we wait for a professional audio application designed at its core for post production - which could be named ‘Soundtrack Pro X’ - Final Cut users have been raving about some specialised plugins from iZotope. Their Advanced Post Production bundle is able to fix audio problems previously thought impossible to solve, and also can create standards-compliant audio for use by broadcasters and distributors. 

Imagine if Apple bought the services of their people and their algorithms, patents and products. As well as working well in Final Cut, Logic and iMovie productions, deep knowledge of being able to remove reverb, echos and distracting noise from records would be very useful for parts of iOS and OS X that interpret audio instructions and environments.

It almost seems as if iZotope have re-organised themselves so as to prepare for an offer from Apple. From their August 18th press release

iZotope, Inc., a leading audio technology company, today announced its strategic decision to divide the current iZotope product line into two distinct families of products, one focused on Music Production and the other on Audio Post Production.

Another useful ability for post production software and for operating systems would be the ability to analyze large amounts of audio. Step up Nexidia.

They have products that can search for specific phrases phrases from hours of media, do quality control on captions, video description channels, languages and a tool that can automatically align captions to specific speakers.

Nexidia’s Dialogue Search enables content producers, owners, and consumers to type any combination of words or phrases, and in only seconds, find and preview any media clip where those words or phrases are spoken—independent of any captions, transcript, or metadata.

As well as supporting productions with terabytes of data to search through for creative reasons, Nexidia software could be used by Apple to interpret petabytes of video stored on iCloud - anonymised of course. The more advanced tools Apple has to understand audio, the better.

Which companies, products or services do you think you could persuade Apple to buy?

Final Cut Pro X 10.2.2 Update: Codecs and workflow

Friday, 04 September 2015

Today Apple updated Final Cut Pro X, Motion and Compressor. Here is what the Final Cut 10.2.2 release note says about what’s changed:

  • Native support for Sony XAVC-L and Panasonic AVC-Intra 4:4:4 up to 4K resolution
  • Import Canon XF-AVC 8-bit video files with Canon plug-in
  • Export interlaced H.264 video
  • Asset management systems can include a library backup file when sharing from Final Cut Pro
  • Fixes render errors that could occur when using reflective materials with 3D text
  • Improves stability when swapping materials on 3D text with published parameters
  • Improves performance when loading text styles
  • Motion Title templates with published text layout parameters now export correctly
  • Fixes an issue that could cause 3D text to appear dark when rendered
  • Addresses issues with timing on certain animated effects

Motion 5.2.2’s note lists a subset of the Final Cut features. Compressor 4.2.1 adds:

  • Fixes a crash that could occur after migrating a user account to another system
  • Restores the ability to use markers for i-frame placement in H.264 exports
  • Improves audio and video sync of closed captions and subtitles

For many Final Cut users, the biggest news with this update is that the text entry field titles is now resizable:


Peter Wiggins reported on Twitter that some plugins that have been broken since the 10.2 update now work:

As regards

  • Native support for Sony XAVC-L and Panasonic AVC-Intra 4:4:4 up to 4K resolution

There is a slight wrinkle for those dealing with media from the Sony X70. ddixon writes on the fcp.co forum:

Yes! Works fine for me. The 2.0 firmware for the camera is required, and once applied you must reformat the SDXC card in-camera, then record new clips. The last FCPX update fixed this for HD - today's update fixes it for 4K.

Note that the memory card must also be reformatted in-camera after the firmware update. However, if you have 4K clips that were shot by a 2.0 firmware upgraded camera, those now magically work with 10.2.2. And, before today you could not mix HD and 4K on the same card, but I assume this limitation is no longer the case - although that's an assumption, have not tested it.

Looks like for media compatibility, you need to check if there is any other software you need to update. For now the Canon XF-AVC software hasn't been udated.

An IBC-friendly update

As well as better 4K camera compatibility, a feature seems designed to support developers presenting at next week’s IBC show in Amsterdam:

  • Asset management systems can include a library backup file when sharing from Final Cut Pro

Part of this feature is a settings file inside Final Cut that third-party applications can use to make libraries using Final Cut. Some users are more comfortable with library .fcpbundle files than .fcpxml files:

Library bundle

Looks like this isn’t designed for users to be able to do from within the normal Final Cut interface yet.

Perhaps we’ll learn more after product announcements at IBC soon. Maybe during a presentation at the FCPWORKS/Soho Editors FCP EXPO

OS X El Capitan compatibility?

This update has appeared only weeks before the expected launch of OS X 10.11. There are various possibilities here:

  • This version of Final Cut has been updated for OS X compatibility as well
  • There was no time for this version to include OSX compatibility
  • Due to buggy new features in the release candidate of OS X El Capitan, it wasn’t possible to make Final Cut Pro X 10.2.2 compatible with what will be an initially feature incomplete version of OS X

I went into why this is so in the last post here: ‘OS X updates and Final Cut Pro X: A false sense of security?

These compatibility possibilities may also apply to third-party products and services in the Final Cut Pro X ecosystem, so unlike recent OS transitions, we might have to wait a few weeks before updating to OS X El Capitan.

OS X updates and Final Cut Pro X: A false sense of security?

Thursday, 03 September 2015

One of the interesting aspects of the Apple Worldwide Developer Conference each year are the new additions of Mac OS X (and iOS) that could benefit those who use Final Cut Pro. Here’s another way of looking at this OS X El Capitan: What if the new features mean problems for the tools we use every day?

In recent years OS X and Final Cut Pro updates have run smoothly. 10 years ago, the standard advice was to wait for three or bugfix updates of OS X or Final Cut to come out before upgrading.

Although Final Cut Pro X has been developed in parallel with improvements of the OS X video architecture, I’ve had no problems with accepting every new update of Final Cut in recent years. Often new versions of Final Cut often mean libraries needing to be updated to a new file format.

It is a good to archive copies of your libraries in the old format before allowing a newly updated Final Cut to change them into the new format.

As suggested by Apple I make an archive copy of the ‘old’ version of Final Cut in the Finder (Using the ‘Compress…’ command from the File menu). Once Final Cut has been updated, I can still run the older version on the same Mac to access the older libraries.

In practice, it is the OS updates that are likely to cause more problems than application updates.

Core Image is deeper than AV Foundation

This year it might be a good idea waiting a while before updating your Mac to OS X El Capitan. 

Metal will be a new underpinning to Core Image. Here’s Apple’s definition of Core Image:

Core Image is an image processing and analysis technology designed to provide near real-time processing for still and video images. It operates on image data types from the Core Graphics, Core Video, and Image I/O frameworks, using either a GPU or CPU rendering path. Core Image hides the details of low-level graphics processing by providing an easy-to-use application programming interface (API). You don’t need to know the details of OpenGL or OpenGL ES to leverage the power of the GPU, nor do you need to know anything about Grand Central Dispatch (GCD) to get the benefit of multicore processing. Core Image handles the details for you.

You can see from this definition that Core Image is the basis of more of OS X (and iOS) than AV Foundation. That means updating it for new technology (Metal) takes more resource from Apple, and a deep change might cause more temporary disruption.

Although we’d all like the benefits of Apple’s Metal technology to improve Final Cut’s speed, the price might be a little incompatibility for a version of OS X or two. There’ll be enough brave Mac fans who won’t be able to resist having the newest OS on their computer. Their feedback to Apple will hasten bug fix updates if they are needed.

Impatient users may complain about third party tools not working in El Capitan: ‘Why didn’t developers take part in the OS X beta and prepare their tools for the new version?’ In some cases it might be that Apple won’t have time to squash OS bugs before release - bugs that third parties have no way of working around.

The good news is as the underpinnings of OS X and iOS become more similar, investment in improving one OS will immediately benefit the other. In the case of Metal, making graphics and processing much more efficient in constrained hardware environment of iOS devices will make OS X Macs run even more quickly and use less power.

Read my summary of an episode of the Debug podcast featuring former Apple people talking about the annual OS development cycle.


Baselight and ColorFinale: Two approaches to colour for editors

Wednesday, 02 September 2015

Today FilmLight announced that they will soon offer free plugins for editors to see and render Baselight colour grades in their NLE made on other full-price Baselight systems:

The forthcoming version 4.4m1 release of Baselight Editions will make it free to view – or to render – grades passed between departments, or even to non-Baselight facilities. The new version will be released following IBC2015.

“Our aim is to make it even easier to pass around and refine the creative intent using BLGs,” said Steve Chapman, director at FilmLight. “If you want to see the latest colour grade in, say NUKE or Avid, you can do so by applying a BLG to a shot with the free version of Baselight Editions. It even allows you to render out the grade in your deliverables. If you want to modify the grade from these applications with the power of the Baselight core toolset, then you can buy – for $995 – the Baselight Editions packages.”

At the moment their software only Baselight Editions system works in Media Composer and NUKE. Their Final Cut Pro 7 version isn’t in active development.

Steve Chapman’s term of phrase in the press release is interesting: “If you want to see the latest colour grade in, say NUKE or Avid…” He didn’t say ‘If you want to see the latest colour grade in NUKE or Avid…’ the ‘say’ implies that the ‘view and render’ versions will be available for other applications.

For now it seems that FilmLight don’t think that Apple and Adobe’s markets don’t overlap with theirs. However, people and businesses with expensive Baselight installations would benefit if the grades they create could be viewed and rendered in Premiere and Final Cut. The separation that exists between editorial and high-quality colour is maintained: let people stick to what they are good at.

This would be FilmLight making the most of their ‘sunk cost’ - leveraging their investment in the UI and metaphors of their industry standard Baselight grading family. 

Color Finale: Minimal viable colour

ColorGrading Central has a different approach to colour in Final Cut Pro X: Color Finale. Despite having years of grading experience working on high-end productions using expensive systems, he went for the ‘minimal viable product’ route.

When creating a new product or service, there is the option to release a version as soon as possible instead of waiting until everything is perfect. This is the way Randy Ubillos approached Final Cut Pro 1.0 (eventually), iMovie ‘08 and Final Cut Pro X. 

  • Choose a minimal set of features that show who you are aiming the product at
  • Implement those features in such a way that communicates your philosophy: how you want to support your well-defined audience
  • Leave out all but the most vital features
  • Make sure your first version is good enough to capture enough interest and justify continual development
  • Update and improve quickly and often, and be clear about future developments

This method worked very well for Final Cut Pro X and is working very well for Color Finale. Color Finale version 1, like Final Cut Pro X, was good enough to use on high-end work, and it is quickly getting better.

How minimal is Color Finale? It doesn’t yet support Undo and Redo for actions in its palette - all you can do is reset the control to its default. This disadvantage is worth dealing with in return for getting to use Color Finale for the months that would have taken for Color Grading Central and Apple being able to deliver standard undo and redo commands.

From the FAQ:

What about masks and tracking?

They are a part of our development roadmap.  We plan to introduce features like these in a forthcoming 'Pro' version. The benefit of being an early adopter is you will get these new feature as a free update.

Already it is possible to make colour grades track features in shots in conjunction with SliceX from CoreMelt - as demonstrated by Sam Mestman in a recent episode of Ripple Training’s MacBreak Studio video training series.

Note that minimal doesn’t mean amateur: experienced colourists are already grading TV shows and feature films using Color Finale.

For more on Color Finale, read Oliver Peters’ detailed review.

Industry standard vs. new approach

Both approaches work well for Final Cut Pro X editors. Those that want to leave the grading work to experts will rely on round-tripping with other systems such as DaVinci Resolve and BaseLight. Others will benefit from tools that are better integrated into the application they use all the time. This is how sound is handled in post.

The question is whether FilmLight is interested in capturing the editing market - for both professionals and those who edit for other reasons. Can Color Finale become more advanced more quickly than Baselight becomes more suited to a wider audience?

In practice if the financial models of their audiences are different enough, Color Grading Central and FilmLight aren’t in competition. One doesn’t have to lose in order for the other to win. Which reminds me of the different approaches of Apple and Avid when it comes to developing their NLEs.

Apple and Cisco. Enterprise video next?

Monday, 31 August 2015

Following Apple’s and IBM’s partnership announced last year, Apple and Cisco have announced a rather vague partnership when it comes to iOS in the enterprise:

August 31, 2015 — Apple® and Cisco today announced a partnership to create a fast lane for iOS business users by optimizing Cisco networks for iOS devices and apps, integrating iPhone® with Cisco enterprise environments and providing unique collaboration on iPhone and iPad®.

Looks like Cisco will be doing most of the technical work and supplying their connections with big business and government ‘with Apple’s support’:

Apple and Cisco are also working together to make iPhone an even better business collaboration tool in Cisco voice and video environments, with the goal of providing employees with a seamless experience between iPhone and their desk phone.

With Apple's support, Cisco will deliver experiences specially optimized for iOS across mobile, cloud, and premises-based collaboration tools such as Cisco Spark, Cisco Telepresence and Cisco WebEx in order to deliver seamless team collaboration and reinvent the meeting experience.

Another way of looking at this is that Cisco’s sales team are probably being incentivised to sell iOS compatibility with Cisco’s telepresence products. That means iPhones being used in conjunction with wall-sized screens in meeting rooms and with people in other parts of the world who also use iPhones.

Mobile devices and enterprise video

At some point people will become accustomed to recording 4K video on their iOS devices for work reasons. How will they organise all that content?

Up until now Adobe’s strategy is to replicate the Desktop Publishing wave that helped Apple Macs get into corporate marketing departments. Adobe are building up a name for internal and external marketing based on Creative Cloud. Although hard, it is easier selling 250,000 Creative Cloud seats to 5,000 purchasers in large organisations than it is selling 250,000 memberships to the public - and even harder getting them to keep paying.

Eventually video will be a peer to text documents, PDFs and presentations: organisational currency. 

Desktop publishing in the late 80s and early 90s was Apple’s corporate lifeline. 80% of DTP’s graphic design features were absorbed into word processing applications by the late 90s. Apple couldn’t survive on what was left - Final Cut Pro 1.0 was designed to sell Macs to people who would realise that they needed to work with video. It has just taken much longer than many expected for video to become mainstream.

Which video editing metaphor will be absorbed into applications dedicated to day-to-day organisation of communication in large organisations? That’ll depend on which one people are more comfortable with, and which one IBM’s and Cisco’s sales teams push the most.

A ‘Movies for iWork’ application anyone?

Logic Pro X 10.2: Update for video editors?

Monday, 31 August 2015

Over at Logic Pro Expert, Edgar Rothermich has written up the many undocumented features in the Logic Pro X 10.2 update:

The release notes that come with the Logic Pro X 10.2 update indicate how big of an update this is. The list has over 250 items of new features, changes and bug fixes: https://support.apple.com/en-us/HT203718.

However, this is not all. There are additional changes in Logic Pro X 10.2 that are not mentioned in the release notes. In this article, I will not only list those changes, but provide in-depth explanations for each of those topics.

Here are aspects of this update of interest to Final Cut Pro X users.

Force-click support

A small thing is how the Logic team have chosen to implement force click for Macs that have input devices that can detect how hard you press. In Logic 10.2 force click acts as ‘click with pencil tool’ without switching from the current tool - which is usually the pointer tool in Logic.

In the case of Final Cut, we usually spend most of our time with the pointer tool - there is a less obvious ‘second most popular tool’ to assign to the force-click action. If you are in a metadata-oriented mode it would be the range tool, if you are fine cutting it would be the trim tool. As most editors have a hand on the keyboard able to switch tools instantly or even temporarily, I’m not sure what I’d want force-click to mean in Final Cut.

No closer to Soundtrack Pro X

Logic isn’t getting many new features that make it more useful for those working on video and feature projects. The Logic team are concentrating on supporting those who work with music and audio-originated productions. If you do use Logic for video post production, the new features that aid recording will be useful though.

Logic certainly doesn't look like it is getting closer to being able to work in a trackless/role-based way. It is becoming a much better track-based audio application.

Hopefully this means there’s space for a new audio post production application better suited for sound editors, mixers and designers.


As well as being able to upload music directly to established Apple Music Connect accounts, an interesting precedent is Logic 10.2’s integration with a third-party collaboration service called Gobbler.

Once signed into Gobbler, Logic users can

  • Send their whole project to another Gobbler user
  • Send a song to another Gobbler users
  • Automatically back up their projects to their cloud-based Gobbler accounts

Equivalent features for Final Cut Pro X/iMovie users could be very useful! 

Some may say that Final Cut’s integration with some video collaboration systems is close to providing these services. In the case of Logic, the integration is deeper: If you’ve signed into Gobbler, Apple modifies the Logic UI. Edgar lists the UI changes in his article:

Once you have Gobbler installed on your machine, Logic Pro will notice that and adds additional Gobbler-related menu items and settings in Logic Pro.

The Procedure for backing up your Project to Gobble

  • You can enable the Gobbler backup mechanism with a checkbox directly in the Project Save Dialog when you first save your Project.
  • That checkbox is also available in the Project Settings ➤ Assets window where you can disable/enable it at any time.
  • In addition, the File ➤ Gobbler ➤ submenu contains commands to start, pause, and remove the Backup procedure, plus access (load) any previous backups that are listed in that submenu

Read about Logic Pro X 10.2 and more at Logic Pro Expert.

Will 4K iPhones and iPads come with iMovie Pro?

Thursday, 27 August 2015

I’m holding out for foldable phone screens, until they can delver what I want, Apple need to come up with a main marketing point for the new generations iPhone every September.

According to Apple tipster Mark Gurman this year’s seems to be a big leap in camera abilities:

In addition to a much-upgraded rear still camera, Apple has decided to make a significant addition to the iPhone’s video recording capabilities: 4K video recording support. The iPhone 6S and iPhone 6S Plus will be the first iPhones capable of recording video in full 4K resolution and among the first phones on the market with such capabilities, though Samsung’s Galaxy S5 launched with 4K video recording support in early 2014.

This kind of hardware update is good news for those interested in iMovie for iOS and OS X developments. 

As iMovie is the default video editor for iOS, it will need to be able to handle 4K video - likely to be 3840x2160. A doubling of linear dimensions means each frame shot will have four times as many pixels. 

Final Cut Pro and Adobe Premiere have been able to handle resolutions larger than this over 10 years. In 2013 Apple stopped development of the old iMovie for OS X application and based iMovie (2013) on Final Cut Pro X 10.1. That meant that iMovie for OS X could then on handle 4K video internally (and transfer 4K to Final Cut for export). The catch was that iMovie for iOS and iMovie for OS X were no longer so well integrated.

Serenity Caldwell writing in Macworld in October 2013:

While you can still send your projects to iTunes, you can’t open a mobile iMovie project in the new iMovie for Mac—nor will it open in iMovie ’11. Try to do so, and you’ll get the following error message: “iMovie can’t import projects created with iMovie for iOS version 1.4.1 or earlier.”

Serenity did report some good news:

After reaching out about the missing option, Apple has confirmed that the feature will be reintroduced in a future update.

As such, it looks like its removal is merely temporary—a small setback, caused by the large revamping of all iLife and iWork apps—but it’s still odd to me that the feature didn’t make the final cut. Apple has made a strong effort to unify the look and feel of its iOS and Mac apps in both the iLife and iWork suites; to prevent iMovie projects from transferring seems completely antithetical to the company’s mission. Fingers crossed, we'll see an update soon that will rectify this situation

If the iMovies were more tightly integrated, iMovie(2015) for iOS could act as a useful post production collaboration tool that would easily fit into Final Cut workflows.

It would certainly be a great demo for an Apple keynote presentation launching an iPad Pro.

iMovie for iCloud?

It’s been almost two years since the ‘Final Cut’ version of iMovie for OS X was launched. What could Apple do to make the two iMovies get on better? 

Despite many setbacks Apple’s cloud services have been improving in recent years.

How about iMovie/Final Cut Pro X for iCloud?

At the moment Final Cut Pro X has an editing mode that allows less powerful computers to playback media and for media to be stored on smaller storage devices. ‘Proxy Mode’ converts footage to a much smaller size and less power- and storage-hungry format.

Proxy mode is very useful for collaboration. Final Cut can convert large video files stored on workgroup servers and store proxy versions of media on a MacBook internal storage.

What if Proxy Mode had an iCloud option? Recent versions of iMovie for iOS offer iMovie Theater, a way of sharing finished films via iCloud. Modifying this feature a little would allow proxy versions of footage shot an iOS device to be made available for iMovie users running on OS X or even in the browser.

The databases that iMovie and Final Cut use to manage footage compress very well, so sharing them via iCloud shouldn’t be a problem. For example a recent project of a Final Cut library that was used to edit multiple versions of multiple short films had hours of AVC-HD footage that took up 50GB of storage space. The editing database required to store the edits and metadata compressed down to 4.2MB. 

The trick would be a way for iMovie for OS X and iMovie to be able to deal with Final Cut timelines.

Cloud-based collaboration needs new encoding formats. You cannot keep hours of high-quality 4K footage on current iPhones. Modern editing also likes to use animated motion graphics with transparency that can be overlaid on video in different ways. Although pros have been talking about H.265 for a while now, not all the video formats that are needed for efficient cloud-based editing have been agreed upon. For example, put me down for a proxy flavour of ProRes that includes an alpha channel: ProRes 4224 (Proxy).

Higher-end editors think the battle of video editing software is between Adobe, Apple, Avid and Blackmagic Design. In the long run it is likely to be between Apple on devices and Google in the cloud. iMovie/Final Cut Pro X for iCloud would be a step in that direction.

Back to 1.0: Interview with Adobe Premiere, Final Cut Pro and iMovie developer Randy Ubillos

Wednesday, 26 August 2015

If you spent time trying out interesting new Macintosh software in the early 90s, you might recognise the name Randy Ubillos. In those days software companies used to credit application developers in the splash screen they displayed while the software started up. Bill Atkinson was credited as MacPaint’s creator in 1984, Thomas and John Knoll were amongst those credited with Adobe Photoshop in 1990. I saw Randy’s memorable name for the first time in the splash screen for Premiere 1.0 when I opened it in 1992:


[image by Riccardo Mori from his System Folder blog]

In April 2015 Randy Ubillos retired from Apple after many years developing video and photo applications such as Final Cut Pro and Aperture. In the 90s he wrote Adobe Premiere 1.0. He started his Mac career working at SuperMac Technologies, a Mac peripheral maker.

Earlier this summer I interviewed Randy on stage at the Bay Area SuperMeetUp in San Jose, California. The evening was part of series of events organised by video user groups in the US and in Europe.

The next Supermeet event takes place in September in Amsterdam and features special guest Walter Murch, who’s worked as sound or picture editor on films with George Lucas, Francis Coppola, Sam Mendes and Brad Bird.

1991 – Adobe Premiere

Alex Gollner: Let’s start at the very beginning… The program that became Premiere…

Randy Ubillos: …originally known as ReelTime.

…was actually demo software. Tell us about that

I was working for SuperMac and they were working on something called DigitalFilm - one of the very first digital video recording cards. It did quarter frame standard definition - they were pushing the limits of the JPEG chips that were available at the time and we needed some software to try it out. In about 10 weeks I put together a demo and we’d bring in people and show them editing on a computer and it was going over pretty well. The marketing department had just gotten out of software at SuperMac and they weren't sure what to do with it and so as it got close to shipping the card in late 1991 my software was sold to Adobe and they released it as Premiere 1.0. 

So they sold the software and didn’t involve you?

Adobe were specifically prohibited from offering me a job, because SuperMac didn’t want to lose me, so I could ask Adobe but I had to specifically do that. I went to lunch with Tim Myers and Eric Zocher from Adobe and we sat down at lunch, there was small talk and eventually they said “So…” and I said “OK. I would like to inquire about a job at Adobe.” There were two of them so they could later corroborate the story and said “That works,” and we started talking about it. 

So Adobe Premiere version 1: Were you the only programmer on it? You developed it all yourself? 

Yes. It took about 10 months.

What did you base your plan for the software on?

I had got into video at high school. At my school in Miami, Florida they had a television in every room. Morning announcements were done over the TVs. I’d installed some of the TVs, we had a TV studio and after the last period, we’d record TV commercials for the Drama club or sports clubs and in the morning we’d put together the morning announcements and in the first period we’d play that back. I got to learn about editing on 3/4" tape decks. Editing was ‘Find your edit points, zero the counters, back ’em up by five counts, and hit ‘Play’ and roll forward’ a very manual process. I learned the concepts of editing there. Going through that on the app is where concepts of tracks came up.

So up until then you weren’t following what Avid and other digital film products were doing?

I’d never seen that stuff. In May of that year we were at Digital World down in Hollywood, and I got to have dinner with some of the people from Avid. They came into SuperMac’s room and seeing this software running on a Mac and playing back digital video. There were definitely some people at Avid who could see that this stuff was going to start migrating down.

So over the next two or three years you created another three versions of Premiere and got to version 4 in a very short period of time.

There were a couple of other guys that joined the team, so by the time we got to Premiere 4, there were three or four of us.

So at that point who was Premiere aimed at?

We weren’t really sure at the time. It was a new thing. Tim Myers [Premiere product manager] and I spent a bunch of time going down to Hollywood; talking to movie studios… we talked to the guys from The Simpsons, we were talking to James Cameron. We were looking at Premiere as a pre-visualisation tool. You could cut something together initially, but the quality level wasn’t nowhere near film. The Simpsons would do their animatics on it.

Because the Digital Film card was around $5,000 - a very expensive card – SuperMac came out with the VideoSpigot - a very inexpensive card. It sold for around $500. It pulled in 1/16th size Standard Definition video, but it was the first time you could plugin in video from a tape or a camcorder into your computer, record it and play it back. Although the marketing people weren’t sure what people were going to do with it, it was cheap ebough for people to buy it to play with it for a while.

Although it was being used by professionals, you weren’t expecting people to do professional things with it yet?

Around the time of version 2 or version 3, we hired Martin Carey to do all the EDL work. We realised that although we weren’t recording at the quality that could go back out to film, we could do an offline edit, create an edit list and do an online with it, or even go to a film cut list [that a negative cutter could use to go back to the source camera footage to make the film master].

We could control a tape deck with RS232, pull stuff in and keep track of all the timecode. We could pull in 3, 4, 5 hours of footage, make your edit - choosing just the pieces you needed, we could go back and redigitise those at a higher quality. We were getting closer and closer to doing actual online stuff. Radius, VideoVision and RasterOps had some cards. They competed with each other on who had the best features. Someone got to 60 frames per second, someone else got to full screen.

1995 – Final Cut Pro

We had been running very quickly, doing versions of Premiere in under a year. I took a month off to decompress and I got a phone call from one of the board members of Macromedia [Adobe’s main competitor in 1995]. They had a big diagram of what they wanted to do. They had a paint program, a vector program and they wanted a video program. They wanted me to come and start a video product. So I went there and we hired a bunch of people. It was going to take 18 months and we were going to have this great video product.

It was going to be in the same vein as Premiere, but restarted. Computers had gotten faster. We were going to do a cross-platform application: Windows and Mac. It didn’t take 18 months. Two years in… three years in… Version 1.0s are incredibly difficult to do because you have no metric on how fast you are moving on things. Once you get to 1.0 you know how long that took, so you can guage better how long things take. They are also difficult to do because you set your sights so high.

We had done a whole design of a 3D scene layout editor so you could visualise a scene and pick where you were going to put your cameras. We shouldn’t have spent time on that stuff while working on a 1.0. That was kind of crazy. There was a script editing feature, there was all this stuff. We were having fun speccing it up, but we didn't get coding faster.

It was hard to find the right set of stuff. Everything seemed super important enough to put into the version 1.0, but the reality is you’ve got to find that right set of stuff. You've got to get it all to work together and put more into 2.0 and more into 3.0.

But Macromedia was bought by Adobe, so what happened to this video software?

By 1998 it was becoming clear to us that Macromedia wan’t going to be releasing our video editor. We knew we were going to be ending up somewhere, but we didn’t know where. At NAB [National Association of Broadcasters show, the main US trade fair for TV technology], Steve Jobs did the keynote and his speech pissed everyone off: “All you guys in the broadcast industry, computers are gonna come along and do all this stuff better” - he ruffled a ton of feathers.

While talking with him back stage was the first point where I realised that Apple might buy our team. It was 1998 and I had friends saying ‘Don’t go to Apple, they’ll be out of business in a year.’ What Steve saw was taking the whole line of computers that Apple was producing and squishing it down to a very small number of machines that all the energy should be focussed into, but they needed applications to bring people to the platform.

He knew that the PowerMac G3 - the first one with a DV connector on it - meant that you didn’t have big expensive digitising hardware: you were going to be able to plug directly in and get really good quality for the time digital video. You were going to be able to manipulate it, store a reasonable amount of it and you were going to be able to edit it.

At what point did you realise that being bought by Apple might be good news?

I don’t know if I thought about it that much. We’d been going for about three years on the project, we went over to them and we had more work to do. Apple wanted to re-look at what it looked like. It turned out to be a good thing. We got to a point right before NAB in ’99 we just said “We are going to show this thing at NAB come hell or high water” and we got the nicest present from Avid because that was the NAB they announced they were leaving the Mac. So there were all these Mac people at NAB saying “I hear Apple have got this new thing” so we got all these customers on a silver platter.

So Final Cut Pro 1.0 comes out and knocks most people’s socks off, and brings in so many more people. Weirdly enough Steve asks a whole different technical team to make an editing team for ‘real people’ - ‘the rest of us.’ Did you want to be involved?

That happened when we were very compartmentalised. I knew there was something going on, that was a very small team - three or four people - working on that. I thought they were doing cool stuff, but it was something I wasn’t focussed on.

What was the target for Final Cut Pro?

The centre of the target, which I still think is very similar today, is software for the aspirational part of the market. People who want to do something good, most of them not making their living doing video - they would like to some day. They are interested in video and they spend a lot of their spare time doing it. That’s the centre of the target.

It goes out into a broad spectrum. You always find at the higher end, including Hollywood, they are always looking for ways to cut costs. They’re always try to find new stuff that’s lower cost.

2006 - iMovie

At what point did you start paying attention to what iMovie was - in terms of what it meant to you?

I worked on Final Cut up to version 5 I think, when we got software-based real time video effects. At the time my husband and I were taking digital pictures and putting them up on our website and I wanted a better way to work with pictures. I wanted to do more than stick up a grid of pictures, I wanted to put a travel journal journal together - to be able to tell stories along with pictures, so the idea for Aperture came out of that. I started a team - there were six or seven of us who developed that and got that out through version 1.0.

So I’d been away from video for some time and it was nice to come back to it with fresh eyes. We were going on a dive trip and I’d just gotten an HD video camera that we used underwater, and we were in a cage with great white sharks. We had hours of footage that was going to be ‘blue, blue, shark! Blue, blue, blue, shark!’ which was going to be a nightmare to edit. I started to think of better ways to do that. So I had the idea that with filmstrips, you just wave your cursor over them, and that’s where iMovie clip skimming came from. Also being able to click and drag to select like you would text to pull something together.

iMovie’s codename was RoughCut, it was conceived originally as a front end to Final Cut - for creating a rough edit for Final Cut. I worked with a graphic designer to make it look good. When I did a demo of it to Steve [Jobs] in about three minutes he said “That’s the next iMovie.” So I asked when it was supposed to ship, and he said “Eight months.”

The iMovie team was six months into the next cycle. They had been looking for something different: a way to make things easier and simpler.

We knew from day one eight months is not enough time to do a whole new application. We knew we weren’t going to have all the features people wanted in iMovie ’08. We were not surprised at all when that went out. The set of features we said we were going to deliver we delivered. iMovie had been around for quite a while - it was living on some pretty old code. But moving that forward meant we could move very quickly [as iMovie ’08 was all new code].

Did you start thinking of your audience in a different way?

A lot of it was looking at it from my own perspective. I was doing more video myself. Camera and storage costs had come way down, getting to the point where you could do everything on a laptop. It was becoming very personal, so I spent working on versions of iMovie for the Mac and on the phone with some great teams.

2011 - Final Cut Pro X

As Apple seemed to ‘get away with’ that painful relaunch of iMovie, did that help them make the decision about doing the same for Final Cut Pro?

One of the things I like about working at Apple was that Apple didn’t have a problem with starting over again - if that was the right thing to do. You don’t want to talk about ‘sunk cost.’ The effort you’ve put in in the past has gone. From now on, what is the best way to go forward? It doesn’t matter if we spent six months working on some feature. That doesn’t matter. Is it the right feature? If so, great, continue forward with it. If you don’t do that with a product and somebody else who doesn’t have the history, the legacy you’re trying to hold on to will jump in and take things out from under you.

The Final Cut Pro team was trying to figure out what they wanted to do next. X was a big shift. I had a big part in convincing people it was the right thing to do. I will say that I had a different idea of the way the launch might have gone… [audience laughter]

My idea was that Final Cut 7 should stay exactly as it was for about a year, and every time you bought a copy of X you got a copy of 7. They didn’t want to hear it. I knew 16 months before the launch that I was going to have a bunch of arrows in my back. I was going to be blamed for this big transition. It’s the Apple way of doing things: ‘Feet first, jump in!’

The very last conversation I had with Steve Jobs was right after the launch of Final Cut Pro X. I was getting ready to get on a plane to go to London to record the second set of movie trailers - we’d hired the London Symphony Orchestra [to perform the music that was going to be bundled with the next version of iMovie] - and Steve caught me at home: “What the heck is going on with this Final Cut X thing?” I said “We knew this was coming, we knew that people were going to freak out when we changed everything out from under them. We could have done this better. We should have. Final Cut 7 should be back on the market. We should have an FAQ that lists what this is all about.” He said “Yeah, let’s get out and fund this thing, let’s make sure we get on top of this thing, move quickly with releases…” and he finished by asking: “Do you believe in this?” I said “Yes.” He said “then I do too.”

That was from the top - you had the authority to make the big changes. I wish it could have gone differently. I absolutely believed it and still do believe it was the right thing to do: that Final Cut X is a better editor than Final Cut 7 was. It’s more popular, it’s bringing more people into editing than ever were before. People who have never used an editor before find Final Cut X much easier to learn than Final Cut 7.

Talking of bringing new people to editing, what does iMovie for iOS mean to you?

It’s always been phenomenal, the fact that people can people can have an HD editing studio in their pocket - it’s ready to go for editing. People take pictures all the time and publishing them. They tend not to do it as much with video. One of the reasons for that is that historically people have felt that to make a video is this giant involved process. People have this idea that it has to be more complicated than it is, but I enjoy showing people how to make personal movies…


…Randy then went on to give a presentation on how to use Apple tools to make personal films.

Thanks to Randy for the interview. I’m looking forward to his next 1.0 - despite the disruption it is bound to cause.


Thanks to Benjamin Brodbeck for the audio recording and to the Supermeet team for the opportunity to interview Randy. See you in Amsterdam in September!

Logic Pro X 10.2 update includes direct share to Apple Music Connect

Tuesday, 25 August 2015

The headline feature of today’s 10.2 update for Logic Pro X is Alchemy - ‘the ultimate sample manipulation synthesizer.’

The more intresting one for other post professionals is the ability to export audio productions directly to an Apple Music Connect account.

Imagine if future versions of Final Cut Pro X or iMovie had an equivalent feature: ‘Share directly to Apple Video Connect.’

Alchemy updates to Apple’s audio applications comes in the wake of their acquistion of Camel Audio earlier this year. Ars Technica has more on what this adds:

For those not familiar, Alchemy is a sample-based synthesizer, similar to Logic's existing EXS24. This differs slightly from a traditional analogue synthesizer, in that instead of generating a raw square wave or a saw wave and using filters and other tools to manipulate it, a sample-based synth uses a recorded sound or instrument as the basis for manipulation. The new version of Alchemy comes bundled with around 14GB of audio samples, which are used in over 3000 presets, and 300 Logic patches. 

Post production application convergence: Want an ‘all-in-one’ app?

Tuesday, 25 August 2015

Bart Walczak thinks there’s a race towards a unified video editing and colour correction application:

With the introduction of Resolve 12, suddenly the race towards a unified NLE/Grading tool become very interesting. It’s hard to argue, that colour correction and grading became an integral part of post-production workflow.

He’s written a very detailed post about the state of colour in Adobe, Avid, Apple and Blackmagic Design software. His summary:

In the race for NLE/Grading application combo, the main competition at the moment seems to take part between Adobe Premiere Pro and Blackmagic Resolve with Avid Symphony lagging behind, and FCP X coming around but looking in a different direction. With the release of version 12, it seems that BMD really delivered. Even though Resolve has yet to prove itself as a successful NLE, it is quickly getting there, and if you consider the price tag, there is really not much to argue with.

I don’t think that it is inevitable that colour grading will be done in editing applications.

Colour grading could end up like audio, titling or encoding. Three parts of the the post production process that have been integrated to varying degrees into NLEs.

Grading could be like audio: the point of collaboration is that you leave some things to experts using tools that work for them. I don’t think a majority of editors working on high-end jobs want all the features of Logic or ProTools in their video application. If you are going to do it yourself - even when doing work as a placeholder for that you will transfer to brief the expert - you need tools that match the sensibilities of the application you use.

Grading could be like titling: not every editor wants to be a motion graphics designer. When it comes to titling even motion graphics designers are intimidated by many advanced aspects of typography. That’s why we don’t see all of After Effects in Premiere or Motion in Final Cut. Most editors would rather choose between a good selection of design templates, add their text and go back to editing.

Grading could end up like encoding: although we have separate applications to come up with custom encodes, most of the utility of those apps are now built into NLEs.

Bill Roberts, Sr. Director of Product Management for Video at Adobe doesn’t plan to add all the colour grading power of Speedgrade to Premiere. From an interview he gave to Julien Chichignoud at a SMPTE conference in July:

Our philosophy is not to input the full complexity of a ‘craft’ tool in the NLE user interface. We are looking at a full workflow, the connection between creative disciplines and then designing the optimized workflow from that perspective.


The Adobe team believes giving optimized controls in the editing experience and deep controls in a dedicated interface is the natural approach.


Optimal picture size for viewing HD and 4K at different distances at home

Tuesday, 25 August 2015

When you deciding which TV to buy, manufacturers talk about the right screen size for the distance at which you watch the picture. This is based on the image taking up at around 30º of your field of view - this allows for less distraction from outside the TV picture area and allows for people with 20/20 vision to get the benefit of HD resolution - to see pixel-level detail.

To get the benefit from the larger number of pixels in 4K TVs and displays, suggestion is the image should take up around 60º of the viewers’ field of view. This is based on 20/20 vision being defined as being able to distinguish detail at 1/60th of a degree of your field of vision. 

Here is a table showing the optimal diagonal screen size for HD and 4K images at different distances [Using this field of view equation]:


In the UK I don’t think that typical homes have got bigger in recent years. Most people set up their TV and seating to fit the room layout. A 2004 BBC survey reported that the average TV viewing distance in the UK was 2.7m (8.9ft).

Following the 30º recommendation, the best HDTV size for the average viewing distance would be 70" diagonally. For 4K, the image should measure around 136" diagonally.

For now if you want a 4K experience at home, I suggest you investigate projection systems.


Apple collaborating with YouTubers: To promote what?

Monday, 24 August 2015

MacRumors reports that Apple is in talks with popular YouTube channels to involve them in a big promotional campaign: 

Although their styles don't mesh together, Apple could be creating a new ad campaign, possibly for the upcoming "iPhone 6s" and "iPhone 6s Plus," that centers on creatives from YouTube using the company's new products as the main tools for shooting, editing, and publishing content for their channel.

I don’t think MacRumors understands what YouTubers could do for the Apple brand. 

YouTubers are much more likely to be talking about what Apple products and services do for their lives and for their friends and family. They have audiences that trust them, so it is a matter of whether they want to deliver those audiences to Apple.

If YouTubers do talk about how Apple products aid YouTube production, they’ll be promoting how good the iPhone and iPad cameras are, but it is unlikely they’ll revert to using iMovie on iOS. Most YouTubers are too invested in editing and publishing their content using iMovie and Final Cut Pro X running on the Mac. 

Apple is promoting the idea that a picture that any iPhone or iPad user might be used in an international advertising campaign. Doing the same for video is much harder because video is harder to make work in different contexts than single images. A step on the way to making it easier will be making iMovie used much more as a matter of course after shooting video on an iPhone. That means building up the iMovie/Final Cut Pro combined ecosystem, which could be good news for millions of potential film makers.

4DX Cinema: Turning movies into rides

Monday, 24 August 2015

Judith Allen runs down the pros and cons of adding ‘in-seat’ theme park effects to the cinemagoing experience:

I find myself now living near the UK's first 4DX cinema in Milton Keynes. I went to see Ant-Man to try it out.

The effects were actually turned on during the Mission:Impossible Rogue Nation trailer beforehand, which worked quite nicely to acclimatise everyone to it and get the 'surprised' shrieking over and done with before the feature. However, the intensity of a trailer became quite overwhelming, with the motion effects struggling to keep up as we went from one action shot to another... and the effect of the pads in the seat actually beating you up was quite unexpected

I wonder if the distancing effect of stereoscopic 3D will soon be be relegated/promoted to such ‘ride experiences’ only.

How well 4DX works is instructive for those working on 360º movies. I wonder if 360º will one day escape the ‘game’ or ‘ride’ classification.

Helping people consider alternatives

Sunday, 23 August 2015

FiveThirtyEightScience on why it is hard to convince people of things if they’ve already decided.

You have to be sure before you destroy what you already know and substitute it with something new


Simply exposing people to more information doesn’t help.


If you want someone to accept information that contradicts what they already know, you have to find a story they can buy into. That requires bridging the narrative they’ve already constructed to a new one that is both true and allows them to remain the kind of person they believe themselves to be.

A concept to bear in mind before you get into debates on NLEs and workflows!

effects cc mac microsoft office 2010 and student with sp2 32-bit-64-bit autodesk autocad design