Apple creative apps architect Randy Ubillos speaking in LA and San Jose

Wednesday, 13 May 2015

The Los Angeles Creative Pro User Group has announced that ex-Apple employee Randy Ubillos will be speaking at public events in May and June. 

Until April 23rd Randy Ubillos was a very important member of Apple's application software team:

His influence on Mac software started years before he joined Apple. He developed the first versions of the Adobe Premiere video editing software. Since joining Apple he's worked on Final Cut Pro, iMovie and iPhoto amongst others.

On May 27, 2015 he will be appearing at the May LACPUG meet in Los Angeles. On June 26, he will be appearing at the Bay Area SuperMeetUp - a similar event in San Jose.

It isn't common for ex-Apple employees to talk publically about areas of expertise they covered while working at Apple. Especially so soon after leaving the company. I guess this is either very bad news or very good news. The negative explanation is that Randy resigned because his vision for the future of Photos, iMovie, Final Cut Pro X and other applications he was involved with was too different from Apple's plans. His resignation was interpreted by some as a sign that Apple are about to give up on their professional applications - including Final Cut Pro X, Motion, Compressor and Logic Pro X. The bad news would be that Randy feels embittered enough to almost immediately go public with problems at Apple.

The 'good news' interpretation is that Randy appearing in public is part of Apple loosening up - that they understand that it is a good idea if users understand more about the people and motivations behind Apple software.

The good news is that the agenda at the LACPUG website says that Randy will be talking about his enthusiasm for the idea of telling stories with video: 

Randy will speak about his own moviemaking experiences and the power of video to inspire and document our lives. He will also provide tips and tricks for making your own movies.

That kind of talk could be designed to establish his bona fides for a new passion project supporting video literacy. A good sign is that he will also be joining post production experts to answer film making questions in a 'Stump the Gurus' session.

There's no sign that he'll be 'dishing the dirt on' or revealing Apple secrets about Final Cut Pro X, Photos and Aperture. Mike Horton of LACPUG specifically tweeted:

However, the fact that Randy is speaking in public so soon after leaving Apple is a good sign.

Will Virtual Reality change which stories we tell?

Wednesday, 13 May 2015

For a few years now I've enjoyed using panorama apps on my iPhone.

Occipital's 360Panorama iOS app can teleport you into a panorama by using the iPhone's accelerometers. Accelerometers detect where the phone is in 3D space and what angle it is being held at. 360Panorama uses this information to determine which part of a panorama to show on screen.

This means as you turn left and right (360°)…

p1 p2 p3

up and down (180°)…

p4 p5

in real life while holding the phone, the panorama display updates to show you what you would see if you looked in that direction at the place and time the panorama was captured.

In a new VR music video iOS app from 'Stor Eiglass' by Squarepusher the same technology gives you the opportunity to look in all directions during a 3D animation. If you have a Google Cardboard viewer you can also experience the VR in stereoscopic 3D, but the 2D version works just as well. The app is also available on Android.

While playing the video, I could look ahead as I flew forward...

vr1a

and look down to see what I was flying over:

vr1b

I wasn't able to choose the direction of my flight, the application flies through a virtual world, but was able to look around as things happened. As streams flew down from a tower...

vr2a

...I could look up...

vr2b

...or behind me:

vr2c

VR: No director or cinemetographer scene framing, few editor edits

From a storytelling point of view, this kind of virtual reality means that the viewer/player/user chooses where to look: how to frame the scene. They choose what is important to look at. Part of non-VR stortyelling is the ability of cinematographer, director and editor to direct the audience's view: "This is important," "her reaction is important" and "don't forget this."

The point of VR is that a solo audience member takes control of where to look. They can even change aspect ratio if they turn their phone:

vr3a vr3b

In-scene editing isn't possible because the editor cannot juxtapose different camera angles with editing - the audience chooses the camera angle.

Another aspect of editing is possible. Structure-based edits can be done with staging. Structure provides the beginning, middle and end of stories. 

Staging means that virtual physical boundaries between scenes act as edits.

In a city I fly towards a advertising billboard:

vr4a

vr4c

Flying through the billboard is a way of travelling between scenes to a new environment:

vr4b

I can look back to see there's no way back to the previous scene.

vr4d

Why are these 'staging edits' important? They help change pace and mood, making storytelling possible, so that this scene takes place in the same story as the previous scenes:

vr5

New storytelling technology, new language, new stories?

The history of movies and TV is the history of technological developments informing the way we tell stories. Movies started off as single shots being shown to large numbers of people in public. As artificial lighting, editing, sound, colour, multitrack audio, model visual effects and computer generated visual effects appeared, the way we told stories changed - which informed the kind of stories we told.

Now's the time to consider whether VR will affect way we tell stories and what stories we tell.

Shared storage for Final Cut Pro X post teams from GB Labs and LumaForge

Wednesday, 06 May 2015

For many years post production teams have been able to access media on shared storage. GB Labs and LumaForge make products that can be tuned to work well with editors who use Final Cut Pro X.

GB Labs' Space

The GB Labs Space storage range is a NAS (Network Attached Storage) system. 10 Gigabit Ethernet connections mean that editors can work with footage and Final Cut Pro X libraries stored on shared storage.

With current connection speeds, the limiting factor for video data rates isn't the networking technology but the speed of the shared storage and storage controllers.

For two simultaneous users who need a fast direct connection to their storage, GB Labs sell a relatively portable product: the Midi Space SSD. It is designed to travel from place to place with a film crew. 

The Midi comes in the form of a Tower PC with two 10GbE connections with up to 13TB of storage with 2,000MB/s throughput.

GB Labs Midi Space SSD NAS device

The GB Labs Space SSD is less portable, but higher performance device with a througput from 3,000 to over 6,000MB/s. It is rack-mounted and serves multiple editors via an external switch: 

GB Labs Space device connected to editing workstations via a switch

GB Labs' workflow page for Final Cut Pro X.

In response to a Twitter question from Sam Johnson:

LumaForge's LumaShare

LumaForge recently introduced their LumaShare Mobile family. It is a portable (as in a luggable single tower PC-sized device) system that supports up to 12 4K users using direct 10GbE connections. Adding an external switch supports more users.

Their 4 minute demo on Vimeo shows how many streams of 4K can ber served from a single LumaShare box:

Because of the way Final Cut Pro X can work with files, the same 16 4K files can be simultaneously streamed to multiple editors on the same network via their own Final Cut libraries (which are also stored on the server).

Later that same day at the April 2015 meeting of the LACPUG:

For speed and storage specifications for the LumaShare family along with prices, visit LumaForge.

As well as GB Labs or LumaForges devices for each workgroup, each Mac needs a 10 Gigabit Ethernet connection. Modern Macs get this using Thunderbolt adapters - such as those from Promise, Atto and Sonnet

Almost plug and play

The new economic model for post production support means that the market will need medium to high-end solutions that are almost plug and play. There isn't much margin in selling Macs and video editing software. The new generation are becoming accustomed to doing without service contracts - supporting themselves instead. LumaForge say that they tune each LumaShare they sell to match the specific needs of the workgroup - including the way Final Cut Pro X libraries work on NFS shares. GB Labs have partners in Europe and the US.

Products like GB Labs Space and LumaForge LumaShare are designed to be set up by assistant editors and DITs. If both companies provide enough online training and support, collaborative workflows for many artists working with large amounts of high resolution footage will be accessible to many more people.

Up until now, obscure user interfaces have been a sign of 'high-end professional' products, but as products move 'down market,' UI quality will become more important than features. Once products provide good enough hardware and software to get the job done at similar proces, it will be the system that is easier to set up and maintain that will win.

Timecode window for Final Cut Pro X

Tuesday, 05 May 2015

Newly available for Final Cut Pro X users: a flexible timecode display window. It is a free download for users of the FxFactory post production plugin management system. FxFactory is a free download that manages custom plugins for Final Cut Pro X and other post production applications.

Because plugins cannot yet modify Final Cut's menu, you access the new Timecode window by right- or control-clicking the timecode display above the project timeline:

Screenshot of using shortcut to show timecode window in Final Cut Pro X

The window always shows exactly what Final Cut's timecode display shows:

Screenshot of Final Cut Pro X timecode window showing native timecode of clip that is being skimmed in the timeline

You can resize the window by dragging the corners or edges.

You can also choose what colours are used for the text and the window background:

Screenshot showing shortcut menu that accesses timecode colour settings

The colour controls include opacity:

tc 4 opacity

The examples shown in these screenshots include a background colour with an opacity of 33%.

Timecode over full-screen video

If you have two displays attached to your Mac, you can also overlay the timecode window on top of full-screen video:

Screenshot showing timecode with transparency appearing over full-screen video in Final Cut Pro X

To do this

  1. Drag the timecode window to your secondary display
  2. Go to full screen mode on your primrary display using the 'View:Playback:Play Full Screen' command or use the Shift-Command-F keyboard shortcut
  3. Drag the timecode window back over your primary display

At the moment the window shows the same information as Final Cut's normal timecode display panel. X displays time project timecode when skimming in the timeline, clip timecode when the cursor is over a specific clip.

If you set the timecode display to show subframes in order to do sub-frame audio editing, the window doesn't yet show the same precision:

Screenshot showing that when subframes are shown by Final Cut Pro X, the FxFactory window doesn't show them

In Final Cut Pro 7 and earlier, there was an option to overlay timecodes of all the clips in the timeline at the playhead. Since June 2011 Final Cut Pro X's information overlays have been simpler.

Maybe the ProApps team are hoping that the need for editors to know so much about timecode will go away. On the other hand, they might be working on a much more configurable overlay system for a future version of Final Cut. Time will tell!

Timecode and FxFactory are free downloads for Final Cut Pro X 10.2 and OS X Yosemite 10.10.2 and newer.

Primordial metadata

Timeline and clip timecode are an example of of a form of metadata that is over 100 years old. When films were shot with celluloid, editors had to manage film edge code - sometimes adding their own code to shot film to be able to manage every frame.

Hopefully Apple will add features that will allow Final Cut users to view and edit any metadata in a floating window - including timecode. The kinds of metadata that would be useful in this case would be

  • Timecode
  • Slate/Scene/Take
  • GPS-recorded location (co-ordinates / colloquial name of place)
  • Keywords
  • Colour grade name
  • Name of person who last made changes/changed metadata

Periscope broadcast 1: Final Cut Pro X - April 6 2015

Tuesday, 07 April 2015

As Periscope currently only allows replays for 24 hours, here is a copy of yesterday's 'scope' on YouTube.

I muse upon Final Cut Pro X and answer questions put to me by Periscope followers.

To make this video, I recorded my iPhone while it displayed the replay in the Periscope application. Do this by connecting your phone to your Mac running Yosemite QuickTime Player X and choosing File:New Movie Recording.

Pop-up menu in QuickTime Player X showing iPhone selected

Then go to the pop-up menu next to the record button and choose the iPhone's camera and then the iPhone's microphone.

 

Chatty apps in OS X: Quiet them down

Wednesday, 25 March 2015

When I upgraded OS X to Mavericks I found that Final Cut Pro X alerts got more annoying. Every time I exported a movie from my edit, I would get an alert when the background export finished. When I need to export many movies in a short period of time, I end up with a whole series of alerts:

Screenshot of many alerts produced during a Final Cut Pro X export session

To prevent Final Cut - or any OS X application - being so 'chatty,' go to System Preferences and choose the Notifications pane.

Screenshot of Mac OS X System Preferences with Notifications icon highlighted

Scroll down and click 'Final Cut Pro.'

Screenshot showing Final Cut Pro selected in the Notifications pane of System Preferences

Change the alert style from Alerts to Banners. Instead of having to dismiss each alert, banners go away automatically.

'Bumpy' pixels: iMovie Apple Force Touch trackpad haptic feedback

Monday, 16 March 2015

Apple has updated iMovie 10.0.7 to provide context-specific haptic feedback for those using a Force Touch trackpad.

As part of their March 9, 2015 event Apple announced a new kind of trackpad for their MacBook computers. Instead of registering clicks using a switch, the new trackpad is able to recognise a range of pressures. The Force Touch trackpad can detect a light touch for when the user wants to move the cursor without clicking and dragging, a heavier touch for when the user wants to click or drag, and a heavier touch - a 'force click' - which is used for shortcuts.

As this new trackpad has no click switch, it is hard for users to know how hard they are pressing without physical feedback. They need to be able to feel the difference between moving the cursor, clicking a UI object and force touching a part of an application. The Force Touch trackpad includes a 'Taptic Engine' - tiny magnets that move the trackpad in such a way that they feel as if the trackpad has flexed downwards.

An Apple support document lists some examples of shortcuts accessible by force clicking:

  • Link previews: Force click a link in Safari or Mail to see an inline preview of the webpage.
  • File icons: Force click a file icon to see a Quick Look preview of it.
  • File names: Force click a file name in the Finder or on your desktop to let you edit the file name.
  • iMovie: When your iMovie project has an animated Map or Globe, you can Force click the map in the Timeline to access a Style menu. This lets you choose from four different styles.

As well as being able to simulate old physical trackpad features, the Taptic Engine can also provide physical feedback based on context:

  • iMovie: When dragging a video clip to its maximum length, you’ll get feedback letting you know you’ve hit the end of the clip. Add a title and you’ll get feedback as the title snaps into position at the beginning or end of a clip. Subtle feedback is also provided with the alignment guides that appear in the Viewer when cropping clips.

Final Cut Pro X is my video editing application of choice. iMovie is a full version of Final Cut Pro X running an additional consumer UI. As Final Cut Pro X hasn't been updated since December, iMovie's use of the Force Touch trackpad is a preview of features I hope to see in the next version of Final Cut.

I visited an Apple Store in London to see how iMovie 'felt' on the new version of the 13" MacBook Pro with Retina. 

'Feeling' the user interface

I tried two out of the three features mentioned in the support document. I couldn't feel any 'snapping' as I moved a title to the start or finish of a clip.

When I dragged the clip to its maximum length I did feel a little bump. Without looking at the timeline and looking at the viewer, I could 'feel' the end of the clip.

This feature presages the ability for UI pixels to be 'bumpy' - for user to feel the texture of application UIs without having to look at where the cursor is. This means that seemingly textured software keyboards and control layouts will be able to be implemented on future trackpads, iPhones and iPads.

Perhaps we'll look back and realise that the iOS 7 update removed borders from button because one day Apple user interfaces will be able to be felt as much as seen, and button text labels will feel more distinctive than button borders under our fingertips.

Non-visual manipulation

Film and video editing is an interesting UI problem: You need to look at the footage you are editing while you manipulate the clips that represent the footage in a timeline. That is why keyboard shortcuts are especially popular amongst video editors. No need to look at your mouse pointer the timeline as you manipulate clips - just press the keys that change the edit. 

Once a complex timeline can be represented by a touch only UI, editing will go full-screen. The screen will show footage only while the editor will be able to feel the edits as the story plays out.