It took me years before I got my mind around Apple Motion. I spent a long time trying to learn how to do complex multi-layer keyframed motion graphics like I used to make in Adobe After Effects. I clicked with Motion once I had a more straightforward task: make a simple plugin for Final Cut Pro X.
After making many Final Cut plugins and motion graphics sequences with Motion, I’ve come to know it well. One of the two sessions I taught at the 2015 FCPX Creative Summit was about using Motion’s behaviors for animation.
I’ll be teaching a webinar version of that session on Tuesday called “Exploring Apple Motion Behaviors for Easy Animation“:
The real power behind Apple Motion is behaviors. Behaviors use the power of complex calculations and real-time rendering to produce results in minutes that would take hours to create and modify using keyframes or complex math. Behaviors can control almost everything in Motion — including graphics, text, particles and cameras. Alex will show behaviors controlling graphics and particles and show how much fun you can have by playing with behaviors in Motion.
Register for free to watch live and ask questions at the Moviola website.
In recent years, more of TV and internet news features recordings made on mobile phones. iPhones and Andoid phones are also being used by professional journalists. The art and science of using consumer technology this way is known as Mobile Journalism. There are blogs, Twitter hashtags and conferences on the subject.
Final Cut Pro X is the editing software ‘for the rest of us’ – designed for professionals in many fields as well as editing. That means broadcast news organisations are training all sorts of staff in using X (Some would say everyone but the editors).
After a few years of steadily improving camera phones and video editing applications that run on iOS and Android, the quality of mobile audio recording is now catching up. Recent devices bypass microphones designed for telephone conversations.
Recently Glen Mulcahy of Irish national TV and radio broadcaster RTÉ compared two systems that connect via the iPhone’s lightning port:
Today I got my hands on the Sennheiser ClipMic Digital mic for iOS so I decided to shoot a quick unboxing video and then do an audio test and a video test to pitch it against the iKmultimedia iRigPro and AKG 417pp Lav which we currently use for Mobile Journalism here in RTÉ.
He used the Apogee MetaRecorder iOS app which includes metadata tagging for those editing MoJo footage in Final Cut Pro X.
Ripple Training’s video on how marker, keyword and role information captured on location can be imported into Final Cut:
ClipMic Digital is a new Microphone from Sennheiser that turns your iPhone or iPad into a professional digital audio recorder. By downloading the companion App from Apogge, you can record and add metadata to your recordings that can be read my Final Cut Pro X via XML. This is one of the COOLEST app/mics we’ve ever used!
Those who paid attention to ‘About this software’ dialog boxes in the 90s who used Adobe Premiere will recognise the name Randy Ubillos.
He was the lead developer for both Adobe Premiere in the early 90s and Apple’s Final Cut Pro in the late 90s.
By the time Final Cut Pro X was launched in 2011, he was chief architect for Apple’s photo & video applications. Apple included him in many important keynotes. His presentations included demos of new versions of iPhoto and iMovie for iOS as well as iMovie ’09:
Randy retired from Apple in April this year, but he is already making public appearances. Next he’ll be at the Bay Area SuperMeetUp in San Jose on June 26th. The SuperMeetUp is one of a series of events for those who use Macs and PCs for TV and film making.
I’m happy to say that part of his appearance will be an on-stage interview where I’ll ask him about storytelling and what has driven him over the years to make tools that have changed millions of people’s lives. As well as talking about developing applications that went on to be used by professionals to make TV shows and feature films all over the world, he’ll discuss the value of creating tools for everyone else to tell their stories.
That same day FCPX Creative Summit delegates will be attending a presentation at Apple’s offices about the latest version of Final Cut Pro X
FCPX Creative Summit attendees have the unique opportunity to visit the Apple Campus in Cupertino and hear directly from FCPX product managers! You’ll get a unique perspective on how this video editing software has changed the industry and how it continues to innovate today.
Get an update from Apple Product Managers on the current release of Final Cut Pro X, exciting customer stories, and the thriving ecosystem of third-party software and hardware.
Representatives of Apple’s ProApps team have appeared at professional events over the years, but this event marks the first time a large group of post production professionals have been invited to visit Apple.
These days we expect all live presentations to be filmed and made available on the internet within hours. This makes attending live much less essential. Despite Apple opening up more recently, they still ask that Final Cut Pro X team public presentations aren’t recorded and put online. Most assume that this is part of Apple’s culture of secrecy. In practice it might be due to the ProApps team wanting to use footage they are not cleared to show online. Footage such as rushes and alternate takes from Warner Bros. recent Will Smith and Margot Robbie feature film which was edited in Final Cut Pro X.
That week is the 4th anniversary of the radical reinvention of Final Cut Pro X. Some Final Cut users hope that Apple’s invitation shows that they will introduce exciting new features as part of a birthday celebration. Although that is possible, even if Final Cut remains unchanged, it is worth visiting the mother ship to learn from those who make the software.Read more
The Apple WWDC 15 session video on AV Foundation shows there are new features for developers who want to manipulate QuickTime movies on the Mac.
Some notes from the video:
New version of AV Foundation provides new classes for applications to edit QuickTime movie files.
Open QuickTime movie files and perform range-based editing on movies in tracks.
You select a segment of a movie and copy it into some other movie.
Add and remove tracks (tracks in QuickTime can refer to any time-based information, such as subtitles, GPS info, camera metadata)
Associate one track with another – such as saying that this track is the chapter break information for that track.
Add or modify movie and track metadata.
Create movie files and URL sample reference movie files.
‘QuickTime movie’ means data in a file that conforms to the QuickTime movie file format or ISO base media file formats that were based on QuickTime such as MPEG-4.
Sample data (audio and video content) can be in files separate from the QuickTime movie.
Movies that reference external media are ‘fragile’ – if the media is deleted or moved, the movie cannot play.
AV Foundation can now update an existing movie file without worrying about the sample data. That means edits, tracks and metadata can all be changed if the samples stay the same – “In place editing” (URLs in the context of AV Foundation are usually describe the location of files in connected storage)
A example project that shows how an application can combine many gigabytes of footage with metadata.
Good news for post production people who need developers to make applications that support complex workflows, and for those that hope existing tools will get useful new features.
Most understated 10.11 feature here is “edit in place”. No longer have to re-export entire file for trivial change. http://t.co/jqmjlnVTWU
— Digital Rebellion (@digitalreb) June 11, 2015
Digitial Rebellion are the makers of Pro Media Tools for Final Cut Pro, Avid and Adobe software.
Before AV Foundation QuickTime libraries in older versions of OS X were able to manipulate QuickTime reference movies. These were small files that were able to represent complex edits of multiple external media files. Reference movies are much simpler to work with that gigabytes of video and audio footage.
Maybe it’s time to do a quick course in Swift so you can make your own post production OS X applications!
Note that the screenshot shows that these new features are OS X Capitan only (the OS X logo in the top right of the screen). Once they’re available on iOS, tools for iPhones and iPads will be able to do much more with movie files.
Autodesk Smoke 2016 includes improved support for Final Cut Pro X XML export.
Although Smoke can use other formats for export, the new help file says out that Final Cut’s format is the one to use for collaboration:
Use FCP X XML Export when you want to share a sequence with third party applications.
The XML Export generates a simplified sequence that can be used in third party applications for creative editorial, color correction, media management, etc.
There are new Sequence Publish presets available, which output FCP X XML sequences.
- XML for DaVinci Resolve for Source Grading (ProRes 422 and 24-bit WAVE)
- XML for DaVinci Resolve for Source Grading (Sequence-only)
- XML for DaVinci Resolve (ProRes 422 and 24-bit WAVE)
Smoke 2016 can also now conform Final Cut Pro X timelines that include MXF format clips.
Shout out to @finalcutproes for the link!
Brian Mulligan pointed out:
#AutodeskSmoke 2016: You can now import, export, and use for media cache (intermediates) Apple ProRes 4444 XQ for high dynamic range content
— @BKMeditor June 11, 2015
Looks like Autodesk also likes Apple’s ProRes 4444 XQ.
Don’t believe Final Cut Pro X or Adobe Premiere will run on iOS one day? Apple’s Metal for iOS might be the key.
As Metal originated in iOS does this mean that there is the potential to run ‘serious’ applications, such as MODO, NUKE or even MARI on an iPad one day?
Anything is possible. Having a common graphics API between the two is certainly a start. What is maybe more interesting is a WYSIWYG workflow between IOS and OSX. You could use your Mac to design assets in MARI / MODO / NUKE and then have them display / rendering live on a mobile device looking exactly the same.
Using the iPad’s accelerometer, Foundry tools might be able to render graphics as AR overlays.
Jack also appeared on stage at the Apple WWDC conference this week – 10:58 into the video at developer.apple.com. He showed how much The Foundry team were able to achieve in four weeks of adding Metal to MODO, their 3D modelling and animation application.