The Onion, 2006:
Lessner, who said he started the weekend session laughing at the abundance of “blooper gold,” soon lost all perspective when faced with the task of condensing the more than 86 hours of footage—most of which was “almost indistinguishably hilarious”—into a single 26-minute special by his strict Sunday deadline.
For people to share video using advanced codecs on the internet and elsewhere, the codecs have to be developed in the first place. The cost of development is then recouped using patent licenses. Sometimes a fee is charged for building a player – whether on a website or in a modern TV, sometimes a free is charged on each time content is played. Paying per play is known as a content royaty.
There are two groups of patent holders associated with the codecs associated with UHD and 4K playback. The HEVC group aren’t charging content royalties, but a second (made up of GE, Technicolor, Mitsubishi, Philips, and Dolby) have just announced their royalty rate.
Jan Ozer has just ran the numbers:
For a $4.00 movie downloaded from Amazon Prime or M-Go, the royalty would be two cents, right in line with MPEG-2/H.264 content royalties. In a Netflix scenario, for a $10/month subscriber who watches 10% of video that uses HEVC, the royalty would only apply to 10% of the subscription price, so the royalty would be about half a penny ($0.005). Assuming the $10 subscriber watches 100% HEVC, the royalty would be a nickel.
HEVCAdvance expects the royalty to be calculated on gross numbers, not on a per-subscriber basis. For an advertising supported site, if HEVC was 30% of all video distributed, the calculation would be 30% x total video-related advertising revenue x .005. In this scenario, if video-related advertising revenue was $1 billion, the royalty would be $1,000,000,000 x .3 x .005, or $1.5 million, a far cry from the $120 million Apple is staring at.
…no matter how much you dislike the terms offered by HEVCAdvance, dealing with the individual patent holders would have likely been more expensive, and certainly more complicated. IP rights are a reality, so like the T-shirt says, the market will keep calm and carry on.
It took me years before I got my mind around Apple Motion. I spent a long time trying to learn how to do complex multi-layer keyframed motion graphics like I used to make in Adobe After Effects. I clicked with Motion once I had a more straightforward task: make a simple plugin for Final Cut Pro X.
After making many Final Cut plugins and motion graphics sequences with Motion, I’ve come to know it well. One of the two sessions I taught at the 2015 FCPX Creative Summit was about using Motion’s behaviors for animation.
I’ll be teaching a webinar version of that session on Tuesday called “Exploring Apple Motion Behaviors for Easy Animation“:
The real power behind Apple Motion is behaviors. Behaviors use the power of complex calculations and real-time rendering to produce results in minutes that would take hours to create and modify using keyframes or complex math. Behaviors can control almost everything in Motion — including graphics, text, particles and cameras. Alex will show behaviors controlling graphics and particles and show how much fun you can have by playing with behaviors in Motion.
Register for free to watch live and ask questions at the Moviola website.
In recent years, more of TV and internet news features recordings made on mobile phones. iPhones and Andoid phones are also being used by professional journalists. The art and science of using consumer technology this way is known as Mobile Journalism. There are blogs, Twitter hashtags and conferences on the subject.
Final Cut Pro X is the editing software ‘for the rest of us’ – designed for professionals in many fields as well as editing. That means broadcast news organisations are training all sorts of staff in using X (Some would say everyone but the editors).
After a few years of steadily improving camera phones and video editing applications that run on iOS and Android, the quality of mobile audio recording is now catching up. Recent devices bypass microphones designed for telephone conversations.
Recently Glen Mulcahy of Irish national TV and radio broadcaster RTÉ compared two systems that connect via the iPhone’s lightning port:
Today I got my hands on the Sennheiser ClipMic Digital mic for iOS so I decided to shoot a quick unboxing video and then do an audio test and a video test to pitch it against the iKmultimedia iRigPro and AKG 417pp Lav which we currently use for Mobile Journalism here in RTÉ.
He used the Apogee MetaRecorder iOS app which includes metadata tagging for those editing MoJo footage in Final Cut Pro X.
Ripple Training’s video on how marker, keyword and role information captured on location can be imported into Final Cut:
ClipMic Digital is a new Microphone from Sennheiser that turns your iPhone or iPad into a professional digital audio recorder. By downloading the companion App from Apogge, you can record and add metadata to your recordings that can be read my Final Cut Pro X via XML. This is one of the COOLEST app/mics we’ve ever used!
Those who paid attention to ‘About this software’ dialog boxes in the 90s who used Adobe Premiere will recognise the name Randy Ubillos.
He was the lead developer for both Adobe Premiere in the early 90s and Apple’s Final Cut Pro in the late 90s.
By the time Final Cut Pro X was launched in 2011, he was chief architect for Apple’s photo & video applications. Apple included him in many important keynotes. His presentations included demos of new versions of iPhoto and iMovie for iOS as well as iMovie ’09:
Randy retired from Apple in April this year, but he is already making public appearances. Next he’ll be at the Bay Area SuperMeetUp in San Jose on June 26th. The SuperMeetUp is one of a series of events for those who use Macs and PCs for TV and film making.
I’m happy to say that part of his appearance will be an on-stage interview where I’ll ask him about storytelling and what has driven him over the years to make tools that have changed millions of people’s lives. As well as talking about developing applications that went on to be used by professionals to make TV shows and feature films all over the world, he’ll discuss the value of creating tools for everyone else to tell their stories.
That same day FCPX Creative Summit delegates will be attending a presentation at Apple’s offices about the latest version of Final Cut Pro X
FCPX Creative Summit attendees have the unique opportunity to visit the Apple Campus in Cupertino and hear directly from FCPX product managers! You’ll get a unique perspective on how this video editing software has changed the industry and how it continues to innovate today.
Get an update from Apple Product Managers on the current release of Final Cut Pro X, exciting customer stories, and the thriving ecosystem of third-party software and hardware.
Representatives of Apple’s ProApps team have appeared at professional events over the years, but this event marks the first time a large group of post production professionals have been invited to visit Apple.
These days we expect all live presentations to be filmed and made available on the internet within hours. This makes attending live much less essential. Despite Apple opening up more recently, they still ask that Final Cut Pro X team public presentations aren’t recorded and put online. Most assume that this is part of Apple’s culture of secrecy. In practice it might be due to the ProApps team wanting to use footage they are not cleared to show online. Footage such as rushes and alternate takes from Warner Bros. recent Will Smith and Margot Robbie feature film which was edited in Final Cut Pro X.
That week is the 4th anniversary of the radical reinvention of Final Cut Pro X. Some Final Cut users hope that Apple’s invitation shows that they will introduce exciting new features as part of a birthday celebration. Although that is possible, even if Final Cut remains unchanged, it is worth visiting the mother ship to learn from those who make the software.Read more
The Apple WWDC 15 session video on AV Foundation shows there are new features for developers who want to manipulate QuickTime movies on the Mac.
Some notes from the video:
New version of AV Foundation provides new classes for applications to edit QuickTime movie files.
Open QuickTime movie files and perform range-based editing on movies in tracks.
You select a segment of a movie and copy it into some other movie.
Add and remove tracks (tracks in QuickTime can refer to any time-based information, such as subtitles, GPS info, camera metadata)
Associate one track with another – such as saying that this track is the chapter break information for that track.
Add or modify movie and track metadata.
Create movie files and URL sample reference movie files.
‘QuickTime movie’ means data in a file that conforms to the QuickTime movie file format or ISO base media file formats that were based on QuickTime such as MPEG-4.
Sample data (audio and video content) can be in files separate from the QuickTime movie.
Movies that reference external media are ‘fragile’ – if the media is deleted or moved, the movie cannot play.
AV Foundation can now update an existing movie file without worrying about the sample data. That means edits, tracks and metadata can all be changed if the samples stay the same – “In place editing” (URLs in the context of AV Foundation are usually describe the location of files in connected storage)
A example project that shows how an application can combine many gigabytes of footage with metadata.
Good news for post production people who need developers to make applications that support complex workflows, and for those that hope existing tools will get useful new features.
Most understated 10.11 feature here is “edit in place”. No longer have to re-export entire file for trivial change. http://t.co/jqmjlnVTWU
— Digital Rebellion (@digitalreb) June 11, 2015
Digitial Rebellion are the makers of Pro Media Tools for Final Cut Pro, Avid and Adobe software.
Before AV Foundation QuickTime libraries in older versions of OS X were able to manipulate QuickTime reference movies. These were small files that were able to represent complex edits of multiple external media files. Reference movies are much simpler to work with that gigabytes of video and audio footage.
Maybe it’s time to do a quick course in Swift so you can make your own post production OS X applications!
Note that the screenshot shows that these new features are OS X Capitan only (the OS X logo in the top right of the screen). Once they’re available on iOS, tools for iPhones and iPads will be able to do much more with movie files.