During the Dolby Summit* yesterday, I used the Q&A system them about the Dolby Vision workflow from iPhone 12 to Mac/PC. Here’s a summary of their answer:
You can’t import the phone’s Dolby Vision metadata into desktop applications. Yet.
Apple have said in their press release for the iPhone 12:
Dolby Vision grading is processed live and sustained during editing, whether in the Photos app or iMovie, and coming to Final Cut Pro X later this year.
Hopefully November will see an update for Final Cut that will be able to read the Dolby Vision (‘DV’) metadata. That metadata will ensure that footage looks good on Dolby Vision-compatible HDR displays. 80% of TVs sold in Europe have Dolby Vision.
The Dolby Vision mastering process includes automatically generating metadata for your video so that it displays well on SDR displays – such as older phones, TVs and most computers. Graders say that this automated process works for 95% of shots. The process creates colour and brightness metadata to each shot in your programme. The art of the colour grader is going through those shots, finding the 5% that need to be changed for SDR viewing and making the right changes.
To be ‘Dolby Vision compatible’ I expect Final Cut will be able to display the light and colour metadata generated by the iPhone camera app when footage is captured. It should also generate the metadata for edited programmes so that it can be modified in colour grading applications. It would be a bonus if that metadata could be changed in Final Cut too.
Apple have a December 2019 white paper on the subject: ‘HDR and Wide Color Gamut in Final Cut Pro X’. In your timeline’s Project Properties, set the colour space to Rec.2020 PQ:
Choose this option if you want to create an HDR movie with the Rec. 2020 color space and PQ transfer function (Rec. 2100 standard). This format can be used for Dolby Vision and HDR10 mastering at a later stage.
So for now, although the footage from the iPhone 12 / 12 Pro is HLG, a PQ timeline is better if you want grading applications to be able to make the most of what was captured.
Here’s Dolby’s list of Macs that support Dolby Vision and Dolby Atmos playback.
I’ll update this post once Final Cut is updated.
*Dolby Summit was a streamed conference for those who make TV, films, music and video games that use Dolby technologies such as Dolby Vision for video and Dolby Atmos for audio.Read more
Today Apple released an update to their professional video editing software.
The current Apple policy is to have no roadmap for future plans or ongoing public promotion of Final Cut Pro. This means what features are prioritised in updates give us hints on how they think about Final Cut and who they think it is for.
There are improvements for…
Apple is telling us with this release that they want to please almost every kind of Final Cut user you might imagine. From feature film editors and post supervisors up to those starting out in multiple channels of social media.
This isn’t the ‘everything I want’ update to 10.5 (or even 11) that some have been hoping for since the last big update in Autumn 2018. The good news is the majority of the new features will change many Final Cut user’s lives for the better.
I’ll update this page with new information as I find it.
Apple Final Cut release notes (not yet updated for 10.4.9).
Motion release notes.
Compressor release notes.
All three applications work with macOS Mojave (10.14.6) and newer.
There is a 20 minute video from Felipe Baez on the workflow improvements in Final Cut:
Another is a 40 minute video from Ripple Training, who make training courses for Final Cut Pro, Motion, Compressor and DaVinci Resolve:
Ripple also make plugins for Final Cut Pro.
Here is their video on new features in Motion 5.4.6:
As do I! (many of them free).
Frame.io have updated their workflow extension:
We’ve updated our #FinalCutPro X Workflow Extension. Drag clips from the https://t.co/maIBcRSC2a extension directly into Final Cut, and Final Cut will download them in the background. #fcpx pic.twitter.com/fnfs0sVF3S
— Frame.io (@Frame_io) August 25, 2020
The 10.4.9 update changes the format of Final Cut libraries, so best back up all libraries made with 10.4.8 and also compress (using the Archive command in the Finder) and back up the Final Cut 10.4.8, Motion 5.4.5 and Compressor 4.4.6 too. Apple’s support document on this.
When you are ready, use the ‘App Store…’ command in the Apple menu to see if the updates are available.
To buy Final Cut Pro for the first time, visit the Mac App Store.Read more
Once you can write plug-ins for Final Cut Pro in Swift, making tools to extend Final Cut will be a worthwhile project for those learning programming.
I like to call the tools I make for Final Cut ‘plug-ins’ because from a user point of view they are small pieces of code that add features to Final Cut Pro. I don’t use traditional programming to make them. I use Motion, Apple’s $49 real-time motion graphics application. I combine the features and filters of Motion to make templates for Final Cut Pro. If I can’t do something in Motion, I can’t make it happen in Final Cut Pro.
What developers like to call ‘real’ plug-ins for applications are made using Xcode and written in Objective-C. They add new features to Motion. These new features are made available in Final Cut using a Motion template.
The software development kit that developers use to build these sort of plug-ins is called FxPlug.
A new version of FxPlug – version 4 – was introduced in October 2019 last year.
FxPlug 4 introduces fully ‘out-of-process‘ FxPlug plug-ins, which have no component that runs inside of the host application process. Out-of-process plug-ins provide improved security for end users and allow plug-in developers the freedom to choose from a variety of rendering technologies, such as OpenGL, Core Graphics, Core Image, or Metal to develop unique plug-ins that include on-screen controls and custom user interface elements—all running seamlessly in the host application. Plug-in developers can choose to implement in either Swift or Objective-C.
FxPlug 4 means developers can use Metal for rendering for the first time and can be written using Apple’s Swift programming language.
Version 4 is good news for developers who want to make new plug-ins that faster (that render using Metal) and more secure (because they are more reliable, so need less support).
In recent years, Apple has focussed on helping educators teach programming using their Swift language. They provide all sorts of resources and tools to advance the cause of development worldwide.
One of the tough parts of teaching programming is to find simple enough projects that are based on small parts of a programming language, yet advanced enough to be inspirational to students.
There’s a big leap from getting your computer to display ‘hello world’ to getting it do something that someone would find useful on a day to day basis. The tough part is including all the elements that make a program work as a Mac, iPad or iPhone application. A tool to use every once in a while or to share with others.
The advantage of making a plug-in is that the ‘host’ application does a lot of the work to make the tool useful. It provides menus, windows, media, timelines and user interface elements.
That’s why I hope that to establish this new version of the FxPlug SDK, Apple will develop educational resources so that new programmers will be able to learn Swift programming by making plug-ins for Motion. Plug-ins that add features to Motion to make templates for Final Cut Pro.
New developers would be even more encouraged to learn if new versions of iMovie for Mac, iPad and iPhone are able to use Motion templates that include FxPlug plug-ins. Having a potential audience (or market) of tens of millions of editors would be a big incentive to learn!Read more
Many in the Final Cut Pro and Motion community (bubble?) bemoan the fact that Apple aren’t out there promoting our favourite applications. They don’t have stands at trade fairs and they don’t seem to put on professional events any more – not even in Hollywood.
Alongside many, I missed that Apple is doing publicly doing something for their video applications. There is an event at the end of this month in Los Angeles: Vlog University with iJustine:
Vlog University is a 2-day training conference designed to provide professional training for those interested in building or enhancing their online channels.
What has this to do with Apple?
Although this isn’t the ‘Force Hollywood to use Final Cut for all productions’ event we might like, it shows that Apple is looking outwards instead of keeping the chocolate factory doors closed between software updates.
It seems like a version of the FCPX Creative Summit with industry-specific sessions – this time for those who want to use polished video on social media. Vlog University is being produced by Future Media Conferences – who have put on the FCPX Creative Summit since it started 5 years ago.
If this event turns out to be a success from Apple’s point of view, I wonder if will pave the way for more industry-specific Final Cut Pro events. For the same audience in other countries, and for different audiences in the USA.
The advantage of starting with social media folk is that they know that they don’t know all there is to know. The trick with other markets is appealing to people who think they know enough to do their work. Maybe if you have the right idea for an event, Apple will support you.
Business Video University anyone? Documentary Production University? Let’s hope events like these will be coming to a country near you soon!Read more
Here is how to make sure Motion displays rectangles without soft edges. By default they are displayed with soft edges. Here is a 2×2 rectangle with a position in whole pixels (i.e. not at -0.5, 1.25). You can see the centre is a two by two square, a extra two pixels around the edge make it have a soft border:
I think this is so that Motion doesn’t change the visible dimensions when you increase the ‘roundness’ setting of the shape.
Sometimes it is useful for rectangles to not use any softness at their edges. If you know your output animation or Final Cut Pro plugin will be used at a specific resolution, you might want the edges of rectangular areas to be hard and not blurred. Some Adobe Photoshop users refer to this as forcing ‘pixel perfect’ rendering (thanks John B. Manos).
You can do this by selecting the rectangle in the Inspector and going to Shape:Style and setting the Feather to -0.1 and Falloff to -1000:
This will make sure Motion and Final Cut will not use soft edges when drawing the rectangle – just 100% of the fill colour or nothing. Here are two 2×2 pixel rectangles. The right-hand square has a Feather of -0.1 and Falloff of -1000:
Here is what the squares look like as you drag them in Motion 5:
Apple has hit a hurdle in clearing the way for high-end camera makers to adopt built-in encoding of their ProRes RAW codec.
Apple wanted to appeal the validity of RED.COM’s patent that allows sensor data to be encoded in specific ways in cameras. The existence of this patent makes it harder for Apple to convince camera companies to include Apple’s ProRes RAW codec recording in their offerings. They might have to apply to RED for a patent license. RED might charge too much for one of their competitors to for ProRes RAW recording to make economic sense.
An ‘Institution of Inter Partes Review’ of RED.COM’s patent on RAW video capture in high-end professional video cameras was denied by the Patent Office according to a document I found on their website*
ProRes RAW has a superior method of storing the actual sensor data as captured by the camera for use in high-end post production. RED’s method compresses the information coming from the camera sensor to make it easier to store and decode on computers:
[The patent] applies mathematically lossy compression to the data “in a way that provides a visually lossless output.”
REDCODE – the system that implements RED’s patent – was designed to work on 10+ year old computer technology. Although the result may be visually lossless, the result doesn’t include all information captured by the camera – some of which might be useful in the postproduction process. Modern hardware and software can now handle the actual sensor data (known as RAW data) in real time – in Apple’s Final Cut Pro running on recent Mac computers.
Not yet invalidating this patent might also be a problem for Apple themselves if they want to introduce RAW video recording in the iPhone or iPad.
The application by Apple to the Patent Office was made public by Andrew Reid of the EOSHD site in August when RED responded to Apple:
In Apple’s attempt to overturn RED’s claims over visually lossless compressed raw video, the US Patent Office has published documents submitted by RED. These explain their side of the story with particular regard to REDCODE.
If RED can continue to prove that the approach to their codec was novel, RED will win and Apple will have to compensate RED or make a deal in order to sell ProRes RAW in our devices and cameras, such as the Nikon Z6.
The Patent Office decision says that it would be unlikely for a review to succeed invaliding the appeal.
To institute an inter partes review, we must determine “that there is a reasonable likelihood that the petitioner would prevail with respect to at least 1 of the claims challenged in the petition.” 35 U.S.C. § 314(a). For the reasons discussed below, Petitioner has not shown a reasonable likelihood that it would prevail in showing that any challenged claim is unpatentable. Thus, we deny the Petition and do not institute an inter partes review.
Apple (the Petitioner) weren’t specific enough for this claim (my emphasis):
Petitioner’s construction of “substantially visually lossless” omits features from the specification’s definition of “visually lossless.” See Pet. 9. Petitioner’s construction refers to the similarity of “sets of data.” Id. But the specification’s definition requires a comparison with the “original (never compressed) image data”—not data sets or image sets, generally. Ex. 1001, 9:55–60. Also, the specification’s definition explains that the comparison is performed “on the same display device” (id.), but Petitioner’s construction does not address how the comparison is made.
In this case the comparison would be between ProRes RAW – “original (never compressed) image data” and REDCODE – “visually lossless.”
Apple’s lawyers also seemed to fail when asserting that RED.COM’s patent had not added anything to a combination of two previous inventions. If combining two previous patents produced the result in RED’s patent, then the idea would be deemed ‘obvious’ and not patentable. However:
In sum, Petitioner’s obviousness challenge is unclear and incomplete.
It seems that this decision by the US Patent Office is more about the application for the appeal not being valid than the actual appeal itself. They are telling Apple’s lawyers that they need to make a more precise claim of validity if the patent review is to take place.
I expect Apple will try again. I’m looking forward to seeing what happens next.
*Prompted when I saw this tweet from fcp.co:
— FCP.co (@FCPdotCO) November 9, 2019