Apple has hit a hurdle in clearing the way for high-end camera makers to adopt built-in encoding of their ProRes RAW codec.
Apple wanted to appeal the validity of RED.COM’s patent that allows sensor data to be encoded in specific ways in cameras. The existence of this patent makes it harder for Apple to convince camera companies to include Apple’s ProRes RAW codec recording in their offerings. They might have to apply to RED for a patent license. RED might charge too much for one of their competitors to for ProRes RAW recording to make economic sense.
An ‘Institution of Inter Partes Review’ of RED.COM’s patent on RAW video capture in high-end professional video cameras was denied by the Patent Office according to a document I found on their website*
ProRes RAW has a superior method of storing the actual sensor data as captured by the camera for use in high-end post production. RED’s method compresses the information coming from the camera sensor to make it easier to store and decode on computers:
[The patent] applies mathematically lossy compression to the data “in a way that provides a visually lossless output.”
REDCODE – the system that implements RED’s patent – was designed to work on 10+ year old computer technology. Although the result may be visually lossless, the result doesn’t include all information captured by the camera – some of which might be useful in the postproduction process. Modern hardware and software can now handle the actual sensor data (known as RAW data) in real time – in Apple’s Final Cut Pro running on recent Mac computers.
Not yet invalidating this patent might also be a problem for Apple themselves if they want to introduce RAW video recording in the iPhone or iPad.
The application by Apple to the Patent Office was made public by Andrew Reid of the EOSHD site in August when RED responded to Apple:
In Apple’s attempt to overturn RED’s claims over visually lossless compressed raw video, the US Patent Office has published documents submitted by RED. These explain their side of the story with particular regard to REDCODE.
If RED can continue to prove that the approach to their codec was novel, RED will win and Apple will have to compensate RED or make a deal in order to sell ProRes RAW in our devices and cameras, such as the Nikon Z6.
The Patent Office decision says that it would be unlikely for a review to succeed invaliding the appeal.
To institute an inter partes review, we must determine “that there is a reasonable likelihood that the petitioner would prevail with respect to at least 1 of the claims challenged in the petition.” 35 U.S.C. § 314(a). For the reasons discussed below, Petitioner has not shown a reasonable likelihood that it would prevail in showing that any challenged claim is unpatentable. Thus, we deny the Petition and do not institute an inter partes review.
Apple (the Petitioner) weren’t specific enough for this claim (my emphasis):
Petitioner’s construction of “substantially visually lossless” omits features from the specification’s definition of “visually lossless.” See Pet. 9. Petitioner’s construction refers to the similarity of “sets of data.” Id. But the specification’s definition requires a comparison with the “original (never compressed) image data”—not data sets or image sets, generally. Ex. 1001, 9:55–60. Also, the specification’s definition explains that the comparison is performed “on the same display device” (id.), but Petitioner’s construction does not address how the comparison is made.
In this case the comparison would be between ProRes RAW – “original (never compressed) image data” and REDCODE – “visually lossless.”
Apple’s lawyers also seemed to fail when asserting that RED.COM’s patent had not added anything to a combination of two previous inventions. If combining two previous patents produced the result in RED’s patent, then the idea would be deemed ‘obvious’ and not patentable. However:
In sum, Petitioner’s obviousness challenge is unclear and incomplete.
It seems that this decision by the US Patent Office is more about the application for the appeal not being valid than the actual appeal itself. They are telling Apple’s lawyers that they need to make a more precise claim of validity if the patent review is to take place.
I expect Apple will try again. I’m looking forward to seeing what happens next.
*Prompted when I saw this tweet from fcp.co:
— FCP.co (@FCPdotCO) November 9, 2019
Even if the next major version of Final Cut includes every feature high-end editors have been asking for for since 2011, acceptance in features and TV might take a long time.
‘Dolomite is my Name’ is a 4K HDR feature that was edited in Adobe Premiere by Billy Fox. There is a very interesting interview on the whole process by Steve Hullfish on the ProVideo Coalition site as part of his ‘Art of the Cut’ series.
On choosing Premiere:
I like the interface. I like the visual interface, the actual editing interface, the sound is very strong. For me, at least, it’s a very comfortable NLE and I enjoy it. I enjoy Avid. I enjoy Final Cut. They’re all different. They all have different flavors and advantages and disadvantages. Each project I look at and look at what’s needed for that project. This one seemed to fit very nicely for another Premiere project. We’re halfway through shooting Coming to America and it’s been great. We’re having a great time.
‘Dolomite is my Name’ is an independent movie destined for Netflix – that it is why it has a full 4K HDR workflow.
The ‘Coming to America’ sequel is a mainstream Paramount feature film. It is being shot on 35mm, being digitised in 4K. The VFX in 2K HDR (for budgetary reasons). It is currently being edited in Premiere.
Recently it seems that films edited using Adobe software are being edited using commercially-available versions of the applications. Before, I got the impression that films were edited using what I call the very exclusive ‘Premiere Pro Pro’ service – which included Adobe staff to fix workflow problems or even developers to make special builds of Premiere and After Effects. This made sense, so Adobe could make sure that higher profile productions edited on Premiere would run smoothly. The catch being that Adobe only had the resources to provide ‘PremProPro’ to one or two productions a year.
In recent years Adobe have proved the case that Premiere as available in their Adobe Creative Cloud subscription can be used to edit high-end TV and feature films.
Final Cut is a long way from having the kind of post production ‘mindshare’ that Adobe have invested in building up in recent years. They do this with regular blog posts, a Twitter account, Facebook and Instagram activity. They have people giving many public presentations all over the world.
Final Cut Pro X fans may enjoy pointing out that ‘Deadpool 2’ wasn’t edited on Premiere (unlike ‘Deadpool’). ‘Terminator: Dark Fate’ was. The Adobe case study mentions that Adobe was chosen because of Premiere/After Effects integration. This is one of many case studies shared on the Adobe site.
If Apple deliver all that high-end post needs with Final Cut Pro 10.5 update, it will take at least couple of years until Final Cut gets the kind of mindshare that Premiere has. ‘Features for features’ can only be the start. If Apple want to accelerate the process, it is time for some aggressive schemes to get the word out beyond the Final Cut bubble.
People will tell me that the Apple Video Applications team are hamstrung by wider Apple policies on industry engagement. What kind of ammunition does the team need to present their case to Apple? What do we propose? At least that promoting the Mac Pro (using Final Cut Pro X) requires different strategies than the MacBook Air range.
First item on my suggested do to list (from August 2017)? Pay editors and 1st assistant editors their normal weekly rate to learn Final Cut workflows – assuming 10.5 matches up.
When Apple announce that they are changing the way they promote and support their tools, we will know Apple is serious about Final Cut Pro in movies and TV.Read more
Need to fix a Mac by reinstalling macOS?
If you downloaded its installer before October 2019, you might not be able to.
People who rely on their Macs for their livelihood are well advised to maintain a bag of tricks to help out when things go wrong. A spare drive with vital installers and utilities to fix problems when the Mac they need stops working. One important installer for such a drive is that for macOS.
For those who have been archiving macOS installer applications, there’s a problem. If you downloaded one before October 14th 2019, it probably won’t work any more. Apple macOS installers available before October 2019 were signed with an old certificate, so Macs will not run if their clock is set to after October 24th 2019. For a technical explanation, go to the Der Flounder blog from Rich Trouton, Mac system administrator.
The error message that will appear will state: “This copy of the Install macOS Mojave application is damaged, and can’t be used to install macOS”
[Image from a tweet by Eric Holtam]
Similar messages will come up if you try to run older High Sierra and Sierra installers (even if they install 10.13.6 and 10.12.6 respectively).
Apple have recently updated their OS installers, so the solution is to follow these links and download the macOS installers you need: Apple support documents on updating to macOS Sierra (10.12.6), macOS High Sierra (10.13.6), macOS Mojave (10.14.6).
Once you have the installer(s) you need, don’t forget to update your ‘emergencies drive’!Read more
The application will be able to record two streams of video at the same time on the iPhone 11 Pro. This means users will be able to record a closeup and a wide angle view of a scene at the same time, or even both a choice of one of the back cameras and the front camera at the same time.
It turns out that this feature should also work on the iPhone 11 and last year’s phones and the iPad Pro as well.
At Apple’s WWDC19 session on iSO 13 multi-camera recording, Apple said:
We do it on all recent hardware, iPhone XS, XS Max, XR, and the new iPad Pro.
This is a pro feature that can be implemented by pro applications running iOS 13 should developers invest their time.
I expect the slower CPU in last year’s devices would mean that applications would only be able to do this in HD and HD, not 4K and 4K. The WWDC session included an example where the front camera recorded 30fps at 640×480 while one of the back cameras recorded 60fps 1920×1080.
I wonder if the CPU in the iPhone 11 range could record three angles at HD at lower frame rates. I’m looking forward to developers discovering what the new hardware can do!Read more
Steve Bayes worked for Avid for 10 years on Media Composer: the application used to edit most TV shows and feature films. He spent over 12 years working for Apple as the Final Cut Pro product manager. Now he is a freelance marketing consultant for businesses serving the post production industry.
He left Apple in July 2018, so is now free to join us on the internet in a private capacity. He has joined fcp.co as a columnist and commentator on the Film, TV and video production industry.
In his first column he outlines what to look for at the IBC 2019 video trade show that is coming to Amsterdam in a couple of weeks:
Look for more advanced workflows that handle HDR better. Use your dynamic range for good and don’t lose track of it on the way to the final display.
and he especially wary of those offering pure cloud solutions for video editing:
Every time someone talks about “the cloud” you need to down a shot of aquavit. Then ask, “I just shot a terabyte of 8K footage yesterday with my 3-camera rig, how long to upload it from my hotel before I can start to generate proxies in your cloud?”. They will point you to a workflow that requires very little footage and lots of GPU computing.
In the end, is there a clear ROI benefit to the cloud over local storage when working with the speeds and quality we expect today? I’d like to see those numbers and the justification. I predict it will remain a cloud/local hybrid for quite some time.
Here is the first of the fcp.co YouTube discussions featuring Steve Bayes on the fcp.co channel on YouTube:
If you are comfortable with Python and building packages on GitHub, version 3.0 of Appleloops is available.
It is for those who administer large networks of Macs that have Apple’s audio editing applications installed. Don’t understand the following requirements? Appleloops isn’t for you: