As an Apple watcher, I am excited about the announcements at next week’s 2019 Apple Worldwide Developer Conference. Hardware news will affect editing and post production this year. Software news will be more about the Final Cut, Motion, iMovie and Compressor of 2020. To predict what will be possible in Final Cut/iMovie this year, it is better to look at what was announced last year at WWDC18.
Final Cut Pro X usually works on the current major version of macOS and the previous major version. This allows people who have bet their business on Final Cut to have the option remain on a tried and trusted version of macOS. Let braver folk help the community by testing the newest versions in production. The Final Cut 10.4.x series runs on macOS Mojave (10.14) and High Sierra (10.13). If 10.5 arrives at the end of the year – to accompany the 2019 Mac Pro – it is likely it will run on macOS Mojave and the version of macOS announced next week* (10.15).
You can review the sessions from WWDC18 on the Apple site.
Here are some links to sessions covering features of macOS Mojave relevant to Final Cut were covered at WWDC18. Each link includes a video from the session, a searchable transcript and a PDF of the presentation shown.
Core Image is the part of iOS and macOS that modifies images in photo and video applications. Often using small pieces of code known as CIKernels. This session describes a way to use the Python language to prototype combinations of Core Image filters before making custom CIKernels. There are over 200 CI filters – similar to the many filters, generators and transitions used as elements of Motion 5 projects and templates.
Long-standing developers would like clarity from Apple on the future of Quartz Composer, a venerable macOS developer tool that in recent years seems to have been abandoned. It combines OpenGL shaders in a node-based user interface to take graphics, sound or other kinds of inputs to generate graphics. QC produces .qtz files which are used to create real-time animation used in iOS and Mac user interfaces and visualisers. Some of the more complex plugins for Final Cut have Quartz compositions at their core. As Apple has moved on to Metal from OpenGL, could they be developing ‘Quartz Composer X’ that generates Core Image-based .qtzx documents?
This session also shows how Machine Learning models can be applied to images for advanced effects. As well as Motion-hosted Core Image filters, Motion, Final Cut and iMovie could also come with ML models that process video. For example models that represent a kind of style to display video in – such as ‘Stained Glass,’ ‘Van Gogh’ or ‘Neon.”
Vision is the part of iOS and macOS that provides computer vision services to apps and applications. This session mentions that macOS Mojave got an update to its face-recognition technology – recognising faces in all orientations (such as upside down). The session also includes how to implement object tracking for video, mentioning that the system can track up to 16 rectangular objects and 16 other kinds of objects at the same time.
Although Motion has had tracking for over 10 years, these kind of trackers are much more advanced. The current Motion motion tracker can’t be used to make titles or effect plugins for Final Cut that track objects in video. Perhaps the next versions of Motion 5 and Final Cut Pro X will have tracking built in.
This was an iOS session on how to access the depth information being generated by both the front and rear camera systems on iOS devices. Although not available on macOS yet, it is likely that these techniques will come to Mac applications once volumetric capture comes to more standalone cameras. The most straightforward use of depth maps will mean that no keying or greenscreens will be needed to separate objects from backgrounds. Graphics, titles and footage will be able to be rendered so they seem at to be any distance from the camera – including between people in the foreground and backgrounds shot on location.
It seems that the majority of Final Cut users don’t want Apple to move to a subscription model for the main application. It is likely however that the Apple higher-ups have sent out a memo to everyone to ask how each team will fit into the new Apple services narrative. One possible answer for the Video Applications group is to offer subscription features for third-party developers. That would mean plugin developers like me, tools vendors and post production service providers would be able to use Apple Store subscriptions to add features and to their Final Cut/Motion/iMovie products and services.
Although its is unlikely that iMovie and Final Cut will be able to take advantages of features in macOS 10.15, next week could see some announcements relevant to editing and post production – and to Final Cut Pro 10.6, or whatever versions are released in 2020.
Apple’s Marzipan project is about making it much easier for the hundreds of thousands of iOS developers to covert their iPhone and iPad apps into Mac applications, or use iOS skills to make new Mac applications. It is likely that Apple will demonstrate examples of their iOS apps being converted to also work on the Mac. This includes new versions of Messages, Reminders and Mail for Mac based on the iOS versions. It is also possible that they will show one of the Apple video applications for iOS running on the Mac.
There is no need to make iMovie for iOS work on the Mac. It already exists as the current version of Final Cut Pro X with a consumer UI. A more interesting application for those in post production would be a Mac version of Clips – Apple’s social media video application. Final Cut users would find Clips useful for an important feature: its ability to use Siri transcription to convert speech into text for subtitles and titling. Many editors would like to use this feature with footage in iMovie and Final Cut Pro.
Although Final Cut may be able to run on macOS Mojave, there is a chance that some features will only work on macOS 10.15. As Clips for iOS uses Siri transcription titles engineered in a version of Motion 5, once Siri for Mac is updated in macOS 10.15, could speech transcription come to Final Cut?
A feature that would benefit Final Cut, Motion and iMovie users is widely expected: the ability to use recent iPads as external displays for Macs. Some expect iPads to act as a generic extra monitor attached to a Mac – with touch input being sent as trackpad input to applications. There is a chance that Apple won’t want their high-end iPad Pros to be ‘mere displays,’ so they may opt for iPads being available for specific uses on an app by app basis. In thee case of post production, iPad Pros have a 120Hz refresh rate and consistent high gamut displays – useful for broadcast monitor simulation – a useful new feature for Compressor and Motion 5.
Like many editors, I still use QuickTime Player 7.0 for simple editing options. It has many useful little features. Features available in other applications, but like Preview for pictures and TextEdit for text, QT7 remains useful for me. Apple has announced that 32-bit applications like these will not run on macOS beyond last year’s Mojave. Next week we also might see if Apple will replace QuickTime Player 7.0 with a whole new application, update QuickTime Player X with professional features, or do nothing. QT7 is powered by features of the highly evolved QuickTime OS toolkit. Its replacement is AV Foundation – which is the video toolkit used by Final Cut. Sessions on an AV Foundation update next week is relevant to video playback utilities and applications on macOS, iOS and tvOS.
Many expect that Apple will at least preview the 2019 Mac Pro and new Apple display. Although Apple usually use Final Cut as their high-end Mac screenshots, remember that WWDC is a third-party developer conference. That means that Apple think that third-party developers appearing on stage will act as inspiration to attendees. That is why Adobe are regular guests at Apple keynotes, showing how quickly they managed to adapt After Effects and Illustrator to use Metal or what Photoshop for iPad will be like.
In practice less than 1% of even high-end post production jobs are too much for a fully-equipped iMac Pro. Facilities houses may appreciate that the iMac Pro comes in Space Grey, but they will find it much easier to justify clients not having their own in-house kit when they can show investment in multiple 6K Apple displays and 2019 Mac Pros.
The 2019 Mac Pro is a way for Apple to gain a little more trust back from professionals – that Apple gets what they need (as well as what they want). Luckily Final Cut is likely to be along for the ride: one of the few uses of high-end Macs that most journalists and investors understand without much explanation.
There’s a good chance that the Mac Pro will include a PCI 4.0 expansion bus. The question remains as to how Apple’s move away from Intel to their ‘A-series’ CPUs will be accommodated by the new form factor. The ‘cheesegrater’ PowerMac G5 was superseded by the Mac Pro with no external change to the computer. The 2019 design will need to accommodate Apple’s plans for the 2020, 2021 and 2022 ‘Ax’ Mac Pros. The ‘modularity’ of Macs is partially based on the expansion bus – which is dependent on the processor architecture. Even on MacBook Pros, the total bandwidth available to connected devices is defined by the PCI bus and the connection pins on Intel CPUs.
A more radical way of introducing modularity to the Mac Pro would be for Apple to adopt a new open source bus architecture. One that works for current Intel and AMD CPUs and one which works for other CPU technology. The advantage of making it open source is that those who make their own PCs to run Linux and Windows (and those who make motherboards to seed them) would have an alternate option.
One of the main differences could be a new way to handle temperature control. At the moment powerful PCI cards handle their own cooling using a variety of technologies. These must work in PCs and Macs with a variety of internal arrangements. These cooling strategies sometimes interfere with each other. Maybe bus expansion specifications should include providing temperature information back to the host computer, so it can change the cooling settings for the whole computer. Apple is one of the few computer companies that could come up with a new way. The advantage of making the standard open source is that card makers would have more incentive to support it – a larger potential market of professionals.
For more on the 2019 Mac Pro, visit my article on the most vital feature it should have at launch.
To see what new abilities of macOS and iOS might support new features of next year’s Final Cut Pro, you can watch WWDC19 sessions from next week onwards. Some will be streamed live. Nearly all of them will be available for streaming and download soon after.
The fun starts on Monday with the Apple Keynote event.
At the moment the schedule isn’t public. Once it is I will post a list of post production-related sessions and update it with tid-bits relevant to Final Cut users later next week.
*Some expect that 10.15 will have a name associated with Mojave, just as Lion was followed by Mountain Lion and Sierra by High Sierra. Possible names in around the Mojave Desert include: macOS Providence, Rainbow, Chase, Baker, Death Valley and macOS Zzyzx.Read more
Once again Apple is updating their MacBook Pro range. Following on from two updates last year, they are once again improving their laptops. Despite the physical design hardly changing since late 2016, Apple act as if they still invested in the current MacBook Pro design.
The good news: This shows that Apple will improve the MacBook Pro when they can – without saving up improvements until the next major redesign.
Apple measured the performance of various pro applications. The degree to which these apps used the CPU vs. the GPU (which remain the same for now) is reflected in the speed increases. Apple describes the new configurations as being ‘up to twice as fast.’ Here’s how much faster they say professional applications are:
- Music producers can play back massive multi-track projects with up to two times more Alchemy plug-ins in Logic Pro X.
- 3D designers can render scenes up to two times faster in Maya Arnold.
- Photographers can apply complex edits and filters up to 75 percent faster in Photoshop.
- Developers can compile code up to 65 percent faster in Xcode.
- Scientists and researchers can compute complex fluid dynamics simulations up to 50 percent faster in TetrUSS.
- Video editors can edit up to 11 simultaneous multi-cam streams of 4K video in Final Cut Pro X.
In the case of Final Cut (according to FCP.co,) the 2018 MacBook Pro can edit 9 streams of multi-cam streams of 4K video simultaneously.
Although appreciated, I hope Apple turn to improving graphics performance next time – for the more modern professional applications that do most of their work in thee GPU.
Today’s news means there won’t be a MacBook Pro announcement at WWDC. This update might also signal that the next notebook architecture from Apple – which eventually allow for non-Intel CPUs – may first be introduced at the low end. Useful for developers who are optimising their applications for Apple’s A-series processors used in the most recent iPad Pro.
In a rare moment a feature film editor is given credit for their creative contributions. In a video in which the directors of various Marvel films – including the two most recent Avengers movies – the Russo brothers talks about Jeffrey Ford, one of their editors.
After describing how he came up with the last moment they filmed for the films towards the end of Endgame they say…
All credit to Jeff Ford, our editor. He edited every MCU movie we’ve done. He’s edited several others as well. He’s one of the creative cornerstones of the MCU. I think it’s very fitting that he would have come up with that that line that pays off the entire series so well.
There’s more in this video. Spoiler warning if you don’t want to know what happens at the end of Avengers: Endgame:
To find out more about the editing of the MCU films, read Steve Hullfish’s interviews with their editors: Jeffrey Ford, Michael Shawver (Black Panther), Craig Wood (Ant-Man and the Wasp), Fred Raskin (Guardians of the Galaxy, Vol. 2), Sabrina Plisco, and Wyatt Smith (Doctor Strange).Read more
Soon after the previous version of the Mac Pro was launched in 2013, pro users were already hoping for Apple to release a perfect replacement. Over on Medium I’ve written a post on why their go-to-market strategy will be vital to its success:
How will 2019 Mac Pro hardware, software and services will be sold and supported? If they answer remains ‘Apple will do it,’ the new computer may have failed already.
Apple have stated that the forthcoming Mac Pro will have a ‘modular’ design. Modularity is seen as being a way of defining a ‘pro’ piece of hardware. Modularity applies to pro software and pro services too. The kind of flexibility that each professional industry category expects. For example, scientific researchers won’t invest in hardware tuned to the needs of high-end TV and film post-production industry.
The botched launch of Final Cut Pro X in 2011 was decried by many who had ‘bet their businesses’ on Final Cut Pro 7. There were many people who had spent a lot of money and time on delivering video, TV and film using Final Cut. The workflows they had developed, the video cards they had invested in and the time they had spent becoming experts was now associated with a system that had no future.
Many missed what also made the 2013 Mac Pro not suited for professionals: a lack of third-party ecosystem to support professionals using the hardware.
…but why one production company – Mystery Box – changed to Final Cut after 8 years with Adobe. Chris Workman writes…
Mystery Box started on Apple’s Final Cut 7 and changed to Adobe Premiere in 2011 as we found it was the closest match in tool set and UI to Final Cut 7. After 8 years of streamlining workflows and building muscle memory, last year Mystery Box made the decision to jump the Adobe ship and make FCPX our NLE of choice.
Acknowledging recently announced updates from Adobe, Avid and Blackmagic:
Every year companies make huge improvements in software development. At NAB 2019 both Avid and Blackmagic announced major changes to their NLEs, with some very promising features patterned after Final Cut X’s organization.
We have experienced the power of metadata media organization, and there is no going back. We can’t say what NLE we will be on in the years to come, but it will definitely have to compete with FCPX’s keyword workflow. Nothing else compares, at the moment.
Go over to the Mystery Box site to learn in detail what makes them stick with Final Cut Pro X.Read more
On Monday Avid announced their Q1 2019 results. Here are some tid-bits from their investor call – as transcribed by The Motley Fool.
Comments from Jeff Rosica, Avid’s CEO:
On how customers are moving from perpetual to subscription licenses:
As products age, there’s always churn because people stop using products 7 or 10 years later and they stop doing support, and so there’s always a bit of churn as products age out. So, there’s always going to be some of that within the noise of the numbers. And then there’s also move from perpetual to subscription, a lot more people, a lot of their perpetual revenues, where everything from individual creative professionals, all the way up to enterprises, who bought their software as a perpetual license and now they’re moving to subscription. So, you just see a lot of movement over to the subscription line from the perpetual licensing line.
On converting ‘| First’ users to paid options:
Our paid conversion is, I would say, almost at best practice, industry standard, which is great because I don’t think we’re yet at best practice from a marketing standpoint. […] But it shows the passion of the customer base, that even with the efforts that we make today, that we’re able to get really top-industry averages for our conversions today. But there’s a big opportunity there that we already are going after and will be going after even more aggressively later this year and in next year.
On the speed of migration to cloud services:
It’s not going to be explosive. This industry will migrate gradually, and they will, I think, largely be a hybrid, multi-cloud approach for most of these customers. It will take time to migrate over, which, to be honest, is a good thing because if they went too fast, you could see too aggressive of a hit to the cash flows of the company. Because they are doing it gradually, you kind of avoid that kind of massive J curve kind of impact to the businesses’ cash flows. So, I think the industry will go hybrid and will go gradually. And for us, I’ll remind you, Orin, also, a lot of our early SaaS offerings, we’re looking to add on services not take away from stuff that we do today. Even though there are some ways you can deploy in the cloud different than we do on-prem. We’re really focused on, especially in the early days, adding complementary SaaS services to the business.
On how they will move on from relying on selling storage to cloud-based storage:
This was the strategy behind what we call Cloudspaces. The best way I could explain this, Michael, is Avid’s a collaborative storage tool, and it works in what they call workspaces. And so, you assign workspaces to groups and they do their work, and you give them bandwidth and you give them storage capacity, et cetera, you assign capability to them. We’re basically allowing that in a cloud so that what we’ve done with NEXIS Cloudspaces is that every customer who already bought a NEXIS, that’s why I mentioned the more than 2,500 customer installations we have to date, all those customers, when they download, because most storage customers have not, almost all run a maintenance plan, they get to download the software update as a part of their maintenance program. And that software update immediately lights up the Cloudspaces, which basically is additional workspaces in the Microsoft Azure environment and allows them to start to try the cloud for their storage expansion. And it’s meant for near line and archives. So, it is the first step for us as a company beyond the near line product we have available today to really show people how they can park and archive stuff in the cloud very easily and efficiently. And it literally lights up when they download the software, then they just decide what they want to do.
And with the help our partnership with Microsoft Azure team, they actually have given a free use of 2 terabytes of storage for people to get started in the first 90 days. So really, our strategy is to get people to try it and try it for free, love it, hopefully, and then start consuming the NEXIS Cloudspaces, which will give us an additional revenue stream for the archive piece.
You can take a look at the Q1 2009 Avid Technology results press release, presentation and listen to the conference call at ir.avid.com.Read more