Apple hires Tim Dashwood, 3D movie and 360º video software pioneer

Sunday, 23 April 2017

Apple’s new hire is an expert in the kind of high-end post production that needs serious Apple hardware. After recently reassuring those despairing over the future of the Mac Pro, with this hire they show that  they are serious about making sure their software keeps up.

The first sign that something was up came on Friday 23rd April. The products sold by Dashwood Cinema Solutions that work on Final Cut Pro X, Motion 5 and the Mac versions of Adobe Premiere Pro CC and Adobe After Effects CC were suddenly available for free.

Now the news has gone out that Tim has been hired by Apple.

Dashwood Cinema Solutions is effectively one man: Tim Dashwood. Creator of post production tools for the Mac for over 10 years. He first gained attention from Final Cut Pro users when he introduced plugins for making 3D feature films. In recent years he’s been making 360º video toolkits for Final Cut Pro and Premiere. His high-end toolkit cost $1,199 until Friday. It is used for high-end 3D VR video post production at places such as Jaunt, where it was used to make Pure McCartney VR. (for more on how these films were made, read my article over at Tim and I feature in this video on how VR video works in Final Cut Pro X with his plugins.

This is good news for those who use Macs to make VR video. As the tools are now free for editors and motion graphics designers, the barrier to entry is that much lower. It is unknown whether Tim has joined the ProApps team to work on future versions of Final Cut Pro X and Motion or has joined the OS team. If he is now with the OS team, it is possible that his technology (which is partially implemented in Quartz Composer) will be made available to developers on Apple platforms via updates to the AV Foundation media toolkit.

Although Dashwood's 360VR Toolbox works well on current MacBook Pros, everyone agrees that the more GPU power you have, the better. Hiring Dashwood means that Apple expects to add much more GPU horsepower to Macs and perhaps iOS devices.

Good news for video editors, motion graphics designers and those wanting to tell their stories with VR video.

Apple has updated their ProRes White Paper

Tuesday, 18 April 2017

In April 2017 Apple updated their white paper on their ProRes codec family. Here are the changes.

An addition about recent changes in Final Cut Pro X:

A variety of cameras can now capture and record a wider gamut of color values when working in log or raw formats. You can preserve this wider color gamut by recording with the ProRes LOG setting on certain cameras such as the ARRI ALEXA or transcoding from the RED® camera’s REDCODE® RAW format. Final Cut Pro 10.3 or later can process color in wide color gamut and output Apple ProRes files in the Rec. 2020, DCI-P3, or D65-P3 color space. This results in deeper colors and more detail, with richer red and green areas of the image.

With Final Cut Pro 10.3 or later, you can also export Apple ProRes files inside an MXF metadata wrapper instead of exporting .mov files. This makes the exported video file compatible with a wide range of playback systems that rely on the MXF standard for broadcast and archiving.

The graph for how many multicam streams are simultaneously supported at HD and 4K on a MacBook Pro has been updated.

Testing conducted by Apple in March 2014 using shipping 15-inch MacBook Pro with Retina display quad-core 2.6GHz units with 1TB flash storage, 16GB of RAM, NVIDIA GeForce GT 750M graphics

Testing conducted by Apple in January 2017 using shipping 2.9GHz quad-core Intel Core i7-based 15-inch MacBook Pro systems with 2TB SSD, 16GB of RAM, Radeon Pro 460 graphics

They also made some changes to the Target Data Rates table. 5120 x 2160 has been replaced by 5120 x 2700, 6K and 8K. Here is a table showing both 5K resolutions and the new rows for 6K and 8K:

Avid vs. Adobe vs. Blackmagic vs. Apple - A mild skirmish before a real war

Tuesday, 18 April 2017

Those involved in trying to make sure ‘their’ NLE wins in the ‘war of post production’ would do well to take notice of the big changes coming to post in the coming years. Amazon, Google, Facebook, Microsoft and maybe the wider Apple may come along and make the current battle seem small.

Amazon have now rebranded their Elemental acquisition as ‘AWS Elemental, an Amazon Web Services company.’

Elemental, which makes high-speed video encoding and transcoding software to enable multiscreen content delivery across different devices, was founded nine years ago in Portland, Ore., and counts among its many customers ABC, BBC, Comcast, Ericsson, and ESPN.

AWS Elemental is first being marketed as a multi-platform distribution technology company. Amazon have been making more purchases in the media area. They also now own Thinkbox - who make cloud-based and on-site based graphics rendering tools for VFX. 

Thinkbox Software has provided creative tools and pipeline technology for both small and large scale projects including the worlds largest feature films. Transformers: Dark of the Moon, Thor, Green Lantern, Harry Potter, Avatar, Tron, GI-Joe the Rise of Cobra and hundreds of other films have utlized our software in front of and behind the screen. Our tools have been at the core of award winning projects, music videos, commercials and hundreds of hours of creative content spanning film, broadcast, commercial, marketing, games and web content.

If high-end VFX is something that Amazon think worth getting into, maybe general purpose video editing will come soon.

Which NLE UI and technology would be a good purchase for Amazon? 

UK TV assistant editor tries Final Cut Pro X

Friday, 14 April 2017

Chris Chapman is an edit assistant who works on UK prime time soap opera Emmerdale. He has written a detailed blog post on his impressions based on two weeks trying out Final Cut Pro X.

I chose to download the FCPX 30 day trial to edit a project with, and see how I liked it, this blog post is essentially a breakdown of how I worked, what I liked and disliked about FCPX, and if I think I'll move over.  I think my situation is probably similar to many others, who edit occasionally at home and are looking for the most cost effective solution to work with; especially as students have just had their Adobe CC subscriptions almost doubled recently. 

He had positive things to say…

…the option for creating Optimised and Proxy Media is excellent.  This very simple, but very useful, feature is truly brilliant.  It essentially creates ProRes media in the background to put all media on the same level playing field: 'Optimised Media' up's lower quality footage (such as H.264) to ProRes 422 to give a better editing experience, 'Proxy Media' creates ProRes Proxy media to improve playback for higher quality footage you struggle to run at full resolution (such as R3D, RAW or 4K).  I found the background rendering to be very fast when I tested it, but I was running mostly 720p or 1080p media that was natively H264 files from an iPhone, and typically an average of 1-2 minutes a clip.  

The biggest advantage of this elegant feature is the removal of the conform process, gone are the days of keeping track of all media throughout an edit,  and relinking your original media when you have locked your final offline sequence, and in worst case scenarios having to manually re-lay all media on a sequence with the high res native media.  With two mouse clicks you can playback a full resolution edit, and be reconnected to your native media instantly 


There were many other functions in FCPX that were very useful when cutting a vlog, a big one was the speed changes I often do to footage Siobhan and Emma shot as timelapses.  This was especially easy in FCPX as there are easy to reach functions for this, one option being below the viewer, it is a 'Retime Editor' who's icon looks like a 'speedometer'.  The quickest way to 'fit a clip' into a certain space or time gap, is to highlight a clip on the timeline and press 'cmd+R', this brings up the 'Retime Menu' which gives you a percentage above the clip.  You can now either use the dropdown to select a certain speed (50%, 25%, 2x, 4x) or a custom percentage.  Or even faster, drag the right edges of the Retime Menu, as if you were trimming the clip and this will speed the clip up, or slow it down to fit the space used by it.


Outside of what I was cutting, I'd imagine FCPX would handle mixed media well, mostly because of it's function to create optimised and proxy media upon import, but also because of it's impressive ability to cache and render on the fly as you work; this does come at a price.  When normally cutting these vlogs, in Premiere CS6, I would be running with about 50% of my 24GB of RAM in use, with FCPX it was closer to 90% and sometimes more.  This is likely down to the program's use of my full machine, and also the constant background rendering, waveform creation, poster frame creation and a more complex program in general.  The rest of the machine however didn't suffer if I jumped to another program briefly such as: Finder to copy a file, Safari (often to google a simple function), or even Premiere (to check how I styled things in past videos).  


…and things he did not like:

[3-point editing] appears to be very difficult also due to the requirement of 'gap', which as far as I'm aware is a main staple in most people's editing arsenal.  FCPX allows you to put down in's and out's on the timeline but they have to be placed over existing media, or gap.  This makes accurate 3-point edits feel difficult for anything other than replacing clips which can be done easier by dragging a new clip over an old one.  More and more I feel FCPX wants you to be throwing clips in roughly at first and finessing on the timeline, rather than being accurate from the start.  A way around this would be to lay 'gap' at the end of your timeline at all times to allow some breathing space, this is however more of a work-around then a true problem solver; but I don't see this style of timeline changing anytime soon.


Also another quirk is the lack of sync markers to warn you, if a shot slips out of sync with it's original audio; this is a worry for me and could potentially be a huge issue with an accidental slip of an edit - I feel the lack of sync markers is Apple's arrogance that as long as you edit how they suggest you do, it'll never go wrong, so why worry you with it.  


I do still feel that FCPX isn't designed or appropriate, for the mass pro market though.  And I don't see it ever replacing the staples that are Avid and Premiere; it doesn't handle multiple users and mass data banks with hard ware like Avid does, and it's lack of integration with After Effects amongst other programs, will be a drawback for many; such as small creative studios.

Those with years of experience of using Final Cut may find much they disagree with here. They might think that Chris doesn't know how to to things ‘properly,’ so is therefore unqualified to make these judgements. In reality, he represents the kind of person we want to be trying Final Cut. Apple would do well to pay attention to what people like Chris think. Smoothing all sorts of ‘onboarding’ stories is what Apple and other concerned parties need to do. It is important to realise that this report is based on what he has gathered while trying Final Cut for two weeks. I hope the Final Cut community is standing by to assist as and when needed. We will see if he changes his mind about the things he doesn’t like at the moment. 

The bottom line is that although he isn't sold on some of it’s features, he will stick with it for projects away from his work in commercial TV.

When talking to those with a great deal of experience in another NLE, I suggest that they try Final Cut by doing a project that is very different from what they usually do. Feature editors should try to make a music video. Documentary videos should try to cut a short film. Fast turnaround editors should try to cut a short documentary. That means they will be able to use Final Cut’s ways of doing things without constantly comparing them to they way they use Media Composer of Premiere every day. 

In the past I've been quite against FCPX and not interested in 're-training' so to speak, but as I'm being pushed to my limit with CS6 I have to make a decision at some point, as to which NLE to move onto for projects I work on alone.  In all honesty, I can see me using FCPX for these projects.

I'm looking forward to seeing what he says in six months. Perhaps in the meantime, another person who is experienced in Media Composer and Premiere could report how their point of view has changed over the months of using Final Cut Pro X!

UPDATE: A very useful detailed comment has been added below the post by Mathieu Ghekiere to help out Chris Chapman with his personal take and pointers to find out more:

Some notes from someone who went from a place of hate to a place of love and experience (woaw, that sounds a bit new age, didn't mean it like that!), you should also keep in mind that it will take a while for new muscle memory to get created in how the timeline reacts, how you organise stuff, etc. ... In my experience this started after about 3 months of editing.


Hope some comments could help you along the way. Happy editing!


With version 3, Thunderbolt is finally a cross-platform hit

Friday, 24 March 2017

Thunderbolt 3's recent success is good news for high-end post production.

When Apple replaced their (then) high-speed Firewire 800 connections with Thunderbolt ports in 2011, pro users were happy with the many improvements. Sadly USB 3 turned out to be 'good enough' for the vast proportion of PC users. That prevented a competitive market for Thunderbolt devices. That meant external drives and capture card prices didn't come down significantly

The situation didn't improve with Thunderbolt 2, despite it doubling bandwidth. It was for ‘Mac people who do postproduction' - not a large enough market for big economies of scale.

It looks like version 3 of Thunderbolt will finally go mainstream. It now has 2,750 MB/s of PCI Express bandwidth - significantly more than USB 3.

As well as Apple including in in recent MacBooks and MacBook Pros, Intel's Thunderbolt 3 buyers guide shows 52 PCs implementing Thunderbolt 3.

Bare Feats have been testing external GPU chassis that connect to Macs via Thunderbolt 3

Bare Feats are working on benchmarking a Sonnet eGFX Breakaway Box. I'll link to that report here soon.

QNAP have launched a Thunderbolt 3 NAS:

By supporting the SMB protocol, Final Cut Pro X 10.3 allows users to create a library on a NAS volume and use it as if it were on a local storage device. This simple-yet-important change allows users of Final Cut Pro X 10.3 (Mac users) and users of Adobe Premiere Pro (Windows users) to centrally store their video materials in the same shared folder on the NAS, greatly improves productivity for highly-collaborative projects in multi-workstation environments.

Simultaneous online editing by multiple users is made possible as the newly-added SMB protocol on QNAP NAS runs faster and more stable than the NFS protocol used in previous versions. Multiple Final Cut Pro users can edit different events and projects for the same media files simultaneously and combine each piece later to speed up the editing process, making it especially ideal for fast-moving media environments. Collaboration is also highly simplified now that Mac and Windows users can share files through SMB networks for separate steps of production work.

Felipe Baez points out that QNAP connects Macs to their NAS using Thunderbolt networking. From the same QNAP page:

Directly connecting a QNAP Thunderbolt 3 NAS to a computer establishes a peer-to-peer (P2P) network and enables 20GbE connectivity.

Detailed QNAP data sheet PDF on their Thunderbolt 3 NAS.

 It is good news that QNAP consider Thunderbolt networking is now reliable enough to be the basis of a professional product. Intel has a PDF on networking PCs using Thunderbolt. with a few notes on networking Macs using Thunderbolt:

  • To be able to communicate with another Windows computer through Thunderbolt networking, the Apple computer should share the same subnet as the other computer.
  • In a multiple communication environment, Apple computers act as bridges. Therefore any computer in a networking chain that is Apple should have the same subnets as its connected peers

Apple has a support note on Thunderbolt networking over USB-C:

  • Be sure to connect your Mac directly to the Thunderbolt 3 computer and not through a USB-C hub. USB-C hubs don't support Thunderbolt 3 connections between their ports
  • Make sure that the USB-C cable that you're using supports Thunderbolt 3. Not all USB-C cables support the requirements of Thunderbolt 3. For example, a USB-C charge cable doesn't support a Thunderbolt 3 connection

Apple’s support note on shared storage with Final Cut Pro X 10.3.

It is good that there are high-end solutions for Thunderbolt 3, but it is important to remember that these are driven by a market of game enthusiasts who want to run advanced games at or even watch TV on 4K monitors. A post by Jeff Atwood covers using a $500 external Thunderbolt enclosure to add a $600 video card to a $1,000 game PC.

Playing games at 1080p in my living room was already possible. But now that I have an incredible 4k display in the living room, it's a whole other level of difficulty. Not just twice as hard – and remember current consoles barely manage to eke out 1080p at 30fps in most games – but four times as hard. That's where external GPU power comes in.


40Gbps is, for the record, an insane amount of bandwidth. Let's use our rule of thumb based on ultra common gigabit ethernet, that 1 gigabit = 120 megabytes/second, and we arrive at 4.8 gigabytes/second. Zow.

That's more than enough bandwidth to run even the highest of high end video cards, but it is not without overhead. There's a mild performance hit for running the card externally, on the order of 15%. There's also a further performance hit of 10% if you are in "loopback" mode on a laptop where you don't have an external display, so the video frames have to be shuttled back from the GPU to the internal laptop display.

This may look like a gamer-only thing, but surprisingly, it isn't. What you get is the general purpose ability to attach any PCI express card to any computer with a Thunderbolt 3 port and, for the most part, it just works!

 Keep up with Thunderbolt 3 at Intel’s Thunderbolt site.

20,000 word Final Cut Pro X high-end postproduction guide

Monday, 06 March 2017

First came Mike Matzdorff’s Final Cut for feature films book: Final Cut Pro X: Pro Workflow (Apple iBook / Amazon Kindle

Now there is a series of articles on by Sam Mestman and Patrick Southern that cover high-end postproduction workflow. They include everything from how to handle 6K RED RAW footage on set to final DCP delivery for cinema showings and worldwide distribution.

Part 1 covers on-set post production including workflows for the camera department, the DIT/Assistant editor, production sound and script supervisor:

This 5 part series should be looked at as a cheat sheet on how to make a movie, pilot, or doc without limits in the modern age.

[…] Everything you are about to read has actually been done with a real world project called Off The Grid, which is We Make Movies’ first original TV Pilot that premiered at the Sundance theater in Hollywood. 

Part 2 describes how to save time when preparing footage for editing:

On Off The Grid, we took the search and organizational aspects of FCPX to another level. We automated most of the metadata management, applying that data to the original sound and picture.

We were able to automatically synchronize audio, batch rename clips, and add keywords and notes based on our Script Supervisor's log. This made it possible to quickly search by character, frame rate, frame size, shot composition, and circle takes.

[…]These automated organizational techniques can cut footage prep time from 3 days down to as little as 10 minutes

Part 3 shows how to maximise the editing experience in Final Cut Pro X:

In other NLE’s, Most editors either spend a lot of time making and renaming subclips, or pulling selects into a timeline for review. These both require lengthy prep and are more difficult to work with than necessary. Subclips don't give you easy access to the full-length clip, and long string-outs can be difficult to navigate for a specific clip.

None of this is necessary in FCPX. You can use a variety of tools within FCPX to find what you’re looking for. You can leverage the search bar, Smart Collections, Favorites, Rejects, Keywords, notes fields, and Markers to help you filter your choices to display exactly the thing you're looking for.

Part 4 details workgroup workflow and finishing:

It’s widely known that most shared storage systems have a hard time once you start dealing with 4K, 6K, and VR.  Less widely known is that most are also not optimized for the small database files that FCPX relies on or for its libraries and cache.

To work optimally on shared storage, FCPX Libraries need storage that is optimized for many micro interactions AS WELL as being optimized for high resolution codecs and framerates.

Part 5 finishes off with collaborating with colour, VFX and audio professionals:

When everyone has access to the same storage and knows how to speak each others’ language, completion of a project becomes exponentially faster. This results in happier teams, clients, and budgets. Ending the Tower of Babel of post production allows you not only save time and money, but it also allows you to put more of that time and money where it matters most…the craft of storytelling.

In summary: a great resource for those planning to make (or working on) a TV series or feature film with a workflow that has Final Cut Pro X at the centre.


Apple royalty-free audio library elements can be used in commercial productions

Wednesday, 15 February 2017

An important part of preparing any video or animation production for distribution is checking that it is copyright-cleared. Funders and distributors need to make sure that productions that become successful don’t lead to lawsuits from aggrieved copyright holders. 

As well as applying to video clips, the audio must be cleared. Music and sound effects are subject to copyright. Google’s YouTube runs software that can recognise music in your video that belongs to someone else. Sometimes this automated process recognises music that is royalty-free - copyright cleared.

If you do use royalty-free music or sound effects, it is a good idea to have a note of the permission you have to use in commercial projects. Apple have support documents that are helpful here. They say that you can use their royalty-free content in any way - apart as individual source files: be they samples or music loops. You can take all or part of their library and sell it to anyone else or make it available to anyone for free:

You may broadcast and/or distribute your own soundtracks that were created using the Sample Content, however, individual files may not be commercially or otherwise distributed on a standalone basis, nor may they be repackaged in whole or in part as clipart, stock animation, audio samples, sound files or music beds.

Here are the relevant support notes on this subject for the royalty free content that comes with Final Cut Pro X and Motion, Garageband and Logic Pro X/MainStage 3.

If YouTube erroneously doesn't like your soundtrack because you've used some content from a royalty-free library that comes with an Apple product you've bought, you can dispute the 'Content ID Claim’ by appealing on the YouTube site itself. WikiHow has an article on appealing this sort of thing. In the ‘I believe the copyright claim is not valid because’ section, you would click the ‘I have a license or written permission from the proper rights holder to use this material’ option and provide a link to the relevant Apple support document for the application you used to make your video.

Proper Final Cut Pro X competition - Snap?

Tuesday, 14 February 2017

Adobe and Avid are very unlikely to come up with any product or service that competes with Final Cut Pro X. Apple certainly don't act as if they have any competition. Avid is stuck at the high end, and Adobe squandered their chance to be the main supplier of video literacy products and services to the wider public.

That isn't great news for Final Cut users. There is no-one to put the ProApps team’ under pressure. Apple act as if their plan is fine and they are doggedly sticking to it. Because there is no serious competition. This isn't because of ‘iPhone money.’ With almost 2 million copies of Final Cut sold so far, there is no reason why it hasn't returned a good profit for the first five and a half years of its life. 

If Final Cut Pro X had had serious competition since 2011, they would have a built in compelling collaboration solution today.

As millions have free access to iMovie on macOS and iOS, Apple are spreading the gift of video literacy by default. Imagine if Apple had had some serious competition on that front over the last few years. Adobe’s Premiere Clip for iOS wasn't it.

Now Snap Inc. - providers of the Snapchat service and Snap spectacles - are going public. The good news for Apple fans is that Snap are telling potential investors that they plan to implement Apple’s strategy. Not explicitly, the similarity has been pointed out by Ben Thompson on his Stratechery weekly post:

To summarize, Snap’s strategy is to:

  • Deliver innovative and differentiated products that…
  • Cost a lot to deliver but…
  • Capture the best customers…and PROFIT!

That’s definitely not Twitter; indeed, the real analogy for Snap is from another part of technology entirely: it’s Apple.

Snap plan to use their insight into users and give them what they need - and providing access to the best customers to advertisers.

Luckily for iMovie and Final Cut Pro X users, Snap have no baggage holding them back: They are in a position to invest in tools for creativity for all sorts of users - without the precondition of using a 20th century metaphor or of needing to sell storage and service contracts.

Ben quotes Snap:

We believe it’s always worth trying to build something that will empower people to express themselves, live in the moment, learn about the world, and have fun together — even when it’s not clear that what we build will be successful or make money.

That’s the kind of competitor Final Cut Pro X users need around! I hope they do step up and try to change the world. Luckily there are a few others out there who might also take on the same role. I hope they all wake up soon!

On-location editing where the editor is the technician

Wednesday, 08 February 2017

A new case study from Peter Wiggins of shows that editors can manage their own workgroup shared storage - setup and maintenance:

The job needed two edit machines, an ingest station, a graphics station and connection to the truck EVS system.

Hiring an ISIS system for the weekend would involve an engineer attending and that would blow the budget. So I suggested that I could supply and look after a newer, faster shared storage system.

He arranged to use a Lumaforge Jellyfish video workflow server. Read the full story over at

Peter’s conclusion:

I can see more jobs where the editor/s are in control of the technical aspects of OB post production. They know how things should be configured, how fast things should run (and why) as they are the end users. The Jellyfish is not only a fast shared storage box, it is very easy to hook up to each client. No IT or field engineer required.

And the bottom line really is the bottom line. If a client has a fully working, fast edit system on location on a tight budget, everybody wins.

Although in this case Peter was working on fast-turnaround on-location sports editing, this model would work well for other sorts of location editing.

Modern post-production: No need for support contracts

Lumaforge’s model is to make money on the hardware and making the system simple enough so there are no support costs. Good news for productions, not so good for those hoping to make money on service contracts and those companies competing for workgroup editing hardware and service sales. I'm interested to see how others compete with Lumaforge’s Jellyfish solution.

Blackmagic Design: SD cards ready for prime time prime time

Tuesday, 07 February 2017

Blackmagic Design think that SD cards are ready for professional broadcast prime time. They have announced the Hyperdeck Studio Mini.

In the last few years high-end productions have been using SSDs to record and playback source media and final produced shows.

The launch of Blackmagic Design's new Hyperdeck Studio Mini shows that they think that relatively inexpensive SD cards have the capacity, speed and reliability to take over from SSDs.

a professional deck that records and plays back broadcast quality 10-bit video as ProRes files on commonly available SD and UHS-II cards. It’s packed with features like 6G-SDI for working with all formats up to 2160p30, HDMI 2.0 for monitoring, dual SD card slots for non-stop recording and a reference output

It seems that their new SD-card based record/playback deck has the features associated with hardware from 4 years ago at 100 times the price. 

I wonder if a connected computer can access ProRes growing media on the SD cards for live highlights editing.

Apple targets the young and educators with $199 post apps bundle

Friday, 03 February 2017

Jim Dalrymple of The Loop reports that Apple have announced a $199 bundle for those in education and teachers that includes full versions of applications usually priced at a total of $629.95:

The apps include Final Cut Pro X ($299.99), Logic Pro X ($199.99), Motion 5 ($49.99), Compressor 4 ($49.99), and MainStage 3 ($29.99).

A saving of $429.96.

The Pro Apps Bundle for Education is available for teachers, faculty, staff, and college students, as well as K12 and HiEd institutions.

Note that these are not student versions of these applications. This is a bundle of codes that are used on the Mac App store to download full retain versions. Since 2011 Apple haven't charged for updates. This means that if students take advantage of this offer and move into post production, they will probably not need to pay any more money for the professional tools they use to earn money.

No UK educational bundle for professional applications has been announced yet. Final Cut Pro X retails for £299.99. The educational price is currently £270.


Accelerating the late 2016 MacBook Pro with an external graphics card

Thursday, 02 February 2017

Bare Feats have been testing the new AKiTiO Node eGFX Box with their late 2016 MacBook Pro. This full-bandwidth Thunderbolt 3-connected cage can connect standard NVIDIA graphics cards to the Mac. Along with many graphs showing how much better every benchmark runs with different cards compared with the best possible internal GPU from Apple (the 4GB AMD Radeon Pro 460) they conclude:

The AKiTiO Node eGFX Box is the first Thunderbolt 3 GPU expander we have tested that is macOS friendly, runs at full Thunderbolt 3 bandwidth, has a built-in power supply, and ships with a 2 meter Thunderbolt 3/USB-C cable. And with the array of compatible NVIDIA GPUs to choose from, you can be confident that it will not only support your CUDA capable apps, but will accelerate OpenGL and OpenCL capable apps beyond what the MacBook Pro's discrete AMD GPU is able to do.

Rob ‘Art’ Morgan of Bare Feats later tweeted Final Cut Pro X-specific information. X will take advantage of the external GPU in the Node eGFX box if it has a display attached:

FCPX 10.3.2 works if external display connected to GPU in Node. During Directional Blur render, 70% load on 980 Ti, 0% on 480.

He tweeted a graph of how much Final Cut lays off GPU processing to an external graphics card:

Looking forward to Bare Feats adding some Final Cut tests using this setup soon!

OWC DEC expansion slice for 2016 MacBook Pro update

Thursday, 02 February 2017

At the beginning of January OWC previewed their DEC expansion system for the 15" 2016 MacBook Pro. They supplied images of a device that attaches to the base of the machine.

2016 MacBook Pro from left and right showing ports on prototype OWC DEC - version 1

The image they first used didn't show how the device is designed to connect to the Mac. You may have missed an update from the Mac Performance Guide in which they had a look at a DEC prototype.

I’ve seen the prototype of the OWC DEC first hand. The DEC bolts onto the bottom of the MacBook Pro after removing the bottom shell of the case. The result is a seamless integration with all the key ports I need (gigabit ethernet, USB-A 3.1, SD card slot), delivering what feels in the hand very much like the 13" 2012 MacBook Pro in thickness (but in 15" size).

Read more at Mac Performance Guide.

A few days after this report, OWC updated their visualisation:


Modular Expansion?

OWC says the DEC will make the MacBook ‘Pro’ again:

This solution seamlessly integrates with your MacBook Pro for increased capacity and expanded connectivity far beyond factory capabilities.

A vague statement. Maybe OWC could do with some feedback on what would be useful. 

Video editors and VFX artists hoping to use their new MacBook in workgroups with servers like Lumaforge’s Jellyfish would appreciate a DEC with 10Gb Ethernet. Others would want a DEC filled with extra battery only, useful for 24 hour journeys across the planet.

As there will be many different requirements by different professional users, I hope OWC offer a modular system.

This would similar to the way PowerBooks used to work in the 1990s: you could choose a CD drive or floppy disc or a battery in one of two slots. In 1999 Low End Mac stated that you could get 16 hours of power from using two batteries in a 'Lombard' Powerbook.

The OWC as shown could have two slots 150mm wide by 160mm deep - one on each side. Here are some ideas for modules to go in the slots:

  • Older MacBook ports: Ethernet, SD Card, USB 3 ports, Audio In, MagSafe, Thunderbolt/MiniDisplayPort, HDMI
  • SSD
  • Battery
  • GPU with external connector
  • High-end ports: 10Gb Ethernet, etc.
  • Optical drive
  • Retro: 3" Floppy drive
  • Retro: Firewire, ADB, SCSI ports

In some cases you would only need one of a module, in others two would be handy. What else would be useful for you?

Sign up at OWC to get DEC updates.


BBC News animation using Apple Motion and Final Cut Pro X

Thursday, 19 January 2017

I visited BBC New Broadcasting House in central London last week with Iain Anderson. As we were being shown around the BBC News headquarters, we happened across an animator working on a piece about Trump's inauguration day. He was editing it in Final Cut Pro X with plugins created in Apple Motion 5, both commercial and custom. 

Iain and I gave him some tips in making plugins - including replicator sequence behaviours. He also showed us BBC-branded plugins he made for Final Cut Pro X titles and animated graphs.

We also were happy to talk to others at the BBC on the latest developments in Final Cut Pro X workgroup collaboration.

Google intelligent image scaling could mean 4K quality at HD data rates

Thursday, 19 January 2017

PC magazine has written about a new Google technology that shows very good results in scaling up lower resolution stills to higher resolution:

The new technique is called RAISR, which stands for "Rapid and Accurate Image Super-Resolution." [It] works by taking a low-resolution image and upsampling it, which basically means enhancing the detail using filtering. Anyone who's ever tried to do this manually knows that the end result looks a little blurred. RAISR avoids that thanks to machine learning.

Check out the article for examples of what RAISR can do.

Imagine this idea applied to ‘UHD’ video distribution. Store frames as 1920x1080 but with more colour information. To display 3840x2160 video, the algorithm could interpret the missing pixels and show an image with detail that even those very close to the display would find indistinguishable from real 4K.

Although this would be hard to do in real-time (in less than 1/60th or 1/120th of a second depending on frame rate), the the algorithm would need to interpolate less detail for moving video. There are limits to what the human visual system can discern. If video was slowed down or paused, the algorithm would have more time to produce higher quality 4K.

In practice 4K for broadcast is mainly for bragging rights, less than 1% of people sitting at a comfortable viewing distance are able to see the improvement over HD. It makes more sense to dedicate precious bandwidth to better quality pixels. This algorithm will then be able to scale the image up well if need be.

Logic Pro X 10.3: It’s new iCloud feature would be big for Final Cut Pro X

Wednesday, 18 January 2017

Logic Pro X 10.3 is out. Alongside Touch Bar and other UI improvements, there's an ProApp feature that should interest Final Cut Pro X users: Syncing with Garageband for iOS via iCloud. According to 9to5 Mac

Tap a button in Logic Pro X and your whole project will now be sent to iCloud as a single reference track. That allows you to open up Garageband on your iPhone or iPad and continue working on the project in a way that doesn’t require you to upload heavyweight session files or bounce and pass around your own audio files in email or elsewhere. Once you add some new tracks in Garageband, the project will be synced back to iCloud ready for you to continue working in Logic Pro when you get back to the studio.

iCloud doesn't quite have the storage for a similar service for Final Cut users. However a 'Share with iMovie for iOS via iCloud Proxies’ feature would be useful in many workflows.

Imagine the following message you could send ‘Here's the latest edit. Open it on iMovie on your iPhone, iPad or Mac. The timeline will link to the media in the cloud. You can roll the edit points a second or so in each direction, and change clip metadata. If you need to, you can also change effect/transition/title/generator settings and modify markers and keywords. Send me your version when you are ready.’

Recent Alex4D posts on Medium

Wednesday, 30 November 2016

Apple’s ProApps team should get their heads into the Cloud

For post production, Apple might be able to infiltrate media organisations via everyone except the post-production team. At some point when writers, researchers, producers and directors ask why the work they’ve already done in Final Cut needs to be transferred to a ‘professional system like Avid (or Premiere)’ — post people won’t be able to come up with a good enough argument.

‘New video distribution models mean new video creation tools

Post tools — be they high-end editing applications or free online services — will need to be able to create stories made of clouds of video, audio, graphics, effects, pictures and transitions. Each of the main video and audio post-production tools are at different points on the path towards being able to do this. They are limited by history, user interface metaphor and ability to deliver.

Apple have great plans for Macs, which don’t include turning them into giant iPhones

I don’t think Apple will make touchscreen Macs — specially ones with large screens. Direct manipulation of a UI much larger than an iPad Pro for hours on end is a choice between aching arms (screen in front of you) or aching neck (screen at an angle comfortable enough for your arms).

Good News: No More Wireless Networking Products from Apple

Why is this good news? The engineers in these teams are now free to do more distinctive things. Obviously Apple thinks that there is not much they can add to these products in coming years. They’re leaving these mature markets to others.

Now video editors can use the macOS Finder like a media database

I was lucky enough to be presenting at the 2016 FCPX Creative Summit a month ago when Benjamin Brodbeck came up to me with a problem. He works at Caterpillar Inc. and has a team of over 30 editors and assistants working on videos being shot and edited all over the world.

They constantly need to make films covering specific regions, models of equipment and categories of engineering project. Categorising footage is very important. A film featuring forest equipment being used in China can’t feature a shot of the wrong piece of equipment, or one taken in the wrong country.

To keep up to date, follow me on Medium.

Avid no longer going after mid-market and individual creatives

Saturday, 15 October 2016

It looks like Avid Technology are changing their strategy. The old strategy was described in a May 2015 video (no need to enter your information just click 'Submit' to see the video): get larger proportions of what they call Tier 2 and Tier 3 markets -The $3.1bn business and institutions market and the $1.8bn individual creatives market.

Now they are talking about capturing more of the workflow in Tier 1. CEO Louis Hernandez Jr. appeared on CNBC's Mad Money show with Jim Cramer:

Avid's behind some of the largest media companies" … "A hugely influential player in 140 countries around the world

Avid editors…

will earn 44% higher than any other editor over their lifetime

[2:03] He refers to Media Composer and ProTools as 'Heritage’ products. As there is much more to workflow than editing and mixing, Avid has had to build on those tools to "…participate in the rest of the workflow”


These transformations are never that easy […] We are surging in cloud subscriptions and large enterprise deployments

He said that they previously announced that their transformation would be over in Q2 2017, but they discovered they had to restate 8 years of financial results. The upside of being delisted was that although the transformation was delayed, they were able to more quickly make the investments they needed to transform the way they planned to, so they are still on track.


We've said the end of this transformation is 2017, mid-year. You're starting to see with every given quarter real progress. You've seen platform sales up last quarter from quarter to quarter of 47%. You've seen 4 times growth on our cloud-based subscriptions and more to come

He later said that NBC needed to make a transition to digital economically, and was able to do so using Avid Technology technology.

I’ll leave it to Avid users themselves to determine whether recent surges in ‘cloud-based subscriptions’ were from new users or from those changing the way they already paid for Media Composer and ProTools.

My guess: they are hoping for an acquisition. Who would be a good fit? If you believe in Avid’s strategy, one that has products and services that complement the aim to provide solutions for the whole workflow at the high end. This means that Avid doesn't like the idea of selling Media Composer to someone else, even if there's almost no chance that it can be anywhere near as profitable as the audio side.

I wonder whether investors have a way of measuring how successful Avid's new strategy is successful. I'd be surprised if the results of the previous one were going to look very good. 

Recent updates to my Alex4D Facebook page

Friday, 09 September 2016

I've been adding of links and articles to my public Final Cut Pro X page on Facebook. Here are links to some of them:

26 July: A 360° film I shot on a Kodak PixPro SP360 4K Rig:

It shows a pilot and principal scientist Professor Alex Rogers taking a submersible from the #BaslineExplorer, the Nekton Mission deep ocean research ship, down to the Atlantic ocean floor 30 miles off the coast of Bermuda

10 August: Another 360° film:

The rule with VR video: keep the camera still. We made this with a 360 camera attached to a speeding boat criss-crossing the ocean while the crew recovers a research submersible after a dive into the deep ocean.

18 August: Apple's ‘multi-ranges in a clip’ patent:

It patents a way of selecting and showing multiple ranges within the same clip […] Part of the description of a way multiple ranges don't yet work in Final Cut

19 August: Apple patent hints they've been thinking of expanding the use of roles:

Many want a roles-based mixer and effects system in the next update. This patent awarded to Apple earlier this year should give them hope.

20 August: Tutorial on exporting Final Cut timelines for delivery to broadcasters - including a link to the UK standard:

How do you send specific channels in your audio mix to channels in your output file in Final Cut Pro X? Adam Schoales shows you how.

23 August: How Apple's culture determines how they compete

Apple, Adobe and Avid seem to compete in the world of post production, but their cultures are so incompatible as for there to be no real competition.

26 August: There'll be Final Cut Pro X special event at IBC in Amsterdam

I've giving a presentation on shooting, stitching, editing, adding graphics to and distributing VR Video using Final Cut Pro X. I will also be in the demo area on Saturday and Sunday to answer your 360 video production questions.

29 August: An article comparing Apple and Google might also apply to Apple in the post production world:

If Apple update Final Cut Pro X to version 10.3 later this year, I hope Final Cut moves from a primarily interdependent architecture to a primarily modular architecture.

29 August: Want to only transcode the media used in a specific project?

Useful if you have a few hours of 360° 4K clips, but you only need to generate proxy versions of clips you've selected in a project for editing on a slower Mac.

30 August: Join me at the Amsterdam Supermeet:

IBC – the biggest European video trade fair – has many stands for those for shoot and post produce video. Sadly it also has very many stands that are of no interest to the same people.

The great thing about the Amsterdam SuperMeet is that all you need is concentrated in one evening of presentations, prizes and people.

1 September: Editors make Hacktintoshes while they wait for upgraded Macs:

The idea behind a Heckintosh is that you get a Mac configured exactly the way you want using standard PC parts for much less than Apple charge.

In practice, the time it takes to build and maintain your cobbled together computer obviates the price advantage - but you can get exactly the Mac you want

6 September: Adobe has beaten Apple when it comes to collaborative editing:

Big Adobe Premiere Pro news: Team Projects - the ability to share timelines between editors.

7 September: The Apple ProApps team have a vacancy:

The Apple Professional Apps Design group is looking to employ a ‘Video Applications Product Designer’

9 September: Additional information about Adobe Premiere's Team Projects feature

Although some would call the implementation clunky, it is only competing with Avid's ancient (but battle-tested) bin-locking alternative.

Check out every few days to see more.

Apple WWDC 2016 Announcements and Post Production

Tuesday, 14 June 2016

Every year at their Worldwide Developer Conference Apple presents some of their plans relevant to software and hardware developers at a keynote presentation. Here are my notes and links from the 2016 keynote.

The main screen and the webcast stream didn’t have the normal 16:9 ratio. It was wider at the Cinemascope ratio of 1:2.40. Could this be a hint that a future Apple-branded display will have a 21:9 (1:2.33) aspect ratio?

New Name

As iOS will reach version 10 this Autumn and OS X has been around for over 16 years, Apple will now rename their Mac operating system macOS. The next version will be macOS Sierra, version 10.12. This renaming will make Final Cut Pro, Logic and iMovie stand out as being part of an older naming scheme.

There’s a chance that iMovie will become ‘Movies’ for iOS and macOS - following on from how iPhoto become Photos. An alternative is that productions started in iMovie will be edited in macMovie and then be openable by macFCP while the soundtrack is modified in macLogic. More likely is that Final Cut and Logic will simply drop their X suffixes.


Siri for macOS means that Macs will be able to be controlled by voice as iOS devices can be today. SiriKit for iOS 10 gives a limited set of third party applications the option to be controlled by Siri.

If SiriKit was introduced to macOS the ProApps team would have the option to add much more voice control to their apps. This would be especially useful for finding clips based on keywords and other metadata. As well as asking “Show me clips in the browser with the ‘Interviews’ keyword” or “Show me clips in the timeline with dual mono,” Siri also understands context: “Show me interview clips… show me those with dual mono” will only show interview clips with dual mono - not first one selection of clips followed by all clips with dual mono.

Although there are many different ways of asking for the same thing, those are interpreted by Siri and passed to the target app in a standard way. This kind of automation would work well with scripting. Apple has released a new guide on that subject: The Mac Automation Scripting Guide. Currently there are no hints that scripting will be added to iMovie/Final Cut yet.

For now SiriKit for third party iOS apps will only be used for the following tasks:

  • Audio or video calling
  • Messaging
  • Payments
  • Searching photos
  • Workouts
  • Ride booking

WWDC 2016 session on SiriKit.

New Photos features useful for video

Photos for iOS 10 and macOS Sierra will have a couple of new features of interest: more advanced content recognition and the automatic generation of ‘Memories’ videos.

As well as recognising all photos with a specific person, Photos will also recognise other kinds of content. This means that photos can be grouped based on the content detected. Examples include photos with beaches, with horses, shot in fields. This kind of automatic categorisation will be very useful for iMovie/Final Cut users - especially when clips are very long. The content recognition should be able to mark only the time in a long shot when a certain person or object appears.

Using this image recognition technology, Photos will also be able to generate ‘Memories.’ A Memory can look like a web page or publication on a subject. Memories can include videos made up of automatically animated photos. If users want to change the mood of a video, they can choose a new soundtrack and the story will be re-generated to match the music.

Will these video Memories will be modifiable in iMovie or Final Cut Pro X? It would be a very quick way to get new people into making movies. The same technology could be used to make automatic videos from selected clips in a video library.

Differential Privacy

Apple have found a way of using information from millions of Apple users to power services without compromising any specific individual’s privacy. ‘Differential Privacy’ is a mathematical method that ensures privacy when sharing data from millions of people.

Specific mathematical equations define a specific amount of ‘noise’ to add to a single piece of data. This noise makes the original data associated with a specific person impossible for anyone - including Apple - to decode. The trick is that when hundreds of thousands of pieces of unbreakable encoded data are combined together, there are statistical measures that will be able to detect trends amongst all the results. Apple will have no way of knowing what an individual value was, but will have an accurate representation of the distribution of all the original values over a large population.

This is the way Apple is able to use the large amount of private information it has access to provide intelligent services. An original mathematical paper: “The Algorithmic Foundations of Differential Privacy.”

Messages and iMessage Apps

Messages in iOS and macOS will get a big upgrade this year. Apple will provide a range of stickers and animations that people can use in conversations. For example ‘Invisible Ink’ will make an image blurred until each person in the conversation swipe over the picture. They also will be able to annotate other people’s messages and pictures. They’ll be able to add animation to speech bubbles, emoji and pictures.

As well as Apple-supplied animations and effects, third-parties will be able to make iMessage Apps to do more with messages. 

I hope Apple define a new graphic and animation file format for Messages that can be applied in other applications, such as Photos, Keynote, iMovie and Final Cut Pro. A metadata-driven format will display differently depending on the device showing the graphics. This will be useful when videos are made up of objects: video clips, images and metadata that tells the playback compositing software how to present the story.

If Apple start presenting Messages as a place for ad-hoc group-based collaboration (for play or for work), there should be a place for video.

WWDC session Part 1 and Part 2

Recording and playback of multiple simultaneous video streams

Created for those who want to record on-screen gameplay for later sharing online, ReplayKit for iOS now adds simple live streaming plus the ability to also record the play themselves commentating using a front-facing camera. This means a standard UI for viewers to be able to switch between ‘angles’ in a playback stream whenever they want. 

A new file system: APFS

The APple File System is designed for modern storage devices. The current file system - HFS+ - was designed to work with floppy discs. APFS is designed for Flash/solid state memory. HFS+ is known to degrade over time - normal day to day usage will result in files getting lost. APFS is designed for recoverability. It will be much easier to get at ‘deleted’ data. It will handle backups much more smoothly. 

As with Final Cut Pro X projects, the state of whole drives or parts of drives can be captured in a Snapshot.

A new file system doesn't mean a new Finder. It means that applications that spend most of their time manipulating files - like the Finder - will need to be updated to understand the new ways of organising documents and applications on storage devices.

Apple’s programming guide to the Apple File System. Ars Technica on APFS.

Important: APFS is released as a Developer Preview in OS X 10.12, and is scheduled to ship in 2017.

Better colour

The new Wide Color system framework will add wide colour gamut picture/picture capture and manipulation to iOS and macOS. Following on from its introduction to recent iMacs and iPad Pros, Apple have settled on the DCI-P3 gamut - the standard colour space used to specify the colours used in US cinema projection. Some think Adobe RGB would have been a better choice.

Sharing private data via CloudKit

CloudKit Web Services Reference:

You use the CloudKit native framework to take your app’s existing data and store it in the cloud so that the user can access it on multiple devices.

Currently any data that is stored in the cloud using Apple’s CloudKit framework is either public or private. This year CloudKit in Apple OSs will add the ability for iCloud users to share data amongst themselves.

This would be very useful for post production applications. For example Final Cut could upload proxy versions of all media (or media used within a specific project) so that collaborators would be able to have a live timeline to work with.

WWDC 2016 session.

QuickTime in Apple OSes

QuickTime as a container for video and audio files has a great future. The AVFoundation framework is the basis of Apple software that records, manipulates and plays QuickTime documents (amongst other file formats).

QuickTime the software framework is depreciated in macOS. This means that applications that use the QuickTime API will still work in macOS Sierra (10.12), but may not work in a future version. There is no way yet to know if Final Cut Pro 7 will work in macOS Sierra, but my guess is that it probably will.

As part of building applications Xcode, Apple’s development system, checks to see if the code uses old or depreciated OS features. It uses a API Diffs file to look at all code. The QuickTime part shows that the API headers have been removed. The AVFoundation part shows a lot has been added.

QuickTime the API has been depreciated for a while. Removing the headers means that applications can no longer compile if the code uses the old API. Applications already compiled on older OSes will still work in macOS Sierra.

Once again, the file format lives on. The part of Apple OSes that manipulate media called QuickTime will be replaced by AVFoundation eventually. This shouldn’t be a problem for Mac users of old applications for now. Remember that one day they will not work in a future version of macOS.

Apple and the future of media

Apple didn’t make any announcements directly relevant to post production. There was no mention of Retina displays, 4K, VR or 360° video.

On the other hand they laid some interesting foundations for collaboration. One day we might look back at this week and see elements vital to a new product or service introduced in coming months and years.

I'm looking forward to seeing what happens next.

os edition student edition cost of creative suite 6 master for apple mac