New video-related Apple patents - 23rd May 2017

Tuesday, 23 May 2017

Yesterday Apple were awarded patents covering frame rate conversion detection and shot stabilization.  

9,661,261: Video pictures pattern detection

For a video that has been converted from one frame rate and format to another frame rate and format, the application detects the conversion method that has been used in the conversion of the video.

9,661,228: Robust image feature based video stabilization and smoothing

The method matches a group of feature points between each pair of consecutive video frames in the video sequence. The method calculates the motion of each matched feature point between the corresponding pair of consecutive video frames. The method calculates a set of historical metrics for each feature point. The method, for each pair of consecutive video frames, identifies a homography that defines a dominant motion between the pair of consecutive frames.

 

 

 

 

 

THX innovator named in new Apple audio patent

Monday, 22 May 2017

It is curious that iMovie for the Mac offers an auto-audio ducking feature, but Final Cut Pro X doesn't. Curious because the free iMovie and the $299 Final Cut Pro X share a great deal of code and resources.

Audio ducking reduces the volume or dynamic range of other channels to make one channel’s sound easier to hear.

Tom Holman is a veteran movie sound expert. Lucasfilm's THX cinema sound certification system is named after him. Since 2011 he has worked for Apple. Last week Apple was awarded a patent with his name on it: ‘Metadata for ducking control.’ It describes a process where audio is analysed to generate metadata on how to adjust other audio channels at the point of playback:

Application of these ducking values may cause (1) the reduction in dynamic range of ducked channels/channel groups and/or (2) movement of channels/channel groups in the sound field. This ducking may improve intelligibility of audio in the non-ducked channel/channel group. For instance, a narration channel/channel group may be more clearly heard by listeners through the use of selective ducking of other channels/channel groups during playback.

I hope this kind of metadata will be generated, read and written by Final Cut Pro X and encoded in QuickTime codecs soon!

 

VR News news - The state of the art in 2017

Thursday, 11 May 2017

Zillah Watson, a news producer with more VR experience than most, has written a report on VR and news broadcasting for The Reuters Institute for the Study of Journalism.

One point in the executive summary calls for news people to join together to lobby the tech world to reduce the walled gardens and create better hardware and standards for wider VR adoption.

Some excerpts:

The proliferation of content created through experimentation is solving some of the challenges involved in VR/360 storytelling. Journalists and news organisations are devoting more time to thinking about what works in VR, and as a result news VR is expanding beyond its early documentary focus. However, most news organisations admit that there is still not enough ‘good content’ to drive an audience.

 

360 may be a good short-term solution to increasing the availability of content. Alongside developments in storytelling, we see some impressive attempts to integrate VR across production, which across the board means that hundreds of journalists have now been trained to shoot 360.

 

The news industry needs to work harder at managing public expectations of VR. Playing with 360 may be fun for journalists, but the audience needs to be put at the heart of any serious future plans for VR. Audience adoption requires consumer literacy in how to engage with the new technology. Even if part of that education happens through audiences’ consumption of VR content in other areas – sport, gaming – news still has to show them why it is worth engaging with via this new medium.

Too many standards - don't count VR video out

There are too many platforms: the ‘walled gardens’ around different VR platforms makes it expensive to produce content for a range of devices. There are parallels with the early days of mobile apps, which required different builds for each. Bandwidth is also an issue for viewers consuming this content.

Platforms and device manufacturers need to up their game if they are going to get mainstream audience adoption. This includes improved hardware and common platforms to provide a frictionless user experience, and lower costs for headsets and bandwidth. 

The news industry needs to work together on this to present a united front when lobbying the tech platforms.

[Emphasis mine]

Although many see 360/VR video only as a gateway to 'full' VR, I wonder if the multiple VR platforms will coalesce faster than video - including 360 video - gets richer. Flat rich video will eventually be broadcast as objects: a cloud of video, audio, text and 3D objects that can be played back or even interacted with using standards-based players on the internet and on set-top boxes. Once that works with flat video, it could make 360 more interesting, which could lead to a full VR standard.

Broadcasters and technologists’ report on VR

Wednesday, 03 May 2017

DVB (Digital Video Broadcasting) is an industry-led consortium of the world’s leading digital TV and technology companies, such as manufacturers, software developers, network operators, broadcasters and regulators, committed to designing open technical standards for the delivery of digital TV and other broadcast services.

Late last year DVB commissioned a report (PDF) to see whether they should set up a group to define a standard for VR to be used with digital broadcasting. Here are some quotes:

We first look at the market segmentation between the tethered devices (Oculus Rift, HTC Vive), game platforms (Sony PS VR) and untethered devices (Gear VR, Consumer HMD, Cardboard). We predict that untethered devices will be 10x the volume of tethered ones, that will appeal more to gamers ‘community.

 

We assess the size of the market on the device side considering different market researches available on a 2020 horizon. A medium scenario shows $20B revenue in 2020. This is followed by a market sizing of the VR Video services on a 2020 horizon.

We estimate that by 2020, VR will generate between $1.0B and $1.4B revenue, the largest application being Live sports. VR Theme Parks & VR arcade games will be a lucrative business for both games and video and will, just as GPS was democratized with car rental, help evangelize VR.

 

  • Principal bodies involved in VR standardisation include ISO, IEC JTG MPEG, JPEG, and DASH IF, and possibly ITU-T and ITU-F in future. It is not clear how their activities overlap which may become the dominant standards for VR.
  • MPEG are developing an Omni-directional Media Applications Format (OMAF) standard, as well as a Media Orchestration (MORE) interface for video stitching and encoding, and are considering Tiling mechanisms for region of interest encoding (using a dual layer SHVC approach).
  • JPEG are developing various file formats including: JPEG XT (omni- directional photographs), JPEG XS (low-latency compression format for VR), and JPEG PLENO (lightfield video format).
  • 3GGP are looking at VR standardisation for wireless mobile services, considering delivery of VR video content through current as well as 5G systems.
  • DASH-IF are planning test and trials of VR delivery using DASH technology
  • A VR Industry Forum is currently established to promote VR: which may develop guidelines, encourage use of common formats, and share experiences with VR.

 

  • It is likely the main commercial driver for tethered VR will come from gaming, whereas the main driver for untethered VR will come from immersive video for sports and music events. The demand for content will depend on its availability and quality of experience.
  • DVB should cooperate with standards bodies working in VR, as members will need to adopt common specifications for stream delivery of VR content. Requirements are needed for the minimum technical quality of VR video and audio, particularly to reduce cybersickness. Requirements should be completed within two years (mid-2018)
  • In terms of quality of service, consideration must be given to the desired frame rate, field of view, visual acuity, degree of visual and audio immersion, head tracking latency, and visual overlays
  • VR audio will need additional support, both for broadcast and broadband transmission.
  • In the short term support is needed to avoid a multiplicity of groups and proprietary panoramic 3 degrees of freedom VR video systems, and considering requirements key parameters such as frame rate, resolution, use with tablets etc. For example, Sky’s provisionally specifies the following VR formats: video: 2-4K resolution, H.264, 25-50 FPS, 20-60 mbps bitrate, audio: stereo or ambisonic.
  • For the longer term it is recommended to continue the study mission to follow developments such as panoramic 6 degrees of freedom VR, augmented reality, and mixed reality.
  • Commercial requirements group would begin their work the questionnaire to DVB members. In addition, the group may consider developing a DVB VR garage, where VR technologies could be neutrally badged under DVB.

 

Sennheiser pushing 360º audio recording with forthcoming prosumer headphones - UPDATED

Monday, 01 May 2017

A step towards ambisonic audio going mainstream: the forthcoming Sennheiser Ambeo Smart Headset headphones have microphones in each ear and (I assume) a sensor to record head position. The device encodes ambisonic audio which is sent to your iOS device to be recorded as an audio file or as the soundtrack to video you are recording. [Not correct, see update below]

Ambisonic audio records a sphere of audio - so that when you play it back, if you turn your head, the sound seems to stay in the same place. This is more like the real world where if you hear a door open to your left and turn to see who is coming in, the audio source will come from in front of you, not from your left.

 

Richard Devine reports in iMore:

VR and AR is the latest hotness, and a big part of the experience there is the audio. After all, having a fully immersive, 360-degree visual experience is going to lose a lot without the necessary audio to go with it. As Sennheiser said during its brief presentation at the event, your eyes see information, your ears hear emotion.

Ambeo is the branding applied to the company's 3D audio products, and it already has one of the world's first portable VR microphones. The idea is straightforward; just as you're capturing 360-degrees of video, Ambeo captures audio in the same sphere, rather than a flat plane as you'd get with regular video content.

Upon plugging the Smart Headset into an iPhone, you're prompted to install a companion app which doesn't yet exist. It's early days still, so that's something we can overlook, but you don't need it. The iPhone detects it just fine as an external microphone and you can use it with the stock camera app or a third-party one such as Filmic Pro.

Sennheiser has not yet announced a release date. The first version will have a Lightning connector for iPhones and iPads. A following version will have a USB C connector. 

2nd May UPDATE:

 

Peter Hajba has pointed out on Facebook that this product will record binaural audio, not ambisonic. Sennheiser’s more specific press release from earlier this year.

Thinking it through, it would be very impressive for this to work as I hoped without significant separation between microphones - and without a third mic!

Blackmagic Design has sights set on Avid with DaVinci Resolve 14

Monday, 24 April 2017

This morning at NAB in Las Vegas, Blackmagic Design launched the public beta of DaVinci Resolve 14.

CEO Grant Petty said they focussed on three areas:

  • Performance
  • Audio
  • Collaboration

Performance sees a new playback engine that is up to 10 times faster than previous versions. Grant showed a 4-up multicam playing four streams of H.264 video at once with no problems. He said it works as smoothly on a notebook. Playback is now instant with J K L editing and live trimming to the current frame of the clip that is playing back.

Audio is a full copy of Fairlight’s movie and TV audio postproduction software built into Resolve. Grant took a swipe at Avid’s ProTools by saying that Fairlight’s tools are designed specifically for post. Like Final Cut Pro X, you can edit audio at sub-frame resolution - down to the sample level.

Collaboration means that if two people are working on the same timeline, the either person can load up the other version of the timeline and Resolve 14 will show the differences. They can then choose to accept or reject edit by edit. You can also use secure chat to connect to people via the local network or the internet. These chats can also be stored and kept with the project being discussed.

Other improvements include tools to make it easier for non-colourists to grade footage. One tool can analyse footage to detect faces and then make changes to those faces using controls based on modifying on human features - such as changing the eyes, nose or mouth.

Editing

  • Bins can have their own windows, footage can be dragged from the Finder into them, and clips dragged between bins
  • Smart bins list clips based on metadata
  • Source monitor can also show audio waveforms while watching video
  • Dynamic trimming with JKL keys
  • Trim multiple clips at once
  • Save commonly used transition settings to the effects library
  • Marker names and comments can be overlaid in viewers
  • Named marker ranges from a source clip can be edited from bins to the timeline

Find out more about on editing in Resolve 14 in Blackmagic’s YouTube video. There are also YouTube videos on New Features and Colour

The price now matches Final Cut Pro X: $299 down from $999. Since Blackmagic released their first version of Resolve, they haven't charged for upgrades. 

The open Beta is available for Mac and PC today from Blackmagic. Although it is called a Beta, Fairlight was only bought by Blackmagic 6 months ago, so the audio parts are more at the Alpha development stage.

I would suggest that if you like editing with a track-based NLE, you should take a good look at Resolve 14. There's a good chance it does much of what you can't easily do at the moment and will improve faster than Avid or Adobe will be able to catch up.

Interesting times!

 

 

 

 

Apple hires Tim Dashwood, 3D movie and 360º video software pioneer

Sunday, 23 April 2017

Apple’s new hire is an expert in the kind of high-end post production that needs serious Apple hardware. After recently reassuring those despairing over the future of the Mac Pro, with this hire they show that  they are serious about making sure their software keeps up.

The first sign that something was up came on Friday 23rd April. The products sold by Dashwood Cinema Solutions that work on Final Cut Pro X, Motion 5 and the Mac versions of Adobe Premiere Pro CC and Adobe After Effects CC were suddenly available for free.

Now the news has gone out that Tim has been hired by Apple.

Dashwood Cinema Solutions is effectively one man: Tim Dashwood. Creator of post production tools for the Mac for over 10 years. He first gained attention from Final Cut Pro users when he introduced plugins for making 3D feature films. In recent years he’s been making 360º video toolkits for Final Cut Pro and Premiere. His high-end toolkit cost $1,199 until Friday. It is used for high-end 3D VR video post production at places such as Jaunt, where it was used to make Pure McCartney VR. (for more on how these films were made, read my article over at fcp.co). Tim and I feature in this video on how VR video works in Final Cut Pro X with his plugins.

This is good news for those who use Macs to make VR video. As the tools are now free for editors and motion graphics designers, the barrier to entry is that much lower. It is unknown whether Tim has joined the ProApps team to work on future versions of Final Cut Pro X and Motion or has joined the OS team. If he is now with the OS team, it is possible that his technology (which is partially implemented in Quartz Composer) will be made available to developers on Apple platforms via updates to the AV Foundation media toolkit.

Although Dashwood's 360VR Toolbox works well on current MacBook Pros, everyone agrees that the more GPU power you have, the better. Hiring Dashwood means that Apple expects to add much more GPU horsepower to Macs and perhaps iOS devices.

Good news for video editors, motion graphics designers and those wanting to tell their stories with VR video.

Apple has updated their ProRes White Paper

Tuesday, 18 April 2017

In April 2017 Apple updated their white paper on their ProRes codec family. Here are the changes.

An addition about recent changes in Final Cut Pro X:

A variety of cameras can now capture and record a wider gamut of color values when working in log or raw formats. You can preserve this wider color gamut by recording with the ProRes LOG setting on certain cameras such as the ARRI ALEXA or transcoding from the RED® camera’s REDCODE® RAW format. Final Cut Pro 10.3 or later can process color in wide color gamut and output Apple ProRes files in the Rec. 2020, DCI-P3, or D65-P3 color space. This results in deeper colors and more detail, with richer red and green areas of the image.

With Final Cut Pro 10.3 or later, you can also export Apple ProRes files inside an MXF metadata wrapper instead of exporting .mov files. This makes the exported video file compatible with a wide range of playback systems that rely on the MXF standard for broadcast and archiving.

The graph for how many multicam streams are simultaneously supported at HD and 4K on a MacBook Pro has been updated.

Testing conducted by Apple in March 2014 using shipping 15-inch MacBook Pro with Retina display quad-core 2.6GHz units with 1TB flash storage, 16GB of RAM, NVIDIA GeForce GT 750M graphics

Testing conducted by Apple in January 2017 using shipping 2.9GHz quad-core Intel Core i7-based 15-inch MacBook Pro systems with 2TB SSD, 16GB of RAM, Radeon Pro 460 graphics

They also made some changes to the Target Data Rates table. 5120 x 2160 has been replaced by 5120 x 2700, 6K and 8K. Here is a table showing both 5K resolutions and the new rows for 6K and 8K:

Avid vs. Adobe vs. Blackmagic vs. Apple - A mild skirmish before a real war

Tuesday, 18 April 2017

Those involved in trying to make sure ‘their’ NLE wins in the ‘war of post production’ would do well to take notice of the big changes coming to post in the coming years. Amazon, Google, Facebook, Microsoft and maybe the wider Apple may come along and make the current battle seem small.

Amazon have now rebranded their Elemental acquisition as ‘AWS Elemental, an Amazon Web Services company.’

Elemental, which makes high-speed video encoding and transcoding software to enable multiscreen content delivery across different devices, was founded nine years ago in Portland, Ore., and counts among its many customers ABC, BBC, Comcast, Ericsson, and ESPN.

AWS Elemental is first being marketed as a multi-platform distribution technology company. Amazon have been making more purchases in the media area. They also now own Thinkbox - who make cloud-based and on-site based graphics rendering tools for VFX. 

Thinkbox Software has provided creative tools and pipeline technology for both small and large scale projects including the worlds largest feature films. Transformers: Dark of the Moon, Thor, Green Lantern, Harry Potter, Avatar, Tron, GI-Joe the Rise of Cobra and hundreds of other films have utlized our software in front of and behind the screen. Our tools have been at the core of award winning projects, music videos, commercials and hundreds of hours of creative content spanning film, broadcast, commercial, marketing, games and web content.

If high-end VFX is something that Amazon think worth getting into, maybe general purpose video editing will come soon.

Which NLE UI and technology would be a good purchase for Amazon? 

UK TV assistant editor tries Final Cut Pro X

Friday, 14 April 2017

Chris Chapman is an edit assistant who works on UK prime time soap opera Emmerdale. He has written a detailed blog post on his impressions based on two weeks trying out Final Cut Pro X.

I chose to download the FCPX 30 day trial to edit a project with, and see how I liked it, this blog post is essentially a breakdown of how I worked, what I liked and disliked about FCPX, and if I think I'll move over.  I think my situation is probably similar to many others, who edit occasionally at home and are looking for the most cost effective solution to work with; especially as students have just had their Adobe CC subscriptions almost doubled recently. 

He had positive things to say…

…the option for creating Optimised and Proxy Media is excellent.  This very simple, but very useful, feature is truly brilliant.  It essentially creates ProRes media in the background to put all media on the same level playing field: 'Optimised Media' up's lower quality footage (such as H.264) to ProRes 422 to give a better editing experience, 'Proxy Media' creates ProRes Proxy media to improve playback for higher quality footage you struggle to run at full resolution (such as R3D, RAW or 4K).  I found the background rendering to be very fast when I tested it, but I was running mostly 720p or 1080p media that was natively H264 files from an iPhone, and typically an average of 1-2 minutes a clip.  

The biggest advantage of this elegant feature is the removal of the conform process, gone are the days of keeping track of all media throughout an edit,  and relinking your original media when you have locked your final offline sequence, and in worst case scenarios having to manually re-lay all media on a sequence with the high res native media.  With two mouse clicks you can playback a full resolution edit, and be reconnected to your native media instantly 

 

There were many other functions in FCPX that were very useful when cutting a vlog, a big one was the speed changes I often do to footage Siobhan and Emma shot as timelapses.  This was especially easy in FCPX as there are easy to reach functions for this, one option being below the viewer, it is a 'Retime Editor' who's icon looks like a 'speedometer'.  The quickest way to 'fit a clip' into a certain space or time gap, is to highlight a clip on the timeline and press 'cmd+R', this brings up the 'Retime Menu' which gives you a percentage above the clip.  You can now either use the dropdown to select a certain speed (50%, 25%, 2x, 4x) or a custom percentage.  Or even faster, drag the right edges of the Retime Menu, as if you were trimming the clip and this will speed the clip up, or slow it down to fit the space used by it.

 

Outside of what I was cutting, I'd imagine FCPX would handle mixed media well, mostly because of it's function to create optimised and proxy media upon import, but also because of it's impressive ability to cache and render on the fly as you work; this does come at a price.  When normally cutting these vlogs, in Premiere CS6, I would be running with about 50% of my 24GB of RAM in use, with FCPX it was closer to 90% and sometimes more.  This is likely down to the program's use of my full machine, and also the constant background rendering, waveform creation, poster frame creation and a more complex program in general.  The rest of the machine however didn't suffer if I jumped to another program briefly such as: Finder to copy a file, Safari (often to google a simple function), or even Premiere (to check how I styled things in past videos).  

 

…and things he did not like:

[3-point editing] appears to be very difficult also due to the requirement of 'gap', which as far as I'm aware is a main staple in most people's editing arsenal.  FCPX allows you to put down in's and out's on the timeline but they have to be placed over existing media, or gap.  This makes accurate 3-point edits feel difficult for anything other than replacing clips which can be done easier by dragging a new clip over an old one.  More and more I feel FCPX wants you to be throwing clips in roughly at first and finessing on the timeline, rather than being accurate from the start.  A way around this would be to lay 'gap' at the end of your timeline at all times to allow some breathing space, this is however more of a work-around then a true problem solver; but I don't see this style of timeline changing anytime soon.

 

Also another quirk is the lack of sync markers to warn you, if a shot slips out of sync with it's original audio; this is a worry for me and could potentially be a huge issue with an accidental slip of an edit - I feel the lack of sync markers is Apple's arrogance that as long as you edit how they suggest you do, it'll never go wrong, so why worry you with it.  

 

I do still feel that FCPX isn't designed or appropriate, for the mass pro market though.  And I don't see it ever replacing the staples that are Avid and Premiere; it doesn't handle multiple users and mass data banks with hard ware like Avid does, and it's lack of integration with After Effects amongst other programs, will be a drawback for many; such as small creative studios.

Those with years of experience of using Final Cut may find much they disagree with here. They might think that Chris doesn't know how to to things ‘properly,’ so is therefore unqualified to make these judgements. In reality, he represents the kind of person we want to be trying Final Cut. Apple would do well to pay attention to what people like Chris think. Smoothing all sorts of ‘onboarding’ stories is what Apple and other concerned parties need to do. It is important to realise that this report is based on what he has gathered while trying Final Cut for two weeks. I hope the Final Cut community is standing by to assist as and when needed. We will see if he changes his mind about the things he doesn’t like at the moment. 

The bottom line is that although he isn't sold on some of it’s features, he will stick with it for projects away from his work in commercial TV.

When talking to those with a great deal of experience in another NLE, I suggest that they try Final Cut by doing a project that is very different from what they usually do. Feature editors should try to make a music video. Documentary videos should try to cut a short film. Fast turnaround editors should try to cut a short documentary. That means they will be able to use Final Cut’s ways of doing things without constantly comparing them to they way they use Media Composer of Premiere every day. 

In the past I've been quite against FCPX and not interested in 're-training' so to speak, but as I'm being pushed to my limit with CS6 I have to make a decision at some point, as to which NLE to move onto for projects I work on alone.  In all honesty, I can see me using FCPX for these projects.

I'm looking forward to seeing what he says in six months. Perhaps in the meantime, another person who is experienced in Media Composer and Premiere could report how their point of view has changed over the months of using Final Cut Pro X!

UPDATE: A very useful detailed comment has been added below the post by Mathieu Ghekiere to help out Chris Chapman with his personal take and pointers to find out more:

Some notes from someone who went from a place of hate to a place of love and experience (woaw, that sounds a bit new age, didn't mean it like that!), you should also keep in mind that it will take a while for new muscle memory to get created in how the timeline reacts, how you organise stuff, etc. ... In my experience this started after about 3 months of editing.

[…]

Hope some comments could help you along the way. Happy editing!

 

With version 3, Thunderbolt is finally a cross-platform hit

Friday, 24 March 2017

Thunderbolt 3's recent success is good news for high-end post production.

When Apple replaced their (then) high-speed Firewire 800 connections with Thunderbolt ports in 2011, pro users were happy with the many improvements. Sadly USB 3 turned out to be 'good enough' for the vast proportion of PC users. That prevented a competitive market for Thunderbolt devices. That meant external drives and capture card prices didn't come down significantly

The situation didn't improve with Thunderbolt 2, despite it doubling bandwidth. It was for ‘Mac people who do postproduction' - not a large enough market for big economies of scale.

It looks like version 3 of Thunderbolt will finally go mainstream. It now has 2,750 MB/s of PCI Express bandwidth - significantly more than USB 3.

As well as Apple including in in recent MacBooks and MacBook Pros, Intel's Thunderbolt 3 buyers guide shows 52 PCs implementing Thunderbolt 3.

Bare Feats have been testing external GPU chassis that connect to Macs via Thunderbolt 3

Bare Feats are working on benchmarking a Sonnet eGFX Breakaway Box. I'll link to that report here soon.

QNAP have launched a Thunderbolt 3 NAS:

By supporting the SMB protocol, Final Cut Pro X 10.3 allows users to create a library on a NAS volume and use it as if it were on a local storage device. This simple-yet-important change allows users of Final Cut Pro X 10.3 (Mac users) and users of Adobe Premiere Pro (Windows users) to centrally store their video materials in the same shared folder on the NAS, greatly improves productivity for highly-collaborative projects in multi-workstation environments.

Simultaneous online editing by multiple users is made possible as the newly-added SMB protocol on QNAP NAS runs faster and more stable than the NFS protocol used in previous versions. Multiple Final Cut Pro users can edit different events and projects for the same media files simultaneously and combine each piece later to speed up the editing process, making it especially ideal for fast-moving media environments. Collaboration is also highly simplified now that Mac and Windows users can share files through SMB networks for separate steps of production work.

Felipe Baez points out that QNAP connects Macs to their NAS using Thunderbolt networking. From the same QNAP page:

Directly connecting a QNAP Thunderbolt 3 NAS to a computer establishes a peer-to-peer (P2P) network and enables 20GbE connectivity.

Detailed QNAP data sheet PDF on their Thunderbolt 3 NAS.

 It is good news that QNAP consider Thunderbolt networking is now reliable enough to be the basis of a professional product. Intel has a PDF on networking PCs using Thunderbolt. with a few notes on networking Macs using Thunderbolt:

  • To be able to communicate with another Windows computer through Thunderbolt networking, the Apple computer should share the same subnet as the other computer.
  • In a multiple communication environment, Apple computers act as bridges. Therefore any computer in a networking chain that is Apple should have the same subnets as its connected peers

Apple has a support note on Thunderbolt networking over USB-C:

  • Be sure to connect your Mac directly to the Thunderbolt 3 computer and not through a USB-C hub. USB-C hubs don't support Thunderbolt 3 connections between their ports
  • Make sure that the USB-C cable that you're using supports Thunderbolt 3. Not all USB-C cables support the requirements of Thunderbolt 3. For example, a USB-C charge cable doesn't support a Thunderbolt 3 connection

Apple’s support note on shared storage with Final Cut Pro X 10.3.

It is good that there are high-end solutions for Thunderbolt 3, but it is important to remember that these are driven by a market of game enthusiasts who want to run advanced games at or even watch TV on 4K monitors. A post by Jeff Atwood covers using a $500 external Thunderbolt enclosure to add a $600 video card to a $1,000 game PC.

Playing games at 1080p in my living room was already possible. But now that I have an incredible 4k display in the living room, it's a whole other level of difficulty. Not just twice as hard – and remember current consoles barely manage to eke out 1080p at 30fps in most games – but four times as hard. That's where external GPU power comes in.

...

40Gbps is, for the record, an insane amount of bandwidth. Let's use our rule of thumb based on ultra common gigabit ethernet, that 1 gigabit = 120 megabytes/second, and we arrive at 4.8 gigabytes/second. Zow.

That's more than enough bandwidth to run even the highest of high end video cards, but it is not without overhead. There's a mild performance hit for running the card externally, on the order of 15%. There's also a further performance hit of 10% if you are in "loopback" mode on a laptop where you don't have an external display, so the video frames have to be shuttled back from the GPU to the internal laptop display.

This may look like a gamer-only thing, but surprisingly, it isn't. What you get is the general purpose ability to attach any PCI express card to any computer with a Thunderbolt 3 port and, for the most part, it just works!

 Keep up with Thunderbolt 3 at Intel’s Thunderbolt site.

20,000 word Final Cut Pro X high-end postproduction guide

Monday, 06 March 2017

First came Mike Matzdorff’s Final Cut for feature films book: Final Cut Pro X: Pro Workflow (Apple iBook / Amazon Kindle

Now there is a series of articles on fcp.co by Sam Mestman and Patrick Southern that cover high-end postproduction workflow. They include everything from how to handle 6K RED RAW footage on set to final DCP delivery for cinema showings and worldwide distribution.

Part 1 covers on-set post production including workflows for the camera department, the DIT/Assistant editor, production sound and script supervisor:

This 5 part series should be looked at as a cheat sheet on how to make a movie, pilot, or doc without limits in the modern age.

[…] Everything you are about to read has actually been done with a real world project called Off The Grid, which is We Make Movies’ first original TV Pilot that premiered at the Sundance theater in Hollywood. 

Part 2 describes how to save time when preparing footage for editing:

On Off The Grid, we took the search and organizational aspects of FCPX to another level. We automated most of the metadata management, applying that data to the original sound and picture.

We were able to automatically synchronize audio, batch rename clips, and add keywords and notes based on our Script Supervisor's log. This made it possible to quickly search by character, frame rate, frame size, shot composition, and circle takes.

[…]These automated organizational techniques can cut footage prep time from 3 days down to as little as 10 minutes

Part 3 shows how to maximise the editing experience in Final Cut Pro X:

In other NLE’s, Most editors either spend a lot of time making and renaming subclips, or pulling selects into a timeline for review. These both require lengthy prep and are more difficult to work with than necessary. Subclips don't give you easy access to the full-length clip, and long string-outs can be difficult to navigate for a specific clip.

None of this is necessary in FCPX. You can use a variety of tools within FCPX to find what you’re looking for. You can leverage the search bar, Smart Collections, Favorites, Rejects, Keywords, notes fields, and Markers to help you filter your choices to display exactly the thing you're looking for.

Part 4 details workgroup workflow and finishing:

It’s widely known that most shared storage systems have a hard time once you start dealing with 4K, 6K, and VR.  Less widely known is that most are also not optimized for the small database files that FCPX relies on or for its libraries and cache.

To work optimally on shared storage, FCPX Libraries need storage that is optimized for many micro interactions AS WELL as being optimized for high resolution codecs and framerates.

Part 5 finishes off with collaborating with colour, VFX and audio professionals:

When everyone has access to the same storage and knows how to speak each others’ language, completion of a project becomes exponentially faster. This results in happier teams, clients, and budgets. Ending the Tower of Babel of post production allows you not only save time and money, but it also allows you to put more of that time and money where it matters most…the craft of storytelling.

In summary: a great resource for those planning to make (or working on) a TV series or feature film with a workflow that has Final Cut Pro X at the centre.

 

Apple royalty-free audio library elements can be used in commercial productions

Wednesday, 15 February 2017

An important part of preparing any video or animation production for distribution is checking that it is copyright-cleared. Funders and distributors need to make sure that productions that become successful don’t lead to lawsuits from aggrieved copyright holders. 

As well as applying to video clips, the audio must be cleared. Music and sound effects are subject to copyright. Google’s YouTube runs software that can recognise music in your video that belongs to someone else. Sometimes this automated process recognises music that is royalty-free - copyright cleared.

If you do use royalty-free music or sound effects, it is a good idea to have a note of the permission you have to use in commercial projects. Apple have support documents that are helpful here. They say that you can use their royalty-free content in any way - apart as individual source files: be they samples or music loops. You can take all or part of their library and sell it to anyone else or make it available to anyone for free:

You may broadcast and/or distribute your own soundtracks that were created using the Sample Content, however, individual files may not be commercially or otherwise distributed on a standalone basis, nor may they be repackaged in whole or in part as clipart, stock animation, audio samples, sound files or music beds.

Here are the relevant support notes on this subject for the royalty free content that comes with Final Cut Pro X and Motion, Garageband and Logic Pro X/MainStage 3.

If YouTube erroneously doesn't like your soundtrack because you've used some content from a royalty-free library that comes with an Apple product you've bought, you can dispute the 'Content ID Claim’ by appealing on the YouTube site itself. WikiHow has an article on appealing this sort of thing. In the ‘I believe the copyright claim is not valid because’ section, you would click the ‘I have a license or written permission from the proper rights holder to use this material’ option and provide a link to the relevant Apple support document for the application you used to make your video.

Proper Final Cut Pro X competition - Snap?

Tuesday, 14 February 2017

Adobe and Avid are very unlikely to come up with any product or service that competes with Final Cut Pro X. Apple certainly don't act as if they have any competition. Avid is stuck at the high end, and Adobe squandered their chance to be the main supplier of video literacy products and services to the wider public.

That isn't great news for Final Cut users. There is no-one to put the ProApps team’ under pressure. Apple act as if their plan is fine and they are doggedly sticking to it. Because there is no serious competition. This isn't because of ‘iPhone money.’ With almost 2 million copies of Final Cut sold so far, there is no reason why it hasn't returned a good profit for the first five and a half years of its life. 

If Final Cut Pro X had had serious competition since 2011, they would have a built in compelling collaboration solution today.

As millions have free access to iMovie on macOS and iOS, Apple are spreading the gift of video literacy by default. Imagine if Apple had had some serious competition on that front over the last few years. Adobe’s Premiere Clip for iOS wasn't it.

Now Snap Inc. - providers of the Snapchat service and Snap spectacles - are going public. The good news for Apple fans is that Snap are telling potential investors that they plan to implement Apple’s strategy. Not explicitly, the similarity has been pointed out by Ben Thompson on his Stratechery weekly post:

To summarize, Snap’s strategy is to:

  • Deliver innovative and differentiated products that…
  • Cost a lot to deliver but…
  • Capture the best customers…and PROFIT!

That’s definitely not Twitter; indeed, the real analogy for Snap is from another part of technology entirely: it’s Apple.

Snap plan to use their insight into users and give them what they need - and providing access to the best customers to advertisers.

Luckily for iMovie and Final Cut Pro X users, Snap have no baggage holding them back: They are in a position to invest in tools for creativity for all sorts of users - without the precondition of using a 20th century metaphor or of needing to sell storage and service contracts.

Ben quotes Snap:

We believe it’s always worth trying to build something that will empower people to express themselves, live in the moment, learn about the world, and have fun together — even when it’s not clear that what we build will be successful or make money.

That’s the kind of competitor Final Cut Pro X users need around! I hope they do step up and try to change the world. Luckily there are a few others out there who might also take on the same role. I hope they all wake up soon!

On-location editing where the editor is the technician

Wednesday, 08 February 2017

A new case study from Peter Wiggins of fcp.co shows that editors can manage their own workgroup shared storage - setup and maintenance:

The job needed two edit machines, an ingest station, a graphics station and connection to the truck EVS system.

Hiring an ISIS system for the weekend would involve an engineer attending and that would blow the budget. So I suggested that I could supply and look after a newer, faster shared storage system.

He arranged to use a Lumaforge Jellyfish video workflow server. Read the full story over at fcp.co.

Peter’s conclusion:

I can see more jobs where the editor/s are in control of the technical aspects of OB post production. They know how things should be configured, how fast things should run (and why) as they are the end users. The Jellyfish is not only a fast shared storage box, it is very easy to hook up to each client. No IT or field engineer required.

And the bottom line really is the bottom line. If a client has a fully working, fast edit system on location on a tight budget, everybody wins.

Although in this case Peter was working on fast-turnaround on-location sports editing, this model would work well for other sorts of location editing.

Modern post-production: No need for support contracts

Lumaforge’s model is to make money on the hardware and making the system simple enough so there are no support costs. Good news for productions, not so good for those hoping to make money on service contracts and those companies competing for workgroup editing hardware and service sales. I'm interested to see how others compete with Lumaforge’s Jellyfish solution.

Blackmagic Design: SD cards ready for prime time prime time

Tuesday, 07 February 2017

Blackmagic Design think that SD cards are ready for professional broadcast prime time. They have announced the Hyperdeck Studio Mini.

In the last few years high-end productions have been using SSDs to record and playback source media and final produced shows.

The launch of Blackmagic Design's new Hyperdeck Studio Mini shows that they think that relatively inexpensive SD cards have the capacity, speed and reliability to take over from SSDs.

a professional deck that records and plays back broadcast quality 10-bit video as ProRes files on commonly available SD and UHS-II cards. It’s packed with features like 6G-SDI for working with all formats up to 2160p30, HDMI 2.0 for monitoring, dual SD card slots for non-stop recording and a reference output

It seems that their new SD-card based record/playback deck has the features associated with hardware from 4 years ago at 100 times the price. 

I wonder if a connected computer can access ProRes growing media on the SD cards for live highlights editing.

Apple targets the young and educators with $199 post apps bundle

Friday, 03 February 2017

Jim Dalrymple of The Loop reports that Apple have announced a $199 bundle for those in education and teachers that includes full versions of applications usually priced at a total of $629.95:

The apps include Final Cut Pro X ($299.99), Logic Pro X ($199.99), Motion 5 ($49.99), Compressor 4 ($49.99), and MainStage 3 ($29.99).

A saving of $429.96.

The Pro Apps Bundle for Education is available for teachers, faculty, staff, and college students, as well as K12 and HiEd institutions.

Note that these are not student versions of these applications. This is a bundle of codes that are used on the Mac App store to download full retain versions. Since 2011 Apple haven't charged for updates. This means that if students take advantage of this offer and move into post production, they will probably not need to pay any more money for the professional tools they use to earn money.

No UK educational bundle for professional applications has been announced yet. Final Cut Pro X retails for £299.99. The educational price is currently £270.

 

Accelerating the late 2016 MacBook Pro with an external graphics card

Thursday, 02 February 2017

Bare Feats have been testing the new AKiTiO Node eGFX Box with their late 2016 MacBook Pro. This full-bandwidth Thunderbolt 3-connected cage can connect standard NVIDIA graphics cards to the Mac. Along with many graphs showing how much better every benchmark runs with different cards compared with the best possible internal GPU from Apple (the 4GB AMD Radeon Pro 460) they conclude:

The AKiTiO Node eGFX Box is the first Thunderbolt 3 GPU expander we have tested that is macOS friendly, runs at full Thunderbolt 3 bandwidth, has a built-in power supply, and ships with a 2 meter Thunderbolt 3/USB-C cable. And with the array of compatible NVIDIA GPUs to choose from, you can be confident that it will not only support your CUDA capable apps, but will accelerate OpenGL and OpenCL capable apps beyond what the MacBook Pro's discrete AMD GPU is able to do.

Rob ‘Art’ Morgan of Bare Feats later tweeted Final Cut Pro X-specific information. X will take advantage of the external GPU in the Node eGFX box if it has a display attached:

FCPX 10.3.2 works if external display connected to GPU in Node. During Directional Blur render, 70% load on 980 Ti, 0% on 480.

He tweeted a graph of how much Final Cut lays off GPU processing to an external graphics card:

Looking forward to Bare Feats adding some Final Cut tests using this setup soon!

OWC DEC expansion slice for 2016 MacBook Pro update

Thursday, 02 February 2017

At the beginning of January OWC previewed their DEC expansion system for the 15" 2016 MacBook Pro. They supplied images of a device that attaches to the base of the machine.

2016 MacBook Pro from left and right showing ports on prototype OWC DEC - version 1

The image they first used didn't show how the device is designed to connect to the Mac. You may have missed an update from the Mac Performance Guide in which they had a look at a DEC prototype.

I’ve seen the prototype of the OWC DEC first hand. The DEC bolts onto the bottom of the MacBook Pro after removing the bottom shell of the case. The result is a seamless integration with all the key ports I need (gigabit ethernet, USB-A 3.1, SD card slot), delivering what feels in the hand very much like the 13" 2012 MacBook Pro in thickness (but in 15" size).

Read more at Mac Performance Guide.

A few days after this report, OWC updated their visualisation:

 

Modular Expansion?

OWC says the DEC will make the MacBook ‘Pro’ again:

This solution seamlessly integrates with your MacBook Pro for increased capacity and expanded connectivity far beyond factory capabilities.

A vague statement. Maybe OWC could do with some feedback on what would be useful. 

Video editors and VFX artists hoping to use their new MacBook in workgroups with servers like Lumaforge’s Jellyfish would appreciate a DEC with 10Gb Ethernet. Others would want a DEC filled with extra battery only, useful for 24 hour journeys across the planet.

As there will be many different requirements by different professional users, I hope OWC offer a modular system.

This would similar to the way PowerBooks used to work in the 1990s: you could choose a CD drive or floppy disc or a battery in one of two slots. In 1999 Low End Mac stated that you could get 16 hours of power from using two batteries in a 'Lombard' Powerbook.

The OWC as shown could have two slots 150mm wide by 160mm deep - one on each side. Here are some ideas for modules to go in the slots:

  • Older MacBook ports: Ethernet, SD Card, USB 3 ports, Audio In, MagSafe, Thunderbolt/MiniDisplayPort, HDMI
  • SSD
  • Battery
  • GPU with external connector
  • High-end ports: 10Gb Ethernet, etc.
  • Optical drive
  • Retro: 3" Floppy drive
  • Retro: Firewire, ADB, SCSI ports

In some cases you would only need one of a module, in others two would be handy. What else would be useful for you?

Sign up at OWC to get DEC updates.

 

BBC News animation using Apple Motion and Final Cut Pro X

Thursday, 19 January 2017

I visited BBC New Broadcasting House in central London last week with Iain Anderson. As we were being shown around the BBC News headquarters, we happened across an animator working on a piece about Trump's inauguration day. He was editing it in Final Cut Pro X with plugins created in Apple Motion 5, both commercial and custom. 

Iain and I gave him some tips in making plugins - including replicator sequence behaviours. He also showed us BBC-branded plugins he made for Final Cut Pro X titles and animated graphs.

We also were happy to talk to others at the BBC on the latest developments in Final Cut Pro X workgroup collaboration.