How many flicks per frame?

Wednesday, 24 January 2018

Facebook’s Oculus division have defined a new unit of time says BBC News:

The flick has been designed to help developers keep video effects in sync, according to a description on the code-sharing site GitHub.

A flick, derived from "frame-tick", is 1/705,600,000 of a second - the next unit of time after a nanosecond.

A researcher at Oxford University said the flick wouldn't have much general impact but may help create better virtual reality experiences.

Although most people are now aiming making VR hardware that refreshes its display 90 times a second, video is available at many different frame rates. It is hard to make sure all the frame updates happen at the same time and at the right time. The small monitors inside a head-mounted display must update more often than the frame rate of the source video in order for the video to follow the speed of normal head movement. The flat frames of video being sent to the viewer’s eyes are excerpts from a larger sphere of video.

If you have spherical footage that is designed to update every 59.94th of a second on a VR headset that is being refreshed 90 times a second, the mathematics gets complicated, and errors can creep into the tens of thousands of calculations that must be done during a VR experience. This is partially because true frame rates cannot be completely captured using multiple decimal place values. The frame rate for US TV is described as 29.97 frames per second for example. The true definition of this frame rate is a division calculation: 30÷100.1×100 = 29.970029970029970029970029970029970029970029970029 and on into uncountable infinity.

The flick trick is tom come up with a small enough unit of time that goes into all common video frame rates and refresh rates without any decimal places left over. This makes the calculations much simpler. Adding and subtracting is faster than dividing. It is also more accurate - as the duration of each video frame or VR refresh update can defined as a whole number of flicks.

Here is a table of how many flicks correspond to popular video frame rates. Final Cut Pro can edit audio clips and keyframes at ‘subframe’ resolution which is 1/80th of the project frame rate. 

Flicks per
frame

fps

1 frame
in seconds

flicks
per frame

flicks per
fcpx subframe

US Film for TV

23.976024

0.04170833

29,429,400

367,867.5

 

24

0.04166667

29,400,000

367,500

Worldwide TV

25

0.04

28,224,000

352,800

US TV

29.9700299

0.03336667

23,543,520

294,294

 

30

0.03333333

23,520,000

294,000

 

59.9400599

0.01668333

11,771,760

147,147

 

 

 

 

 

VR headset refresh

90

0.01111111

7,840,000

 

PS: The highest commonly used ‘frame rate’ is that used in high-end audio: 192KHz which defines samples of audio at 192,000 fps - which is 3,675 flicks per sample.

PPS: The Facebook conversation that prompted the creation of flicks.

Apple Final Cut Pro 10.4: 360º spherical video, colour, video HDR and more

Thursday, 14 December 2017

Today’s Final Cut Pro X update adds features for both high-end professionals, those new to editing and everyone in between.

  • Professionals can now stay in Final Cut for advanced colour correction and advanced media workflows.
  • Everyone can explore 360° spherical video production - from those who have recently purchased consumer 360° cameras up to teams working on the most advanced VR video productions.
  • 10.4 includes the ability to open iMovie for iOS projects for people who want move from free editing tools to a full video, TV and film production application.

Apple has also updated Motion, their real-time motion graphics application, to version 5.3. Compressor, their video encoding and packaging application, has been updated to version 4.4.

All updates are free for existing users, and prices for new users remain the same from the Mac App Store: $299.99 for Final Cut Pro and $49.99 for both Motion and Compressor. Apple have yet to introduce subscription pricing on their professional video applications. Those who bought them from the Mac App Store in 2011 have not had to pay for any updates over the last six years.

The hardware requirements to run Apple’s professional video application remain the same, but a few features depend on them running on macOS 10.13 High Sierra: HEVC and HEIF support and attaching a VR headset. If you don’t yet need these features, Final Cut Pro, Motion and Compressor will run on macOS 10.12.4 or later.

After I cover the new 360° spherical video features, I’ll give a rundown of the rest of the 10.4 update.

360° spherical video

There is a large range of audiences for 360° spherical video:

  • The majority use phones looking at Facebook and YouTube videos to use a ‘magic window’ to turn around inside 360° - as they move the phone around in 3D space. The video displaying on the screen updates to match their position, giving the feeling of being ‘inside’ the video.
  • Those that have bought devices worn on the head that cradle phones in front of their eyes as they look around (from less than $30).
  • People with VR headsets ($200-$2000).
  • Groups of people in rooms with devices that project the video on the inside of a dome.

The rest of this section is a much shorter version of my Final Cut Pro & 360° spherical video: All you need to know page. 

Final Cut Pro 10.4 can handle spherical video with ease. It recognises footage captured by 360° cameras and spherical rigs. You can create spherical video timelines to edit 360° footage. There’s a 360° Viewer in the Final Cut interface that can be shown next to the normal view that lets you get a feel of what your audience will see when they explore the sphere of video.

To look around inside the video sphere, drag in the 360° Viewer.

VR headset support

On faster Macs running macOS High Sierra you can install the Steam VR software and attach an HTC Vive VR headset to use it to watch the video play straight from the Final Cut Pro 10.4 and Motion 5.4 timelines. Apple’s technical support document on the subject: Use a VR headset with Final Cut Pro X and Motion.

Spherical video now a peer to rectilinear video

It has been possible to work with 360° spherical video in video applications before. As they are designed to work with video in rectangles - rectilinear video - it was necessary to ‘fool’ them into working with the spheres of video that are at the core of 360° editing. This was done with specialised 360° plugins, which were applied as effects and transitions to footage in rectilinear video timelines. Although the user knew that the rectilinear footage represented spheres of video, the editing and motion graphics applications had no idea.

Apple have made spherical video a true peer of rectilinear video in Final Cut Pro 10.4 and Motion 5.4. If applications understand the nature of spherical video, existing features can be improved to do the right thing for 360° production, and new features can be added that benefit both rectilinear and spherical production.

‘Reorient’ orientation 

Media that represents spheres of video has ‘reorientation’ properties. This is useful when you want to choose which part of the sphere is is visible if the viewer is looking straight forward. When people start watching, playback starts with then facing forward. After initially looking around when the story starts, even though viewers can look anywhere in the sphere, most will spend the majority of the time looking forward, turning maybe 60° to the left or the right depending on video and audio cues.

In 10.4 you can show a Horizon overlay which marks what is straight ahead, with tick marks to show what is 90° to the left and 90° to the right (the left and right edges of the rectangular viewer define what is seen if the viewer turns 180° from the front.

There is a new Reorient transform tool for changing spherical video orientation by dragging in the viewer.

The 360° Viewer shows what is straight ahead when viewed online or in a spherical video device. Here the Reorient tool is being used to make the London bus appear straight ahead (X:0°, Y:0°, Z:0°):

This means that if the viewer is looking ahead when this shot starts, they’ll see the London bus.

Final Cut Pro 10.4 doesn’t yet convert footage from 360° cameras and rigs into spherical videos. Apple expects that editors will use the specialised software that comes with cameras to do this work - which is known as ‘stitching.’ If footage needs more advanced work done on it (such as motion tracking to steady a shaky shot and removing objects from spherical scenes), that will need to be done in applications such as Mocha VR.

Flat media inside a 3D sphere of video

Final Cut recognises spherical media and knows how it should work in a spherical timeline. It also recognises flat media, and knows what to do with it in a spherical timeline. In traditional rectilinear projects, each piece of media has X and Y position properties. This allows editors to position footage and pictures in the frame.

When flat (non-360°) media is added to a 360° spherical video project, instead of having X position, Y position and Scale properties in the ‘Transform’ panel of the clip inspector, there is an additional panel in the clip inspector: 360° Transform. This panel has properties that allow editors to position the flat media anywhere inside the video sphere. This can be defined in Spherical coordinates - two angles plus distance, or Cartesian coordinates - X, Y and Z co-ordinates (where the centre of the sphere is [0,0,0]).

Auto Orient makes sure the flat media always faces the viewer. X, Y, and Z Rotation is applied to the media after is positioned using Spherical or Cartesian co-ordinates

360° effects - Modify whole sphere

Final Cut Pro 10.4 comes with 10 360°-specific plugins. Nine of them are used to apply a graphic effect to a whole sphere of video. Here they are in the effects browser:

  

360° effect: 360° Patch

There is another plugin that can be used to hide parts of the sphere of video, which is useful when you need to hide the equipment (or person) that is holding the 360° camera.

In this case of this shot, I am visible to those who look straight down, because I held the camera on the end of a pole above my head. The 360° 

The result is that the whole sphere looks like this:

360° titles and generators

10.4 includes a set of titles designed for 360° - they display and animate 3D text on and off:

10.4 comes with two 360° generators:

Apple’s 360° spherical video do list

The Final Cut Pro 10.4 update is probably only first part of Apple’s 360° spherical video plan. The way they have started is designed to accommodate many future updates. I expect that the video applications team still have a long to do list:

A tough list, but Apple are best positioned of anyone to be able to deliver these features to all Final Cut Pro users. The Apple video applications team can also bring 360° spherical video to millions of people through their other applications working on Apple hardware of all kinds: iMovie for macOS, iMovie for iOS, Clips for iOS and Memories for Photos.

Not only 360° spherical video

Here is a summary of the other features in the Final Cut Pro 10.4 update - with links to the new help system on these topics:

Advanced colour correction

Choose which part of the footage to base a white balance on. A new option in the Color Balance effect (Option-Command-B). Apple help topic on manual white balance

New grading tools such as colour wheels, colour curves plus hue and saturation curves. Color Correction Overview.

New built-in camera LUTs (including support for the December 2017 RED Workflow update and software from Canon) and support for loading more camera LUTs. You can also control where in the pipeline LUTs are applied using the new Custom LUT effect. See Color Lookup Tables.

TIP: Important note pointed out by Gabriel Spaulding: Libraries do not carry LUTs that are applied using the new LUT features, so if are sharing with other editors and you use these new features, make sure you manage the LUTs to prevent seeing this message:

All colour corrections can now be animated using keyframes.

HDR video

High-dynamic-range video allows the range of brightness levels in footage, projects and exports to be much larger. This means much more detail in brighter parts of the image. Wide Color Gamut and HDR Overview and Configure library and project settings for wide gamut HDR.

There is a new button in the library inspector.

Once clicked you can set your library to be able to support media and projects with wide gamut HDR.

As well as being able to HDR properties for footage, projects and libraries, there is a new HDR Tools effect to support standards conversion. 10.4 can also generate HDR master files.

For a detailed article by someone much more expert than me on the subject of the new colour tools and HDR, read Marc Bach’s blog post.

iMovie for iOS projects

Any projects started on iMovie for iOS on an iPhone or iPad can be sent directly to Final Cut Pro 10.4 for finishing. Very useful for the many professional journalists who prepare reports on their mobile devices. See Import from iMovie for iOS

Additional video and still image formats

If 10.4 is running on macOS 10.13 High Sierra:

  • HEVC (High Efficiency Video Coding), also known as H.265, a video compression standard
  • HEIF (High Efficiency Image File Format), a file format for still images and image sequences
  • RF64, an extension to the WAV file format that allows for files larger than 4 GB

Collaboration

Final Cut Pro 10.4 libraries can be stored on connected NFS devices as if they were on local drives.

Retiming speed

Optical flow generation of new frames is now much faster as it has been rewritten to use Metal.

Improved Logic audio plugin UIs

The UIs have been redesigned and also been made resizable (using a 50%/75%/100% pop-up menu).

Before:

 

After:

The Final Cut Pro X Logic Effects Reference site has been updated to provide help on the redesigned audio plugins.

Notes for Final Cut users

Upgrading to Final Cut Pro 10.4

As this is a major update to Final Cut, the Library format has been updated to work with the new features. Apple advises that before you install the update from the Mac App Store, you should backup your current version of Final Cut and existing libraries.

Before you update, check to see if you need to update your version of macOS. Final Cut will no longer run on macOS 10.11, but will still run on macOS 10.12.4. Apple’s detailed Final Cut Pro technical requirements.

Bits and pieces

New in Preferences: In the Editing panel, you can choose which is the default colour correction that is applied when you click the Color Inspector icon in the inspector, or press Command-6.

In the Playback panel, you can Show HDR as raw values and If frames are dropped one the VR headset, warn after playback

TIP: Control-click a clip in the browser to create a new project based on its dimensions and frame rate.

TIP: It is useful to be able to line up elements of waveforms when colour grading. To add a horizontal guide, click once anywhere in the waveform monitor.

Commands with unassigned keyboard shortcuts:

  • Add Color Board Effect
  • Add Color Curves Effect
  • Add Color Hue/Saturation Effect
  • Add Color Wheels Effect
  • Color Correction: Go to the Next Pane
  • Color Correction: Go to the Previous Pane
  • Toggle Color Correction Effects on/off

New commands:

  • Select Previous Clip - Command-Left Arrow
  • Select Next Clip - Command-Right Arrow
  • Extend Selection to Previous Clip - Control-Command-Left Arrow
  • Extend Selection to Next Clip - Control-Command-Left Arrow

For new keyboard commands associated with 360° spherical video, visit my Final Cut Pro & 360° spherical video: All you need to know page. 

360º features review: ‘Version 1.0’

Apple ‘went back to 1.0’ with Final Cut Pro X in 2011. They didn’t push Final Cut Pro 7’s 1990s software core to breaking point to accommodate new digital workflows. They imaged what kind of editing application they would make if they weren't limited by the ideas of the past. One result was that Final Cut Pro 10.0 was based around GPU rendering and multiple-core CPU processing. The kind of processing that 360° spherical video production needs.

Getting established postproduction tools to do 360° via plugins is they way people without access to the core of applications had to do it. It is a stopgap that application users will eventually want leave behind. Apple didn’t add 360° via plugins to Final Cut in a ‘do it the legacy way.’ They jumped to ‘Version 1.0’ of 360° spherical video. They answered this question: “As you have control over Final Cut Pro, how should you design 360° into its core?”

Following the Final Cut Pro 10.4 update, the Apple Video Applications team are now well placed to develop more of their products and services to support many more people who want to tell stories through 360° spherical video. For years now Final Cut Pro has been powerful enough to work on the biggest shows, yet friendly enough for the millions of people who know iMovie to make a small step towards professional production. With 10.4, that applies to 360° spherical video too. I’m looking forward to experience the stories they tell.

New iMac Pro - How much better at 360º spherical video stitching?

Tuesday, 12 December 2017

Vincent Laforet is another influencer who has has access to an iMac Pro for the last week. His blog post includes speed tests for Final Cut Pro X, DaVinci Resolve, Adobe Lightroom, RED Cine-X and Adobe Premiere. He also test to see how fast the new iMac Pro was at stitching the multiple sensor media recorded using an Insta360 Pro to a 6K stereo sphere. He compared it with his 2016 5K iMac and his recent 2017 MacBook Pro:

I processed 6K Stereo (3D) VR Insta360 PRO footage through their Insta360 Stitcher software, a 56 second clip, here were the export / processing times:

iMacPRO – 5 minutes 55 seconds

iMac – 11 minutes 09 seconds

MacBookPro 15” – 32 minutes

Read more about the computer and other results on his blog.

Video preview of iMac Pro from MKBHD - Marques Brownlee

Tuesday, 12 December 2017

Just as with the iPhone X, it looks like Apple are giving online influencers early access to new products and giving them permission to share their impressions before release. Marques Brownlee - known as MKBHD on the internet - has posted a video on the forthcoming iMac Pro. His 5.4 million subscribers are now finding out about the new Mac from Apple.

He mentions that this video was editing on the new iMac Pro in the next version on Final Cut Pro X, 10.4.

The model he's be working with for a week is the Intel Xeon W 3GHz 10-core iMac Pro with 128GB of RAM, Radeon Pro Vega 64 GPU with 16GB of RAM and 2TB storage - the ‘middle’ iMac Pro in the range.

  • The physical dimensions exactly match today’s 2017 5K iMac.
  • No access to upgrading the RAM
  • Two more Thunderbolt 3 ports (for a total of 4)
  • 10 Gigabit Ethernet
  • Geekbench iMac Pro single core: 5,468 (vs. 5,571 for 2017 iMac and 3,636 for 2013 Mac Pro)
  • Geekbench iMac Pro multi-core: 37,417 (vs. 19,667 for 2017 iMac and 26,092 for 2013 Mac Pro)
  • Storage speed: 3,000MB/s read and write
  • Fan rarely spins up and keeps cool to the touch, despite high-end workstation components
  • 8- and 10-core editions available first, you'll have to wait longer if your order an 18-core.
  • “The ideal high-end YouTuber machine”

Looks like applications that take advantage of multiple CPU cores are going to see a big difference on the iMac Pro.

Apple have announced that the orders for the new iMac Pro will start on Thursday

Soon: More audio timelines that can automatically be modified to match changes in video timelines

Wednesday, 06 December 2017

In many video editing workflows, assistant have the thankless task of making special versions of timelines that generate files for others in postproduction. A special timeline for VFX people. A special timeline for colour. A special timeline for exporting for broadcast. A special timeline for audio. Transferring timelines to other departments is called ‘doing turnovers.’

Final Cut Pro X is the professional video editing application that automates the most turnovers. It seems that Apple want to stop the need for special timelines to be created. Special timelines that can go out of sync if the main picture edit changes. Final Cut video and audio roles mean that turnovers for broadcast no longer require special timelines.

The Vordio application aims to make the manual audio reconform process go away. At the moment problems arise when video timelines change once the audio team start work one their version of the timeline. Sound editors, designers and mixers can do a great deal of work on a film and then be told that there have been changes to the picture edit.

What’s new? What’s moved? What has been deleted?

Vordio offers audio autoreconform. That’s if (when) the picture timeline changes Vordio looks at the NLE-made changes and produces a change list that can be applied to the audio timeline in the DAW. It currently does this with Final Cut Pro X and Adobe Premiere timelines. If the sound team have already made changes in Reaper (a popular alternative to ProTools) and they need to know what changes have since been made to the video edit, Vordio can make changes to the audio timeline that reflect the new video edit. This includes labelling new clips, clips that have moved and showing which clips have been deleted.

It looks like Vordio will soon work with other DAWs by using the Hammerspoon UI scripting toolkit.

StudioOne is a useful DAW that has a free version.

I expect timeline autoreconform come to all timelines. To get a preview of what it could be like, check out Vordio.

Film from a single point, then wander around inside a cloud of pixels in 3D

Monday, 04 December 2017

People wearing 360° spherical video headsets will get a feeling of presence when the small subconscious movements they make are reflected in what they say. This is the first aim of Six Degrees of Freedom video (6DoF). The scene changes as the viewer turns in three axes and moves in three axes. 6DoF video is stored as a sphere of pixels and a channel of information that defines how far each of those pixels are from the camera.

Josh Gladstone has been experimenting with creating point clouds of pixels. His 4th video in a series about working on a sphere of pixels plus depth shows him wondering around a 3D environment that was captured by filming from a single point.

The scenes he uses in his series were filmed on an GoPro Odyssey camera. The footage recorded by its 16 sensors was then processed by the Google Jump online service to produce a sphere of pixels plus a depth map.

The pixels that are closest to the camera have the brighter corresponding pixels in the depth map.

360° spherical video point clouds are made up of a sphere of pixels whose distance from the centre point have been modified based on a depth map.

Josh has written scripts in Unity - a 3D game development environment - that allow real-time rendering of these point clouds. Real time is important because users will expect VR headsets to be able to render in real time as they turn their heads and move around inside virtual spaces.

You can move around inside this cloud of pixels filmed from a single point:

In the latest video in his series Josh Gladstone simulates how a VR headset can be used to move around inside point clouds generated from information captured by 360° spherical video camera rigs. He also shows how combining multiple point clouds based on video taken from multiple positions could be the basis of recording full 3D environments:

What starts as an experiment in a 3D game engine is destined to be in post production applications like Apple’s Motion 5 and Adobe After Effects, and maybe eventually in NLEs like Final Cut Pro X.

I’m looking forward to playing around inside point clouds.

28 videos, 53 million views (so far) - advice for your video essay YouTube channel

Sunday, 03 December 2017

Every Frame a Painting is a YouTube channel made up of video essays about visual storytelling. It has 1.3 million subscribers and millions of views. The creators Taylor Ramos and Tony Zhou have decided to close close it. Luckily for us they have written an essay on what they learned - including tips for others considering making videos in this form.

All the videos were made with Final Cut Pro X:

Every Frame a Painting was edited entirely in Final Cut Pro X for one reason: keywords.

The first time I watch something, I watch it with a notebook. The second time I watch it, I use FCPX and keyword anything that interests me.



Keywords group everything in a really simple, visual way. This is how I figured out to cut from West Side Story to Transformers. From Godzilla to I, Robot. From Jackie Chan to Marvel films. On my screen, all of these clips are side-by-side because they share the same keyword.

Organization is not just some anal-retentive habit; it is literally the best way to make connections that would not happen otherwise.

Even if you don't make scholarly videos on the nature of visual storytelling, there is a lot to be learnt from their article and the 28 video essays in their channel.

iPhone-mounted camera will capture 3D environments that can be fully explored in VR

Friday, 01 December 2017

Photogrammetry is the method of capturing a space in 3D using a series of still photos. It usually requires a great deal of complex computing power. A forthcoming software update for the $199 Giroptic iO (a 360° spherical video camera you mount onto your iPhone or Android phone) will give users of the  the ability to capture full VR models of the spaces they move through.

Mic Ty of 360 Rumors writes:

the photographer simply took 30 photos, then uploaded them to cloud servers for processing. The software generates the 3D model, and can even automatically remove the photographer from the VR model, even though the 360 photos had the photographer in them.

Once the model is generated it can be included in full VR systems that can be explored in VR headsets. This will work especially well in devices such as the HTC Vive, which can detect where you are in 3D space and move the 3D model in VR to match. Remember though that many VR experiences are about interactivity, and in order to add that to a 3D environment, users will have to use a VR authoring system.

3D environments in post production applications

For those making 360° spherical videos, it is likely that they will want their post tools to be able to handle the kind of 3D models generated by systems like these. Storytellers range from animators (users of applications like Blackmagic Fusion) to editors and directors (users of Final Cut Pro X and Adobe Premiere). Developers should bear in mind the way they integrate 3D environments in post applications should vary based on the nature of the storyteller.

However, it looks like there'll be a new skill to develop for 360° spherical photographers: where to take pictures in a space to capture the full environment in 3D.

Go over to 360 Rumors to see a video of the system in action.

 

Amazon launches Rekognition Video content tagging for third-party applications

Thursday, 30 November 2017

Amazon have announced an content recognition service that developers can use to add features to their video applications, Streaming Media reports:

Rekognition Video is able to track people across videos, detect activities, and identify faces and objects. Celebrity identification is built in. It identifies faces even if they're only partially in view, provides automatic tagging for locations and objects (such as beach, sun, or child), and tracks multiple people at once. The service goes beyond basic object identification, using context to provide richer information. The service is available today.

The videos need to be hosted in or streamed via Amazon S3 storage.

Apple are unlikely to incorporate Amazon Rekognition Video in their video applications and services. Luckily the Final Cut Pro X and Adobe Premiere ecosystems allow third-party developers to create tools that use this service. Post tools makers can then concentrate on integrating their workflow with their NLE while Amazon invest in improving the machine learning they can apply to video.

4K: Only the beginning for UK’s Hangman Studios’ Final Cut Pro X productions

Thursday, 30 November 2017

Some think that Final Cut Pro X has problems working with 8K footage. Hangman Studios has been making concert films with this workflow since 2015. There’s a new case study by Ronny Courtens of Lumaforge at fcp.co:

Two years ago I made a conscious decision to get rid of all of my HD cameras. We decided that everything from now on had to be 4K and up.

…our boutique post production services in London are newly designed and built for 8K workflows and high end finishing. Drawing upon 17 years of broadcast post experience we've designed a newer, more simplified and efficient workflow for the new age of broadcast, digital and cinema. We’re completely Mac based running a mix of older MacPro 12-cores (mid 2010) with the newer MacPro (2013) models.

I imagine there’ll be space in their West London studios for at least one new iMac Pro. When Apple gave a sneak preview of Final Cut Pro 10.4 and Motion 5.4 as part of the FCPX Creative Summit at the end of October, they showed in easily running an 8K timeline on a prerelease iMac Pro.

Apple have said that Final Cut Pro X 10.4 will able to support 8K HEVC/H.265 footage on macOS High Sierra. This kind of media is produced by 360º spherical video systems such as the Insta360 Pro. When 10.4 comes out in December, editors will be able to do even more at high resolutions.

What is ‘Six Degrees of Freedom’ 360° video?

Sunday, 26 November 2017

Six Degrees of Freedom – or 6DoF – is a system of recording scenes that when played back allow the viewer to change their view using six kinds (‘degrees’) of movement. Today common spherical video recoding uses multiple sensors attached to a spherical rig to record everything that can be seen from a single point. This means when the video is played, the viewer can…

  • turn to the left or right
  • look up or down
  • twist their head to rotate their view

…as look around inside a sphere of video.

If information has been recorded from two points close together, we perceive depth - a feeling of 3D known to professionals as ‘stereoscopic video.’ This feeling of depth applies as long as we don't twist our heads too much or look up or down too far - because ‘stereo 360°’ only captures information on the horizontal plane. 

6DoF camera systems record enough information so that three more degrees of movement are allowed. Viewers can now move their heads

  • up and down
  • left and right
  • back and forward

…a short distance.

As information about the environment can be calculated from multiple positions near the camera rig, the stereoscopic effect of perceiving depth also will apply when viewers look up and down as well as when they rotate their view.

Here is an animated gif taken from a video of a session about six degrees of freedom systems given at the Facebook developer conference in April 2017:

Six degrees of freedom recording systems must capture enough information that the view from all possible eye positions within six degrees of movement can be simulated on playback. 

A great deal of computing power is used to analyse the information coming from adjacent sensors to estimate the distance of each pixel captured in the environment. This process is known as ‘Spherical Epipolar Depth Estimation.’ The sensors and their lenses are arranged so that each object in the environment around the camera is captured by multiple sensors. Knowing the position in 3D space of the sensors and the specification of their lenses means that the distance of a specific object from the camera can be estimated.

6DoF: simulations based everything you can see from a single point… plus depth

Post-processing the 6DoF camera data results in a single spherical video that includes a depth map. A depth map is a greyscale image that stores an estimated distance for every pixel in a frame of video. Black represents ‘as close as can be determined’ and white represents ‘things too far away for us to determine where they are relative to each other - usually 10s of metres away (this distance can be increased by positioning the sensors further apart or by increasing their resolution).

Once there is a sphere with a depth map, the playback system can simulate X, Y and Z axis movement by moving pixels further away more slowly than pixels that are closer as the viewer moves their head. Stereoscopic depth can be simulated by sending slightly different images to each eye based on how far away each pixel is.

Moving millimetres, not metres

The first three degrees of environment video freedom - rotate - allow us to look at anywhere from a fixed point. 360° to the left or right and 180° up and down. The next three allow is to move our heads a little: a few millimetres along the X, Y and Z axes. They do not yet let us move our bodies around an environment. The small distances that the three ‘move’ degrees of freedom allow make a big difference to the feeling of immersion, because playback can now respond to the small subconscious movements we make in day to day real life when assessing where we are and what is around us.

Free plugin finds and tracks faces in footage

Saturday, 25 November 2017

For legal reasons it is sometimes necessary to have to hide the identity of people in footage. ‘Secret Identity’ is a free plugin for Final Cut Pro X, Adobe Premiere, Adobe After Effects and Motion that works out where all the faces are in a clip. It can also automatically track their positions as they move. You can then choose which people's identity you wish to hide. The plugin can then obscure their whole face, their eyes or their mouth. It can also obscure everything but people's faces.

Here's a demo video showing how it works:

Secret Identity from Dashwood Cinema Solutions is available for free if you install the free FxFactory post production app store. It only is available for macOS.

Move over 800MB/s USB 3.1 externals, here come Thunderbolt 3 drives

Tuesday, 21 November 2017

There seems to be some competition improving the state of external drives. Most workflows are more than served by the kind of bandwidth available through the USB 3.1 protocol. There are always jobs that need more. Barefeats have done a new test comparing the fastest bus-powered SSD from last year with this year’s Thunderbolt 3 drives and enclosures from Sonnet, Netstor, AKiTiO and LaCie.

See how fast they can read and write data over on the Bare Feats site.

VR: Six 4K ProRes streams to the same drive?

Although read speeds are getting very high, write speeds are becoming more important for some productions. As well as quickly needing to make backups for gigabytes of camera media, some VR cameras can have external devices attached. The Insta360 Pro currently has a USB connection for an external SSD. It records media from six sensors at the same time to HEVC/H.265. Soon producers will want to record high-quality ProRes from 6 (or more) sensors at a time, and Thunderbolt 3 might be the answer.

120 Animation Transitions for Final Cut Pro X - Special Black Friday offer - $39 for one week

Tuesday, 21 November 2017

From today Tuesday 21st November, there is a special offer on my Alex4D Animation Transitions pack - which applies for one week only, until the end of 'Cyber Monday' - November 27th.

Very rarely the FxFactory professional tools app store offers sales on all the plugins. From today, they are offering 20% off everything they distribute - including my first product.

There is no special ‘Black Friday’ or Thanksgiving offer code to apply at checkout. For one week, everything is automatically 20% cheaper.

Alex4D Animation Transitions is a pack of 120 different ways of animating content on and off the screen. Instead of having to apply a series of complex keyframes to multiple clip parameters, just drop one of these transitions on for instant animation. The advantage of using keyframes is that you can quickly adjust the start time, finish time and duration of the animation by dragging the transition or changing its duration.

Here’s a new video showing how it works:

 

  • Spin, scale and fade clips onto the screen
  • Move clips from any location: drag on-screen control to choose
  • Change animation speed and timing without using keyframes by dragging transitions in the timeline
  • Animate overlaid logos
  • Animate titles
  • Animate connected stills and videos
  • Animate between full-screen clips in the main storyline
  • Animate between clips in secondary storylines
  • Animate off the screen using the same settings, or opposite settings to keep clips moving, spinning and scaling in the same direction as they animated on
  • Scale and spin around around any point on the screen: drag on-screen control to choose
  • Divide clips into two and control the timing and animation of each part separately
  • Crop animations  
  • Works in all resolutions from 480p up to 5K and higher
  • Works at any frame rate
  • Works in any aspect ratio: landscape 20:1, 16:9, 4:3, square and portrait 3:4, 9:16, 1:20
  • 32 page PDF manual (10.6MB)

Transitions range from subtle and straightforward presets for editors who want quick results to complex and fully-customisable presets for designers who want instant advanced motion graphics in the Final Cut Pro X timeline.

25 minutes tutoriel vidéo en français par YakYakYak.fr

Traducción de esta pagina en español por Final Cut Argentina.

Buy now for $39 

atransitions-logo-with-shadow

Buy by credit card via FxFactory

Download free trial

A fully-functional watermarked trial version of Alex4D Animation Transitions is available through at FxFactory post-production app store. The trial version includes all 120 transitions and a 32 page PDF manual.  

 

Icon for FxFactory application

Free trial via FxFactory

 

If you don’t have FxFactory, click the ‘Download FxFactory’ button.

A little more help on installing FxFactory.

Restart Final Cut Pro X to see a new ‘Alex4D Animation’ category in the Transitions Browser.

Removing the watermarks

Trial version transitions include a watermark. To remove the watermark, select one of the applied transitions in the inspector and click the Buy button in Final Cut Pro, or in the FxFactory application, click the price button next to the Animation Transitions icon in the Alex4D section of the catalog. If you have entered your credit card and billing information, a dialogue box will appear to confirm your purchase. For more information on activating Alex4D Animation Transitions, visit the FxFactory website.

Generate centre-cutout guides for ARRI shoots using free online tool

Monday, 20 November 2017

The highest resolution most feature films and high-end TV shows need to be delivered in is 4K - 4096 by 2304. That doesn't mean there aren't benefits to shooting at higher resolutions. 

The advantage of using cameras such as the ARRI 65 is that 6K allows for reframing in post. The camera operator can shoot with a very loose frame knowing that editors can choose which part of the 6K frame to include in the 4K master. Also VFX can benefit from the pixels from outside the visible frame.

In order to make sure a 6K camera is being operated so that the 4K area of interest is framed correctly, it is useful to have a frame guide in the camera. ARRI have a free tool that generates these frame guides so that they can be shown on set:

You can choose which ARRI camera that is planned to be to used on your shoot and choose which guides you want to show centre cutout. In this case the 6560x3100 ARRI 65 has guides for 5K and 4K framing (based on a 2.39:1 aspect ratio).

These guides are useful in post, so the tool can also generate transparent PNGs that can be used in the production and post production workflow.

Try out the ARRI Frameline Composer on the ARRI website.

When Final Cut Pro X importing is not enough: A guide to rsync - free media copying tool

Sunday, 19 November 2017

There is a point in post production workflow when only using your NLE’s importing function is not enough. When insurance companies want to know how you are confirming data transfers and where your redundant backups will be stored. Instead of investing a dedicated application for media management, Seth Goldin suggests a free OS alternative:

As far as I can tell, rsync remains superior to pretty much every other professional application for media ingest, like Imagine Products ShotPut Pro, Red Giant Offload, DaVinci Resolve’s Clone Tool, Pomfort Silverstack, or CopyToN. Each of these applications are great in their own rights, and they deliver what they promise, but they can be slow, expensive, and CPU-intensive. In contrast, rsync is fast, completely free of charge, and computationally lightweight.

It looks like the tradeoff is much more power in return for learning a command-line based interface. Seth has written a post that explains rsync's advantages, how to install it and how to use it on Medium entitled ‘A gentle introduction to rsync, a free, powerful tool for media ingest.’ He includes how to use rsync to copy 9 camera cards onto three hard drives so that the process uses the minimum amount of CPU power while making the most of the maximum speed of each of the hard drives.

Although you may not need to learn it today, it could be the right solution for a friend now, or you soon.

9:16, 1:1, 1:2, 4:5… (Social) Media aspect ratios primer

Sunday, 19 November 2017

Many experienced film makers decry vertical and square video. The fact is, millions of people watch stories that way on their personal devices. Facebook is now not just ‘social media’ - it is ‘media.’ 20 years ago editors started to deal with other aspect ratios than 4:3. Here's the specifications from Facebook on the various aspect ratios their platforms work with:

View in new window or see PDF on Facebook site.

If you aren’t working in non-16:9 now, you will soon, or at the least need to prepare your work for others who will.

1:1 and 9:16 video are likely to become more popular, so learn to be effective in these aspect ratios!

Updated to add: Chris Roberts wrote an article earlier in 2017 on how to make 1:1 videos using Final Cut Pro X.

Apple’s VR production patent by Tim Dashwood

Thursday, 12 October 2017

Within weeks of third-party Final Cut Pro X developer Tim Dashwood joining the ProApps team, Apple applied for a patent that changes the way computers connect to VR and AR head-mounted devices: ‘Method and System for 360 Degree Head-Mounted Display Monitoring Between Software Program Modules Using Video or Image Texture Sharing’ (PDF version).

It turns out that Tim is doing more for Apple than being part of adding VR video editing features to applications. His work is part of the way macOS works in all sorts of applications.

Direct to Display = Less OS overhead

Up until now, head-mounted devices like the Oculus Rift and HTC Vive connect as specialised displays. As far as macOS or Windows is concerned, an attached device is just another monitor - albeit with an odd aspect ratio and frame rate.

The new method is for VR/AR tools to connect to Apple devices in such a way that there is no longer a 'simulate a monitor' overhead. Apple is aiming for a 1/90th of second refresh rate for VR and AR experiences. Even if you are viewing a VR video that is playing at 60 frames a second, for smooth movement it is best if what the viewer sees updates 90 times a second, so if they turn quickly, the content keeps up with them.

If macOS, iOS and tvOS are spending less time simulating a monitor display. That means more of the 90th of a second between refreshes can be spent on rendering content. Also less powerful GPUs will be able to render advanced VR content and AR overlays - because there's less OS delay in getting it in front of users' eyes.

The idea is for VR/AR applications to modify image data in a form that the OS automatically feeds to devices without simulating a monitor: 

…methods and systems for transmitting monoscopic or stereoscopic 180 degree or 360 degree still or video images from a host editing or visual effects software program as equirectangular projection, or other spherical projection, to the input of a simultaneously running software program on the same device that can continuously acquire the orientation and position data from a wired or wirelessly connected head-mounted display's orientation sensors, and simultaneously render a representative monoscopic or stereoscopic view of that orientation to the head mounted display, in real time.

For more on how HMD software must predict user actions in order to keep up with their movement, watch the 2017 Apple WWDC ‘VR with Metal 2’ session video: One guest speaker was Nat Brown of Valve Software who talked about SteamVR on macOS High Sierra:

Our biggest request to Apple, a year ago, was for this Direct to Display feature. Because it's critical to ensure that the VR compositor has the fastest time predictable path to the headset display panels. We also, really needed super accurate low variance VBL, vertical blank, events. So, that we could set the cadence of the VR frame presentation timing, and we could predict those poses accurately.

VR production

Although the patent is about how all kinds of applications work with VR and 3D VR, it also mentions a mode where the production application UI appears in the device overlaid on the content being produced:

FIG. 5 illustrates the user interface of a video or image editing or graphics manipulation software program501 with an equirectangularly projected spherical image displayed in the canvas502 and a compositing or editing timeline503. The image output of the video or image editing or graphics manipulation software program can be output via a video output processing software plugin module504 and passed to a GPU image buffer shared memory and then passed efficiently to the image receiver507 of the head-mounted display processing program506. The 3D image processing routine508 of the head-mounted display processing program will texture the inside of a virtual sphere or cube with a 3D viewpoint at the center of said sphere or cube. The virtual view for each of the left and right eyes will be accordingly cropped, duplicated (if necessary), distorted and oriented based on the lens/display specifications and received orientation data509 of the wired or wirelessly connected head-mounted display's510 orientation sensor data. Once the prepared image is rendered by the 3D image processing routine, the image can then be passed to the connected head-mounted display511 for immediate presentation to the wearer within the head-mounted display.

Additionally, since wearing a head-mounted display will obscure the wearer's view of the UI of the video or image editing or graphics manipulation software program, it is also possible to capture the computer display's user interface as an image using a screen image capture software program module512 and pass it to an image receiver/processor513 for cropping an scaling before being composited on the left and right eye renders from the 3D image processing routine508, 514, 515 and then the composited image can be passed to the connected head-mounted display for immediate presentation to the wearer within the head-mounted display.

Further, a redundant view can be displayed in a window516 on the computer's display so others can see what the wearer of the head-mounted display is seeing, or if a head-mounted display is not available

Tim has been demonstrating many interesting many 3D and VR production tool ideas over the years. Good to see his inventions now have the support of Apple Computer. I'm looking forward to the other ideas he brings to the world through Apple.

Adobe Premiere used on big new 10-part Netflix TV series

Wednesday, 13 September 2017

It was tough ask for Adobe Premiere to tackle the needs of David Fincher's 'Gone Girl' feature film in 2014. In recent months, it has been used on a bigger project: ‘Mindhunter’ - a 10 hour David Fincher exec-produced high-end TV series soon to be available on Netflix. 

Instead of a single team working on a two hour film, TV series have multiple director-cinematographer-editor teams working in parallel. In this case the pilot was directed by David Fincher. The way TV works in the US is that the pilot director gets an executive producer credit for the whole series because the decisions they make define the feel of the show from then on. Fincher brought along some of the team who worked on Gone Girl. While they worked on the pilot post production, other teams shot and edited later episodes in the series.

The fact that the production company and the studio were happy for the workflow to be based around Premiere Pro CC is a major step up for Adobe in Hollywood.

The high-end market Adobe is going for is too small to support profitable software development. Even if they sold a subscription to all professional editors in the USA, that would not be enough to pay for the costs in maintaining Adobe Premiere. Its use in high-end TV and features is a marketing message that Adobe must think contributes to people choosing to subscribe to the Adobe Creative Cloud - even if renters will never edit a Hollywood film or TV show.

What about Final Cut Pro X?

Directors Glenn Ficarra and John Requa are happy to use Final Cut Pro X in studio features. They haven't been able to use Final Cut in the TV shows they have directed. Glenn and John directed the pilot and three other episodes of ‘This is Us’ - a big success for NBC in the US last year. Although directors have much less power in TV than in features, pilot directors do have some power to set standards for the rest of the series. I don’t know why Final Cut wasn’t used on ‘This is Us.’ It could be a lack of enough collaboration features or a lack of enough Final Cut-experienced crew. It may take a while before both of these reasons no longer apply.

Although the 10.3 update for Final Cut Pro X was nearly all about features requested by people who work on high-end production, it seems the majority of the ProApps team time is spent on features for the majority of Final Cut users. 

Is the use of Final Cut Pro X in a smattering of Hollywood productions enough to support Apple’s marketing message? Will Apple invest more in Final Cut’s use in Hollywood? 

When it comes to the opinions of Hollywood insiders, it seems that Premiere is currently the only viable alternative to Avid Media Composer. Although the ProApps team is very likely to want Final Cut to be the choice people make at all levels of production, will they be able to get the investment they need from the rest of Apple to make that happen? We’ll see in the coming months and years.