4K: Only the beginning for UK’s Hangman Studios’ Final Cut Pro X productions

Some think that Final Cut Pro X has problems working with 8K footage. Hangman Studios has been making concert films with this workflow since 2015. There’s a new case study by Ronny Courtens of Lumaforge at fcp.co:

Two years ago I made a conscious decision to get rid of all of my HD cameras. We decided that everything from now on had to be 4K and up.

…our boutique post production services in London are newly designed and built for 8K workflows and high end finishing. Drawing upon 17 years of broadcast post experience we’ve designed a newer, more simplified and efficient workflow for the new age of broadcast, digital and cinema. We’re completely Mac based running a mix of older MacPro 12-cores (mid 2010) with the newer MacPro (2013) models.

I imagine there’ll be space in their West London studios for at least one new iMac Pro. When Apple gave a sneak preview of Final Cut Pro 10.4 and Motion 5.4 as part of the FCPX Creative Summit at the end of October, they showed in easily running an 8K timeline on a prerelease iMac Pro.

Apple have said that Final Cut Pro X 10.4 will able to support 8K HEVC/H.265 footage on macOS High Sierra. This kind of media is produced by 360º spherical video systems such as the Insta360 Pro. When 10.4 comes out in December, editors will be able to do even more at high resolutions.

What is ‘Six Degrees of Freedom’ 360° video?

Sunday, 26 November 2017

Six Degrees of Freedom – or 6DoF – is a system of recording scenes that when played back allow the viewer to change their view using six kinds (‘degrees’) of movement. Today common spherical video recoding uses multiple sensors attached to a spherical rig to record everything that can be seen from a single point. This means when the video is played, the viewer can…

  • turn to the left or right
  • look up or down
  • twist their head to rotate their view

…as look around inside a sphere of video.

If information has been recorded from two points close together, we perceive depth – a feeling of 3D known to professionals as ‘stereoscopic video.’ This feeling of depth applies as long as we don’t twist our heads too much or look up or down too far – because ‘stereo 360°’ only captures information on the horizontal plane.

6DoF camera systems record enough information so that three more degrees of movement are allowed. Viewers can now move their heads

  • up and down
  • left and right
  • back and forward

…a short distance.

As information about the environment can be calculated from multiple positions near the camera rig, the stereoscopic effect of perceiving depth also will apply when viewers look up and down as well as when they rotate their view.

Here is an animated gif taken from a video of a session about six degrees of freedom systemsgiven at the Facebook developer conference in April 2017:

Six degrees of freedom recording systems must capture enough information that the view from all possible eye positions within six degrees of movement can be simulated on playback.

A great deal of computing power is used to analyse the information coming from adjacent sensors to estimate the distance of each pixel captured in the environment. This process is known as ‘Spherical Epipolar Depth Estimation.’ The sensors and their lenses are arranged so that each object in the environment around the camera is captured by multiple sensors. Knowing the position in 3D space of the sensors and the specification of their lenses means that the distance of a specific object from the camera can be estimated.

6DoF: simulations based everything you can see from a single point… plus depth

Post-processing the 6DoF camera data results in a single spherical video that includes a depth map. A depth map is a greyscale image that stores an estimated distance for every pixel in a frame of video. Black represents ‘as close as can be determined’ and white represents ‘things too far away for us to determine where they are relative to each other – usually 10s of metres away (this distance can be increased by positioning the sensors further apart or by increasing their resolution).

Once there is a sphere with a depth map, the playback system can simulate X, Y and Z axis movement by moving pixels further away more slowly than pixels that are closer as the viewer moves their head. Stereoscopic depth can be simulated by sending slightly different images to each eye based on how far away each pixel is.

Moving millimetres, not metres

The first three degrees of environment video freedom – rotate – allow us to look at anywhere from a fixed point. 360° to the left or right and 180° up and down. The next three allow is to move our heads a little: a few millimetres along the X, Y and Z axes. They do not yet let us move our bodies around an environment. The small distances that the three ‘move’ degrees of freedom allow make a big difference to the feeling of immersion, because playback can now respond to the small subconscious movements we make in day to day real life when assessing where we are and what is around us.

Free plugin finds and tracks faces in footage

Saturday, 25 November 2017

For legal reasons it is sometimes necessary to have to hide the identity of people in footage. ‘Secret Identity’ is a free plugin for Final Cut Pro X, Adobe Premiere, Adobe After Effects and Motion that works out where all the faces are in a clip. It can also automatically track their positions as they move. You can then choose which people’s identity you wish to hide. The plugin can then obscure their whole face, their eyes or their mouth. It can also obscure everything but people’s faces.

Here’s a demo video showing how it works:

Secret Identity from Dashwood Cinema Solutions is available for free if you install the free FxFactory post production app store. It only is available for macOS.

Move over 800MB/s USB 3.1 externals, here come Thunderbolt 3 drives

Tuesday, 21 November 2017

There seems to be some competition improving the state of external drives. Most workflows are more than served by the kind of bandwidth available through the USB 3.1 protocol. There are always jobs that need more. Barefeats have done a new test comparing the fastest bus-powered SSD from last year with this year’s Thunderbolt 3 drives and enclosures from Sonnet, Netstor, AKiTiO and LaCie.

See how fast they can read and write data over on the Bare Feats site.

VR: Six 4K ProRes streams to the same drive?

Although read speeds are getting very high, write speeds are becoming more important for some productions. As well as quickly needing to make backups for gigabytes of camera media, some VR cameras can have external devices attached. The Insta360 Pro currently has a USB connection for an external SSD. It records media from six sensors at the same time to HEVC/H.265. Soon producers will want to record high-quality ProRes from 6 (or more) sensors at a time, and Thunderbolt 3 might be the answer.

120 Animation Transitions for Final Cut Pro X – Special Black Friday offer – $39 for one week

Tuesday, 21 November 2017

From today Tuesday 21st November, there is a special offer on my Alex4D Animation Transitions pack – which applies for one week only, until the end of ‘Cyber Monday’ – November 27th.

Very rarely the FxFactory professional tools app store offers sales on all the plugins. From today, they are offering 20% off everything they distribute – including my first product.

There is no special ‘Black Friday’ or Thanksgiving offer code to apply at checkout. For one week, everything is automatically 20% cheaper.

Alex4D Animation Transitions is a pack of 120 different ways of animating content on and off the screen. Instead of having to apply a series of complex keyframes to multiple clip parameters, just drop one of these transitions on for instant animation. The advantage of using keyframes is that you can quickly adjust the start time, finish time and duration of the animation by dragging the transition or changing its duration.

Here’s a new video showing how it works:

  • Spin, scale and fade clips onto the screen
  • Move clips from any location: drag on-screen control to choose
  • Change animation speed and timing without using keyframes by dragging transitions in the timeline
  • Animate overlaid logos
  • Animate titles
  • Animate connected stills and videos
  • Animate between full-screen clips in the main storyline
  • Animate between clips in secondary storylines
  • Animate off the screen using the same settings, or opposite settings to keep clips moving, spinning and scaling in the same direction as they animated on
  • Scale and spin around around any point on the screen: drag on-screen control to choose
  • Divide clips into two and control the timing and animation of each part separately
  • Crop animations
  • Works in all resolutions from 480p up to 5K and higher
  • Works at any frame rate
  • Works in any aspect ratio: landscape 20:1, 16:9, 4:3, square and portrait 3:4, 9:16, 1:20
  • 32 page PDF manual (10.6MB)

Transitions range from subtle and straightforward presets for editors who want quick results to complex and fully-customisable presets for designers who want instant advanced motion graphics in the Final Cut Pro X timeline.

25 minutes tutoriel vidéo en français par YakYakYak.fr

Traducción de esta pagina en español por Final Cut Argentina.

Buy now for $39 

atransitions-logo-with-shadow

Buy by credit card via FxFactory

Download free trial

A fully-functional watermarked trial version of Alex4D Animation Transitions is available through at FxFactory post-production app store. The trial version includes all 120 transitions and a 32 page PDF manual.

 

Icon for FxFactory application

Free trial via FxFactory

 

If you don’t have FxFactory, click the ‘Download FxFactory’ button.

A little more help on installing FxFactory.

Restart Final Cut Pro X to see a new ‘Alex4D Animation’ category in the Transitions Browser.

Removing the watermarks

Trial version transitions include a watermark. To remove the watermark, select one of the applied transitions in the inspector and click the Buy button in Final Cut Pro, or in the FxFactory application, click the price button next to the Animation Transitions icon in the Alex4D section of the catalog. If you have entered your credit card and billing information, a dialogue box will appear to confirm your purchase. For more information on activating Alex4D Animation Transitions, visit the FxFactory website.

Generate centre-cutout guides for ARRI shoots using free online tool

Monday, 20 November 2017

The highest resolution most feature films and high-end TV shows need to be delivered in is 4K – 4096 by 2304. That doesn’t mean there aren’t benefits to shooting at higher resolutions.

The advantage of using cameras such as the ARRI 65 is that 6K allows for reframing in post. The camera operator can shoot with a very loose frame knowing that editors can choose which part of the 6K frame to include in the 4K master. Also VFX can benefit from the pixels from outside the visible frame.

In order to make sure a 6K camera is being operated so that the 4K area of interest is framed correctly, it is useful to have a frame guide in the camera. ARRI have a free tool that generates these frame guides so that they can be shown on set:

You can choose which ARRI camera that is planned to be to used on your shoot and choose which guides you want to show centre cutout. In this case the 6560×3100 ARRI 65 has guides for 5K and 4K framing (based on a 2.39:1 aspect ratio).

These guides are useful in post, so the tool can also generate transparent PNGs that can be used in the production and post production workflow.

Try out the ARRI Frameline Composer on the ARRI website.

When Final Cut Pro X importing is not enough: A guide to rsync – free media copying tool

Sunday, 19 November 2017

There is a point in post production workflow when only using your NLE’s importing function is not enough. When insurance companies want to know how you are confirming data transfers and where your redundant backups will be stored. Instead of investing a dedicated application for media management, Seth Goldin suggests a free OS alternative:

As far as I can tell, rsync remains superior to pretty much every other professional application for media ingest, like Imagine Products ShotPut Pro, Red Giant Offload, DaVinci Resolve’s Clone Tool, Pomfort Silverstack, or CopyToN. Each of these applications are great in their own rights, and they deliver what they promise, but they can be slow, expensive, and CPU-intensive. In contrast, rsync is fast, completely free of charge, and computationally lightweight.

It looks like the tradeoff is much more power in return for learning a command-line based interface. Seth has written a post that explains rsync’s advantages, how to install it and how to use it on Medium entitled ‘A gentle introduction to rsync, a free, powerful tool for media ingest.’ He includes how to use rsync to copy 9 camera cards onto three hard drives so that the process uses the minimum amount of CPU power while making the most of the maximum speed of each of the hard drives.

Although you may not need to learn it today, it could be the right solution for a friend now, or you soon.

9:16, 1:1, 1:2, 4:5… (Social) Media aspect ratios primer

Sunday, 19 November 2017

Many experienced film makers decry vertical and square video. The fact is, millions of people watch stories that way on their personal devices. Facebook is now not just ‘social media’ – it is ‘media.’ 20 years ago editors started to deal with other aspect ratios than 4:3. Here’s the specifications from Facebook on the various aspect ratios their platforms work with:

View in new window or see PDF on Facebook site.

If you aren’t working in non-16:9 now, you will soon, or at the least need to prepare your work for others who will.

1:1 and 9:16 video are likely to become more popular, so learn to be effective in these aspect ratios!

Updated to add: Chris Roberts wrote an article earlier in 2017 on how to make 1:1 videos using Final Cut Pro X.

Apple’s VR production patent by Tim Dashwood

Thursday, 12 October 2017

Within weeks of third-party Final Cut Pro X developer Tim Dashwood joining the ProApps team, Apple applied for a patent that changes the way computers connect to VR and AR head-mounted devices: ‘Method and System for 360 Degree Head-Mounted Display Monitoring Between Software Program Modules Using Video or Image Texture Sharing’ (PDF version).

It turns out that Tim is doing more for Apple than being part of adding VR video editing features to applications. His work is part of the way macOS works in all sorts of applications.

Direct to Display = Less OS overhead

Up until now, head-mounted devices like the Oculus Rift and HTC Vive connect as specialised displays. As far as macOS or Windows is concerned, an attached device is just another monitor – albeit with an odd aspect ratio and frame rate.

The new method is for VR/AR tools to connect to Apple devices in such a way that there is no longer a ‘simulate a monitor’ overhead. Apple is aiming for a 1/90th of second refresh rate for VR and AR experiences. Even if you are viewing a VR video that is playing at 60 frames a second, for smooth movement it is best if what the viewer sees updates 90 times a second, so if they turn quickly, the content keeps up with them.

If macOS, iOS and tvOS are spending less time simulating a monitor display. That means more of the 90th of a second between refreshes can be spent on rendering content. Also less powerful GPUs will be able to render advanced VR content and AR overlays – because there’s less OS delay in getting it in front of users’ eyes.

The idea is for VR/AR applications to modify image data in a form that the OS automatically feeds to devices without simulating a monitor:

…methods and systems for transmitting monoscopic or stereoscopic 180 degree or 360 degree still or video images from a host editing or visual effects software program as equirectangular projection, or other spherical projection, to the input of a simultaneously running software program on the same device that can continuously acquire the orientation and position data from a wired or wirelessly connected head-mounted display’s orientation sensors, and simultaneously render a representative monoscopic or stereoscopic view of that orientation to the head mounted display, in real time.

For more on how HMD software must predict user actions in order to keep up with their movement, watch the 2017 Apple WWDC ‘VR with Metal 2’ session video: One guest speaker was Nat Brown of Valve Software who talked about SteamVR on macOS High Sierra:

Our biggest request to Apple, a year ago, was for this Direct to Display feature. Because it’s critical to ensure that the VR compositor has the fastest time predictable path to the headset display panels. We also, really needed super accurate low variance VBL, vertical blank, events. So, that we could set the cadence of the VR frame presentation timing, and we could predict those poses accurately.

VR production

Although the patent is about how all kinds of applications work with VR and 3D VR, it also mentions a mode where the production application UI appears in the device overlaid on the content being produced:

FIG. 5 illustrates the user interface of a video or image editing or graphics manipulation software program501 with an equirectangularly projected spherical image displayed in the canvas502 and a compositing or editing timeline503. The image output of the video or image editing or graphics manipulation software program can be output via a video output processing software plugin module504 and passed to a GPU image buffer shared memory and then passed efficiently to the image receiver507 of the head-mounted display processing program506. The 3D image processing routine508 of the head-mounted display processing program will texture the inside of a virtual sphere or cube with a 3D viewpoint at the center of said sphere or cube. The virtual view for each of the left and right eyes will be accordingly cropped, duplicated (if necessary), distorted and oriented based on the lens/display specifications and received orientation data509 of the wired or wirelessly connected head-mounted display’s510 orientation sensor data. Once the prepared image is rendered by the 3D image processing routine, the image can then be passed to the connected head-mounted display511 for immediate presentation to the wearer within the head-mounted display.

Additionally, since wearing a head-mounted display will obscure the wearer’s view of the UI of the video or image editing or graphics manipulation software program, it is also possible to capture the computer display’s user interface as an image using a screen image capture software program module512 and pass it to an image receiver/processor513 for cropping an scaling before being composited on the left and right eye renders from the 3D image processing routine508, 514, 515 and then the composited image can be passed to the connected head-mounted display for immediate presentation to the wearer within the head-mounted display.

Further, a redundant view can be displayed in a window516 on the computer’s display so others can see what the wearer of the head-mounted display is seeing, or if a head-mounted display is not available

Tim has been demonstrating many interesting many 3D and VR production tool ideas over the years. Good to see his inventions now have the support of Apple Computer. I’m looking forward to the other ideas he brings to the world through Apple.

Adobe Premiere used on big new 10-part Netflix TV series

Wednesday, 13 September 2017

It was tough ask for Adobe Premiere to tackle the needs of David Fincher’s ‘Gone Girl’ feature film in 2014. In recent months, it has been used on a bigger project: ‘Mindhunter’ – a 10 hour David Fincher exec-produced high-end TV series soon to be available on Netflix.

Instead of a single team working on a two hour film, TV series have multiple director-cinematographer-editor teams working in parallel. In this case the pilot was directed by David Fincher. The way TV works in the US is that the pilot director gets an executive producer credit for the whole series because the decisions they make define the feel of the show from then on. Fincher brought along some of the team who worked on Gone Girl. While they worked on the pilot post production, other teams shot and edited later episodes in the series.

The fact that the production company and the studio were happy for the workflow to be based around Premiere Pro CC is a major step up for Adobe in Hollywood.

The high-end market Adobe is going for is too small to support profitable software development. Even if they sold a subscription to all professional editors in the USA, that would not be enough to pay for the costs in maintaining Adobe Premiere. Its use in high-end TV and features is a marketing message that Adobe must think contributes to people choosing to subscribe to the Adobe Creative Cloud – even if renters will never edit a Hollywood film or TV show.

What about Final Cut Pro X?

Directors Glenn Ficarra and John Requa are happy to use Final Cut Pro X in studio features. They haven’t been able to use Final Cut in the TV shows they have directed. Glenn and John directed the pilot and three other episodes of ‘This is Us’ – a big success for NBC in the US last year. Although directors have much less power in TV than in features, pilot directors do have some power to set standards for the rest of the series. I don’t know why Final Cut wasn’t used on ‘This is Us.’ It could be a lack of enough collaboration features or a lack of enough Final Cut-experienced crew. It may take a while before both of these reasons no longer apply.

Although the 10.3 update for Final Cut Pro X was nearly all about features requested by people who work on high-end production, it seems the majority of the ProApps team time is spent on features for the majority of Final Cut users.

Is the use of Final Cut Pro X in a smattering of Hollywood productions enough to support Apple’s marketing message? Will Apple invest more in Final Cut’s use in Hollywood?

When it comes to the opinions of Hollywood insiders, it seems that Premiere is currently the only viable alternative to Avid Media Composer. Although the ProApps team is very likely to want Final Cut to be the choice people make at all levels of production, will they be able to get the investment they need from the rest of Apple to make that happen? We’ll see in the coming months and years.

IMF: Output any version you need from a single master

Tuesday, 12 September 2017

Interoprable Master Format is a system that allows you to specify all the versions of a feature film using a set of rules. Instead of rendering out every combination of language, aspect ratio, certification, distributor standard, you define how their rules apply to your movie. When a specific version is called for, it can then be rendered out automatically based on the media and the specific timeline included in an IMP (Interoperable Mastering Package).

Even if you don’t work in high-end features, it is worth learning about this because it is coming to TV and online delivery in 2018. For now this IMF is for high-end tools, services and suppliers, but the nature of video production means that it will be the eventual standard most NLEs will support – maybe even directly, with few external tools.

This video – presented by Bruce Devlin (@mrMXF on Twitter) – is an introduction to IMF, and the first in a series, should you want to learn more:

(My YouTube playlist of videos on Interoperable Mastering Format in order)

The nature of Final Cut Pro X makes it potentially the best NLE to work with IMF. Apple could add features to the timeline required to generate IMPs. Compressor could generate specific versions of a film or TV show based on an IMP.

If Apple considers this the kind of feature best left to third-parties, I hope they add the required hooks to Final Cut so Frame.io (for example) could add IMF management to their Final Cut Pro X service.

Apple Goes to Hollywood: For more than just TV production

Friday, 01 September 2017

Apple have had offices in Los Angeles for many years. The number of Apple employees in the area rose significantly when the company bought Beats Music in 2014. Now it looks like there’ll be more to the LA operation than music.

The Financial Times reports [paywall link] that Apple are looking for more space in Culver City, Los Angeles County. The FT say that Apple is thinking of leasing space at The Culver Studios. Culver City isn’t exactly close to Hollywood, but from a production perspective, it counts as Hollywood: both Gone With the Wind and Citizen Kane were filmed at The Culver Studios.

The FT headline ‘Apple eyes iconic studio as base for Hollywood production push’ implies that they want space to make high-end TV and feature films – including bidding to produce a TV show for Netflix. Interesting that they suggest that Apple plan to make TV for others – instead of commissioning others to make TV for them. That would mean Apple investing in the hardware and infrastructure to make high-end TV directly.

Office space for…

However, the body of the article says that Apple is primarily looking for office space. It seems that the large amount of office space that Beats lease won’t be enough. It could be that Apple Music administration needs more people (The Culver Studios is only 15 minutes walk from Beats). On the hand, what else could Apple be doing in LA?

They certainly need to need to hire enough new staff to be involved in their $1bn push into TV. They could be based in Los Angeles County.

Part of the Mac team seems to be based in Culver City. A recent vacancy listed on the Apple jobs site was for an expert to set up a post production workflow lab in Culver City. That is likely to be primarily about making sure the next iteration of the Mac Pro fits the future needs of Hollywood TV and film production:

Help shape the future of the Mac in the creative market. The Macintosh team is seeking talented technical leadership in a System Architecture team. This is an individual contributor role. The ideal candidate has core competencies in one or more professional artists content creation areas with specific expertise in video, and photo, audio, and 3D animation.

The pro workflow expert will be responsible for thoroughly comprehending all phases of professional content creation, working closely with 3rd party apps developers and some key customers, thoroughly documenting, and working with architects to instrument systems for performance analysis.

It seems that some of Apple’s ProApps team is based in Culver City too. Recent job openings for a Video Applications Graphics Engineering Intern and a Senior macOS/iOS Software Engineer for Video Applications are based there.

Also, if I was going to develop a VR and AR content business, it might be a good idea to create custom-designed studio resources for VR and AR content production. Los Angeles would be a good location to experiment with the future of VR and AR.

26th November 2017

What is ‘Six Degrees of Freedom’ 360° video?

30th November 2017

Amazon launches Rekognition Video content tagging for third-party applications