Latest Posts

BlogWhen Final Cut Pro X importing is not enough: A guide to rsync – free media copying toolSunday, November 19 2017

There is a point in post production workflow when only using your NLE’s importing function is not enough. When insurance companies want to know how you are confirming data transfers and where your redundant backups will be stored. Instead of investing a dedicated application for media management, Seth Goldin suggests a free OS alternative:

As far as I can tell, rsync remains superior to pretty much every other professional application for media ingest, like Imagine Products ShotPut Pro, Red Giant Offload, DaVinci Resolve’s Clone Tool, Pomfort Silverstack, or CopyToN. Each of these applications are great in their own rights, and they deliver what they promise, but they can be slow, expensive, and CPU-intensive. In contrast, rsync is fast, completely free of charge, and computationally lightweight.

It looks like the tradeoff is much more power in return for learning a command-line based interface. Seth has written a post that explains rsync’s advantages, how to install it and how to use it on Medium entitled ‘A gentle introduction to rsync, a free, powerful tool for media ingest.’ He includes how to use rsync to copy 9 camera cards onto three hard drives so that the process uses the minimum amount of CPU power while making the most of the maximum speed of each of the hard drives.

Although you may not need to learn it today, it could be the right solution for a friend now, or you soon.

Read more
Apple Motion 59:16, 1:1, 1:2, 4:5… (Social) Media aspect ratios primerSunday, November 19 2017

Updated in March 2020

Many experienced film makers decry vertical and square video. The fact is, millions of people watch stories that way on their personal devices. Facebook is now not just ‘social media’ – it is ‘media.’ 20 years ago editors started to deal with other aspect ratios than 4:3. Here’s the specifications from Facebook on the various aspect ratios their platforms work with:

Read more
BlogApple’s VR production patent by Tim DashwoodMonday, October 30 2017

Within weeks of third-party Final Cut Pro X developer Tim Dashwood joining the ProApps team, Apple applied for a patent that changes the way computers connect to VR and AR head-mounted devices: ‘Method and System for 360 Degree Head-Mounted Display Monitoring Between Software Program Modules Using Video or Image Texture Sharing’ (PDF version).

It turns out that Tim is doing more for Apple than being part of adding VR video editing features to applications. His work is part of the way macOS works in all sorts of applications.

Direct to Display = Less OS overhead

Up until now, head-mounted devices like the Oculus Rift and HTC Vive connect as specialised displays. As far as macOS or Windows is concerned, an attached device is just another monitor – albeit with an odd aspect ratio and frame rate.

The new method is for VR/AR tools to connect to Apple devices in such a way that there is no longer a ‘simulate a monitor’ overhead. Apple is aiming for a 1/90th of second refresh rate for VR and AR experiences. Even if you are viewing a VR video that is playing at 60 frames a second, for smooth movement it is best if what the viewer sees updates 90 times a second, so if they turn quickly, the content keeps up with them.

If macOS, iOS and tvOS are spending less time simulating a monitor display. That means more of the 90th of a second between refreshes can be spent on rendering content. Also less powerful GPUs will be able to render advanced VR content and AR overlays – because there’s less OS delay in getting it in front of users’ eyes.

The idea is for VR/AR applications to modify image data in a form that the OS automatically feeds to devices without simulating a monitor:

…methods and systems for transmitting monoscopic or stereoscopic 180 degree or 360 degree still or video images from a host editing or visual effects software program as equirectangular projection, or other spherical projection, to the input of a simultaneously running software program on the same device that can continuously acquire the orientation and position data from a wired or wirelessly connected head-mounted display’s orientation sensors, and simultaneously render a representative monoscopic or stereoscopic view of that orientation to the head mounted display, in real time.

For more on how HMD software must predict user actions in order to keep up with their movement, watch the 2017 Apple WWDC ‘VR with Metal 2’ session video: One guest speaker was Nat Brown of Valve Software who talked about SteamVR on macOS High Sierra:

Our biggest request to Apple, a year ago, was for this Direct to Display feature. Because it’s critical to ensure that the VR compositor has the fastest time predictable path to the headset display panels. We also, really needed super accurate low variance VBL, vertical blank, events. So, that we could set the cadence of the VR frame presentation timing, and we could predict those poses accurately.

VR production

Although the patent is about how all kinds of applications work with VR and 3D VR, it also mentions a mode where the production application UI appears in the device overlaid on the content being produced:

FIG. 5 illustrates the user interface of a video or image editing or graphics manipulation software program501 with an equirectangularly projected spherical image displayed in the canvas502 and a compositing or editing timeline503. The image output of the video or image editing or graphics manipulation software program can be output via a video output processing software plugin module504 and passed to a GPU image buffer shared memory and then passed efficiently to the image receiver507 of the head-mounted display processing program506. The 3D image processing routine508 of the head-mounted display processing program will texture the inside of a virtual sphere or cube with a 3D viewpoint at the center of said sphere or cube. The virtual view for each of the left and right eyes will be accordingly cropped, duplicated (if necessary), distorted and oriented based on the lens/display specifications and received orientation data509 of the wired or wirelessly connected head-mounted display’s510 orientation sensor data. Once the prepared image is rendered by the 3D image processing routine, the image can then be passed to the connected head-mounted display511 for immediate presentation to the wearer within the head-mounted display.

Additionally, since wearing a head-mounted display will obscure the wearer’s view of the UI of the video or image editing or graphics manipulation software program, it is also possible to capture the computer display’s user interface as an image using a screen image capture software program module512 and pass it to an image receiver/processor513 for cropping an scaling before being composited on the left and right eye renders from the 3D image processing routine508, 514, 515 and then the composited image can be passed to the connected head-mounted display for immediate presentation to the wearer within the head-mounted display.

Further, a redundant view can be displayed in a window516 on the computer’s display so others can see what the wearer of the head-mounted display is seeing, or if a head-mounted display is not available

Tim has been demonstrating many interesting many 3D and VR production tool ideas over the years. Good to see his inventions now have the support of Apple Computer. I’m looking forward to the other ideas he brings to the world through Apple.

Read more
BlogAdobe Premiere used on big new 10-part Netflix TV seriesWednesday, September 13 2017

It was tough ask for Adobe Premiere to tackle the needs of David Fincher’s ‘Gone Girl’ feature film in 2014. In recent months, it has been used on a bigger project: ‘Mindhunter’ – a 10 hour David Fincher exec-produced high-end TV series soon to be available on Netflix.

Instead of a single team working on a two hour film, TV series have multiple director-cinematographer-editor teams working in parallel. In this case the pilot was directed by David Fincher. The way TV works in the US is that the pilot director gets an executive producer credit for the whole series because the decisions they make define the feel of the show from then on. Fincher brought along some of the team who worked on Gone Girl. While they worked on the pilot post production, other teams shot and edited later episodes in the series.

The fact that the production company and the studio were happy for the workflow to be based around Premiere Pro CC is a major step up for Adobe in Hollywood.

The high-end market Adobe is going for is too small to support profitable software development. Even if they sold a subscription to all professional editors in the USA, that would not be enough to pay for the costs in maintaining Adobe Premiere. Its use in high-end TV and features is a marketing message that Adobe must think contributes to people choosing to subscribe to the Adobe Creative Cloud – even if renters will never edit a Hollywood film or TV show.

What about Final Cut Pro X?

Directors Glenn Ficarra and John Requa are happy to use Final Cut Pro X in studio features. They haven’t been able to use Final Cut in the TV shows they have directed. Glenn and John directed the pilot and three other episodes of ‘This is Us’ – a big success for NBC in the US last year. Although directors have much less power in TV than in features, pilot directors do have some power to set standards for the rest of the series. I don’t know why Final Cut wasn’t used on ‘This is Us.’ It could be a lack of enough collaboration features or a lack of enough Final Cut-experienced crew. It may take a while before both of these reasons no longer apply.

Although the 10.3 update for Final Cut Pro X was nearly all about features requested by people who work on high-end production, it seems the majority of the ProApps team time is spent on features for the majority of Final Cut users.

Is the use of Final Cut Pro X in a smattering of Hollywood productions enough to support Apple’s marketing message? Will Apple invest more in Final Cut’s use in Hollywood?

When it comes to the opinions of Hollywood insiders, it seems that Premiere is currently the only viable alternative to Avid Media Composer. Although the ProApps team is very likely to want Final Cut to be the choice people make at all levels of production, will they be able to get the investment they need from the rest of Apple to make that happen? We’ll see in the coming months and years.

Read more
BlogApple Goes to Hollywood: For more than just TV productionFriday, September 1 2017

Apple have had offices in Los Angeles for many years. The number of Apple employees in the area rose significantly when the company bought Beats Music in 2014. Now it looks like there’ll be more to the LA operation than music.

The Financial Times reports [paywall link] that Apple are looking for more space in Culver City, Los Angeles County. The FT say that Apple is thinking of leasing space at The Culver Studios. Culver City isn’t exactly close to Hollywood, but from a production perspective, it counts as Hollywood: both Gone With the Wind and Citizen Kane were filmed at The Culver Studios.

The FT headline ‘Apple eyes iconic studio as base for Hollywood production push’ implies that they want space to make high-end TV and feature films – including bidding to produce a TV show for Netflix. Interesting that they suggest that Apple plan to make TV for others – instead of commissioning others to make TV for them. That would mean Apple investing in the hardware and infrastructure to make high-end TV directly.

Office space for…

However, the body of the article says that Apple is primarily looking for office space. It seems that the large amount of office space that Beats lease won’t be enough. It could be that Apple Music administration needs more people (The Culver Studios is only 15 minutes walk from Beats). On the hand, what else could Apple be doing in LA?

They certainly need to need to hire enough new staff to be involved in their $1bn push into TV. They could be based in Los Angeles County.

Part of the Mac team seems to be based in Culver City. A recent vacancy listed on the Apple jobs site was for an expert to set up a post production workflow lab in Culver City. That is likely to be primarily about making sure the next iteration of the Mac Pro fits the future needs of Hollywood TV and film production:

Help shape the future of the Mac in the creative market. The Macintosh team is seeking talented technical leadership in a System Architecture team. This is an individual contributor role. The ideal candidate has core competencies in one or more professional artists content creation areas with specific expertise in video, and photo, audio, and 3D animation.

The pro workflow expert will be responsible for thoroughly comprehending all phases of professional content creation, working closely with 3rd party apps developers and some key customers, thoroughly documenting, and working with architects to instrument systems for performance analysis.

It seems that some of Apple’s ProApps team is based in Culver City too. Recent job openings for a Video Applications Graphics Engineering Intern and a Senior macOS/iOS Software Engineer for Video Applications are based there.

Also, if I was going to develop a VR and AR content business, it might be a good idea to create custom-designed studio resources for VR and AR content production. Los Angeles would be a good location to experiment with the future of VR and AR.

Read more
BlogAdobe discontinues Speedgrade, will live on as Adobe Premiere panel – More integration to come?Wednesday, August 23 2017

Will all Adobe video applications end up as panels in Adobe Premiere? Adobe doesn’t see the need to make an application dedicated to the colour grading process any more. Adobe have announced that they are discontinuing their Speedgrade colour grading application:

Producing a separate application for color grading was born out of necessity some 35 years ago – it was never a desirable split from a creative perspective.

I don’t think audio post people would say the same about picture editing.

…the paradigm of consolidating toolsets for a specific task into a single panel has led to further innovation. The Essential Sound Panel and the new Essential Graphics panel are designed with the same goal in mind: streamlining professional and powerful workflows made for editors.

Maybe this is a sign that Blackmagic’s Resolve 12 and 14 updates are putting pressure on Adobe. Which other Adobe video applications do you think will end up as panels in Premiere?

Read more