Latest Posts

BlogApple spending $1bn on TV production – how much on Final Cut Pro X post?Sunday, August 20 2017

Last week the Wall Street Journal reported that Apple will spend $1 billon on making its own TV content over the next year:

Combined with the company’s marketing clout and global reach, the step immediately makes Apple a considerable competitor in a crowded market, where both new and traditional media players are vying for original shows. Apple’s budget is about half of what Time Warner Inc.’s HBO spent on content last year, and on par with estimates of what Amazon.com Inc. spent in 2013, one year after it announced its move into original programming.

Apple could acquire and produce as many as 10 television shows, according to the people familiar with the plan, helping fulfill Apple Senior Vice President Eddy Cue’s vision of offering high-quality video, similar to shows such as HBO’s “Game of Thrones,” on its streaming-music service or possibly a new, video-focused service.

Given that post production costs on feature films and high-end TV usually amount to 1-3% of total budgets, that means around $10-30 million will be spent on picture editing, sound editing, compositing and mastering.

How much of that $10-30 million will be spent on Final Cut Pro X-based workflows?

How prescriptive will Apple be?

Judging from recent success in TV production, the trick that HBO, Netflix and Amazon have mastered is to be less hands-on with the creative people involved. Their policy is to invest in people will proven track records and not to manage them too closely.

That means Apple are unlikely to be insisting that each TV show is as precisely designed and produced as an iPhone. They will not insist on Apple products and services being at the core of story ideas, or even being placed clearly on screen. This kind of thing would be instantly counter-productive.

Although what goes into the writing and on screen is unlikely to be influenced by Apple, that restriction might not apply to the aspects of production that viewers won’t be able to judge. That means Apple could require that preproduction, production and post-production use a specific amount of Apple products, services and software.

Even Apple isn’t forced to use Apple

Today Apple doesn’t force all suppliers and staff to only use Apple products and services. Marketing vacancies at apple.jobs.com include requirements that people know how to use Adobe products for which there are Apple equivalents. Some Apple TV commercials are not edited using Final Cut Pro X, some motion graphics are not created in Motion 5, audio post production is not limited to Logic Pro X.

Apple sensibly want to be able to work with the best people and suppliers – and not be limited to those that only use Apple products. On the other hand, Apple’s hardware, software and services teams proceed on the basis that what they make would be the best tools to make high-end TV and feature films.

Train the talented in the tools you want them to use

There are two things Apple can do here: firstly, they need to improve their products to make them more suited for high-end production. Secondly, they could invest in the education aspects of the post ecosystem. Production companies who are required to use Final Cut Pro X to edit the next House of Cards or Stranger Things are likely to say: ‘There aren’t enough editors, assistant editors, apprentices, post production supervisors and VFX producers who know Final Cut Pro X and its ecosystem.’

Oversupply is a requirement

The catch is that although there are some people in Los Angeles and New York who could be employed in these roles, they don’t have enough experience in high-end TV. The sad thing about TV and film production is that you need a whole ecosystem of people and suppliers who know a specific post system: so you can have the security of knowing you can fire individuals and drop companies when you want. Production company management techniques expect that kind of control over costs. That’s working in the VFX industry is so tough – those paying the bills know that there is enough oversupply to keep them in a very good negotiation position.

Specific demands of commissioners such as Amazon are changing post workflows. They specify that shows that they fund must be made using a 4K workflow. In practice almost no-one will benefit from more pixels being streamed to their TVs at home, but Amazon consider 4K an important marketing distinction. But production companies will change their workflow in order to be in line for some Amazon money.

Apple are unlikely to require a Final Cut Pro X workflow. They could encourage its use by allowing production companies to get double the usual money for post budgets if they use Final Cut. Dangling that carrot won’t make adoption possible any time soon. $1bn of TV production makes around 10 big (‘Fargo’/‘The Walking Dead’-sized) shows per year. If the post production of five or six of these shows are done in Los Angeles, because of a lack of a Final Cut Pro X people and supplier ecosystem, I guess only two of those could be made simultaneously using a Final Cut-based workflow.

A few million on a big plan

As Apple is about to spend millions of dollars in Los Angeles with creative people, perhaps it is time for Apple to prime the Final Cut Pro X L.A. post production ecosystem: Train the trainers, plan the courses, do the marketing to post people, train experienced post people and generate case studies. Create the oversupply that makes producers feel like they have enough control.

There is time. TV show development takes months. While new Apple commissioning people make their plans and start working with talented people and production companies, there is time other Apple people can set about preparing a much bigger ecosystem to support production and post production using Apple hardware, software and services.

‘Negative cost’ Final Cut Pro X training

What could Apple do with $1m in Los Angeles? Pay experienced post production people to attend Final Cut Pro X training. For editors, assistant editors, VFX supervisors, VFX staffers, producers, writers, directors, reporters. Pay them their normal daily rates to be trained in what people their roles need to know about Apple’s Pro Apps.

Who needs to be convinced?

The argument isn’t about ‘tracks vs. the magnetic timeline’ – it’s about money. All the talk of convincing post people to use Final Cut Pro X is a nice, kind way of doing things. The people who need convincing are those with the money. Post didn’t move to Avid 20 years ago because it was better than film. The money people were convinced by the economics of computer-based editing, and ordered the post people to make the change.

Sorting the supply of people and third-party services is the start of this. The next stage is gathering the evidence of how much money will be saved. Once that happens, improvements in the magnetic timeline or the Final Cut Pro X version of bin locking will be irrelevant. Once it can be shown that a switch to Final Cut Pro X makes post twice as cheap and twice as flexible as any other method, that’s when the switch will happen.

Time to for Apple start planning and make a big change in high-end post production.

Read more
BlogThe end of public advertisingFriday, August 11 2017

How much would you pay for all advertising to be removed from your view as you go about your daily life? All is it needs is the ability to interpolate what advertsing covers up, and replace all advertising with that.

This TechCrunch article buries the lede:

Facebook buys computer vision startup focused on adding objects to video

Adding objects to video isn’t as hard as removing objects, Facebook has bought a company that has that technology:

Facebook has acquired a German company called Fayteq that builds software add-ons for video editing that can remove and add whole objects from captured video using computer vision

My emphasis.

Facebook also have a company that makes glasses you can wear to run the software.

I would guess that Apple and others will also develop such software to run on their AR devices. Once a majority of people won’t be able to see shared public advertising, how long until it is no longer put up?

Read more
BlogApple Patent: Personalised Programming for Streaming Media BreaksTuesday, August 8 2017

Apple have been awarded a patent for ‘content pods’

A content delivery system determines what personal content is available on the user device through connecting to available information sources. The delivery system then assembles the content pod from these elements in addition to invitational content from content providers. In some embodiments, a bumper message is included in the content pod to provide a context for the elements that are being assembled in combination with each other. Once the content pod is generated, it is sent to the user device to be played during content breaks within the online streaming playback.

The patent doesn’t specify whether this pod is made for breaks in video streaming – Apple TV – or audio – Apple Music. This means automatically generated audio and video content to pepper the ‘stream’ (or Facebook/Twitter/Instagram feed). Apple already creates animated video ‘Memories’ based on photos on iOS and macOS.

‘Pod’?

Interesting that Apple refers to these bundles of content as ‘pods.’ Seems that when they applied for this patent, they saw the value of the podcast brand. As people have had problems widening understanding of podcasts outside their niche, perhaps Apple were considering modifying the meaning of ‘pod’ to integrated customised programming bundle.

On the advent of Apple’s ‘iTunes Radio’ in 2013, I had some thoughts on what else might be included in automatically generated personalised media feeds might be like:

Adding the visual to a media feed would make a playlist item an act of a TV show or feature film, a short film, a YouTube video or a family video. It would include content from broadcast TV (news and sport and drama premieres), purchased TV, feature films and content from streamed subscription services. If you wanted to jump into a TV series or Soap after the first episodes, recap content would be playlisted in advance of the show you want to start with.

Almost 10 years ago Apple got a patent for inserting advertising into a feed. Just because Apple has a patent, it doesn’t mean they will produce a product or service that relies on the patent.

Read more
BlogApple Motion Tip: Setting Precise Widget Snapshot ValuesTuesday, August 8 2017

In 2011, Apple updated Motion to version 5. It’s biggest new feature is the ability to make plugins for Final Cut Pro X. Part of that feature is the ability to control multiple parameters in Motion with a single slider. You can set the start and end values for the slider. Each Motion parameter value you ‘rig’ to the slider ‘widget’ can be controlled by the slider.

In this case, a Slider widget has been to control the width of a line and the font size of some text in Motion. The range of the slider has been set to a minimum of 6 and a maximum of 100:

You can also make it so when the slider is set to a specific value, the rigged parameters can be set to values of your choice. These specific slider values are known as ‘snapshots.’ In the case above, I would like to make it so a value of 50 for the slider makes the width of the line 10 points and the size of the text 50 points. The problem is that you add snapshots to slider widgets by double-clicking below the slider. You can then drag the snapshot to the value you want it to be, but it is hard to set the value precisely. In this case, the closest I could get the snapshot to be was 50.03.

Before version 5.3, to set a precise value for a snapshot required doing some calculations and opening the Motion file in a text editor and editing the source directly (setting the snapshot value to 0.531914894 (50/(100-6))). I wrote about this back in 2011.

Now all you need do is double-click the snapshot pin. A field appears that you can edit:

Sadly there is a small fault in that the value doesn’t take the start value of the slider into account. In this case it doesn’t show ‘50.03’ but ‘44.03.’ If you edit this value to be ‘44’…

…the snapshot value is set to ‘50’.

Much easier than before.

Apple Patent: Personalised Programming for Streaming Media Breaks

Tuesday, 08 August 2017

Apple have been awarded a patent for ‘content pods’

A content delivery system determines what personal content is available on the user device through connecting to available information sources. The delivery system then assembles the content pod from these elements in addition to invitational content from content providers. In some embodiments, a bumper message is included in the content pod to provide a context for the elements that are being assembled in combination with each other. Once the content pod is generated, it is sent to the user device to be played during content breaks within the online streaming playback.

The patent doesn’t specify whether this pod is made for breaks in video streaming – Apple TV – or audio – Apple Music. This means automatically generated audio and video content to pepper the ‘stream’ (or Facebook/Twitter/Instagram feed). Apple already creates animated video ‘Memories’ based on photos on iOS and macOS.

‘Pod’?

Interesting that Apple refers to these bundles of content as ‘pods.’ Seems that when they applied for this patent, they saw the value of the podcast brand. As people have had problems widening understanding of podcasts outside their niche, perhaps Apple were considering modifying the meaning of ‘pod’ to integrated customised programming bundle.

On the advent of Apple’s ‘iTunes Radio’ in 2013, I had some thoughts on what else might be included in automatically generated personalised media feeds might be like:

Adding the visual to a media feed would make a playlist item an act of a TV show or feature film, a short film, a YouTube video or a family video. It would include content from broadcast TV (news and sport and drama premieres), purchased TV, feature films and content from streamed subscription services. If you wanted to jump into a TV series or Soap after the first episodes, recap content would be playlisted in advance of the show you want to start with.

Almost 10 years ago Apple got a patent for inserting advertising into a feed. Just because Apple has a patent, it doesn’t mean they will produce a product or service that relies on the patent.

Read more
BlogNew RED HYDROGEN ONE video shows non-working prototypeThursday, August 3 2017

A video showing more of the HYDROGEN ONE phone due ‘next year’ from RED. No preview of the display or much explanation of RED’s new viewing format (aka video network). A look at the physical design and a hint of the modularity. Interesting.

Not sure why RED needs to make this a phone. How long would it take anyone to match a phone to Samsung or Apple quality – hardware and software integration-wise?

It would work just as well as a media slab that you take with you everywhere.

Apple and RED getting closer?

Unless due to a deal with Apple (who is now exclusive distributor of the $15,000 RED RAVEN camera), the HYDROGEN is secretly an iPhone Pro extension device! That would be interesting.

Coming soon: Apple post production workflow lab in Culver City, Los Angeles

Monday, 31 July 2017

It’s good news and bad news for users of Apple’s high-end Macs. The good news: they are going to set up a ‘Pro Workflow’ lab. The bad news: they didn’t do this years ago!

Apple watchers suspect that up until earlier this year, the plan was that the introduction of the iMac Pro would mark the end of the 2013 Mac Pro – or any kind of standalone high-end Mac. The new plan is to make a Mac Pro replacement, which was described as coming eventually, but “will not ship this year.”

The news about a new Apple Pro Workflow Lab based in Culver City, Los Angeles County, comes from a new Apple job description for a ‘Pro Workflow Expert’ position.

They are looking for someone with experience combining multiple high-end post production applications, hardware and computers together to produce professional content:

  • A minimum of 7+ years of experience developing professional content using video and photo editing and/or 3D animation on Macs and/orMac, PC and/or Linux systems.
  • Deep knowledge of one or more key professional content creation tools such as Final Cut Pro X, Adobe Creative Cloud tool suite, Flame, Maya, Mari, Pro Tools, Logic Pro, and other industry leading tools. An understanding of the entire workflow from generating and importing content, editing, effects, playback and distribution is required.
  • Basic knowledge of required 3rd party hardware such as cameras, control surfaces, I/O, display, storage, etc. for various workflows
  • Knowledge of relevant plug-ins typically used for various aspects of pro workflows

The Macintosh System Architecture team wants someone to

Set up a lab with necessary equipment for relevant workflows for instrumentation and analysis, including desktops, notebooks and iPads.

Work with key developers to thoroughly comprehend all aspects of various pro workflows, applications, plug-ins, add-on hardware, and infrastructures.

Ensure relevant workflows are understood and thoroughly documented and work with technical marketing to ensure definitions correspond to customer usage cases.

Identify performance and functional issues with each workflow and work with architecture team to for detailed micro-architecture analysis.

Good news for those who think that Apple is only ‘the iPhone company’ who will never take small niche markets like high-end production seriously.

6 years late?

It is a pity that this lab wasn’t set up during the development of the Mac Pro in the years before its 2013 launch. At least the new lab will include ‘desktops, notebooks and iPads’ – that implies not just Mac Pros, iMacs and MacBook Pros but PCs and mobile PCs.

Have the relevant experience and want to work with cool Apple folk in Culver City? Real high-end post production skills? Apply for the job today!

 

Animation tools not ready for satire prime time

Thursday, 27 July 2017

If you had an almost unlimited budget, could you produce a rich feed of animated satirical videos with a one-day turnaround?

For now the answer seems to be no.

When HBO announced a deal with American satirist Jon Stewart in 2015, one of the shows mentioned was an animated comedy show. The Hollywood Reporter has reported that in May this year, HBO said that they would not be going ahead with the idea:

Stewart was set to work with cloud-graphics company OTOY to develop new technology that would allow him to produce timely shortform digital content. “Appearing on television 22 minutes a night clearly broke me,” Stewart said at the time. “I’m pretty sure I can produce a few minutes of content every now and again.”

The idea had been for the material to be refreshed on HBO’s digital platforms, including HBO Now and HBO Go, multiple times throughout the day. But sources say it was the one-day turnaround that became a sticking point for the project. From a technological standpoint, it became clear to those involved that it would be next to impossible to create and distribute the sophisticated animation within the short window. The project had already been delayed due to its technological complexity. At one point, the digital shorts were expected to debut ahead of the presidential election, so as to provide commentary on the campaigns — but when challenges arose, HBO reportedly told Stewart he could have as much time as he needed to get it right.

HBO executive Casey Bloys explained that there was another reason they couldn’t get the turnaround time down to one day:

“Once Jon realized that he could get close on the animation, what he realized also was in terms of the quality control and in terms of the writing, when you’re putting something out a couple of times a day, the quality control still has to be here,” Bloys said. “It just got to a point where it was like, is this worth his time? Is this worth our time? We kind of thought, ‘You know what? It was a good try, but ultimately not worth it.’”

So making sure the writing remains high quality was a problem, but the technology also isn’t ready. I wonder what tools and UI will be able to hit this kind of target.

 

 

BBC evaluating iOS ARkit for mobile apps

Tuesday, 25 July 2017

The BBC is looking for an AR agency to collaborate in developing a mobile application to augment the launch of a big new TV series in Spring 2018:

We are looking to create an Augmented Reality (AR) experience in association with the Civilisations television broadcast (Spring 2018). We’d like a mobile application that allows the audience to bring objects from the programme into their homes – we’d like to do this using through-camera, marker less tracking to let the audience inspect and experience the objects in the comfort of their own living room. In addition to the objects themselves we’d like to provide the audience with exclusive audio and text content around these objects.

  • We’d like to use this as an opportunity to evaluate the state-of-play of solutions like Apple’s ARKit, Vuforia and OpenCV.
  • A system whereby approved content can be categorised or curated based on episodic content, overarching themes, geographic location or personal interest.
  • A system whereby institutions can integrate and publish new content to the app.

We want both the App and the content framework to be built with extensibility in mind – we want the ability to add different types of content in the future, or use the framework to power a geolocated app.

The TV show is in a similar scale to programmes such as “Life on Earth” and “Planet Earth.” It aims to cover the history of art, architecture and philosophy:

BBC Civilisations is a re-imagining of the landmark history series Civilisation. Where the original series focussed on the influence of Western art, this new series will expand to include civilisations from Asia to the Americas, Africa as well as Europe. The series will be complemented by other programming across the BBC’s platforms and will include innovative digital content which will be made working in collaboration with the UK’s museum sector.

Imagine what kind of objects and environments could be overlaid into people’s lives using AR. The link to museums would allow AR content to be unlocked on visits to specific locations in the UK.

ARkit is a new featuring coming iOS 11 later this year. It allows recent iPhones and iPads to overlay 3D content over the live camera view so that rendered objects and environments align with real spaces.

If you are part of an agency that might be able deliver this kind of application and associated content management system, apply now – the deadline is tomorrow evening.

New Apple job = Machine learning coming to Final Cut Pro X?

Monday, 24 July 2017

Computer vision and machine learning coming to Apple ProApps? On Friday Apple added a new opening to their jobs website:

Video Applications Senior Software Engineer

Combining the latest Mac OS with Apple quality UI design, the editing team builds the innovative next generation timeline experience in Final Cut Pro X.

The job requirements have some interesting clauses:

Work on the architecture that provides the canvas for telling a story in video and incorporate cutting edge technology to build future versions of the product.

What cutting edge technology could they be thinking of here?

Experience developing computer vision and/or audio processing algorithms

Do you have experience applying machine learning solutions with video and audio data in a product?

Seems like object and pattern recognition will be useful, perhaps for automatic keywording and point, plane and object tracking. This is to be expected as smartphones can do real-time face and object tracking in social media apps today.

At 2017 WWDC in June Apple announced that they will add object tracking to iOS and macOS later this year (link to video of tracking demo). Here’s an excerpt from the video of the session:

Another new technology, brand-new in the Vision framework this year is object tracking. You can use this to track a face if you’ve detected a face. You can use that face rectangle as an initial condition to the tracking and then the Vision framework will track that square throughout the rest of your video. Will also track rectangles and you can also define the initial condition yourself. So that’s what I mean by general templates, if you decide to for example, put a square around this wakeboarder as I have, you can then go ahead and track that.

They also talked about applying machine learning models to content recognition:

Perhaps for example, you want to create a wedding application where you’re able to detect this part of the wedding is the reception, this part of the wedding is where the bride is walking down the aisle. If you want to train your own model and you have the data to train your own model you can do that.

Machine learning and audio?

Interesting that they are planning to recognise aspects of audio as well – keywording is straughtforward. What could machine learning automatically determine about captured audio? This could be the beginning of automatic audio mixing to produce rudimentary placeholders before audio professionals take over.

Recently there have been academic experiments investigating automatic picture editing (for example the Stanford/Adobe research I wrote about in May). I wonder when similar experiments will investigate sound editing and mixing?

Not just Final Cut Pro X

Although people are expecting machine learning to be applied to video and audio in Final Cut Pro X, remember that iMovie is the same application with the consumer-friendly UI turned on. What works in Final Cut Pro X can also be introduced to a wider market in iMovie for macOS, for iOS and Clips for iOS.

Logic Pro X 10.3.2 update sees Final Cut Pro X interchange improvements

Wednesday, 19 July 2017

It looks like the Logic Pro team are spending time making it work better with Final Cut Pro X. Logic Pro X was updated to version 10.3.2 yesterday. In the extensive list of new features and bug fixes, here are the points related to Final Cut:

  • When importing Final Cut Pro XML projects containing multichannel audio files, Logic Pro now reliably maintains channel assignments.
  • Large Final Cut Pro XML files now load more quickly.
  • Final Cut Pro X XML files now reliably import with the correct Sub-Role Names.
  • Logic Pro now creates a Summing Stack for each Parent Role when importing FCPX XML files.

I don’t use Final Cut to Logic workflows, so I can’t say how reliable Logic is when interpreting Final Cut XML. It seems that the Logic team are more like the Adobe Premiere team when it comes to implementing features: don’t wait until a feature is perfect, get it in, then make it better based on user feedback.

If you have bought Logic Pro X, the update is from the Mac App Store.

VR jobs at Apple: July 2017

Monday, 17 July 2017

There are a number of positions available at Apple in July 2017 whose job descriptions mention VR.

VR hardware

IMG CoreMedia VR Pipeline Engineer

The Interactive Media Group (IMG) provides the media and graphics foundation across all of Apple’s innovative products, including iPhone, AppleTV, Apple Watch, iPad, iPod, Macs as well as professional and consumer applications from Final Cut to iTunes and iWork.

  • Strong coding skills in C with ARM on embedded platforms
  • 2+ years experience developing and debugging large software systems
  • Direct experience with implementing and/or designing VR or 360 video playback systems

The role requires the ability to help design, build and troubleshoot media services for playback and export.

Spatial Audio Software Engineer

  • Key Advantage : Experience with audio software subsystems including DAWs, Game Audio Engines including Unreal, Unity and/or audio middleware for game and AR/VR applications.
  • Experience with Spatial audio formats (Atmos, HOA etc) is desirable.
  • Experience with Metal and general working of GPU systems.
  • Experience with SIMD, writing highly optimized audio algorithms

What would be in VR file format?

IMG – CoreMedia VR File Format Engineer

  • Proven experience with Audio/Video components of a media software system
  • Direct experience with implementing and/or designing media file formats
  • Experience with VR and 360 video

Interesting that Apple feel the need for a VR file format. I wonder what will make Apple’s VR file format stand out? It will probably be able to be recorded/encoded on iOS and macOS. I wonder if will also work on tvOS and watchOS. If it doesn’t work on non-Apple hardware, it could be part of an Apple plan for technological lock-in.

VR marketing

Creative Technologist

As a member of Apple’s Interactive Team, the Creative Technologist is responsible for driving innovation that enhances and enlivens the marketing of Apple’s products and services. This role requires collaboration with the design, UX, motion graphics, film/video, 3D, and development groups across Apple’s Marcom group.

  • Developing interactive prototypes in order to conceptualize and develop innovative approaches for Apple marketing initiatives.
  • Experience with Adobe Creative Suite.
  • Experience with Unity/Unreal and AR/VR development is a plus.
  • Motion graphics and 3D software (AfterEffects, Maya) skills.

It’s a pity Apple Marketing doesn’t require knowledge of Apple’s motion graphics application.

Route-to-Market Mgr, WW In-Store Channel Digital Experience

The Route-to-Market Manager, WW In-Store Channel Digital Experience, is responsible for driving and executing all digital marketing communications as related to in-store Apple-led branded product presentation and campaigns.

  • Detailed knowledge of digital experience technologies – including but not limited to, on-device engagement tactics, digital content development, app development, beaconing, AR/VR, etc.

Using VR to make Apple products

Also a job requirement shows that Apple are using VR simulations to design power systems:

Senior Electromagnetics Analyst

  • Engineer will also need to do fundamental analyses and run SPICE simulations for VR conversion.

SPICE is a system that takes circuit designs and simulates specific results given specific inputs. 30 years ago it was a command-line based UNIX tool. Now Apple engineers are using VR to look around inside their hardware designs.

If you choose to apply for any of these jobs, good luck. Tell them Alex Gollner sent you!

Apple Pro Apps and macOS High Sierra compatibility

Friday, 14 July 2017

What versions of Final Cut Pro X are compatible with macOS High Sierra?

During Apple’s 2017 Worldwide Developer Conference, macOS High Sierra was announced. Apple has a public beta test programme, where you can sign up to try early versions of Apple operating systems before they are released.

macOS High Sierra is supposed to a version of the Mac operating system that consolidates on previous features and stability. This gives Apple and third-party developers the chance to catch their breath for a year. They can concentrate on reliability and stable improvement.

The question for Final Cut Pro X, Motion 5, Compressor and Logic Pro X users is whether to update their Macs to High Sierra.

Apple says that if they are using older versions of these applications, if they want to use macOS High Sierra, they will need to update to

  • Final Cut Pro X 10.3.4 or later
  • Motion 5.3.2 or later
  • Compressor 4.3.2 or later
  • Logic Pro X 10.3.1 or later
  • MainStage 3.3 or later

If you still use Final Cut Pro 7 – or any other applications in the Final Cut Studio suite (including DVD Studio Pro and Soundtrack Pro), or need to use them once in a while to open older projects, don’t update all your Macs to macOS High Sierra:

Previous versions of these applications, including all apps in Final Cut Studio and Logic Studio, are not supported in macOS High Sierra.

Interesting that the ProApps team are pushing users forward this way. It will be interesting to see if new application features and bug fixes require newer versions of macOS than previous transitions.

Final Cut Pro 7 was last updated in September 2010. It is impressive that it still runs on Macs being released in 2017.

If you have more than one Mac, perhaps it is worth keeping one on macOS Sierra for the foreseeable future. When the next major version of Final Cut appears, it is likely it will work on Sierra. If you don’t have more than one Mac, prepare a clone of your most reliable macOS Sierra startup drive for future use when you need to revisit old projects.

Investigate HEVC/H.265 encoding using free chapter from Jan Ozer FFmpeg book

Wednesday, 28 June 2017

Apple have decided to stadardise on HEVC/H.265 video encoding in macOS High Sierra and iOS 11. Jan Ozer has written a book about how to encode video using the free FFmpeg encoding system.

He has made the chapter on HEVC encoding from the book free to download:

Below you can download a sample chapter of my new book, Learn to Produce Video with FFmpeg in 30 Minutes or Less. It’s Chapter 12 Encoding HEVC

If you have already installed FFmpeg (which includes the libx265 encoder), visit Jan’s site to download the chapter and do some experiments. Check your results using the free VLC player.

PS: Although he doesn’t cover HDR in this free chapter, investigate the X.265 documentation on the subject.

Apple’s HEVC choice: Codec battle 2018?

Wednesday, 21 June 2017

What does Apple’s choice of HEVC (H.265) mean for developers, users, viewers and streamers? Jan Ozer writes that it will take a year a so to find out. His predictions include:

No major publishers implement HEVC/HLS support before 3-6 months after iOS 11/MacOS Sierra ship. This leaves the door open for a full codec analysis between AV1 and HEVC, including encode and decode requirements, hardware support, cost, IP risk, HDR support, software support, the whole nine yards. At least in the US and Europe, one of these codecs will be codec next.

Marketing hype is global, codecs are local. Premium content distributors around the world will choose the best codec for their markets. In second and third world markets, iPhones play a very small role, and there will be plenty of low-cost Android phones, and perhaps even tablets and computers, without HEVC hardware support. In these environments, VP9/AV1 or another codec (PERSEUS?) might be best.

Frame.io Enterprise – online team edit reviews for enterprises

Tuesday, 20 June 2017

Today Frame.io announced that their online video production team collaboration system now has features that are useful for larger organisations:

Enterprise offers everything large companies need to manage their creative process at scale. Admins can organize teams by department, brand, production or whatever best suits your company structure.

With this organization teams can work in their own workspaces much like they do with Frame.io today. Admins can control team access and visibility and manage thresholds for team size and resource allocations all from a single platform.

Interesting news for Final Cut Pro X users who need to share edits and notes with other team members online.

Frame.io is a edit review system. Editors can share edits and rushes with others online.

Non-editors review edits in a web browser and can access media used in the edit and selected unused media. They can review edits and make notes at specific times in the edit. They can also make drawings that other team members can see. Useful when planning new shots or briefing changes that need to be made using VFX. Team members can even compare edits with side-by-side version control.

Editors can then import these notes as markers with comments so they can see the exact point in the edit the note is associated with.

Media companies are the beginning

Interesting that Frame.io chose the ‘Enterprise’ suffix for this new service. The announcement may say that Vice, Turner Broadcasting Systems and BuzzFeed are already using Frame.io Enterprise, but media companies should be the tip of the video collaboration iceberg. The very features described in the press release seem more suited to non-media companies and organisations.

Although desktop video has been around for over 20 years, it hasn’t yet properly broken into the world of work as a peer to the report (word processing), financial documents (spreadsheet) and presentation (presentation). Microsoft and Adobe never got video production – or at least editing – into most offices. Now that everyone has a video camera in their pocket, it is time for someone to make this happen. Online or network collaboration will help.

Trojan Horse for Final Cut Pro X

At this point the Final Cut Pro X angle becomes relevant. Although frame.io integrates very well into the Adobe Premiere and Adobe After Effects user interfaces, those applications aren’t big-business friendly. Due to their history, their metaphors are for editors and motion graphics designers. The very multiplicity of windows, panels and preferences are the kind of features that experienced editors and animators like. They look pretty threatening to people with other jobs. Final Cut Pro X is the application that can be used by people who need to get an edit done, or make last-minute changes based on some notes entered into frame.io by the CEO on her iPhone.

The question for the Final Cut ecosystem is whether a future version of X will allow the kind of third-party integration that makes the notes review process for frame.io in Adobe Premiere so much better than it is in Final Cut Pro X.

HDR production: Five concepts, 10 principles

Tuesday, 20 June 2017

It is likely that the next major versions of common NLEs will support HDR. As editors we will be asked about the right HDR workflow. For now it is a matter of picking a standard, following some guidelines and maintaining metadata.

Jan Ozer writes:

HDR sounds complex, and at a technical level it is. Abstractly, however, it involves just five simple concepts.

First, to acquire the expanded brightness and color palette needed for HDR display, you have to capture and maintain your video in 10-bit or higher formats. Second, you’ll need to color grade your video to fully use the expanded palette. Third, you’ll have to choose and support one or more HDR technologies to reach the broadest number of viewers. Fourth, for several of these technologies, you’ll need to manage color and other metadata through the production workflow to optimize display on your endpoints. Finally, although you’ll be using the same codecs and adaptive bitrate (ABR) formats as before, you’ll have to change a few encoding settings to ensure compatibility with your selected HDR TVs and other devices.

Jan is a great commentator on streaming technologies, read his HDR production workflow guide at StreamingMedia.com

Read more
BlogComing soon: Apple post production workflow lab in Culver City, Los AngelesMonday, July 31 2017

It’s good news and bad news for users of Apple’s high-end Macs. The good news: they are going to set up a ‘Pro Workflow’ lab. The bad news: they didn’t do this years ago!

Apple watchers suspect that up until earlier this year, the plan was that the introduction of the iMac Pro would mark the end of the 2013 Mac Pro – or any kind of standalone high-end Mac. The new plan is to make a Mac Pro replacement, which was described as coming eventually, but “will not ship this year.”

The news about a new Apple Pro Workflow Lab based in Culver City, Los Angeles County, comes from a new Apple job description for a ‘Pro Workflow Expert’ position.

They are looking for someone with experience combining multiple high-end post production applications, hardware and computers together to produce professional content:

  • A minimum of 7+ years of experience developing professional content using video and photo editing and/or 3D animation on Macs and/orMac, PC and/or Linux systems.
  • Deep knowledge of one or more key professional content creation tools such as Final Cut Pro X, Adobe Creative Cloud tool suite, Flame, Maya, Mari, Pro Tools, Logic Pro, and other industry leading tools. An understanding of the entire workflow from generating and importing content, editing, effects, playback and distribution is required.
  • Basic knowledge of required 3rd party hardware such as cameras, control surfaces, I/O, display, storage, etc. for various workflows
  • Knowledge of relevant plug-ins typically used for various aspects of pro workflows

The Macintosh System Architecture team wants someone to

Set up a lab with necessary equipment for relevant workflows for instrumentation and analysis, including desktops, notebooks and iPads.

Work with key developers to thoroughly comprehend all aspects of various pro workflows, applications, plug-ins, add-on hardware, and infrastructures.

Ensure relevant workflows are understood and thoroughly documented and work with technical marketing to ensure definitions correspond to customer usage cases.

Identify performance and functional issues with each workflow and work with architecture team to for detailed micro-architecture analysis.

Good news for those who think that Apple is only ‘the iPhone company’ who will never take small niche markets like high-end production seriously.

6 years late?

It is a pity that this lab wasn’t set up during the development of the Mac Pro in the years before its 2013 launch. At least the new lab will include ‘desktops, notebooks and iPads’ – that implies not just Mac Pros, iMacs and MacBook Pros but PCs and mobile PCs.

Have the relevant experience and want to work with cool Apple folk in Culver City? Real high-end post production skills? Apply for the job today!

Read more