New RED HYDROGEN ONE video shows non-working prototype

A video showing more of the HYDROGEN ONE phone due ‘next year’ from RED. No preview of the display or much explanation of RED’s new viewing format (aka video network). A look at the physical design and a hint of the modularity. Interesting.

Not sure why RED needs to make this a phone. How long would it take anyone to match a phone to Samsung or Apple quality – hardware and software integration-wise?

It would work just as well as a media slab that you take with you everywhere.

Apple and RED getting closer?

Unless due to a deal with Apple (who is now exclusive distributor of the $15,000 RED RAVEN camera), the HYDROGEN is secretly an iPhone Pro extension device! That would be interesting.

Coming soon: Apple post production workflow lab in Culver City, Los Angeles

Monday, 31 July 2017

It’s good news and bad news for users of Apple’s high-end Macs. The good news: they are going to set up a ‘Pro Workflow’ lab. The bad news: they didn’t do this years ago!

Apple watchers suspect that up until earlier this year, the plan was that the introduction of the iMac Pro would mark the end of the 2013 Mac Pro – or any kind of standalone high-end Mac. The new plan is to make a Mac Pro replacement, which was described as coming eventually, but “will not ship this year.”

The news about a new Apple Pro Workflow Lab based in Culver City, Los Angeles County, comes from a new Apple job description for a ‘Pro Workflow Expert’ position.

They are looking for someone with experience combining multiple high-end post production applications, hardware and computers together to produce professional content:

  • A minimum of 7+ years of experience developing professional content using video and photo editing and/or 3D animation on Macs and/orMac, PC and/or Linux systems.
  • Deep knowledge of one or more key professional content creation tools such as Final Cut Pro X, Adobe Creative Cloud tool suite, Flame, Maya, Mari, Pro Tools, Logic Pro, and other industry leading tools. An understanding of the entire workflow from generating and importing content, editing, effects, playback and distribution is required.
  • Basic knowledge of required 3rd party hardware such as cameras, control surfaces, I/O, display, storage, etc. for various workflows
  • Knowledge of relevant plug-ins typically used for various aspects of pro workflows

The Macintosh System Architecture team wants someone to

Set up a lab with necessary equipment for relevant workflows for instrumentation and analysis, including desktops, notebooks and iPads.

Work with key developers to thoroughly comprehend all aspects of various pro workflows, applications, plug-ins, add-on hardware, and infrastructures.

Ensure relevant workflows are understood and thoroughly documented and work with technical marketing to ensure definitions correspond to customer usage cases.

Identify performance and functional issues with each workflow and work with architecture team to for detailed micro-architecture analysis.

Good news for those who think that Apple is only ‘the iPhone company’ who will never take small niche markets like high-end production seriously.

6 years late?

It is a pity that this lab wasn’t set up during the development of the Mac Pro in the years before its 2013 launch. At least the new lab will include ‘desktops, notebooks and iPads’ – that implies not just Mac Pros, iMacs and MacBook Pros but PCs and mobile PCs.

Have the relevant experience and want to work with cool Apple folk in Culver City? Real high-end post production skills? Apply for the job today!

 

Animation tools not ready for satire prime time

Thursday, 27 July 2017

If you had an almost unlimited budget, could you produce a rich feed of animated satirical videos with a one-day turnaround?

For now the answer seems to be no.

When HBO announced a deal with American satirist Jon Stewart in 2015, one of the shows mentioned was an animated comedy show. The Hollywood Reporter has reported that in May this year, HBO said that they would not be going ahead with the idea:

Stewart was set to work with cloud-graphics company OTOY to develop new technology that would allow him to produce timely shortform digital content. “Appearing on television 22 minutes a night clearly broke me,” Stewart said at the time. “I’m pretty sure I can produce a few minutes of content every now and again.”

The idea had been for the material to be refreshed on HBO’s digital platforms, including HBO Now and HBO Go, multiple times throughout the day. But sources say it was the one-day turnaround that became a sticking point for the project. From a technological standpoint, it became clear to those involved that it would be next to impossible to create and distribute the sophisticated animation within the short window. The project had already been delayed due to its technological complexity. At one point, the digital shorts were expected to debut ahead of the presidential election, so as to provide commentary on the campaigns — but when challenges arose, HBO reportedly told Stewart he could have as much time as he needed to get it right.

HBO executive Casey Bloys explained that there was another reason they couldn’t get the turnaround time down to one day:

“Once Jon realized that he could get close on the animation, what he realized also was in terms of the quality control and in terms of the writing, when you’re putting something out a couple of times a day, the quality control still has to be here,” Bloys said. “It just got to a point where it was like, is this worth his time? Is this worth our time? We kind of thought, ‘You know what? It was a good try, but ultimately not worth it.’”

So making sure the writing remains high quality was a problem, but the technology also isn’t ready. I wonder what tools and UI will be able to hit this kind of target.

 

 

BBC evaluating iOS ARkit for mobile apps

Tuesday, 25 July 2017

The BBC is looking for an AR agency to collaborate in developing a mobile application to augment the launch of a big new TV series in Spring 2018:

We are looking to create an Augmented Reality (AR) experience in association with the Civilisations television broadcast (Spring 2018). We’d like a mobile application that allows the audience to bring objects from the programme into their homes – we’d like to do this using through-camera, marker less tracking to let the audience inspect and experience the objects in the comfort of their own living room. In addition to the objects themselves we’d like to provide the audience with exclusive audio and text content around these objects.

  • We’d like to use this as an opportunity to evaluate the state-of-play of solutions like Apple’s ARKit, Vuforia and OpenCV.
  • A system whereby approved content can be categorised or curated based on episodic content, overarching themes, geographic location or personal interest.
  • A system whereby institutions can integrate and publish new content to the app.

We want both the App and the content framework to be built with extensibility in mind – we want the ability to add different types of content in the future, or use the framework to power a geolocated app.

The TV show is in a similar scale to programmes such as “Life on Earth” and “Planet Earth.” It aims to cover the history of art, architecture and philosophy:

BBC Civilisations is a re-imagining of the landmark history series Civilisation. Where the original series focussed on the influence of Western art, this new series will expand to include civilisations from Asia to the Americas, Africa as well as Europe. The series will be complemented by other programming across the BBC’s platforms and will include innovative digital content which will be made working in collaboration with the UK’s museum sector.

Imagine what kind of objects and environments could be overlaid into people’s lives using AR. The link to museums would allow AR content to be unlocked on visits to specific locations in the UK.

ARkit is a new featuring coming iOS 11 later this year. It allows recent iPhones and iPads to overlay 3D content over the live camera view so that rendered objects and environments align with real spaces.

If you are part of an agency that might be able deliver this kind of application and associated content management system, apply now – the deadline is tomorrow evening.

New Apple job = Machine learning coming to Final Cut Pro X?

Monday, 24 July 2017

Computer vision and machine learning coming to Apple ProApps? On Friday Apple added a new opening to their jobs website:

Video Applications Senior Software Engineer

Combining the latest Mac OS with Apple quality UI design, the editing team builds the innovative next generation timeline experience in Final Cut Pro X.

The job requirements have some interesting clauses:

Work on the architecture that provides the canvas for telling a story in video and incorporate cutting edge technology to build future versions of the product.

What cutting edge technology could they be thinking of here?

Experience developing computer vision and/or audio processing algorithms

Do you have experience applying machine learning solutions with video and audio data in a product?

Seems like object and pattern recognition will be useful, perhaps for automatic keywording and point, plane and object tracking. This is to be expected as smartphones can do real-time face and object tracking in social media apps today.

At 2017 WWDC in June Apple announced that they will add object tracking to iOS and macOS later this year (link to video of tracking demo). Here’s an excerpt from the video of the session:

Another new technology, brand-new in the Vision framework this year is object tracking. You can use this to track a face if you’ve detected a face. You can use that face rectangle as an initial condition to the tracking and then the Vision framework will track that square throughout the rest of your video. Will also track rectangles and you can also define the initial condition yourself. So that’s what I mean by general templates, if you decide to for example, put a square around this wakeboarder as I have, you can then go ahead and track that.

They also talked about applying machine learning models to content recognition:

Perhaps for example, you want to create a wedding application where you’re able to detect this part of the wedding is the reception, this part of the wedding is where the bride is walking down the aisle. If you want to train your own model and you have the data to train your own model you can do that.

Machine learning and audio?

Interesting that they are planning to recognise aspects of audio as well – keywording is straughtforward. What could machine learning automatically determine about captured audio? This could be the beginning of automatic audio mixing to produce rudimentary placeholders before audio professionals take over.

Recently there have been academic experiments investigating automatic picture editing (for example the Stanford/Adobe research I wrote about in May). I wonder when similar experiments will investigate sound editing and mixing?

Not just Final Cut Pro X

Although people are expecting machine learning to be applied to video and audio in Final Cut Pro X, remember that iMovie is the same application with the consumer-friendly UI turned on. What works in Final Cut Pro X can also be introduced to a wider market in iMovie for macOS, for iOS and Clips for iOS.

Logic Pro X 10.3.2 update sees Final Cut Pro X interchange improvements

Wednesday, 19 July 2017

It looks like the Logic Pro team are spending time making it work better with Final Cut Pro X. Logic Pro X was updated to version 10.3.2 yesterday. In the extensive list of new features and bug fixes, here are the points related to Final Cut:

  • When importing Final Cut Pro XML projects containing multichannel audio files, Logic Pro now reliably maintains channel assignments.
  • Large Final Cut Pro XML files now load more quickly.
  • Final Cut Pro X XML files now reliably import with the correct Sub-Role Names.
  • Logic Pro now creates a Summing Stack for each Parent Role when importing FCPX XML files.

I don’t use Final Cut to Logic workflows, so I can’t say how reliable Logic is when interpreting Final Cut XML. It seems that the Logic team are more like the Adobe Premiere team when it comes to implementing features: don’t wait until a feature is perfect, get it in, then make it better based on user feedback.

If you have bought Logic Pro X, the update is from the Mac App Store.

VR jobs at Apple: July 2017

Monday, 17 July 2017

There are a number of positions available at Apple in July 2017 whose job descriptions mention VR.

VR hardware

IMG CoreMedia VR Pipeline Engineer

The Interactive Media Group (IMG) provides the media and graphics foundation across all of Apple’s innovative products, including iPhone, AppleTV, Apple Watch, iPad, iPod, Macs as well as professional and consumer applications from Final Cut to iTunes and iWork.

  • Strong coding skills in C with ARM on embedded platforms
  • 2+ years experience developing and debugging large software systems
  • Direct experience with implementing and/or designing VR or 360 video playback systems

The role requires the ability to help design, build and troubleshoot media services for playback and export.

Spatial Audio Software Engineer

  • Key Advantage : Experience with audio software subsystems including DAWs, Game Audio Engines including Unreal, Unity and/or audio middleware for game and AR/VR applications.
  • Experience with Spatial audio formats (Atmos, HOA etc) is desirable.
  • Experience with Metal and general working of GPU systems.
  • Experience with SIMD, writing highly optimized audio algorithms

What would be in VR file format?

IMG – CoreMedia VR File Format Engineer

  • Proven experience with Audio/Video components of a media software system
  • Direct experience with implementing and/or designing media file formats
  • Experience with VR and 360 video

Interesting that Apple feel the need for a VR file format. I wonder what will make Apple’s VR file format stand out? It will probably be able to be recorded/encoded on iOS and macOS. I wonder if will also work on tvOS and watchOS. If it doesn’t work on non-Apple hardware, it could be part of an Apple plan for technological lock-in.

VR marketing

Creative Technologist

As a member of Apple’s Interactive Team, the Creative Technologist is responsible for driving innovation that enhances and enlivens the marketing of Apple’s products and services. This role requires collaboration with the design, UX, motion graphics, film/video, 3D, and development groups across Apple’s Marcom group.

  • Developing interactive prototypes in order to conceptualize and develop innovative approaches for Apple marketing initiatives.
  • Experience with Adobe Creative Suite.
  • Experience with Unity/Unreal and AR/VR development is a plus.
  • Motion graphics and 3D software (AfterEffects, Maya) skills.

It’s a pity Apple Marketing doesn’t require knowledge of Apple’s motion graphics application.

Route-to-Market Mgr, WW In-Store Channel Digital Experience

The Route-to-Market Manager, WW In-Store Channel Digital Experience, is responsible for driving and executing all digital marketing communications as related to in-store Apple-led branded product presentation and campaigns.

  • Detailed knowledge of digital experience technologies – including but not limited to, on-device engagement tactics, digital content development, app development, beaconing, AR/VR, etc.

Using VR to make Apple products

Also a job requirement shows that Apple are using VR simulations to design power systems:

Senior Electromagnetics Analyst

  • Engineer will also need to do fundamental analyses and run SPICE simulations for VR conversion.

SPICE is a system that takes circuit designs and simulates specific results given specific inputs. 30 years ago it was a command-line based UNIX tool. Now Apple engineers are using VR to look around inside their hardware designs.

If you choose to apply for any of these jobs, good luck. Tell them Alex Gollner sent you!

Apple Pro Apps and macOS High Sierra compatibility

Friday, 14 July 2017

What versions of Final Cut Pro X are compatible with macOS High Sierra?

During Apple’s 2017 Worldwide Developer Conference, macOS High Sierra was announced. Apple has a public beta test programme, where you can sign up to try early versions of Apple operating systems before they are released.

macOS High Sierra is supposed to a version of the Mac operating system that consolidates on previous features and stability. This gives Apple and third-party developers the chance to catch their breath for a year. They can concentrate on reliability and stable improvement.

The question for Final Cut Pro X, Motion 5, Compressor and Logic Pro X users is whether to update their Macs to High Sierra.

Apple says that if they are using older versions of these applications, if they want to use macOS High Sierra, they will need to update to

  • Final Cut Pro X 10.3.4 or later
  • Motion 5.3.2 or later
  • Compressor 4.3.2 or later
  • Logic Pro X 10.3.1 or later
  • MainStage 3.3 or later

If you still use Final Cut Pro 7 – or any other applications in the Final Cut Studio suite (including DVD Studio Pro and Soundtrack Pro), or need to use them once in a while to open older projects, don’t update all your Macs to macOS High Sierra:

Previous versions of these applications, including all apps in Final Cut Studio and Logic Studio, are not supported in macOS High Sierra.

Interesting that the ProApps team are pushing users forward this way. It will be interesting to see if new application features and bug fixes require newer versions of macOS than previous transitions.

Final Cut Pro 7 was last updated in September 2010. It is impressive that it still runs on Macs being released in 2017.

If you have more than one Mac, perhaps it is worth keeping one on macOS Sierra for the foreseeable future. When the next major version of Final Cut appears, it is likely it will work on Sierra. If you don’t have more than one Mac, prepare a clone of your most reliable macOS Sierra startup drive for future use when you need to revisit old projects.

Investigate HEVC/H.265 encoding using free chapter from Jan Ozer FFmpeg book

Wednesday, 28 June 2017

Apple have decided to stadardise on HEVC/H.265 video encoding in macOS High Sierra and iOS 11. Jan Ozer has written a book about how to encode video using the free FFmpeg encoding system.

He has made the chapter on HEVC encoding from the book free to download:

Below you can download a sample chapter of my new book, Learn to Produce Video with FFmpeg in 30 Minutes or Less. It’s Chapter 12 Encoding HEVC

If you have already installed FFmpeg (which includes the libx265 encoder), visit Jan’s site to download the chapter and do some experiments. Check your results using the free VLC player.

PS: Although he doesn’t cover HDR in this free chapter, investigate the X.265 documentation on the subject.

Apple’s HEVC choice: Codec battle 2018?

Wednesday, 21 June 2017

What does Apple’s choice of HEVC (H.265) mean for developers, users, viewers and streamers? Jan Ozer writes that it will take a year a so to find out. His predictions include:

No major publishers implement HEVC/HLS support before 3-6 months after iOS 11/MacOS Sierra ship. This leaves the door open for a full codec analysis between AV1 and HEVC, including encode and decode requirements, hardware support, cost, IP risk, HDR support, software support, the whole nine yards. At least in the US and Europe, one of these codecs will be codec next.

Marketing hype is global, codecs are local. Premium content distributors around the world will choose the best codec for their markets. In second and third world markets, iPhones play a very small role, and there will be plenty of low-cost Android phones, and perhaps even tablets and computers, without HEVC hardware support. In these environments, VP9/AV1 or another codec (PERSEUS?) might be best.

Frame.io Enterprise – online team edit reviews for enterprises

Tuesday, 20 June 2017

Today Frame.io announced that their online video production team collaboration system now has features that are useful for larger organisations:

Enterprise offers everything large companies need to manage their creative process at scale. Admins can organize teams by department, brand, production or whatever best suits your company structure.

With this organization teams can work in their own workspaces much like they do with Frame.io today. Admins can control team access and visibility and manage thresholds for team size and resource allocations all from a single platform.

Interesting news for Final Cut Pro X users who need to share edits and notes with other team members online.

Frame.io is a edit review system. Editors can share edits and rushes with others online.

Non-editors review edits in a web browser and can access media used in the edit and selected unused media. They can review edits and make notes at specific times in the edit. They can also make drawings that other team members can see. Useful when planning new shots or briefing changes that need to be made using VFX. Team members can even compare edits with side-by-side version control.

Editors can then import these notes as markers with comments so they can see the exact point in the edit the note is associated with.

Media companies are the beginning

Interesting that Frame.io chose the ‘Enterprise’ suffix for this new service. The announcement may say that Vice, Turner Broadcasting Systems and BuzzFeed are already using Frame.io Enterprise, but media companies should be the tip of the video collaboration iceberg. The very features described in the press release seem more suited to non-media companies and organisations.

Although desktop video has been around for over 20 years, it hasn’t yet properly broken into the world of work as a peer to the report (word processing), financial documents (spreadsheet) and presentation (presentation). Microsoft and Adobe never got video production – or at least editing – into most offices. Now that everyone has a video camera in their pocket, it is time for someone to make this happen. Online or network collaboration will help.

Trojan Horse for Final Cut Pro X

At this point the Final Cut Pro X angle becomes relevant. Although frame.io integrates very well into the Adobe Premiere and Adobe After Effects user interfaces, those applications aren’t big-business friendly. Due to their history, their metaphors are for editors and motion graphics designers. The very multiplicity of windows, panels and preferences are the kind of features that experienced editors and animators like. They look pretty threatening to people with other jobs. Final Cut Pro X is the application that can be used by people who need to get an edit done, or make last-minute changes based on some notes entered into frame.io by the CEO on her iPhone.

The question for the Final Cut ecosystem is whether a future version of X will allow the kind of third-party integration that makes the notes review process for frame.io in Adobe Premiere so much better than it is in Final Cut Pro X.

HDR production: Five concepts, 10 principles

Tuesday, 20 June 2017

It is likely that the next major versions of common NLEs will support HDR. As editors we will be asked about the right HDR workflow. For now it is a matter of picking a standard, following some guidelines and maintaining metadata.

Jan Ozer writes:

HDR sounds complex, and at a technical level it is. Abstractly, however, it involves just five simple concepts.

First, to acquire the expanded brightness and color palette needed for HDR display, you have to capture and maintain your video in 10-bit or higher formats. Second, you’ll need to color grade your video to fully use the expanded palette. Third, you’ll have to choose and support one or more HDR technologies to reach the broadest number of viewers. Fourth, for several of these technologies, you’ll need to manage color and other metadata through the production workflow to optimize display on your endpoints. Finally, although you’ll be using the same codecs and adaptive bitrate (ABR) formats as before, you’ll have to change a few encoding settings to ensure compatibility with your selected HDR TVs and other devices.

Jan is a great commentator on streaming technologies, read his HDR production workflow guide at StreamingMedia.com

31st July 2017

Coming soon: Apple post production workflow lab in Culver City, Los Angeles

8th August 2017

Apple Motion Tip: Setting Precise Widget Snapshot Values