Apple Video job postings 2018… Cloud, IP production, 3D/VR in 2019?
A good way of seeing what Apple plans to work on is to check out their jobs site. A July 2017 job posting for a pro workflow expert to set up a studio ends up with Apple giving a journalist a tour of the lab in April 2018.
Here is a round-up of recent Apple Pro Apps-related job posts. They hint as to what might be appearing in Apple’s video applications in 2019.
Many start with this description of the Apple Video Applications group:
The Video Applications group develops leading media creation apps including Memories, Final Cut Pro X, iMovie, Motion, and Clips. The team is looking for a talented software engineer to help design and develop future features for these applications.
This is an exciting opportunity to apply your experience in video application development to innovative media creation products that reach millions of users.
Senior Engineer, Cloud Applications
Job number 113527707, posted March 2, 2018:
The ideal candidate will have in-depth experience leveraging both database and client/server technologies. As such, you should be fluent with cloud application development utilizing CloudKit or other PAAS (“Platform as a Service”) platforms.
The main NLE makers have come to cloud-enabling their tools relatively late compared to other creative fields. Apple currently allow multiple people to edit the same document in iWorks at the same time. Sharing multiple-gigabytes of video data is much harder than keeping a Pages or Numbers document in sync across the internet. Avid have recently announced Amazon-powered video editing in the cloud services coming this year. It looks like Apple isn’t shying away from at least exploring cloud-based editing in 2018.
Cloud features aren’t just for macOS video applications: There was an October 2017 posting for a MacOS/iOS Engineer – Video Applications (Cloud) – Job number 113167115.
Senior Software Engineer, Live Video
Job number 113524253, posted February 27, 2018:
The ideal candidate will have in-depth experience leveraging video editing, compositing, compression, and broadcasting technologies.
The key phrase here is ‘Live Video’ – this could be Apple making sure their tools will be able to work in a IP-enable post workflows. Broadcasters are now connecting their hardware via Ethernet instead of the older SDI technology. Engineering this sort of thing is about keeping everything in sync – sharing streams of video across 10-Gigabit Ethernet.
I wrote about BBC R&D exploring IP production in June 2017. Recently they’ve been seeing how IP production could use cloud services: “Beyond Streams and Files – Storing Frames in the Cloud”.
Sr. Machine Learning Engineer – Video Apps
Job number 113524253, posted April 12, 2018:
Apple is seeking a Machine Learning (ML) technologist to help set technology strategy for our Video Applications Engineering team. Our team develops Apple’s well-known video applications, including Final Cut Pro, iMovie, Memories part of the Photos app, and the exciting new Clips mobile app.
We utilize both ML and Computer Vision (CV) technologies in our applications, and are doing so at an increasing pace.
We are looking for an experienced ML engineer/scientist who has played a significant role in multiple ML implementations — ideally both in academia and in industry — to solve a variety of problems.
You will advise and consult on multiple projects within our organization, to identify where ML can best be employed, and in areas of media utilization not limited to images and video.
We expect that you will have significant software development and integration knowledge, in order to be both an advisor to, and significant developer on, multiple projects.
This follows on from a vacancy last July for a video applications software engineer ‘with machine learning experience.’
It looks like the Video Applications team are stepping up their investments in machine learning – expecting to use it in multiple projects: maybe different features in the different applications they work on.
One example would be improving tracking of objects in video. Instead of tracking individual pixels to hide or change a sign on the side of a moving vehicle, machine learning would recognise the changing position of the vehicle, the sign and be able to interpret the graphics and text in the sign itself.
MacOS High Sierra 10.13 introduced machine learning features in Autumn 2017. Usually Pro Apps users would need to wait at least a year to get features available in the newest version of macOS – because editors didn’t want to update their systems until the OS felt reliable enough for post production. Interesting with the Final Cut Pro 10.4.1 update, the Video Applications team have forced the issue – the current version of Final Cut (plus Motion) won’t run on macOS Sierra 10.12. At least that means new Final Cut features can start relying on new macOS features introduced last year. I wrote about Apple WWDC sessions on media in June 2017.
Senior UI Engineer, Video Applications (3D/VR)
Job number 113524287, posted February 23, 2018:
Your responsibilities will include the development and improvement of innovative and intuitive 3D and VR user interface elements. You will collaborate closely with human interface designers, and other engineers on designing and implementing the best possible user experience. The preferred candidate should have an interest and relevant experience in developing VR user interfaces.
- Experience with OpenGL, OpenGL ES or Metal
- Experience developing AR/VR software (SteamVR / OpenVR)
- macOS and/or iOS development experience
Notice here that this is not a user interface engineer who will create UI for a 3D application. Apple plan to at least investigate developing 3D user interfaces that will work in VR. Although this engineer is being sought by the video applications team, who knows where else in Apple is looking for 3D interface design to be used in VR.
See also VR Jobs at Apple – July 2017.