If you are an editor or film maker that wants to explore VR video and need good enough footage to learn from and for low-end projects: The Insta360 ONE.
In the past, many used Kolor Autopano Video (in conjunction with Kolor Autopano Giga) – on a PC. Subscription-based Mistika VR is becoming more popular, and updated more often. Although they run on the Mac, most professional stitchers (that is now a whole new job) run them on PCs configured with as many high-end GPU cards as they can afford. I’ve stitched footage in AVP (Autopano Video) from different kinds of camera rig on the Mac, but the rendering speed is very slow: 2-6 fps on a 2013 MacBook Pro.
While learning, if you use the Insta360 ONE (or the Ricoh Theta V), you’ll only be dealing with a very simple stitching boundary – between the front camera and the back camera. In this case you can do the stitching with the free iOS/Android/Mac/PC app (Insta360) or in the cloud (Ricoh).
I expect that as a tradeoff between quick production and sharing on social media, most viewers will learn to tune out the thin stitching line from these low end rigs and become immersed in the content of VR videos.
It has been discontinued. As Tim Dashwood was hired by Apple in early 2017, he made all his plugins for Final Cut Pro X, Motion 5, Adobe After Effects and Adobe Premiere (on the Mac) free. The 360VR Toolbox was used by high-end VR video producers to make 3D VR video – when you make videos that send different images to the left and right eye to get a feeling of depth inside the sphere of video.
Although many of the features of these plugins have been implemented in Final Cut Pro 10.4, some are not yet available. If you already have copies of the Dashwood plugins, they still work in Final Cut Pro 10.3.
The current version of Final Cut Pro X (version 10.4) includes 360° spherical video features.
Spatial audio is sound that responds to where you look inside a spherical video experience. For example, if you hear something in your left ear and then turn towards it, you will hear it in both ears equally. Many 2017 consumer spherical video cameras record spatial audio. Only some video editing applications recognise it, so you may need to use a separate audio editing application to work with spatial audio. Facebook acquired a company that made spatial audio plugins for digital audio workstation applications and now supplies the plugins as Facebook 360 Spatial Workstation for free.
Current versions of Final Cut Pro X and Adobe Premiere add the required metadata when they upload to YouTube.
The GoPro VR Player can show a VR video stored on a (Mac/PC/Linux) computer it is running on. For phone playback on iOS or Android, FreeVRPlayer from GingerMonkey Games will play locally stored VR videos, recognising 3D and ambisonic soundtracks.
No free desktop players yet, but the iOS and Android YouTube app recently added ‘3D audio’ ambisonic audio playback. That means you can share unlisted YouTube videos with clients for playback on their phone.
Although the vast majority of VR video today is about capturing, modifying and experiencing a full sphere of video, it is likely that there will be many uses of fractions of video spheres.
For now most people experiencing VR video initially look all around the sphere: up, down, behind. After that inital exploration, the spend the vast majority of their time hardly ever looking behind.
Single fisheye lens cameras can now capture as much as 235° of the video sphere. This means the whole stitching process can be avoided. You don’t have to leave uncaptured 125° (or 180°) rest of the video sphere dark. You can use post prouction tools to composite stills or motion graphics that add to the experience: “to get more information, look behind you/down/up.”
A way of avoiding the problems of stitching is to use single lens rigs. Either a single camera for monoscopic (2D) video or two cameras for stereo (3D) video. These result in videos that allow the viewer to look around a view that is 180º wide: from looking straight left around to straight to to right. When you look at the 2:1 equirectangular projection of the VR video, only a central square has moving video.
180º video is shot with a fish-eye very wide angle lens that covers everything from looking straight down to looking straight up. It results in half a sphere of video.
180×101 video is shot with cameras that can’t support a wide angle lens that covers the full field of vertical view. Instead of 180º vertically, the resulting video is 101º vertically – ‘straight down’ and ‘straight up’ aren’t captured.
The advantage of normal VR video is that it isn’t too difficult for the playback software to take equirectangular footage and show the viewer what they would see if they looked in any given direction. Many VR camera rigs can capture much more information than playback systems on phones or computers on websites can show. Some record depth information to aid stereoscopic (3D) video post production. 6 Degrees of Freedom allows the viewers to make small head movements (left/right/up/down/backwards/forwards – six degrees) to influence what is seen in headsets. More on 6DoF.
The more money you spend, the further you can move along the 3 axes, or in the six degrees of freedom. Intel have a technical demo that allows viewers to move at room scale. This requires large amounts of source camera resolution and computing power to create the 3D environment for 6DoF to work.
For now positional VR will be based on post-production tools that recognise video footage with depth information, so that overlays and focus and lighting effects can be applied to areas of a 360º shot based on their distance from the camera.
Facebook plans to include 6DoF in their VR video player. They demonstrated test videos using depth maps produced in conjunction with Alpha software for Adobe After Effects and Adobe Premiere from Mettle in April 2017.