The process of 360-degree video creation differs slightly from the one for traditional video material. As a first step, a video idea and subsequently, a story is invented. Based on this, a plan is needed to define what and who to shoot how and where. The shooting process itself is not complicated. However, there are some factors that need to be taken into account that will be explained further. Once filmmakers have captured the footage, consisting of multiple images, it needs to be stitched. This stitching process is most common for professional 360-degree film makers, while amateurs tend to prefer automated solutions. The next step is the post-processing where audio and video edits, as well as cutting for the story creation story are performed. The last step is distribution and consumption. There are three ways to distribute 360-degree videos:
- Platforms that support 360-degree content, such as YouTube and Facebook
- Smartphone and tablet applications that include 360-degree content
- Using a VR headset in order to have a more immersive experience
In order to generate 360-degree videos, certain aspects should be taken into account. The producers need to know about what to do when shooting starts, where and how to locate the camera during the shooting or how to avoid viewers getting motion-sick once they wear the VR headset.
Unlike the traditional, rectangular video frame, a 360-degree production does not leave anything or anybody around the scenery uncaptured. Thus, the crew, the producer, the actors, and everyone who is near by the 360-degree camera will be in the shot. As White (2016) indicates, in practice, producers crop the first few seconds after shooting start, while everyone – except the protagonists – hides from the camera, and the last few seconds when stopping the action. Therefore, the time for staff to leave the scene and later to return is not part of the final video. During post-production, these extra seconds are also useful for audio-video time-synchronization and planning transitions between scenes (White, 2016)
Positioning of the camera
In order to shoot 360-degree videos where viewers feel immersed and like being at the viewed location, the perceived distance to the objects and subjects in the recorded environment plays an important role, as reasoned by Ergürel (2016). Empirically determined, a separation between the camera’s location and the scenery has to be at least 1.5 meters. Due to the wide-angle lenses, an object or person too close to the camera will look spatially distorted. On the other hand, if the objects are distributed far from the camera, it will be challenging for viewers to follow the story-line and consequently, the immersion and sense of presence will be weak (Ergürel, 2016).
Furthermore, filmmaker Lavigne (2016) remarks the significance of the camera’s height above ground when shooting 360-degree videos. The wrong height can drastically affect the viewers’ experience: When the camera is located too low, the viewers will perceive the world from a child’s perspective, if the camera is too high the viewers will feel themselves artificially elevated. Thus, in order to avoid this distortion, the lenses of the 360-degree camera should be placed level to the actors’ chest or chin (Lavigne, 2016). Overall, when shooting a 360-degree video, producers should take into account the 360-degree camera’s placement in the environment.
Users often suffer dizziness when being immersed in 360-degree videos. It is one of the problems that producers need to manage during the shooting. Motion sickness within 360-degree videos is commonly caused by improper movement, rotation or shaking of the camera, leading to unnatural scene motion, reasoned by Peters (2016). The mind obtains contradictory signals of being embedded into a virtual world through the visual and auditory senses, while in parallel the viewer’s body is present in the stable, real world (Tambovtsev, Floksy, & Peshé, 2016). Possible solutions are to avoid excessive 360-degree camera movement and jerk during the shooting, and instead, maintaining stable movements or keeping the camera fixed (Peters, 2016).
Alternatively, in 2016, Facebook launched an algorithm for smoothing user generated 360-degree videos during the viewing process. This 360-degree video stabilization, offered on Facebook and the Oculus platform for users, reduces the motion sickness and make the immersive experience more enjoyable (Kopf, 2016). As stated by Kopf (2016), a research scientist at Facebook, and illustrated in Figure 2, this new hybrid 3D-2D stabilizing technique identifies the motion within the video by tracking distinctive points over time, then finds a motion model for the camera, and smoothens the video by counteracting the computed motion in order to finally produce dizziness-free output frames.
Check back soon for the next blog post where we will discuss the state-of-the-art of the 360-degree cameras and software used for post-production, including some technical properties of both.