Imagine a special type of camera that is put inside a soap bubble and that is capable of capturing the whole inner surface of the bubble with just one shot! The camera itself is not visible in the image, but everything around it is. There is no concept of being ‘behind the camera’, as everything is in front of it. That is also what 360° photos and videos feel like: bubbles that magically float in the air and capture everything around them. More specifically, 360° video differs from ordinary video in that it provides a very large field-of-view – up to complete sphere around the camera – and the product of a 360° camera is actually a spherical image.
How do you view such images? As an example of the challenge, consider drawing a map of the World: a cartographer must force the spherical surface of the Earth on a piece of paper using some sort of projection. Unfortunately, when a 3D spherical surface is unwrapped and drawn on a 2D surface, severe distortion inevitably occurs. Also true in the opposite case: it is impossible to wrap an orange inside a rectangular paper without making any wrinkles. But we don’t want to see distortion nor wrinkles in the image. This is why 360° video is best viewed with a special kind of video player where only a small part of the whole field-of-view is visible. Ordinary rectilinear projection can be used, image distortion stays minimal, and everything looks like what we are used to seeing. Great. It is just that most of the content is now invisible, behind the boundaries of the display device…
In a 360° video player only a small part of the captured field-of-view is visible to the user at once. Each user can independently explore the scene by freely panning the visible part of the view. Each playback is a unique experience.
The benefit of capturing a scene in 360° comes from the fact that viewing a 360° video is actually an interactive experience: the user can navigate within the whole captured field-of-view by panning. It is like user was in charge of the camera, pointing it at any direction, and looking at whatever he finds interesting. It also means that each user will get a unique experience, and each time a particular user plays the video he sees a slightly different version of it.
In practice this can happen for example by dragging the image on a touch-sensitive screen, or by moving a display device that is equipped with movement sensors. The most natural experience can be achieved with virtual reality goggles: the user can look around within the whole captured field-of-view by simply turning his head, which effectively creates an illusion of being in another place and time. Everyone who has tried that knows that it feels awesome. It feels… future. But it is here now. The technology is ready for mass-market.
An example of a 360° video. On computer, use mouse or arrow keys to navigate. On KolorEyes app on iOS or Android, drag the image by touch or simply move the device. If you have a VR frame (such as Google Cardboard), enable VR mode, put the phone into the frame and simply look around while wearing the frame. Notice how the image contains a full spherical view without any black spots or covered areas, and the camera is nowhere to be found. Video by FinCloud.tv.
The Good, the Bad and the Ugly
While free navigation within 360° video is very fascinating from the user’s point of view, it also brings both a great opportunity and a great challenge for content creators. Learning to shoot credible 360° video is a challenge of its own, yet it is merely a first step as we will soon see.
The potential in 360° video is obviously huge: now the user can be taken in the center of the action, and for the first time in history we have technology to trick the brain actually believing that it is in that other place, not just looking recorded images of it. But the challenges are also new and seminal in nature: Nobody can escape a 360° camera, so where to hide the cameramen and other necessary people in the setting? How to arrange proper lighting when all the usual gear must be hidden from view? You get the picture… and rule number one: everything and everyone must be part of the scene, unless you are ready for some serious post-processing.
In ordinary film making the director chooses to use different views to the scene by changing camera location, direction, height from ground, zoom level etc. 360° video requires a completely different approach: Since a cameraman could not hide himself anywhere, the camera must be placed in a cart that can be pulled with a string, attached to a self-propelling vehicle such as a car or a robot, carried around by one of the actors – or left still. All the directions are captured in 360° filming, so camera orientation cannot be changed without fighting with users who also try to control their own viewing direction. Camera height from ground cannot be altered with large (visible) cranes, but a small drone might work instead. Zooming is basically impossible – 360° camera has the widest possible view all the time, which also means that it needs to be taken very close to the action – but not too close because nasty stitching errors creep in when there is too much parallax between the multiple lenses of a 360° camera set. So we get rule number two: forget all the usual camera tricks, and imagine your camera is just an invisible person hanging around in the middle of the scene.
Yet the hardest puzzle remains: the way of storytelling needs to be completely revamped, as the director cannot control what his audience will actually see: They might all be looking at different directions. They might miss the carefully planted plot device. They might be still exploring the scene when you suddenly cut to the next shot. As a result, they might not understand half of your story. And that just cannot happen.
Virtual reality goggles trick your brain into believing that it is in another time, another place. A new era in entertainment starts right here.
All of this is being worked out by many pioneering film makers. As an example, it is already known that for 360° movie you need both 360° image and 360° sound. Why? One thing is that it makes the experience feel so much more real, but it is also very useful for bringing back some control to the director: Imagine you suddenly hear steps approaching behind you. What do you do? You turn to look. Did you just hear a woman scream on your left? Your head probably turned before you even realized it. The third rule seems to be hardwired to our brains: our ears turn our eyes. That is very useful to know.
360° film makers will learn to use these three rules and find many more. And maybe the users will turn their heads at the intended moment. But a professional would never rely on hope. A professional wants to know. He wants to test and see if his movie actually works as intended. But how would he organize such an experiment? Maybe he would find a group of people, put them inside a room with virtual reality goggles, and play the movie. Afterwards he would ask questions from his audience to learn from their experience. Or maybe he would record a video of them wearing the goggles and turning their heads around while watching his movie, to get more objective data. It would be a lot of work to analyze that video.
Fortunately, we have a better solution to offer. Much, much better.
Music videos are pioneering 360 video storytelling. Observe how camera is placed in the center of the scene, and action scattered around the camera. Some clever tricks are included: virtual screen replaces full screen graphics, and camera attached to a slowly moving truck brings new scene and people in view. Video by Kolor.
The LiveHEAT Solution
LiveHEAT is Finwe Ltd.’s novel (patent pending) method and service for automatically tracking and visualizing 360/VR visible view area across a large user base. In other words, LiveHEAT knows what they see. And now you can know it too. Here’s how it works in three simple steps:1. A small head tracking component, the LiveHEAT Tracker, resides in the 360° video player and continuously observes what part of the whole field-of-view is visible to user, including full orientation data and zoom level. This can be used to produce a gazing track similar to the one illustrated above.
2. Gazing tracks from several users who watched the same video clip are transmitted to LiveHEAT Cloud for post-processing. Optional Google Analytics integration allows including information about each user such as age, gender and location for analyzing differences over user groups.
3. LiveHEAT Cloud creates a model of the gazing data and allows exporting different kinds of visualizations, such as a heatmap illustrated above and superimposed over the original video.
When using LiveHEAT Service a director would organize a focus group simply by posting a link to hist video clip to a group of people who then watch the video using LiveHEAT enabled player. The player automatically tracks what the users actually see and sends this information to LiveHEAT cloud, which processes the data and creates a gazing model. The director logs in to his LiveHEAT account to watch a replay of the video with a heat map superimposed over the video. With Google Analytics integration he could choose to include only female users older than 45 living in Texas and interested in gardening, and replay their heat map.
LiveHEAT service can export illustrative heat maps superimposed over the original spherical video. This means that you can watch view traces even on your on mobile device in VR mode.
You might wonder what do you actually get out of the heat map visualization. It gives answers for example to the following questions: What exactly the audience is looking at right now? What parts of the scene the audience mostly observes? What part of the scene the audience does not see at all? Does the audience turn their heads at the intended moment? Which actor the audience follows when they separate? Have their freely observed the scene enough already? These are just a few examples. LiveHEAT quickly becomes an irreplaceable tool in your 360° film work flow.
An example of LiveHEAT heat map visualization superimposed on an equirectangular format video source. From the temperature colored spots it is immediately obvious what users chose to look at, and what not. The longer the user views a particular area, the hotter it gets – it is as simple as that.
Use Case Examples
Contact us for more information about LiveHEAT solution:
Call Us: (+358) 40 769 4061 – Skype: CEO Juha Kela – Mail: email@example.com