|A woman steps inside the woods|
to better understand where homeless Veterans sleep.
At StoryUP, we have street teams that conduct unscientific "clinical trials" to gather information about where the user looks. Before our story is published, we ask dozens of people to don headsets and watch our videos. Then, we watch them watching the video. Did they see the elephants underneath the helicopter? Did that slight camera movement make them feel sick? Did they want to take off the headset before the end of the story? VR analytics tools like Thrillbox and Retinad are already making our unscientific video trials easier by showing us exactly where our viewers look inside the sphere. Wistia also has a 360 player with heat maps that are providing storytellers with valuable information. With new VR inputs coming out all the time, storytellers now have even more tools to test. Here's the latest rundown.
|Heat Map of a 360 Video Shows Viewer Attention|
How do you ensure viewers are seeing the action you want them to see? We use narration, text, lighting, hotspots and sound to softly direct the audience's attention. If someone were to clap their hands behind you right now, what would you do? Most likely, you'd turn around. It's the same concept in immersive storytelling. The sound is spatial or 3D and can be mixed at different locations inside the sphere. When I worked in radio back in the 90s, I used to edit audio tape with a razor blade and masking tape. So to be talking with sound designers about placing audio in a game engine so you can hear it on the ground sounds complicated. Just when storytellers have conquered 3D video, now we're having to tackle how to capture and mix 3D audio. If you'd like a deep dive, StoryUP recently hosted a Hangout with a group of 3D sound designers to explain the differences between positional audio, binaural, ambisonics and head-related transfer function. Splicing together audio reels with masking tape sounds really good about now!
Vestibular Headphones are another input Samsung is testing for VR storytelling. These Entrim 4D headphones are said to use nerve waves to manipulate your vestibular system and move your body in sync with the video. I kid you not. Essentially, the video moves you. Skeptical, I tried Entrim at SXSW and was shocked to find my body was actually being swayed to mimic the race track I was seeing on screen. In this video, you can see the moment Samsung engineers turn on the device. The headphones are not available to the public yet. I left the demo with lots of questions. Will Samsung have to do clinical trials first?
Another future input for VR storytellers is aroma. Before you roll your eyes, you should know several brands have already combined smell-o-vision with virtual reality. Budweiser allowed VR viewers to smell its beer. VR helmets are coming out on the market that include different aromas, water mists as well as touch.
I recently watched a pitch competition for a VR haptic glove which uses air pressure to simulate the virtual feel of objects. Could I envision ever using haptics in our stories? Yes! In our most recent story in Zambia, we were standing atop Victoria Falls in Africa. The cliff on which we stood vibrated due to the force of the surging water. It would be incredible to experience that vibration on my feet in VR. Haptics will be another tool for storytellers to indirect attention. If you put out your hand and you feel a wall, you might be influenced to turn the other way.
What tools are you using to indirect attention in Virtual Reality? We would love to hear about the outcomes of your own clinical video trials. http://www.story-up.com/.