Shaking Up the Ad Business

Bookmark and Share

Wed, 06/29/2016 - 15:37 -- Al Caudullo

Stolichnaya’s 2014 ad featuring a shaking martini glass was one of the first to use immersion technology.There is a huge push for haptic advertising. Simply put, to add nuanced tactile effects to simulate a realistic feeling to 360 virtual reality content: the shaking of a martini glass in a Stoli ad; the power of the engine in a Peugeot ad and the flickering of lights in the American Poltergeist trailer. These are a few of the ways that haptic advertising has been utilized using Immersion’s Touchsense Design Cloud. Starting in 2014, Stoli was the first to break the ice, so to speak, with the first tactile ad. Now 554 mobile apps and 481 tablet apps on the Opera Mediaworks network have the capability to run haptic ads. These ads have the potential to reach 14.9 million total impressions, 4.2 million unique viewers.

Immersive is the buzzword for 360 VR content. VR wants to put you into the experience. But with only visual and aural elements, one thing that is missing is tactile response: the ability to feel as well as see and hear. How to accomplish this? Haptics. You already have it on your Smartphone.

The simplest form is the vibrating mode on your phone. Your Smartphone uses actuators that alert you to a phone call or message now can are being used to augment 360 VR videos. You’ve probably heard of haptics for gamming, where the use of hand controllers sends subtle and sometimes not so subtle vibrations to your hands.

Immersion Touchsense Design CloudNow one company has designed a beautiful streamlined system to enable you, the 360 VR media creator, to incorporate haptics into your videos. Rather than having to download and setup the software on your studio computer, Immersion has created the Immersion Touchsense Design Cloud.

This company of 130 employees has accomplished something that many larger companies have not: they have created a sophisticated, yet simple to use tool to accomplish a very complicated task. Current customers include LG, Huawai, Sony, Fujitsu, Kyocera, Volkswagen, Logitech and Meizu.

I spoke in depth with David Birnbaum, design director at Immersion. He explained how the system works. There are four elements in the process. Once you sign up with Immersion they will send you an interface box. This small box is your connection between your Mac computer and your Android smartphone with v5.0 and up software. The only other software that you need is an audio editor. The supported software right now is Pro Tools 10.3 on OSX 10.8.5, Pro Tools v11 or V12 on OSX 10.8 and up, or Adobe Audtion v8.1.

Once you download the Haptic Monitor from Google Play and you connect to the interface, the Haptic Monitor Connect to the Design Cloud on the Internet. Using your audio editor the haptic ques are created where you want them as an additional audio track, but instead of normal audio content, the track contains the haptic ques that you built in the Design Cloud.

Next, the audio track is output as a .wav file and combined into the video in the Design Cloud. The testing phase follows using the Haptic Monitor Connect to play back the content on your smartphone. This enables you to make and adjustments necessary to make the haptics work to enhance your content. Once the haptic ques are tweaked to your satisfaction the final product is ready to be published.

Go to www.immersion.com/touch-sense-design-cloud-interest if you are interested in learning more about using this technology for your content.