Relatórios e Whitepapers
- Type: Whitepaper
- Topics: Live Events;
- Date: abril 2015
By Dan Daley, Special to AVIXA
By now, the trope of a six-year-old child trying to manipulate images on the family TV — underscoring the expectation that every screen is (or should be) interactive — has become commonplace. And in the time it’s taken for that cute portrayal of technological progress to permeate our culture, some of those six year olds have grown up.
“You’d be surprised the extent to which people have come to expect interactivity on screens at events,” observes Jon Accarrino, Vice President of Marketing and Communications at RedTouch Media, a Salt Lake City-based developer that creates custom content for interactive media.
Accarrino cites the 2013 Daytime Emmy Awards as an example of how interactivity has enabled event production to expand in real time. During the cocktail reception preceding the awards show, guests could use touchscreens scattered around the room to summon and watch extended clips of nominated programs. He says the amount of content required to support an interactive environment presents both a logistical and a creative challenge. Accarrino estimates the company handles approximately a million digital files a month.
But the potential for interactive media at live events is about more than just entertainment. Accarrino says that in addition to user engagement, interactivity opens the door to analytics, which has become a prime motivator for clients, such as sponsors and exhibitors, to implement interactive solutions at live events.
At a 2014 event for Skull Candy in Park City, Utah, the Millennial-oriented lifestyle purveyor wanted to encourage attendees to try out the company’s new Crusher ear buds. Using a touchscreen, visitors could choose demo songs to listen to. After they were done, they could keep a file of the song. What Skull Candy got in return was data: age, musical tastes and other preferences. This information, says Accarrino, helped the company choose future spokespeople.
“Analytics is the bleeding edge,” he says. “It drives every major business decision a company makes. Once a user acquires a piece of content from our platform, we can feed our partners data that can help them make strategic business decisions.”
Analytics may be the bleeding edge, but how audiences interact with content at live events is also changing rapidly. At the InfoComm show, Planar Systems employed a 27-inch touchscreen to allow booth visitors to move digital objects around a much bigger 16- by 9-foot videowall. The developer of the software for that installation, Seattle-based IdentityMine, believes interactive interfaces for live events have already evolved beyond the stationary touchscreen.
“The interactive experience is moving off the screen,” says Matthias Shapiro, one of the company’s senior user interface designers. “Often now, the screen is just a display of information about what’s taking place when users are using other interfaces, such as gestures or motion sensing.”
Shapiro says his company develops interactive solutions for many different interfaces, including Microsoft’s Kinect and Surface Hub, Oculus VR’s Rift virtual reality (VR) headset, and other AR and VR interfaces, and the Leap Motion Controller.
Even as interactive interfaces have diverged and multiplied, the experiences generated by interactive technology have grown more varied. At one end of the spectrum are multi-user experiences, such as Planar’s multitouch walls. At the other are highly individualized encounters that utilize personal mobile devices or virtual and augmented reality (AR) tools. “Interactive experiences are becoming more situational, tailored to the needs of the event,” Shapiro says.
In some cases, interactive media at live events combines both collective and individual experiences. Metaio, an augmented-reality developer, has deployed its software at live events for clients such as Volvo, Coca-Cola and Macy’s. Visitors to a store or a tradeshow booth can look at a car or other product through the lens of an iPad or other tablet computer. Metaio’s software will apply an overlay of additional information and imagery.
For Volvo, Metaio made it so people could “see” under the hood of a car, using an iPad to examine the engine and suspension systems as if they were looking at an x-ray (pictured). The software applies the augmented reality; the content developer determines what the user should see; the user determines where to look and for how long — all of which makes for a highly personal, interactive experience, says Jack Dashwood, Senior Marketing Manager for Metaio in San Francisco.
“It’s something that sparks people’s curiosity and draws them in,” he says. “Perhaps more so than gathering around a display or videowall.”
Experts say VR and AR will help drive more interactivity at live events, but so will other interface technologies. Even as touchscreens grow more ubiquitous, “touch-less” solutions are on the horizon.
Vincent John Vincent, CEO of GestureTek, which makes capacitive (touch) and virtual (touch-less) interfaces, expects both to gain mindshare among event producers. But he expects that in the long run, once 3D depth cameras improve in resolution and tracking capabilities, the future will be touch-less and virtual.
“That ability to track gestures at a distance will allow people who are interacting at an event to do so more naturally,” he says. “When that happens, the complexity of the content will follow, and the technology will really take off.”
Start With Content
Not surprisingly, realizing the possibilities of interactivity at live events starts with content. And Dashwood says the cost of an interactive solution depends mainly on the quality and nature of that content. At the high end, Dashwood says he’s witnessed events pay close to six figures for “Hollywood-level content” and real-time add-ons, such as face-tracking technology.
That said, hardware costs for interactivity at events can run considerably less than for typical AV installations of multiple touchscreens. In fact, if a client wants to offer its content in the form of a downloadable app, then hardware costs are virtually zero. Shapiro notes that as more personal devices are used to support interactivity at events, production costs should also drop. However, he says, that doesn’t reduce the pressure on software developers, who are constantly trying to find ways to make each event unique.
“The range of technologies we can use for interactivity has exploded,” he says. “If I gave you a list of all the technologies we worked with in just a month, you’d be amazed.”
Which hints at a not-uncommon concern at the leading edge of digital technology: As a small cadre of technology developers labors to outdo each other, potential users of their solutions may be unaware of what they produced even a few months earlier. It’s not surprising, then, that many of the clients of the most ambitious interactive developers are themselves Silicon Valley denizens — often with deep pockets.
“People might see interactive media in a video or an article, but they’re always surprised when they find out how attainable it is,” says Dashwood. “More work needs to be done on awareness about what can actually be done on reasonable budgets.”