Can Data Make Your Museum Exhibit More Immersive?
If you really want to get immersive, you have to get meta. It’s one thing to interact within an experience, but it’s quite another to be able to look, real-time, into an analytics mirror and see exactly what that engagement looked like.
That’s the type of next-level immersive learning that is happening at the New York Hall of Science (NYSCI) Connected Worlds exhibit, which I learned all about thanks to the Panasonic and AVIXA sponsored event at the museum on November 29.
On the first layer, Connected Worlds visitors interact with one of the planet’s largest full-scale, seamless immersive CGI environments (think CAVE, but for very large groups) to learn about sustainability. But then on the meta layer, they can also view a data visualization of their every move, and the full day’s worth of moves beforehand, to see the big-big picture of how their actions affected the virtual biomes that live and die for their educational benefit.
The data doesn’t lie. Once you see the extremely adorable “creatures” that were developed and released into the Connected Worlds space by the interactive specialists at design I/O, you’ll do everything you can to keep them alive. Or at least, I hope you will. But then again, NYSCI is interested in analyzing precisely that — in addition to studying learning behavior, they want to look at both the pro-social and anti-social behaviors of the humans who collaborate in the space.
This might be the meta-meta layer of this project. Not only is Connected Worlds meant to educate visitors on the complex, multidisciplinary science of sustainability, but the exhibit is also a learning laboratory about how humans can gain understanding of systems (rather than just facts) through interactive, immersive experience.
Just beyond the edge of the brightly-lit immersive space at the center of the 100-foot-tall Great Hall where Connected Worlds resides is a large, podium-mounted LCD display. There the museum presents its “Global View” visualization of data culled from exhibit’s trackers and sensors. For this display and real-time visualizations provided on rugged interactive tablets used by visitors, data is logged and stored both in graphical form, in a series of jpgs that are consolidated into the “Global View” Quicktime movie of cumulative events, and also as cloud data, which can be analyzed later for educational research purposes. (Watch for several new NYSCI-published papers next year.)
Connected Worlds has been open for two years so far, and looking ahead to the future, NYSCI Chief Scientist Stephen Uzzo, Ph.D. said the museum is looking at technologies that will allow for more individualistic tracking of behavior in the space. “We want to look at things like how many times do individuals stop and talk to other people, where do they go in the exhibit, how fast do they get there — a lot of the details you just don’t get with the overarching more optical system,” he said. Once funding is achieved for the implementation of that technology, “we’ll layer that onto what we already have, and build a whole set of analytics on top of that.”
Even more meta. With the amount of research teams that are looking at this information, including game theorists and other behavioral scientists, there’s a whole lot of learning happening around this exhibit. So, when you get immersive, make sure you start collecting data, because people who engage in technology are going to want to see how they relate to a space, and others will want to have a peek at their behavior, too.