April 30, 2018 by Kirsten Nelson

Manifest 1.0Even Jerry Seinfeld loves Steely Dan. That was one of my main takeaways from “Comedians in Cars Getting Coffee,” which is now being consumed rapturously by Netflix viewers everywhere. Asked what music he likes, Seinfeld said, “Steely Dan is the one that just stopped me in my tracks.” And he further revealed that he sees the band play every chance he gets.

Seinfeld then loses a bit of music cred when he confesses to not being familiar with The Clash. But it doesn’t matter. Because he’s just proved the most important axiom of the audiovisual experience design universe: Music is the thing that changes every conversation.

The enjoyment and performance of music might be the strongest universal across the multitude of technologies and practices represented at the InfoComm show. That’s why I’m so excited about one of the music-related sessions we have slated for Center Stage this year at InfoComm in Las Vegas. The topic is, “Combining Live Music with Both Physical and Virtual New Media,” scheduled for 3 p.m. PDT on Thursday, June 7.

Mark your calendar for this one, because that’s when the extremely engaging polymaths Yo-Yo Lin and Steve Dabal, Art Director and Creative Director, respectively, with THE FAMILY are going to be on stage talking about how the experience of live music can actually become even more immersive with the thoughtful use of new technologies.

As a reference point for how musicians are seeking to create truly high-tech, high-touch experiences even on a relatively small scale (it’s not just the arena rockers who want the best lighting and sound), Lin and Dabal will be talking about Manifest 1.0.

Manifest 1.0Produced last year with the goal to create a new platform for live music to be experienced as art, this live music gallery of technology-enabled art was created and performed collaboratively with the musician Sunni Colón and a dynamic team of multi-talented lighting, sound and music experts (including Joanna Fang, Foley Artist with Alchemy Post Sound, who will be presenting on “Creating Empathy with Immersive Audio in VR and Video Games” immediately following Lin and Dabal’s session on Center Stage).

After learning more about Manifest 1.0, I’d describe it as a seriously great immersive music experience that combines new media with analog atmosphere in a way that gives me much hope for the future of concerts.

It seems we are reaching a new apex of musician-audience alignment in technology and event design. Not only do crowds of fans crave a spectacle worthy of leaving their homes, removing their ever-present headphones, and (gasp!) looking up from their phones, musicians also want to create engaging new expressions of their work.

The good news is, live music is already a visceral experience. But what Lin and Dabal will discuss is how to truly engage the senses and create a memorable event through new ways to personalize and engage with lighting, acoustics, interactivity, immersive audio, bespoke architectural elements and even customized scents. Then to complete the package, new media will also allow fans to take the experience home with them and re-visit the immersion through VR. Incidentally, to all you new media heads out there, The Family is still looking for funding for a Unity programmer to translate its lightfield renderings of Manifest 1.0 into a VR experience for future viewers.

So make your plans for Center Stage now. Because it’s the place where we have conversations rather like those between Seinfeld and his fellow comedians. It’s all about what makes us show up for work every day — that creative and intellectual urge to make things work together and look good and sound good, creating experiences that stop people in their tracks.

Session Details:

Combining Live Music with Both Physical and Virtual New Media

InfoComm Center Stage (Booth N1646), 3:00 p.m. PDT on Thursday, June 7

A look at how a conceptual live music performance uses immersive lighting, sound, and interactivity to create an emotionally intelligent space. How does new media art and technology allow for deeper artistic collaboration with live-performing musicians? How do current audiences want to experience and engage with music? How does virtual reality aid in the process of designing and distributing the event and act as an archival tool for artists?

About Kirsten Nelson

Kirsten Nelson has written about audio, video and experience design in all its permutations for almost 20 years. As a writer and content developer for AVIXA, Kirsten connects stories, people and technology through a variety of media. She also directs program content for InfoComm Center Stage. Kirsten was the editor of SCN magazine for 17 years, and has written for numerous industry publications and the InfoComm Daily. In addition to technology, she also writes about motorcycles, which provides a professional outlet for her obsession with MotoGP racing.