How Real-Time Media is Changing the Way We Create and Manage AV Content

What we used to call “interactive” is now something more like inter-reactive. Where previously gestures or touch would trigger specific, pre-programmed reactions from content, now we’re in the “real-time” era. What does that mean? That means the content is “generative,” or it grows and morphs in response to various sensor inputs, which can include motion, heat-mapping and other analytics, sound, video and light.

So basically the new content in AV feeds itself.

I had the tremendous pleasure of interacting with some of the finest practitioners of this new inter-reactive at the TouchDesigner Summit in Montreal last month. TouchDesigner is a “node-based visual programming language for real-time interactive multimedia content,” developed by the Toronto-based company Derivative. And for quite some time now, it has been fueling some of the coolest and wow-est new generative and interactive video pieces you’ve seen in lobbies, museums and performing arts venues. 

Hana Fubuki by Akiko Yamashita, Artechouse | AVIXA"Hana Fubuki" by Akiko Yamashita, Artechouse

Now that we’re seeing a lot more applications of these “experiential designs” in the AV world, we’re seeing the use of this software and many other real-time platforms from other providers in our projects. So what does that mean? How is real-time media changing the way we create and manage audiovisual content?

Here are a few ideas I picked up from talks at TDSummit:

• “You’re no longer making an end-result, you’re making a system” – With a talk that demonstrated the wide range of real-time outputs (visual content, audio-generated visualizations, data animations, social media posts and even print artifacts), Roy Gerritsen and Tim Gerritsen (“we’re brothers, but not twins!”) from yfx lab revealed how the roles of designer and programmer are now merging in creative new ways. And ultimately, the experience they’re creating for projects like the giant circular ceiling-hung video display at Chase Bank is one of unending variety. It’s never quite done. It’s a system of inputs and imagery that always changes.

Chase Bank, yfx lab | AVIXAChase Bank, yfx lab/Xite Labs

• Spontaneous engagement on many levels – Seeing the crowds move through her room-filling piece, Hana Fubuki at Artechouse last spring, the visual artist Akiko Yamashita witnessed the ease with which different audiences moved into controlling the piece’s elements. Small groups formed to create clusters of movement among the animations, and individuals found their own quiet moments in triggering the motion. And it works well for individuals of multiple ability levels. One wheelchair-bound visitor told Yamashita that the piece provided a lovely sense of extra agency not always offered in “interactives.”

• Experience design is transforming the way we collaborate – One of TouchDesigner’s earliest adopters, David Bianciardi of AV&C delivered a hypothesis for defining this emerging field in his talk, “What Do We Want to Be (When We Grow Up)?” Speaking to the TouchDesigner Summit’s gathering of AV practitioners, media artists, designers, creative technologists, motion graphics makers, musicians and too many other software-enabled categories to list here, he described how companies like his are evolving to provide real-time works for commercial and creative purposes. “Experience design is a hybrid discipline that requires an interdisciplinary team of curious and rigorous people to experiment, design, code, engineer, deploy and support, this kind of work at scale,” he notes. Well-versed in multi-faceted endeavors that blend real-time works of art into commercial spaces like 515 North State Street in Chicago, AV&C is a living example of the collaborative evolution required to ensure that tech-infused architecture resonates with audiences.

515 North State Street, AV&C and ESI Design  | AVIXA515 North State Street, AV&C and ESI Design
(Photo Credit: Caleb Tkach)

If the output is evolving in real-time, so must AV designers expect growth and change in the growing list of creative collaborators from the agency, brand and architecture worlds. Working together, the opportunities are out there to be captured. As Bianciardi reports, “We’re finding it’s easy these days to meet clients excited about creating this kind of digital landmark. They’re seeing a lot more of them, and they want them, and I can report the C-suite FOMO is strong, so you can expect the phone to ring.  When it does, you can position this kind of generative real-time approach as a strategic solution that solves real problems associated with this kind of canvas.”

TOPICS IN THIS ARTICLE


Kirsten Nelson

Kirsten Nelson has written about audio, video and experience design in all its permutations for more than 20 years. As a writer and content developer for AVIXA, Kirsten connects stories, people and technology through a variety of media. She also directs program content for the TIDE Conference and Technology Innovation Stage at InfoComm. For three years, she also created conversations around emerging media and experiential design at InfoComm's Center Stage. Prior to that, Kirsten was the editor of SCN magazine for 17 years.