Local Projects, Timescape, 9/11 Memorial and Museum
We thought it was going to be gestures. Then suddenly it was voice. Now we’re looking at a potential amalgamation of the two, where your room’s AI-enabled user interface might sense your arms flailing in exasperation and react by dimming the lights and cueing the Sade to calm you down.
That’s a slightly ridiculous approximation of where control system design is headed, but it takes into account what we’ve been saying all along — it’s time to put the human at the center of the UI experience. As new technologies make it easier to analyze human actions and emotions, we might soon be having the best human-machine conversation yet.
James Patten, Director, Patten Studio
Sundar Raman, Director of Creative Engineering,
That’s what I’m gathering from recent conversations with James Patten, Director of Patten Studio, and Sundar Raman, Director of Creative Engineering with Local Projects.
In their respective roles, each of these humans are making it easier for us to talk to machines. Raman finds ways to seamlessly connect people with the context they need to grasp subject matter in museum environments, such as in the “Timescape” visualization of ongoing media coverage at the 9/11 Memorial and Museum. And Patten cuts the awkwardness out of UI with the thoughtfully natural means of interaction inherent to his responsive Lift lighting feature and collaborative tabletop robot Thumbles.
I got together with Raman and Patten last week to discuss “The Psychology of User Interface Design” as a topic for Center Stage at InfoComm 2018. And we talked about all sorts of successful and not-so-successful interface ideas, including elevators, various forms of light sources, VR, AR and RFID “smart dust.”
Looking at how these new tools might improve the UI experience, one idea became apparent. Interaction is going to be a lot more transparent soon, and that’s a fact that’s already changing AV systems design.
“We’ve all been talking about going away from screens,” Raman observed. “So now maybe we can actually start utilizing systems built into environments.”
With voice, motion, heat, RFID and a multitude of other sensors serving as inputs, soon the built environment itself is going to be our UI. That said, if we’re eliminating screens and developing interaction that relies on natural forms of human expression, we need to start codifying the way we’ll be interacting with building-embedded systems. Otherwise we’ll face a language barrier every time we walk into a new facility.
Lift, Patten Studio
Photo Credit: Ty Cole
“Clearly, there’s a communication problem,” Patten noted. “As our environment becomes more complex, how does that communication happen? Is it an explicit interaction, is it implicit, where the environment’s figuring out what you want without you telling it what you want. There are a lot of interesting design choices to be made there.”
And then a preview of the session that Raman and Patten will be developing for Center Stage began:
Raman: “There’s a commonality to expression that has to exist. I’m afraid that when we get these sensoral spaces, if we don’t identify an open paradigm, it’s going to get co-opted very quickly and we’re going to end up in this world where we can’t communicate with each other.”
Patten: “Right, and the design decisions related to all of that are so much more important, because it’s in the world. It’s not like a phone, and you can just put it back in your pocket.”
Thumbles, Patten Studio
Raman: “It’s how you walk through a building.”
A point which brings it right back to the AV and experience design industry, and how our increasingly integral and embedded systems designs will influence the architectural environment.
This was just the beginning of a conversation that will continue as we develop the “Psychology of User Interface Design” and other sessions for Center Stage this year. Watch this space for more updates, and check out the Center Stage website for more details as they’re announced.