In Focus: Interactive Theater - Part 3, Production

Article by Katharine Horowitz

Audience immersion and interactivity have always been the mainstays of haunted houses and historical reenactment sites, but the genre seems to be experiencing a recent eruption of popularity in the United States with such productions as Punchdrunk’s Sleep No More, or Third Rail Project’s Then She Fell, both in New York City. The last few years have also seen an increase in some impressive interactive theatre productions in the Twin Cities, but is there a lasting future for it here? And what challenges do companies encounter when building the kind of designs and stories needed to create a successful interactive experience?

This is the third in a series of articles examining the work involved to produce interactive theatre, and how we might foster its continued growth in the Twin Cities. We continue our discussion from last month with the designers of two recent interactive theatre shows. We will hear from audiences in the next article.

There’s an element of complexity in the technical process of interactive theatre that differs from traditional presentational theatre. The level of unpredictability is heightened. The excitement behind creating such constantly mutating intricacy is undeniable, culminating in an inescapable pride when it all knits together.

CTC Sound Director Sten Severson was the system designer for 20,000 Leagues Under the Sea (20K). Freelance composer and sound designer Michael Croswell designed the sound and music for Live Action Set’s Crime & Punishment (C&P). Working with very different budgets and in very different spaces, the two designers discussed their approaches and aesthetic.


Neither Severson nor Croswell had ever designed specifically for interactive theatre before. When approaching the system design for 20K, Severson drew from his experiences designing for modern museum installations and electronic opera. Croswell found inspiration from his 25-plus years of experience playing live music and becoming familiar with the changing nature of live accompaniment.

“When I composed the music for Crime and Punishment there were aspects of the work that required me to be meticulous like a composer, and there were aspects of the work that required me to have the ability to improvise like a live musician,” Croswell said. “It was a 50 minute show that asked me to essentially create a soundtrack for each room or speaker location... And each location was a different nugget of the story that I had to support or highlight.”


The bulk of the challenge for Croswell and Severson was keeping up with the audiences and the multiple performances areas, while still serving the immersive needs of the production.

“In a broad sense the goal of sound [for C&P] was to act like an emotional fog that blanketed the entire set,” Croswell said, “and to use sound as a timeline to allow for the cast to synch up throughout the playing space.” This necessitated Croswell integrating sound prior to tech.

As system designer, Severson had a similar task on a larger scale. The pressing question for 20K was: How to control an event that has multimedia, moves through the building, and overlaps?

“We identified the places we needed to be able to control,” Severson said. “And then we said ‘okay, well what technology can we use to allow someone at that location to control lights, sound, and video?’, realizing that we needed to be able to control ahead, too. It was impossible with the route to have someone run ahead and set things up.”


Severson wound up creating several control boxes, one for each performance area, networking each of them to a master computer that ran QLab and in turn communicated with the various audio and video playback computers, and the lighting console.

However, the boxes were just that: devices that sent out a signal via Telnet or web interface but had no way of knowing what happened and where. So Severson had to come up with a way for the boxes, and the people operating them, to communicate.

“I knew I had to find a way to tie these things together because natively QLab can’t talk to those little boards in those boxes,” he said. “So I had to find some way to connect the two and then provide sort of an overarching look at what’s going on.”

The answer was drawn from his experience with electronic opera in the early 2000s, during which he was introduced to Max, a visual programming language for music and multimedia (originally developed by IRCAM, a French institute for electronic arts and music, later distributed by Californian software development company Cycling 74).

“The basics of how [Max] works is really simple,” Severson said. “Even though it can really do incredibly powerful things, it doesn’t take a computing degree to figure it out. I know a little bit about programming but this sort of worked with my brain chemistry better. It’s meant to be used for audio and video. You can get very deep into very interesting programming stuff without having to be a programmer. Literally drawing lines between different objects and different things.

There were approximately seven control boxes on the wall, with the same number of modules in the Max software, each relating to one of the control boxes, and each with its own IP address.

Mac computers were distributed with QLab for each of the main performance areas. All communication originating from the master QLab computer was triggered by OSC commands, which in turn would trigger the appropriate performance area QLab computer, as well as sending out commands to each control box, which in turn would communicate what had just happened back to the master computer. A complete loop of information.

“The thing that made this take was being able to use Max and knit it all together,” Severson said. “It allowed me to use pieces of gear and software that weren’t intended to work together. QLab 3 OSC implementation made things a lot easier, so we didn’t have to try to control remote machines over MIDI, which is kind of the next best thing. Because Max also has an OSC component so it can receive and send OSC commands, I was able to translate everything back and forth from Telnet into OSC.”

Working in a more confined (and dusty) space, Croswell determined early on that the sound would be driven by multi-channel fixed audio playback driven solely by QLab, with each audio timeline divided into three-to-five minute blocks. However, with little to no staff to assist set-up, the process became a balancing act between time spent physically working on gear arrangement versus time spent working on the design.

“The very first steps of this project were to figure out how I could pull together enough gear to run a multi-channel audio system throughout the entire space,” Croswell said. “I had two 8-channel snakes that I ran to different halves of The Soap Factory's basement. (This means I had two 8-channel nodes that I could branch out from and run lines to each specific speaker location.) I had eight powered, full-range speakers that I used as main speakers to provide sound for the main soundtrack that synched up the cast. I then had six extra audio lines that I used in small radios and environmental effects.”

Croswell also made liberal use of QLab’s app for iPad, noting how much more difficult setting levels in such a fluid environment would have been without it. When not setting up gear, Croswell was working at night editing and tracking the music and sound to get it into QLab. He found the use of QLab’s app for iPad invaluable.


Because the system design, and the nature of interactive show, affected all the multimedia design aspects, Severson was sometimes concerned that any hiccups the system experienced held designers back.

20K lighting designer Craig Gottschalk never felt creatively stifled by the process, but did agree that it was a unique situation.

“It was interesting to be reliant upon a singular system,” Gottschalk said. “In a normal show setting, in which you’re in a space that’s designed for theatre, you can exist alone. If the sound board crashes, lights can still fire and the stage can still change scenery. In this case, if the system failed the whole show ground to a halt.”

He and Severson agreed that their struggles were not unique, with each department having the same challenge of how to make the show work in the space, with the existing budget, and everybody wishing they had more tech time. Looking back, Severson puts the integration of the control boxes and Max software into perspective.

“The truth of the matter is you don’t really learn how [things] work until you’re in the heat of battle, until you use it in anger,” he said. “You can do as much prep work as you want - it’s not going to actually fail properly until you try to use it for real.”

Croswell agrees, noting that the process required him to learn and adapt in real time and with limited resources. He also acknowledged how much the presence of an audience can change an interactive performance.

“Things really change once the audience arrives and starts to wander around inside the production,” he said. “When the fourth wall has been totally smashed and the very first audiences become part of the action the entire production team can expect to radically alter their work again (even after weeks of tech rehearsals).”


For all the complexity - and complications - both men said they felt their designs were a success.

“The show ran very smooth, technically,” Croswell said. “The system was reliable and it sounded good. I felt that I created a lot of interesting sonic environments that worked well for each of the character areas and locations.”

Severson takes pride in his system design, but both he and Gottschalk wonder if interactive theatre is a financially sustainable method of performance, given the equipment, space, and limited audience capacity that can, in turn, affect ticket sales. Croswell, meanwhile, ruminates on the greater impact of live audience interaction than that of traditional theatre contained by the fourth wall.

Are audiences responding positively? Are they attending in numbers that make the effort it takes to produce worth it? In our fourth and final installment next month, we’ll examine audience reaction to interactive theatre in the Twin Cities.