Hippotizer tools up for battleROYAL
- Details
The show, part entertainment and part art installation, was played to 200 guests facing the open sides of a 7m x 11m stage. Upstage, projection surfaces covered the rear two sides of the performance area, forming the setting for a story told through the interplay of human movement, tracking projections and light.
“battleROYAL continually create new ways to integrate technology into their performances,” says UK-based video engineer Steve Jackson, a technical partner for battleROYAL’s productions for the past five years, and a Hippotizer specialist for the past seven.
For this show, Jackson used Hippotizer Karst+ media servers with HD SDI outputs and capture card, to play media through multiple projectors, including Barco HDX and Epson 15K units, covering the walls and floor of the stage, and a performer’s dress which becomes its own projection surface. With limited time on-site, the precise technical layout was completed in advance using CAD and Cinema 4D. The system design was then completed in Cinema 4D and SHAPE, Hippotizer’s 3D toolkit, where projection angles, throw distances and required lenses were calculated and defined.
Jackson found SHAPE to be a particularly useful feature on this project. “Hippotizer enabled me to use both SHAPE for the tailor-made set, and a normal Viewport for the dress element, which was a slight unknown and didn’t require full 3D modelling. This combination is very powerful, as to be confined purely to 3D would have been very challenging,” he says.
Jackson used MultiController, the Hippotizer V4 tool which manages various control protocols including MIDI and OSC. “MultiController also enabled the integration of a Wacom tablet to control parameters of the Notch block,” he says.
Notch became the universal tool to realise this artistic vision and to create dynamic video content. The live setup enabled the designer to test new content during the intense production period within seconds of creating it, giving the team precious time to focus on the creation of the whole show.
“Cameras were also used to track performers’ movements and work as an effector for the Notch elements,” says Jackson. “The fact the show could be programmed in timeline and triggered via the lighting desk made show control very straightforward. The show was also locked to timecode to sync with audio files.”
(Jim Evans)