The immersive livestream presented Bryson Tiller in a series of six different world
USA - Xite Labs utilised the disguise Extended Reality (xR) workflow, powered by a gx 2c media server and rx real-time rendering platform, to bring to life a myriad of virtual worlds in Unreal Engine for rapper Bryson Tiller’s Trapsoul World Series live streamed concert.
Shot at Xite Labs’ xR stage in LA, the concert accentuated the artist’s musical vibe for fans around the world watching at home. The immersive livestream presented Tiller in a series of six different worlds linked by a narrative flowing through the songs. Xite Labs was responsible for the stunning xR content for 14 different songs performed in four virtual ‘worlds’ with distinct appearances and themes.
For this immersive experience, Xite turned to its in-house workflow, featuring a disguise gx 2c media server as the primary controller of the xR environment, while a dedicated disguise rx machine was used to run the Unreal Engine scenes via disguise RenderStream. Front-plate elements were created in Notch to further link Tiller into each of the unique worlds.
These included a virtual lounge that fell away to reveal a world of galaxies, nebulas and spaceships, a time theme with a mountain desert landscape and a flight through a moonlit sky, guerrilla warfare transforming into a neon jungle as well as stark hallways with bold, flat lighting, colour-changing walls and silhouettes. Throughout, Tiller appeared to perform on a moving platform, which served as the anchor point transporting him from one otherworldly environment to another.
“I have not seen anything done in xR that was quite as diverse and complex as this. And the fact that it was shot on our smaller volume in such a short timeframe still blows my mind,” said Greg Russell, creative director at Xite Labs.
An unforeseen benefit that the disguise xR workflow brought to the production team was the ability to film Tiller wearing a shiny, black reflective jacket for the interlude performances.
“This would have been incredibly challenging on a green screen and require a great deal of time technically for lighting and ensuring the separation of light fields. But xR made this possible, and it looked amazing,” added Vello Virkhaus, creative director at Xite Labs.
The ambitious production clocked in over 400 hours of development per nine Unreal Engine artists across ten weeks and over 200 hours per two Notch artists in order to bring the virtual environments to life.
“From the top-down the livestream concert went exceedingly well and got great feedback,” said Russell. “Bryson understood the technology and intuitively knew where to be on the stage and how to be in and out of the lighting.”
Joining Xite Labs on the project was creative studio 92 Group, which provided the overall show direction and art direction, while production was done by HPLA.
“On the production level, it was the first time director Mike Carson, DP Russ Fraser and producer Amish Dani had done xR,” Russell said. “With film and music video people coming into our world, it was a very challenging job from a production standpoint because we were basically teaching them the xR workflow on the job. But once they started putting the pieces together they realised its value.”

Latest Issue. . .

Save
Cookies user preferences
We use cookies to ensure you to get the best experience on our website. If you decline the use of cookies, this website may not function as expected.
Accept all
Decline all
Analytics
Tools used to analyze the data to measure the effectiveness of a website and to understand how it works.
Google Analytics
Accept
Decline
Advertisement
If you accept, the ads on the page will be adapted to your preferences.
Google Ad
Accept
Decline