You are here: Home Xsens Blog A Q&A with Kite & Lightning

A Q&A with Kite & Lightning

Posted on November 9, 2018

Have you ever wanted to watch a bunch of two-foot-tall immortal ‘Beby’s engage in a real-time Battle Royale? Look no further than ‘Bebylon’, an upcoming multiplayer VR game with comedic arena combat at its core, where you can humiliate, fight your foes and laugh your ass off.

We spoke with Cory Strassburger, co-founder of cinematic VR company Kite and Lightning Studios, to discuss their upcoming project – Bebylon: Battle Royale, as well as how their team are capitalizing on their recent SIGGRAPH success. We also spoke about how their DIY performance capture setup, involving facial capture from the latest iPhone X, opens up the world of motion capture to the general public.

Kite & Lightning recently attended this year’s SIGGRAPH convention, the world’s largest conference on computer graphics, attended by tens of thousands of CG professionals. During the Real-Time Live competition, described as a showcase for the best of the best in real-time graphics and interactivity to compete and share their innovations, Kite & Lightning won the prize for Best Showcase with their DIY body and facial performance capture solution. 

How has the reaction been after your win at SIGGRAPH's Real-Time Live? What has the reaction from the industry been like?

The general reactions have been really positive and exciting. I think the setup inspired a lot of people who’ve been itching to bring characters to life but didn’t really have the means to. 
The industry reaction was a bit of a surprise! I didn’t expect to hear from so many VFX folks, from the big facilities, who were really impressed and who were contemplating, or in the process of implementing a similar setup. 

This setup was used for promotional purposes and events but is any of the footage you've captured using this setup used in-game? Whether it is in gameplay or cinematics?

Actually the entire purpose of the endeavor was for generating in-game content, cinematics, trailers, and eventually short story content. Early on, as the Bebylon world began to conceptually grow, it became clear how important the characters and their stories we’re for inspiring the players. It's really a game world designed to unleash the player’s inner wild child and pour all that expressivity into the arena battles! So it was essential for us to find a way to bring these characters and stories to life with enough expression to set the tone and create a wild setting for players to dive into. The main problem we had was that we had neither the time nor budget to entertain this slice of the spectrum, at least not for a long while.

But, now that we have this super fast and easy-to-use DIY setup, we can start to generate more and more content. I’ve been using it lately for in-game content and we have just a few more capture sessions planned: that will wrap our gameplay and cinematic needs for the initial release. But we’re most excited for the subsequent content that will expand the initial game, customizations and grow the overall Bebylon world with all kinds of cinematic and immersive content! We’ve created quite a vast world for Bebylon over the last few years, and we’re just at the tip of the iceberg, so we have a long road ahead with tons of content to create; all thanks to our new motion capture setup!

Can you tell us more about how you plan to adapt to the latest updates in the technology you've used? For example, when the latest Apple iPhone/other consumer tech is released?

I was pretty keen to see what enhancements the next-gen iPhone X brought about, but from what I can tell there are no hardware updates that directly impact the face capture tech. Of course, faster processors can pave the way for further advancements in the software. I’m generally excited to see how things evolve, though I’m not holding my breath that Apple will progress things in the ways our needs would want. I do have faith that developers will start creating apps specifically for facial capture, something that will potentially go beyond what the Apple tech is designed to do.

So how did you put together this particular setup? How accessible is this tech setup to the general public?

It was a process that started in 2017, when the iPhone X was initially released. I knew Apple had previously purchased a company called Faceshift and that they were creating amazing facial capture software for the desktop, so I got curious. What aspects of their tech made it into the iPhone X? It was a surprise to find that they had managed to miniaturize all of their core tech into the iPhone X and, more so, that Apple gave developers access to it via ARkit. 

From there I started to see what was possible! Once I got the face capture to a satisfying place it made sense to combine it with the body capture. Luckily we already had our Xsens suit, which is remarkably easy to set up and use, so it was the perfect compliment to how easy the iPhone X setup is. The results were incredibly impressive and really fueled my desire to see how good I could make this setup. It also proved how important the body is when talking about expressivity.   We know body language plays a major role in human communication and adding the body capture automatically elevated the face capture by a large magnitude. 

Unfortunately, I only had a set amount of days a month to focus on this as the game development itself was taking up the majority of my time. But nonetheless I kept chipping away at levelling up the quality of the facial capture and each new iteration kept getting better.

More recently we had an exciting boost when Chris Adamson from Xsens hit me up about entering into SIGGRAPH’s Real Time Live. Based on my lack of time, I didn’t think I would have been able to get this system working in real-time within the Unreal Engine by the deadline. In the end, we got accepted which allowed me some extra game dev days for the presentation.  I was pleasantly shocked at how good everything looked and we rendered everything in real time! I can’t overstate enough just how much of a blessing it was to have such an easy-to-use system. I honestly have had so many fun Bebylonian stories to tell that I’m chomping with excitement knowing this system is at my fingertips.  

I can’t say give away any details just yet but there are some very exciting developments brewing! Particularly in regards to pushing this DIY mocap setup and generating some exciting new cinematic stuff!

This DIY setup also opens up many new doors for us in regards to live streaming and for virtual Youtubers! Along with the game and cinematics, we’re also planning to use this setup in conjunction with AR and Virtual Cinematography to do some Live Streaming of the ‘Beby’ characters; so lots of interesting use cases on the horizon. 

Can they do it themselves? How?

The real beauty of this whole setup is that it is very accessible and fairly easy to use for anyone semi-technical. The face capture tech is free and accessible to anyone using Unity or Unreal Engine and both companies have sample projects available to get you started. The only required ingredient is an iPhone X and an Apple developer account (any of the new iPhone X flavours will work). With the body, there are various options to choose from, depending on your needs. I would say the Xsens suit, in my opinion, is at the top of the quality pyramid. I would also highly stress just how important the body is to this whole system, so my advice would be to get the best body capture solution you can. Also if you plan to stream directly into Unreal Engine then you’ll need to add IKinema LiveAction to your ingredients list.

What is next for Kite & Lightning? What is next for Bebylon Battle Royale?

As I was mentioning, the Bebylon world goes really deep and we’re planning to spend the next couple years rolling out as much of the big vision as we can. It's a challenging gamble because each step needs to be successful in order for the next step to take root. The Battle Royale game itself is actually more of a gateway drug into the world and if it's successful enough, we can start to create all the meta layers of entertainment that further fuel the game and it’s expansion. It’s really this bigger vision that gets us up in the morning; it’s what keeps our excitement building. At heart, we love making the cinematic story-driven content and if all goes as planned, we’ll be having a crazy amount of fun over the next couple years evolving this wild world of Bebylon!

 

About Kite and Lightning
Kite & Lightning is a cinematic VR company known for creating immersive computer-generated worlds that blend interactive gaming, social and story. As one of the leaders of the new virtual reality movement, Kite & Lightning became known for their original emotionally charged transformative experiences such as the award-winning VR Mini Opera Senza Peso. Other noteworthy commercial projects include a 3D VR experience for NBC’s “The Voice” featuring Pharrell Williams, Adam Levine, Blake Shelton, & Gwen Stefani; and Lionsgate’s first VR narrative, starring Kate Winslet, Mekhi Phifer, and Miles Teller.