On Friday, December 12, my fellow Timefirites and I set out on an epic road trip from Phoenix to LA, on a quest to find the forth VRLA Conference on Saturday, Dec. 13th and stand in line with our VIP tickets to get in early and make general pests of ourselves. It was only the forth VRLA conference to be held but it would be our first, and our first exposure and opportunity to meet other people who were just as nerdy about virtual reality as we are, so we were determined to be there bright-eyed and bushy-tailed. After living and working in Phoenix for as long as we have, we were beginning to feel lonely. We certainly weren’t expecting to see so many of our kind when we walked through those doors.
Almost every demo utilized a mobile headset of one type or another, all of them using cell phones. I had my pants scared off by a horror game I watched while holding a plastic cellphone-headset to my eyes like a pair of binoculars. I had the chance to see myself in VR as a robot, sitting in a cafe. When I looked over to my right, I could see my reflection in the window. This was one of the few demos I was given on the DK2, but because of this, I could lean into objects to get a closer look at them, and I could even stand up from my chair to an extent, both in VR and in Reality. This gave me something interesting to talk about with one of the developers working with Otherworld Interactive, since I was working on a robot of my own.
Before this conference, I had never had the opportunity to talk with anyone from Epic Games face to face. Twitch Livestreams are great and all, but it’s nothing like being able to thank one of them in person for all of the hard work that they do. And if Nick Whiting in any way represents the team as a whole, they are all very friendly and approachable people. Amir on the other hand, from Sixense, was a man on a mission and was determined to get it done, in the quickest, most direct, and no bullshit way possible. I got poor Steve in trouble when we first got there by distracting him with my questions about the STEM system while he was trying to get it set up. Fortunately, this resulted in my being one of the first people at the conference to try it out! I did have a chance to talk to Amir afterward with my fellow Timefireites about all the possibilities of Mobile VR. Hands-in Mobile VR is awesome! With this kind of technology, you could definitely program one of those hands to become a can of spray paint and paint on all kinds of surfaces.
I would say the difference between the STEM and the Control VR setup, other than the Control VR being tethered, is that the Control VR allows for finger movement detection, without the possibility of the sensors losing track of your hands’ position as they often do with the Leap Motion/DK2 pairing. Although, during my time at their booth, the Control VR people were having issues of their own with other components of their setup, resulting in several Blue Screens of Death in very short spans of time. However, most of us attendees standing by waiting to try it were willing to wait for them to reboot their system. I myself had two blue-screens before the system settled down long enough for me to give it a thorough try.
I even lent them my micro-USB cable during one of their many reboots so that they could reconnect to Steam. What surprised me the most about all of this was how well the Control VR crew handled their situation. They took it like champs! Instead of doing what I would imagine anyone else in their position would do, i.e. freak out, panic, get angry, annoyed, any of those emotions, what did they do instead? They laughed it off. “Whatever,” “no big deal,” “these things happen,” “nothing to worry about,” “we’ll just get it up and running again later, or try again next time.” And they did. Last I remember seeing one of them, he told me they had it back up and running just fine. The computer was just over-heating due to a faulty cooling fan. When the pressure was on, they stayed in control by not losing their cool.
I was especially excited to meet Philip Rosedale, the founder of Second Life and now the founder of High Fidelity, and their new virtual interconnectivity system that allows users’ avatars to interact with one another through voice and facial inputs and creating servers as virtual social environments. I never imagined being able to control my avatars facial expressions, but that was exactly what I had a chance to do with their software. One of their associates sat down with me and walked me through “training” the software to recognize my face and its movements. The entire experience of seeing my expressions on the face of an avatar reminded me of what I had imagined the world of “Ready Player One” would be like. It was easy to see the potential of such a system that utilized that kind of software. I even mentioned this to Philip, and he informed me that he had even showed off his system to the author, Earnest Kline.

The upper left hand corner shows the avatar that represents this user with facial detection active. The avatar he is talking to appears to have been distracted by something he saw off to his left (he’s wearing a DK2 with head tracking in effect).
My next adventure in VR was at the Xsens booth where I was strapped into a full body motion-capture detection system that wrapped neatly and easily over my clothes. The result of this and a DK2 was an adventure in full-body tethered VR using realtime retargeting. No markers, no cameras, just squeeze into this tight (but stretchy) T-shirt, and put on all these straps on the right body parts and you’re golden. The experience was pretty nifty too! I could put both my hands on my ankle, do some warm-up routines, a little dance, a little cheer…Hey. Half the fun is test the limits of the software. One of the only limits I could see with this one was not being able to see where I was going in the real world so that I wouldn’t bump into things.

If you guessed that idiot kicking her leg up in the background was me, you guessed correctly. What’s important though, is what’s happening in the foreground. Using a one-size-fits-all (even me) setup to capture motion in real time, Xsens’ software could make some animators really happy. Myself included!
Last, but not least, I wanted to make note of Visionary VR. It had only been a few days prior that I had been exposed for the second time to 360 degree video, and that was on the Gear VR. I had imagined that it would be edited the same as video has always been edited. From one shot at a time, taken from each camera in the setup, only it would be magically sewn back together afterward. I had imagined a similar process with animated content, only the cameras would be virtual. The only problem with editing this way would be that you would be doing all of that work behind a flat, 2D screen. Visionary VR solves that by allowing users to edit video/animation in VR, while wearing the DK2, giving the users more control over what the audience sees to the left, right, and back as well as above and below. This really helped open my eyes to the possibility of 360 video really changing the rules and best practices of how good editing is done. The rule of thirds and so forth, don’t really apply anymore now that the audience has the freedom to look all around them. They aren’t watching through a tiny rectangle anymore. More thought is going to have to go into planning shots since there is no longer a frame to hide the surrounding space.
I overheard someone saying they were going to need “a goddamn stadium” to hold the next VR conference. I think they’re right. The explosion (the sudden increase in number of attendees) that happened after 1:00 was only the first indication of this.
Also a noteworthy observation; this technology is growing rather rapidly. While waiting in line to see a VR documentary using one of the older setups I had a chance to talk to one of the men helping to demonstrate it. I can’t remember the name of the company exactly, but I know they work with the same woman who created the virtual simulations of a bomb going off in Syria or of homeless people waiting in line for food and panicking as one of the homeless people collapses into epileptic shock. They used a heavy headset with red lights arranged almost like antlers that were being tracked by cameras to keep record of where the user was. It came with a shoulder bag that had to be worn along with the headgear which I ended up wearing alongside my own shoulder back, which was quite awkward to carry around.
As I was talking with this guy about the headset, he mentioned that in the next few months, they would no longer be able to demonstrate this hardware setup because of how quickly it is becoming obsolete, especially with these new mobile headsets popping up everywhere. He said it would be unacceptable. It caught me off guard a little bit. My first impression on seeing this, before catching a glimpse of the accompanying shoulder bag was that this was yet another solution for what everyone was trying to do, just a little more showy. It didn’t take me long to realize that it was in fact the other way around and that the hardware they were using had been around for quite some time. However, I thought the experience was worth waiting in line for as long as I did. None of the other demos seemed to be as story driven as this one. You had to physically climb into a “car” and drive it around the race track a couple of times to trigger an event to happen, which in this case, was the car in front of you crashing and burning. It felt a lot like what many of us dreamed virtual reality would be 5 years ago or longer where you have a designated space you can walk around in and each step you take brings you closer to something in the virtual world.
If I end up going to another conference, I’ll need to make up a list of questions to ask people. This time around, I talked with a lot of artists like myself, and a few coders, and asked mainly about their part in the projects they worked on and sharing my own stories about my work, particularly with the Otherworld Interactive people who worked on Café AME. Their robot reminded me so much of Benjamin that I felt compelled to share it with them. I asked a lot of questions about how everyone’s gadgets worked, especially when I got to try them out. But even then there were still times when I found myself so awestruck that I was struck dumb. Or I would ask the wrong questions, the ones that people could not answer for me for proprietary reasons. It’s a very different experience attending a conference as a representative of a company rather than as an individual as I have in previous conferences.
Written by Ariana Alexander of Timefire VR












































































