Shenanigans at VRLA

 Whatever  Comments Off on Shenanigans at VRLA
Jan 042015
 
Luis is in his element while filming this motion capture actor.

Luis is in his element while filming this motion capture actor.

DSC02563

On Friday, December 12, my fellow Timefirites and I set out on an epic road trip from Phoenix to LA, on a quest to find the forth VRLA Conference on Saturday, Dec. 13th and stand in line with our VIP tickets to get in early and make general pests of ourselves. It was only the forth VRLA conference to be held but it would be our first, and our first exposure and opportunity to meet other people who were just as nerdy about virtual reality as we are, so we were determined to be there bright-eyed and bushy-tailed. After living and working in Phoenix for as long as we have, we were beginning to feel lonely. We certainly weren’t expecting to see so many of our kind when we walked through those doors.

Almost every demo utilized a mobile headset of one type or another, all of them using cell phones. I had my pants scared off by a horror game I watched while holding a plastic cellphone-headset to my eyes like a pair of binoculars. I had the chance to see myself in VR as a robot, sitting in a cafe. When I looked over to my right, I could see my reflection in the window. This was one of the few demos I was given on the DK2, but because of this, I could lean into objects to get a closer look at them, and I could even stand up from my chair to an extent, both in VR and in Reality. This gave me something interesting to talk about with one of the developers working with Otherworld Interactive, since I was working on a robot of my own.

Before this conference, I had never had the opportunity to talk with anyone from Epic Games face to face. Twitch Livestreams are great and all, but it’s nothing like being able to thank one of them in person for all of the hard work that they do. And if Nick Whiting in any way represents the team as a whole, they are all very friendly and approachable people. Amir on the other hand, from Sixense, was a man on a mission and was determined to get it done, in the quickest, most direct, and no bullshit way possible. I got poor Steve in trouble when we first got there by distracting him with my questions about the STEM system while he was trying to get it set up. Fortunately, this resulted in my being one of the first people at the conference to try it out! I did have a chance to talk to Amir afterward with my fellow Timefireites about all the possibilities of Mobile VR. Hands-in Mobile VR is awesome! With this kind of technology, you could definitely program one of those hands to become a can of spray paint and paint on all kinds of surfaces.

I would say the difference between the STEM and the Control VR setup, other than the Control VR being tethered, is that the Control VR allows for finger movement detection, without the possibility of the sensors losing track of your hands’ position as they often do with the Leap Motion/DK2 pairing. Although, during my time at their booth, the Control VR people were having issues of their own with other components of their setup, resulting in several Blue Screens of Death in very short spans of time. However, most of us attendees standing by waiting to try it were willing to wait for them to reboot their system. I myself had two blue-screens before the system settled down long enough for me to give it a thorough try.

DSC02580

Feeling like a cyborg lunchlady with all this gear on. But seeing my hands in VR was even cooler!

DSC02583

I even lent them my micro-USB cable during one of their many reboots so that they could reconnect to Steam. What surprised me the most about all of this was how well the Control VR crew handled their situation. They took it like champs! Instead of doing what I would imagine anyone else in their position would do, i.e. freak out, panic, get angry, annoyed, any of those emotions, what did they do instead? They laughed it off. “Whatever,” “no big deal,” “these things happen,” “nothing to worry about,” “we’ll just get it up and running again later, or try again next time.” And they did. Last I remember seeing one of them, he told me they had it back up and running just fine. The computer was just over-heating due to a faulty cooling fan. When the pressure was on, they stayed in control by not losing their cool.

I was especially excited to meet Philip Rosedale, the founder of Second Life and now the founder of High Fidelity, and their new virtual interconnectivity system that allows users’ avatars to interact with one another through voice and facial inputs and creating servers as virtual social environments. I never imagined being able to control my avatars facial expressions, but that was exactly what I had a chance to do with their software. One of their associates sat down with me and walked me through “training” the software to recognize my face and its movements. The entire experience of seeing my expressions on the face of an avatar reminded me of what I had imagined the world of “Ready Player One” would be like. It was easy to see the potential of such a system that utilized that kind of software. I even mentioned this to Philip, and he informed me that he had even showed off his system to the author, Earnest Kline.

The upper left hand corner shows the avatar that represents this user with facial detection active. The avatar he is talking to appears to have been distracted by something he saw off to his left (he’s wearing a DK2 with head tracking in effect).

The upper left hand corner shows the avatar that represents this user with facial detection active. The avatar he is talking to appears to have been distracted by something he saw off to his left (he’s wearing a DK2 with head tracking in effect).

My next adventure in VR was at the Xsens booth where I was strapped into a full body motion-capture detection system that wrapped neatly and easily over my clothes. The result of this and a DK2 was an adventure in full-body tethered VR using realtime retargeting. No markers, no cameras, just squeeze into this tight (but stretchy) T-shirt, and put on all these straps on the right body parts and you’re golden. The experience was pretty nifty too! I could put both my hands on my ankle, do some warm-up routines, a little dance, a little cheer…Hey. Half the fun is test the limits of the software. One of the only limits I could see with this one was not being able to see where I was going in the real world so that I wouldn’t bump into things.

If you guessed that idiot kicking her leg up in the background was me, you guessed correctly. What’s important though, is what’s happening in the foreground. Using a one-size-fits-all (even me) setup to capture motion in real time, Xsens’ software could make some animators really happy. Myself included!

If you guessed that idiot kicking her leg up in the background was me, you guessed correctly. What’s important though, is what’s happening in the foreground. Using a one-size-fits-all (even me) setup to capture motion in real time, Xsens’ software could make some animators really happy. Myself included!

DSC02576

Last, but not least, I wanted to make note of Visionary VR. It had only been a few days prior that I had been exposed for the second time to 360 degree video, and that was on the Gear VR. I had imagined that it would be edited the same as video has always been edited. From one shot at a time, taken from each camera in the setup, only it would be magically sewn back together afterward. I had imagined a similar process with animated content, only the cameras would be virtual. The only problem with editing this way would be that you would be doing all of that work behind a flat, 2D screen. Visionary VR solves that by allowing users to edit video/animation in VR, while wearing the DK2, giving the users more control over what the audience sees to the left, right, and back as well as above and below. This really helped open my eyes to the possibility of 360 video really changing the rules and best practices of how good editing is done. The rule of thirds and so forth, don’t really apply anymore now that the audience has the freedom to look all around them. They aren’t watching through a tiny rectangle anymore. More thought is going to have to go into planning shots since there is no longer a frame to hide the surrounding space.

I overheard someone saying they were going to need “a goddamn stadium” to hold the next VR conference. I think they’re right. The explosion (the sudden increase in number of attendees) that happened after 1:00 was only the first indication of this.

Also a noteworthy observation; this technology is growing rather rapidly. While waiting in line to see a VR documentary using one of the older setups I had a chance to talk to one of the men helping to demonstrate it. I can’t remember the name of the company exactly, but I know they work with the same woman who created the virtual simulations of a bomb going off in Syria or of homeless people waiting in line for food and panicking as one of the homeless people collapses into epileptic shock. They used a heavy headset with red lights arranged almost like antlers that were being tracked by cameras to keep record of where the user was. It came with a shoulder bag that had to be worn along with the headgear which I ended up wearing alongside my own shoulder back, which was quite awkward to carry around.

DSC02593

As I was talking with this guy about the headset, he mentioned that in the next few months, they would no longer be able to demonstrate this hardware setup because of how quickly it is becoming obsolete, especially with these new mobile headsets popping up everywhere. He said it would be unacceptable. It caught me off guard a little bit. My first impression on seeing this, before catching a glimpse of the accompanying shoulder bag was that this was yet another solution for what everyone was trying to do, just a little more showy. It didn’t take me long to realize that it was in fact the other way around and that the hardware they were using had been around for quite some time. However, I thought the experience was worth waiting in line for as long as I did. None of the other demos seemed to be as story driven as this one. You had to physically climb into a “car” and drive it around the race track a couple of times to trigger an event to happen, which in this case, was the car in front of you crashing and burning. It felt a lot like what many of us dreamed virtual reality would be 5 years ago or longer where you have a designated space you can walk around in and each step you take brings you closer to something in the virtual world.

If I end up going to another conference, I’ll need to make up a list of questions to ask people. This time around, I talked with a lot of artists like myself, and a few coders, and asked mainly about their part in the projects they worked on and sharing my own stories about my work, particularly with the Otherworld Interactive people who worked on Café AME. Their robot reminded me so much of Benjamin that I felt compelled to share it with them. I asked a lot of questions about how everyone’s gadgets worked, especially when I got to try them out. But even then there were still times when I found myself so awestruck that I was struck dumb. Or I would ask the wrong questions, the ones that people could not answer for me for proprietary reasons. It’s a very different experience attending a conference as a representative of a company rather than as an individual as I have in previous conferences.

 

Written by Ariana Alexander of Timefire VR

Substance Designer

 Whatever  Comments Off on Substance Designer
Oct 212014
 

Substance Designer from Allegorithmic is being used by Timefire VR in Scottsdale, Arizona

I’ve been spending more than a few days in Allegorithmic’s Substance Designer working over a massive amount of textures I’ve downloaded from GameTextures. The process is tedious, especially the Metal PBR workflow, but after more than a few days grinding through a directory with more than 250 base materials, I’m kind of addicted. At night I go home and work on some simple stuff, assembling my horde of images from CGTextures. These are easy as it’s just a single bitmap I have to wrangle. The glue that is making all of this possible is Allegorithmic’s new tool found in Bitmap2Material 3.0. It’s a “Node” that works as a kind of plugin for Designer. Feed the node the images you want converted for use in a PBR workflow and the node does the heavy lifting. But of course nothing is ever totally easy and so I wrestle with Masks, Emissive textures, Blend nodes, Levels, and the adjustment of Normals in order to get the Substances just right for our shared library. Between Allegorithmic’s Database of procedural textures (about 850 of them), the 1000 CGTextures, and the 1000 GameTextures files I’m working with, I could be at this for quite a while. In the end, I think this will prove to be an invaluable asset to our team, though I might have a momentum that will demand I just keep going exploring the possibilities this amazing software offers us.

Sverchok to Unreal Engine 4 – Part 4

 Helpers  Comments Off on Sverchok to Unreal Engine 4 – Part 4
Oct 152014
 

Continued from Part 3

9. Now we will bring in our Can Play? Bool. Alt – Left Click and drag it out into the Event Graph. You should get a “Set” node of your “Can Play?” bool, check the tiny box next to can play?

Now we are going to need a “Get” version of your bool so go back to the left hand panel and Ctrl – Left Click on your “Can Play?” and drag it back out to the “Event Graph”. Now we will connect them by dragging from their pins. Follow the picture.

BluePrint figure 2 from Unreal Engine 4

10. If you have your node tree set up how I have in the picture you are ready for the animation process. Right Click and type in “Add Timeline”, hit enter. The timeline node will show up. We are going to connect our pins from our Set and our Branch into the “Play from Start” input of the timeline.

BluePrint figure 3 from Unreal Engine 4

11. Double click on the timeline. Which will take you to your Timeline graph. You can change the length of time where it says “Length” but I’m keeping it at 5. We are going to create a float track to set our values. Click on the button with an F near the top of the Timeline graph.

BluePrint figure 4 from Unreal Engine 4

12. You now have something like this.

BluePrint figure 5 from Unreal Engine 4

13. We need to add 3 keys to the timeline. The 1st is to set the default value which will be 0, the 2nd key is the value that will morph our mesh which is 1, and the 3rd key will have a value of 0 to bring the mesh back to its default state. You can add your keys by right clicking on your track and hit “Add Key” Do this 3 times to create 3 keys. Their placement does not matter right now as we will change their value and time by typing it in.

BluePrint figure 6 from Unreal Engine 4

14. Go to your 1st key and in “Time” and “Value” type in 0. This will make your 1st key go to Time 0 with a Value of 0 on your timeline.

15. Click your 2nd key and in “Time” type in 2.5 (halfway to 5) and give it a “Value” of 1 (Remember in Blender how the value of 1 morphed our geometry, we need to use this value).

16. Now go to your 3rd key and in “Time” type in 5 and give it a value of 0 to return to normal.

Your graph should look something like this if you followed my variables. You might be zoomed in, so just scroll back.

BluePrint figure 7 from Unreal Engine 4

17. Lets smooth out our curve shall we. Right Click on each of your first two keys and hit “Cubic-auto” this will smooth out your curve. Now that this graph has all the curves that we like go back to your “Event Graph” tab. We are almost done.

18. We are going to “Get” (Ctrl Left Click) our SkeletalMesh1 and bring it into the Event Graph. We need to get a “Set Morph Target” and the only way to do that is by dragging out from the SkeletalMesh1 (blue circle in the node) and typing “Set Morph Target”. Inside of your “Set Morph Target” node there is an input called “Morph Target Name”, here we will type in “Morphing” which is the name we changed back when we set up Shape Keys. After you have your nodes set up, connect them to your timeline. It should look like this!

BluePrint figure 8 from Unreal Engine 4

19. That’s it. Hit “Compile” at the top left corner of you window. Find your BP_*meshname* and throw it into your level. (simple click and drag into the level)

You now have successfully made a Sverchok created geometry, brought it into UE4, and animated it. Yay new skills. High five!

This article just scratches the surface of what is possible with this Sverchok to UE4 workflow. As I start understanding how to make more complex geometry I hope I can create some really interesting psychedelic shapes. I highly recommend anyone who followed along with this tutorial to check out the Sverchok thread at http://blenderartists.org/forum/showthread.php?272679-Addon-WIP-Sverchok-parametric-nodes-for-architects&highlight=sverchok , they are doing some amazing things with this plug in.

I hope everyone had some fun learning something new and this article was informative. If anyone following along has any questions you can reach me on twitter @luisfilmchavez hope to see you soon with another update on what we’re doing at the Timefire office. Peace out.

Written by Luis Chavez for TimefireVR

Oct 152014
 

EXPORT TO UE4

1. Select the mesh with the Shape Keys applied to it and go up to File → Export → FBX.

Export mesh as FBX from Blender for Unreal Engine 4

2. You should now be in the export window. On the bottom left hand side you should see a panel that says “Export FBX”. This is your default settings we will change them.

Default Export dialog from Blender

3. To this. Pay special attention to the FBX Version – it MUST be 6.1 ASCII for this to work.

Unreal Export Settings for Unreal Engine 4

4. Now that you have the exact same export settings as me name your file whatever you want and save it somewhere you will remember. We will be going into Unreal Engine now. Click Export FBX at the top right. It might take a few seconds but once it’s done go ahead and Save and close out of Blender.

INSIDE OF UNREAL ENGINE 4

I assume you already have Unreal Engine and know how to navigate through the UI so go ahead and open up a new project or an already existing one. Your choice.

1. Once you have your project open go to your Content Browser to import your Sverchok mesh I called mine PracticeConeBall. Go up to Import and select your FBX file and open it.

Import FBX from Blender into Unreal Engine 4

2. The FBX Import Options window will pop up. By you will be in the Static Mesh tab. We will need to be in the Skeletal Mesh tab so click over to skeletal mesh, make sure import morph target is enabled and click import.

Skeletal Mesh Import from Blender into Unreal Engine 4

It should now show up in whichever folder you were in. Mine was in a FBX folder I created.

3. Now that you imported your mesh we will set up the Shape Keys/Morph Target. Right Click on your imported mesh. A drop down will appear we will click on Create Blueprint Using… Clicking it will give you this.

Create Blueprint in Unreal Engine 4

4. When you name your Blueprint start with “BP_” this is a nice way of organizing your assets. Once you name it and press OK your meshes Blueprint will appear.

Blueprint Tab in Unreal Engine 4

5. Before we start dropping nodes in we need to set up a variable. Hit the variable button on the left and a variable will appear. Most likely it will be a bool which is perfect. Name your bool “Can Play?” and hit compile at the top left.

Make Bool type in Unreal Engine 4

6. Now that you Compiled, a new value will show up in your newly set up Bool. Go to your Bool’s details panel and under Default Value click the Can Play? Checkbox.

Check "Can Play" option in Unreal Engine 4

7. Lets start bringing in some nodes. Right Click in your Event Graph to bring up your blueprint menu and type in “Begin Event Play” hit enter. This will start the animation at the beginning of the play session.

8. Next we will make a “Custom Event” called “Play Again”. So right click to bring up the blueprint menu and type in Custom Event and press enter. Rename it Play Again. The third node we will bring in is a “Branch” node, so right click and type Branch, hit enter. This is a true or false statement based on the condition which will be our “Can Play?” node and now you should have this so far.

BluePrint Figure 1 from Unreal Engine 4

Written by Luis Chavez for TimefireVR –  Read the last entry in this tutorial series in the next post

Sverchok to Unreal Engine 4 – Part 2

 Helpers  Comments Off on Sverchok to Unreal Engine 4 – Part 2
Oct 142014
 

CREATING YOUR FIRST SVERCHOK GEOMETRY

Now comes the fun part, we get to play with Sverchok.

1. To bring up nodes inside Sverchok we will only be using the “Shift – A” shortcut to bring up nodes. So go ahead and “Shift – A” and click Search.

Searching for Nodes in Sverchok for Blender

Click Image to expand for clearer view

2. Inside search type in “Object In” and click or press enter. This node allows us to bring in geometry from the 3D view port. Make two.

Two Sverchok Node items loaded in Blender

Click Image to expand for clearer view

3. To bring in objects we need to have objects to bring in. So lets make that happen. Inside the 3D view port use “Shift-A”, this brings allows you to create simple primitives very fast we will grab an Icosphere and a Cone. Move your geometry to the side.

Make IcoSphere in Blender for use in Sverchok

4. Now that you have your Icosphere and your cone it’s time to “GET” them into our Object Ins. Left Click on your Icosphere. Now inside the Sverchok Node Editor click on the top “Object In” (Order is very important) and hit “GET”. Your top “Object In” now says Icosphere at the bottom of the node. Do the same for your cone and the bottom “Object In”. You should have this.

Loaded Objects in Sverchok for Blender

5. Now we are going to bring in a couple nodes all at once. You are going to “Shift – A” and search individually for Adaptive Polygons, Mesh Join, Float (just Float not Range Float or Float 2 Int), and Viewer Draw. If you typed all of these nodes correctly you should now have this.

Viewing nodes in Sverchok for Blender

Click Image to expand for clearer view

6. We will now connect all of these pretty nodes together.

Connected Nodes in Sverchok for Blender

Click Image to expand for clearer view

7. Now if you have the same values as me which is 1 in the donor width of the Adaptive Polygons Node and 1 in the Float Node your geometry should look like this:

Controlling geometry in Blender using Sverchok

8. I’m going to explain what this node system does but first I want you to play around with those values so you can see what is happening because we are going to be playing with them to set up our Shape Keys. So go ahead and mess with those values.

9. Ok so after playing with those values, you might have an idea of whats happening. The Icosphere Object is receiving the vertices and polygons that the Cone Object is donating (The Cones are being replicated on each face of the Icosphere). So if you increase the donor width you are increasing the width of the Cone’s polygon that touches the Icosphere, which is the bottom of the cone.

The Float node is plugged into the Z_Coefficient. Changing this value adjusts the height of the cone on the Z axis in relation to the Icosphere’s faces.

The Mesh Join node does exactly what you would think, it joins all the meshes together to bake one object instead of 80. (Which is the number of faces on the Icosphere.) This might crash your computer, so SAVE and see for yourself, plug the Vertices and Polygons from Adaptive Polygons straight into Viewer Draw. You will get the same object but you will have 80 individual objects (not good for what we want to accomplish). Plug Adaptive Polygons back into Mesh Join and connect to Viewer Draw.

10. Now that you understand what is going on, click “BAKE” on the Viewer Draw.

11. You don’t see anything at first and that is because our mesh has been created directly under our Sverchok geometry, so left click on the blue mesh, you will select the baked mesh under which is what you want. Click on red arrow and drag to the side.

Baked mesh prepared in Sverchok for Blender

12. You just created your first Sverchok geometry.

SETTING UP SHAPE KEYS/ MORPH TARGETS

1. Setting up Shake Keys/ Morph Targets is super easy now that we have our first Sverchok mesh already baked. We are going to bake another Sverchok mesh but this time we will change the values of the Donor Width and Z Coef. So do that now. I’m going to change my Donor Width to 2.5 and my Z Coef to 0.5. I click “BAKE”. (You can do what ever values you like or you can use my values.) Mine looks like this.

Two Coneballs created with Sverchok in Blender

Now we are done with Sverchok.

2. Now we are going to set up our Shape Keys. Select your base object. (1 of the 2 objects you created)

3. Once you have selected on of your models go over to the Properties Panel which is on the right and click on the Data tab which is a small tab that looks like an upside down triangle.

4. Here we will set up out Shape Keys. Go down to Shape Keys and expand the area. Click on the + button to add your Base Shape Key.

Object Data parameters in Blender

5. Your Base Shape Key is the default shape your object will take when the value is 0. Now we will create the Shape Key that our base object will change to. Select the mesh that you didn’t apply the shape key to and then Shift Click on the mesh with the Base Shape Key.

6. Now that both are selected go back to Shape Keys and click the black arrow that is pointing down. A drop down menu will appear and one of your options will be Join as Shapes. Click it and you will apply the shape of the second mesh to your Base mesh.

Join as Shapes in Blender

7. You will now see a second entry in the Shape Keys panel under Basis. Double Click on the new entry under Basis and rename it to “Morphing”. Now that you selected your second shape key you should see a “Value” bar appear under the Shape Key box. This will drive our Shape Key animation. Go ahead play with the Value by dragging on the bar or clicking on it and typing in a number from 0 to 1. Dragging it from 0 to 1 you will see that our mesh now morphs into the second meshes shape.

Shape Key complete in Blender

Click Image to expand for clearer view

You have set up your geometry with shape keys. Now we will export out for Unreal Engine 4.

Before we leave if you want to play around, try inserting different objects into your “Object In” nodes. (Beware of the number of polygons your objects have before doing this, it may crash your computer if it’s too high.) I made a couple variations just for fun.

Creating Sverchok meshes and bringing them into Unreal Engine 4 at TimefireVR in Scottsdale, Arizona

Written by Luis Chavez for TimefireVR

 

 

Sverchok to Unreal Engine 4 – Part 1

 Helpers  Comments Off on Sverchok to Unreal Engine 4 – Part 1
Oct 132014
 

TONS-OF-FUN1

Hey everyone this week I’m finally going to be showing you the process of making a simple Sverchok created mesh, apply animation through shape keys, and then import your mesh and its animation into UE4. Depending on your skill level with Blender and Unreal Engine 4 this might seem like a lot but actually it isn’t. I’ve already done all the head scratching, cursing, and crying that goes with figuring out a functioning workflow for these programs, so be calm, sit up straight, and prepare to learn.

SVERCHOK INSTALLATION

1. First we need to download Sverchok plug-in for Blender here:

https://github.com/nortikin/sverchok

You’re going to want to download the ZIP, it’s on the right hand side. Don’t UNZIP it.

Download Sverchok from its Github repository

2. We will be going into blender to activate it from there. Open Blender.

3. Once Blender is open go to the upper left hand corner to File -> User Preferences or hit Ctrl–Alt-U.

4. Inside User Preferences, click on the Addons tab and click install from file, it looks like this.

Addons tab in the User Preferences of Blender

5. Find the ZIP Sverchok file you downloaded. It should internally unzip the file and install into your Blender Addon folder.

Choosing the Sverchok Addon for Blender from your download location

Click Save User Settings at the bottom left corner of User Preferences. To make sure Sverchok is enabled, search for Sverchok in the search bar of User Preferences at the top left. If installation works Sverchok should be the only addon visible and make sure its check box is checked.

6. BOOM. You have Sverchok.

SETTING UP SVERCHOK NODE TREE

Before I go into the process I really urge you to check out the Sverchok thread at blenderartist.org I’ve learned a ton and the developers are incredibly helpful. I have very little math and architectural knowledge and I was still able to create some pretty cool buildings with it. For all the architecture students that don’t have the money to shell out for Rhino and Grasshopper, Sverchok is the next best thing, actually screw that, Sverchok is better. GO TEAM SVERCHOK!!!

http://blenderartists.org/forum/showthread.php?272679-Addon-WIP-Sverchok-parametric-nodes-for-architects&highlight=sverchok

Ok so here we go.

1. So you probably already have Blender open. Cool. We’re going to set up your workspace for Sverchok. Click on the bottom left corner of your geometry window like this:

Default Blender view upon starting the program

2. If you clicked on the corner and dragged up you should see something like this.

Splitting Windows in Blender

3. Now we are going to switch the bottom editor window to the Node Editor.

Change Timeline Window to Node Editor view in Blender

4. Now that you have the Node Editor open we want to switch the node editor from Shader to the Sverchok Node Tree.

Ready to work with the Sverchok Addon in Blender

6. If you have this, give your computer screen a high five, I’m doing the same at the time of this writing. Good Job!

Written by Luis Chavez for TimefireVR

VR and The Alchemy of Communication

 News  Comments Off on VR and The Alchemy of Communication
Oct 042014
 
The Crescent Bay VR Headset as seen at Oculus Connect in Los Angeles, California

The Crescent Bay VR Headset

Recently I attended a gathering of future alchemists, the tribe who discovered the ability to harness this power to turn metals into gold though they are not fully ready to give us their secrets, they are known as the Oculus. Like the apprentice witnessing the magic of the old wise ones, we sit on the sidelines and try to decipher what the signs mean. We are the attendant developers. Due to the gravity of what the Oculusians have discovered and their appearance of maybe being a different species than that of our own, we sit transfixed and hanging on their unique communication skills. In effect we are the Neanderthals, they the Homo Sapiens. This is the gulf of knowledge and skill that is occurring in real time among us competing tribes. We are living through an evolutionary moment in history that is about to define the trajectory of the dominant species. Our goal as the invited guests of this meeting called “Oculus Connect” is to assimilate the language and customs of our new teachers. They have parceled out small fragments of the words that give them their power. They have shared fetishes called HMD’s, the “beads” of the 21st century.

Brendan Iribe CEO of Oculus at Oculus Connect in Los Angeles, California

Brendan Iribe CEO of Oculus

It would be warmly welcomed should the missionaries from Oculus have their founder speak with us first. Instead he was hidden from us, only making a short appearance via video. Maybe they thought the power of Palmer Luckey may be too great for us to witness or that they had to pave the way for our minds to be expanded over the course of the day in preparation for the Oracle of the Virtual. In his stead we were greeted by their chief, Brenden Iribe. It was not a poor performance, on the contrary, by the end of his grand talk we were enthralled to have been the first of our peoples to behold the golden Crescent Bay. This device is so powerful that one’s first glimpse could only be through simulation, a photo depicting its eminence was brought before our eyes. I have since heard there were those who were blinded by the sorcery offered that day.

Chief Scientist Michael Abrash of Oculus at Oculus Connect in Los Angeles, California

Chief Scientist Michael Abrash

Next up to speak to the chosen 1,000 was a scientist, a man of great mental scope and intuition called Michael Abrash. Had we been lesser people, his code may have been indecipherable, but either through his ability to speak of the complex in simpler terms, or our gathering enlightenment, we went on a ride that helped explain how now is the time of VR! The room sat in silence hanging on these words of wisdom.

John Carmack "Melting someone's brain" at Oculus Connect in Los Angeles, California

Chief Technology Officer John Carmack

What happened next defies one’s ability to comprehend what must surely be an illusion, a bending of reality, a deception of magic. On this day the laws that govern the secret behavior of electrons and photons were altered in ways that will never allow us to be the same as when we entered this great hall. While history will hail the efforts and skills of the founder of Oculus, it was the raw overwhelming mental giant of all things on the molecular scale John Carmack who bathed our brains in the incandescent glow of profundity that only true genius is able to accomplish. Heads hung in shame of not being able to grasp the immensity of the words spoken, people wept in his presence, while yet others simply had to leave the space where such a power was on display, thus is the power of Carmack. New vocabularies were explored during the hour he was allotted on his podium of the profound. If anyone else in attendance care to share a fraction of what was said this day, let him come forward and enlighten the masses.

Our emissaries attended workshops where specialization in particular skills were being offered. They will be responsible for bringing back their observations to educate those of us who remained on the periphery. Those on the periphery, we are the philosophers and the plenipotentiary, we are trying to decide the meaning of what VR will bring to our peoples. While mastery of the world around us has proven invaluable and given us special powers, it has also been fraught with conflict and the abuse of power. So we must think hard and do our best to understand how this will alter our relationship with our fellow citizens, travelers, merchants, and explorers. Our task to interpret the relevant cultural iconography and histories that should play a role between our peoples cannot be taken lightly, for “Content” is the holy grail which we are being entrusted to seek out and discover.

For well over a year now a cadre of us global representatives have toiled with the tools we have been told will be essential to the creation of this new alchemy, such as the Unreal Engine, Unity, Blender and Maya, 3D-Coat and ZBrush, Substance Designer and now Painter too. We still cling to the old tools too until they have been proven to be outdated such as C++, 3DS Max, Photoshop, and xNormals – these were the early tools our ancestors forged in order to carve a path to this new world.

Chief of Developer Relations Aaron Davies with the Director of Lawnmower Man, Brett Leonard

Chief of Developer Relations Aaron Davies with the Director of Lawnmower Man, Brett Leonard

While I would like to report the secret goings on within the confines of the private indoctrination ceremonies called “Workshops” I found myself in direct contact with three representatives from Oculus, known as Callum, Aaron, and Chris. They were on hand to steal a peek at the work this seven person team known as Timefire have been crafting in the seclusion of the desert, far away from the goings on of the technological elite who might steal from this nascent group of prospectors on their own quest for gold. With Aaron immersed around the corner taken by a “Mobile VR App” (at the moment this secret weapon appears to hold much weight with the Oculusians) and could not grant me a proper audience, but this was nothing to despair of as Sir Callum gifted me his Card of Passage encouraging me to remain in close contact.

Typical reaction to experiencing mobile VR for the first time - at Oculus Connect in Los Angeles, California

Typical reaction to experiencing mobile VR for the first time

Invigorated, for how else should one feel following such intense scrutiny? I attempted to gather my thoughts, but I must admit weakness. For at the age where youth has now escaped me, I found myself troubled by the prospect that my insight may finally be growing frail. Is my path to Seated-and-Tethered Virtual Reality the correct course or do my observations of a youthful class of exuberant firebrands embracing this barely understood “Mobile VR” portend things I cannot yet dream of? The cave I must return to in order to contemplate these lofty questions will not exist in our physical reality much longer, the virtual cave awaits me. Let me wish that the virtual incarnation of myself be wise and courageous as he steps into this new Alchemy of Communication. Long live the Oculus, long live the Virtual and Virtuous.

Allegorithmic Tools in Game Dev

 Bits and Pieces, Helpers  Comments Off on Allegorithmic Tools in Game Dev
Oct 012014
 

Importing Substance Designer files into Unreal Engine 4 for Timefire  in Scottsdale, ArizonaToday’s post might seem a bit obtuse, but the workflow is actually quite easy, fun even, though you are going to have to learn a few things along the way. Allegorithmic is a software company that embeds genius in their work and it is from these smart tools that Timefire has adopted the entirety of their workflow to make our materials pipeline shine. Sitting between these tools and our game engine (Unreal Engine 4) is an Allegorithmic plugin that allows us to bring in the small and efficient Substance Designer material known as an .sbsar into game. While all of the mechanics and understanding needed to put the big picture together is complex, it is fairly quickly learned. The short version of things is that we are able to bring materials out of our authoring environment (Substance Designer, Painter, and Bitmap2Material) into UE4 with a drop of a single file. That file is then auto-magically assembled into a basic material in Unreal that is ready to be placed in game or can be further manipulated with Epic’s mighty Material Editor.

Let me give you another example of how easily this process can work.

  1. Take a photo
  2. Drag it into the Graph view in Substance Designer
  3. Connect the photo to the new Bitmap2Material 3.o Pro node
  4. Manipulate image in PBR workflow, add grunge, contrast, saturation and a host of other adjustments, masks, or channels
  5. Publish .sbsar
  6. Import .sbsar into Substance Painter
  7. Hand paint and customize your texture while it’s on a mesh or add animated particle effects to simulate real world attributes raining down on your material
  8. Re-import the textures from Painter in Designer to assemble a new material – (I don’t like this part, but Allegorithmic has promised that the programs will one day talk with one another)
  9. Publish .sbsar
  10. Drag and drop into Unreal Engine 4 and your new material is ready for game

Graph view from Substance Designer Another example of the power of the Substance workflow comes from Designer, but first a trick question: How do you get a dozen textures to work as a single material that is under 1MB even with Normals, Roughness, and Metallic? You use Designer obviously, because if you use Photoshop you are going to have dozens of files and hundreds of MB’s of image files. Having access to the Allegorithmic Database of procedural textures is lucky for anybody who has made the investment – they are not cheap. They are very efficient and amazing in what they are, you see these guys have figured out how to make an egg using a bunch of noise and math nodes that produce a 20KB egg. Using the Multi-Material Blend node in Designer we can bring a dozen procedural textures into our “Graph” (this is where materials are assembled) and using an SVG node (Structured Vector Graphic) that has been baked from the UV map you created in your favorite 3D modeling software, we can mask the specific areas we want to texture on our mesh.

Substance Designer material at work in Unreal Engine for Timefire LLC in Scottsdale, AZThe chairs, sides of the tracks, and the track system are made from a mix of Allegorithmic’s procedural textures and PBR textures from our friends at Game Textures. While this is not purely procedural, the size of the package that is pulled into UE4 is significantly smaller than it would have been otherwise and it is quickly and automatically assembled into a material. As we only have one traditional graphic artist on staff, this way of working technically with textures saves us a lot of time and allows the level builder to rapidly prototype variations that would quickly overwhelm our graphic artist.

I’d like to point out that a well stocked library of textures is very useful in Substance Designer. Setting up the Library right now is tedious and time consuming, hopefully it will be improved in time, as managing a significant number of textures is difficult. How many is a significant amount? We currently use the Database from Allegorithmic as I pointed out above that includes over 750 procedural textures, we also have more than 1000 textures from Game Textures, more than 1,700 textures from CGTextures, and over 400 displacement maps from the guys at Surface Mimic – which I should also point out are an invaluable addition to our 3D-Coat and Blender workflows.

Substance Painter from AllegorithmicThis is Dani’s first real attempt of working with Substance Painter. She modeled the mermaid in Blender, created too many UV maps, and dug into figuring out how to paint her 3D mesh. Dani created the base material in Substance Designer, imported it into a Fill Layer and then went to work trying out a bunch of the various tools that Painter offers. If this model was to be used in game we would export the bitmaps from Painter and then reassemble them in Substance Designer where we could then publish the .sbsar file that would be dragged into Unreal Engine – as an added benefit we could also take elements of our multi-layered images and open them in Photoshop with a direct connection between Substance Designer and it (this is a recent addition).

Allegorithmic is constantly improving the software, workflow, and plugins it has created. Matter of fact, this version of Painter is still in beta with the official 1.0 not due out until mid-October. Once these guys mature these products to where library functions, interoperability between programs, and performance is improved, they will rule the materials creation world.

Bitmap2Material 3.0 from Allegorithmic

 News  Comments Off on Bitmap2Material 3.0 from Allegorithmic
Sep 302014
 

Bitmap2Material 3.0 and the PBR workflowBitmap2Material 3.0 was released by Allegorithmic yesterday and now boasts a PBR workflow. Physically Based Rendering or PBR has been making great inroads this year with all game engines now supporting it or being about to. For those who need to know, PBR allows different surfaces to appear more photo-realistic due to the way light bounces off of these channels. If you are interested in knowing even more about how PBR works, the guys at Marmoset have a great article that goes into depth about the specification, click here to read it.

After you install the program all you need to do is drag an image into the interface and Bitmap2Material will compute all of the required channels such as Base Color, Roughness, Metallic, Diffuse, Specular, Glossiness, Normal, Height, Displacement, Bump, Ambient Occlusion, Curvature, and Detail Normal. But that is only a small part of the magic being offered, it is the parameters in the right column that really show off the power of B2M 3.0. Besides now being able to work with 8k, 16k, and 32k (gasp) textures, there are eight other main categories of options to effect your image. A caveat regarding those super large images, I’m using a GTX 980 with 4GB of RAM as my GPU and 8K images bring this new card to its knees, it would appear that a Titan with 6GB or a Quadro card with 12GB would be required for the heavy lifting that those sizes and larger images would require.

With your image loaded it’s time to get busy setting up your new material for export. The list of operations and adjustments are lengthy, too much to dive into here today. Better you download the FREE TRIAL and start exploring what Allegorithmic has unleashed. Click here to grab a copy.

Bitmap2Material 3.0 Pro offers a node that works within Substance Designer - also from Allegorithmic

Bitmap2Material 3.0 Working in Substance Designer

While this new incarnation is a fantastic development, it is what is included in the Pro version that is truly amazing for our work. Allegorithmic has created integrations that allow B2M 3.0 to work inside 3DS Max, Maya, Modo, Unity, and Unreal Engine (sadly not Blender), but even this is not what makes this version truly perfect. It is the inclusion of two nodes that offer the full functionality of B2M 3.0 to work inside Allegorithmic’s Substance Designer. One of the new nodes is purely for a PBR workflow, the second one is a dream node here at Timefire – it’s been specifically created with the Unreal Engine 4 material workflow in mind.

Once the node is installed, drag it into the Graph view and bring any bitmap into the program. Feed the output of the bitmap into this specialized node and then the output of the Bitmap2Material node to the Output nodes and the rest of the work is done for you. In mere seconds or less, the Outputs are calculated and Normals, AO, Curvature, Height, Roughness, and more are ready for export or further modifications. It is that easy to use.

Bitmap2Material 3.0 from Allegorithmic working within Substance Designer

Detail of the B2M 3.0 Node in Substance Designer

Clicking on the Bitmap2Material 3.0 node in Substance Designer opens the “Instance Parameters” column which allows the same granularity of modifications found in the full B2M 3.0 program. Something else that needs pointing out, this version of B2M supports Mikktspace Tangents – a way of calculating Normals popular with xNormal, Blender, 3D-Coat, and as I understand it; Unreal Engine. We are yet to test how exactly what this means to our workflow, but anything that brings better quality and compliance with industry respected tools is a welcome addition. While B2M 3.0 supports Mikktspace Tangents, users of Substance Designer will have to wait a short while until those guys at Allegorithmic push out version 4.5 – rumored to be coming SOON.

 

Pre-Conference at Oculus Connect 1

 News  Comments Off on Pre-Conference at Oculus Connect 1
Sep 192014
 
John Carmack of Oculus and John Wise of Timefire at Oculus Connect in Los Angeles, California

John Carmack and John Wise

Like a rock star on stage, John Carmack of Oculus (and of course Doom fame) was surrounded in the lobby of the Loews Hotel in Los Angeles as attendees are arriving for the first Oculus Connect conference. Had the chance to speak with the man regarding GPU developments, Nvidia, PC rendering, and Epic’s role in preparing UE4 to work on Samsung’s GearVR. After over an hour of fielding questions, taking photos, signing a guys copy of Wolfenstein on 3.5″ floppy for PC he was called away. Super personable guy, no pretension, on his game in ways that make geeks drool.

Hilmar Veigar Pétursson of CCP Games, the makers of Eve Online and John Wise of Timefire at Oculus Connect in Los Angeles, California

Hilmar Pétursson of Eve Online and John Wise

Went upstairs to finish registration and ran into Hilmar Pétursson of CCP Games, the makers of Eve Online! This is turning out to be one spectacular day. Just before heading up to the mezzanine I ran into Aaron Davies of Oculus (Director of Developer Relations) who promised to be at my meeting on Saturday to demonstrate our work in progress on Timefire.

The Oculus Connect swag bag

The swag bag is kinda empty, a t-shirt is in there, but not an Oculus Next-Gen 4k Rift, or a free GTX 980, or a GearVR – though I’m holding out that some kind of magic is in the air and we will see something to satisfy everyone’s sense of greed. Hell, I’d be happy with half a dozen bobble heads in the likeness of Palmer Luckey, Brendan Iribe, Nate Mitchell, Michael Antonov, and John Carmack.

Dov Katz of Oculus VR at the Oculus Connect conference in Los Angeles, California

Dov Katz

Prior to lunch, not that I’ve eaten breakfast or lunch yet, I ran into Dov Katz, Senior Computer Vision Engineer for Oculus. I first met Dov at GDC earlier this year as I hoped to get an early photo of the DK2, but he asked me to respect their embargo, so I obliged. To learn the brilliance of this guy, you should watch his talk at Carnegie Mellon University on VR.

Virtual Reality Meetup – L.A.

 News  Comments Off on Virtual Reality Meetup – L.A.
Sep 182014
 
James Iliff of Survios.com talking with Josh Constine of TechCrunch at VR meetup in Los Angeles, California

James Iliff of Survios and Josh Constine of TechCrunch

Got to the Mondrian Skybar promptly at 6:30, joining a bunch of early birds. As a relative newcomer to Virtual Reality and game development, I don’t recognize anyone, so I start listening and looking. First guy I meet is Graham Matuszewski, co-founder of Survios, formerly known as Project Holodeck. I picked him to introduce myself to due to his righteous big red beard. The photo above is one of the other co-founders of Survios, James Iliff, talking with Josh Constine of TechCrunch. From what I caught of the talk they helped affirm Timefire’s direction by emphasizing sticking to one of the big game dev engines, be ready to scale, build with the future in mind that will allow you to add brilliance and shine if you are currently building for speed or mobile.

Trying the Sixense Stem System at TechCrunch VR Meetup in Los Angeles, California

Trying the Sixense Stem System

Finally I recognize someone, it’s Amir Rubin from Sixense, we’d met at GDC (Game Developers Conference) in San Francisco early this year. Wow, what a difference six months have made, the accuracy of the Stem System is ASTONISHING. It only took seconds to be convinced that this will be part of the VR ecosystem as various developers look at ways to let their players interact, play, grab, paint, and otherwise use their hands in virtual worlds.

Graham Matuszewski, co-founder of Survios trying the Samsung GearVR headset built with the help of Oculus. At the TechCrunch VR Meetup in Los Angeles, California

Graham Matuszewski, co-founder of Survios

I ran into Graham Matuszewski again later in the night trying out the Samsung GearVR headset just before I got to try it myself. GearVR has been developed in cooperation with OculusVR and promises to be one of many collaborations Oculus will enter to. The headset was here courtesy of the guys behind NextVR who are out capturing the world in 360 degrees so you can enjoy it on your flight, at home, in a car (not in the drivers seat please) all while wearing this stylish little device. I believe the screen I was looking at is going to be indicative of what we are going to see in the consumer version of the Oculus Rift headset – at a minimum, it is BEAUTIFUL! The resolution and clarity are spectacular. Add to this the announcement Nvidia made today regarding their new GTX 970 and GTX 980 GPU’s and VR is getting ever closer to really rocking the world.

Los Angeles at night as seen from Loews Hotel on the 18th floor

Before leaving I had to chance to briefly talk with Cymatic Bruce who was here showing off AltSpaceVR, neat stuff, can’t wait to see what they do with their API and SDK. Talked a good long while with Kyle Russell of TechCrunch – seriously enthusiastic guy loaded with a million questions and an equal amount of people who run into him to say hi. The BIG surprise meeting was a chance encounter with “The Machine” – Peter Sistrom of Touch Designer. The Machine he truly is. He is one of the brains behind the art of ISAM, the projection mapping brilliance that toured with Amon Tobin. We talked a short while about things VR and how the guys behind Touch Designer have implemented support for the Oculus. I explained some things I’m looking for regarding compatibility with Unreal Engine 4 and Touch Designer, let’s see what develops. There were other encounters and business cards exchanged, but it was getting late and I needed some dinner – at 11:00 p.m. Back out in Glamour City, Sunset Blvd and then Hollywood Blvd on a Thursday night, the streets were packed. I pulled into a favorite old haunt from my teen years spent growing up in L.A. and sure enough the place was so full, I had to eat in my rental car….a lowly Hyundai in a city of Bentleys and Lamborghinis.

Grab Bag of VR Wishes

 Bits and Pieces  Comments Off on Grab Bag of VR Wishes
Sep 172014
 

Jiri Wehle on the streets of Zeithausen - Timefire LLC Scottsdale, Arizona

The more I learn about VR and the intricacies, details, and nuances of things I’d like to see happen, the more complex the picture becomes in regards to what will be necessary to realize these ambitions. Problem is there are already dozens of creative tools that allow us to peel back the fabric of reality and manipulate what our players will see. Those tools represent a near infinity of possibility and yet there is room for improvement.

This week saw Ton Roosendaal of Blender fame agree with me that he too would like to see a direct connection between Blender and UE4. Hopefully a conversation will start in the Blender Game Engine forum regarding just such a possibility. Maybe Epic can help get behind this and we can get Bastien Montagne and his helper Jens to create this great feature.

Someone needs to encourage those geniuses building Sverchok (Blender addon) to whip together a node that allows for better UV unwrapping of parametric objects. As it stands we cannot currently bring in objects with individual UV’s and textures (unless we slog through the tedium of unwrapping and texturing possibly hundreds of objects), but if we want to assign a single material – we’re golden. We already have the world and now we want the icing too.

Finding buoyancy but the water still cuts through the boat in TimefireVR - Scottsdale, Arizona

Help us find occlusion….

Our guy Luis has been beating himself up jumping into C++ in order to harness the power of the Gerstner wave so we can have boats that float gently on the water with a buoyancy that lends realism to our scene. Well he finally figured that out, but now we’ve got to master how to occlude that water from the inside of the boat.

Hey Epic, how about a Lit Mode for UE4 that would allow us to see some relation to how the Oculus and various graphic cards (via profiles) will render our geometry? A kind of FPS density mode so we can visualize what areas are killing our 75fps perfection?

While many people may not know about sIBL GUI, I do and I will get around to using it one of these days for the creation of our Skyboxes in UE4. You see, with sIBL we can make 360 degree HDRI (High Dynamic Range Images) – then all we need to do is map those to a sphere and make a new sky in our game. Well, wouldn’t it be really cool if someone wrote a plugin that allowed sIBL GUI to output a Skybox or handshake directly with UE4? Okay, my apologies, I suppose this might be a bit too pedantic and lazy as I wish for everything in a perfect VR world.

Sep 122014
 
BlueprintsInception

Basically…

Shiver me timber bitches!!! It’s me, Luis, back to feed you baby birds with an update of what I’ve been doing at the TimefireVR office. Since my Sverchok post I have been diving deeper into Unreal Engine 4, mostly Blueprints because what’s not to love about programming without the confusion of all those pesky lines of code (mmmmmmmmhhhh yummy nodes).

Boats in the virtual water using Unreal Engine 4 for TimefireVR in Scottsdale, AZ

Look at that wood grain…. Ladies….

Last Friday John temporarily dragged me away from my Blueprint immersion and gave me the task of creating a couple of boat meshes and texturizing them to fill the lovely canals that will allow one to navigate our world. I, wanting to stay gainfully employed and housed, took to Blender with a supreme confidence that the Black Mamba himself (Kobe Bean Bryant) would admire, which later was replaced with supreme doubt (Please still like me Kobe!) when I saw the multitude of components needed to make a motor for a boat. Alas I carried on making and texturing that motor boat and a much simpler row boat.

Gerstner Wave formula used by Timefire in UE4 for creating water

Once I handed over the boats to our resident UE4 Master Joe Cunningham, we quickly saw that we needed to add buoyancy and collision before they could set sail on the virtual sea. My eyes shimmered with excitement (in the most manly way imaginable), I knew that our problem could only be solved with Unreal Engine – and MORE BLUEPRINTS! What I didn’t know was that I had to incorporate my arch nemisis…. Mathematics (cue dramatic music). More specifically I had to implement the Gerstner Wave formula.

If you’re saying to yourself “Luis you’re not qualified to turn this kind of mathematics into a node based material to displace objects!” You my friend are too quick to judge, but you are likely right. No way could I have figured out how to do this on my own. Luckily UE4 has a great community, so I found a tutorial on how to do it. Here is the link. http://www.youtube.com/watch?v=PBWLfpm0K0o

I’m still working on the buoyancy part which has some coding in it so I’ve only been able to generate the animating ocean material shown in the boat picture.

So that’s what I’ve been doing on my end. As promised I will be following up my Sverchok Overview in a week or two with a step-by-step guide on how to go from making a Sverchok model and importing it into UE4 with some animation. Stay tuned and be sure to follow our blog and social links somewhere on this page.

Written by Luis Chavez for Timefire

Co-Founder of Oculus Brendan Iribe Gives Large Gift to University

 News  Comments Off on Co-Founder of Oculus Brendan Iribe Gives Large Gift to University
Sep 122014
 
John Wise and Brendan Iribe at Steam Dev Days in Seattle, WA

John Wise and Brendan Iribe at Steam Dev Days in Seattle, WA

The power of Virtual Reality continues to gain momentum, Brendan Iribe, the co-founder of Oculus has made a $31 million donation to the University of Maryland where a computer science center is to be named after Iribe. A million dollars of the gift will go to fund a scholarship in his name. Another co-founder, Micheal Antonov is throwing in $4 million while Iribe’s mother, Elizabeth Trexler is putting up $3 million to endow two professorships in computer science. The above photo is of Timefire’s founder John Wise meeting Brendan Iribe at Valve’s Steam Dev Days in Seattle, Washington that was held in January, 2014.

A.I. in UE4 Overview

 Bits and Pieces  Comments Off on A.I. in UE4 Overview
Sep 112014
 
Detect Enemy Task Blueprint created by Ariana Alexander for TimefireVR in Scottsdale, Arizona

Click image for larger view

As you may have noticed from the complicated Unreal Engine graph you see above, I have moved from the craziness that is Character Blueprints and into the craziness that is Character A.I. (Artificial Intelligence). I’m sure you can imagine, there is a big difference between controlling a character yourself using a keyboard or a controller, and telling the computer how to control a character. Because computers, and therefore your A.I. Characters, are stupid, and if you don’t tell them exactly what to do they just sit there and do nothing. That’s not super exciting when you’re making a video game. Quite the opposite in fact.

So to combat this, I decided to educate myself in A.I. Controller Blueprints and Data Assets and Behavioral Trees inside of Unreal (UE4), along with a few other things. Even though I only really needed to know how to make a character walk from one point to another, I thought it would be better to delve a little deeper into the Behavioral Tree process in order to understand it better so that maybe I could do a bit more than make the character walk from one point to another in the long run. The Blueprint you see above is actually a Blackboard Task Blueprint to detect whether an enemy is close by or not. If the Player’s character moves close enough to the NPC (Non-Playable Character or A.I. Character) the NPC will chase the Player’s character (in this case, mine, since I’m the one testing it), and once in range, shoot it (me). Yeah. Scared the crap out of myself the first time I tried it while testing someone else’s code. Peter L. Newton’s code to be precise. He has a Youtube channel with a series of tutorials demonstrating this entire process. Shut up, you would be startled too if an NPC came at you out of nowhere at top speed and started shooting at you.

As with my own code, I still have a couple of bugs to work out, one of which being instead of being shot at when I am detected, the A.I. just runs right up to me and clings to me to the point where I can barley move. I guess my character is just too devilishly handsome. It would be sweet if it weren’t so weird.

Next on the agenda is to attempt to take what I have learned and apply it to my own characters in game.

Written by Ariana Alexander for TimefireVR

Low-Poly High-Poly Game Mesh

 Bits and Pieces  Comments Off on Low-Poly High-Poly Game Mesh
Sep 102014
 

Creating the illusion of a high-poly mesh using Substance Designer from Allegorithmic for TimefireVR in Scottsdale, Arizona

Learned another aspect of building low-poly game meshes today. Starting in Blender I took a simple cube, scaled it on the Y-axis and deleted the middle to create a basic frame shape. Next step was grabbing an intricate  black and white design that would work for the frame detail followed by applying a subdivision and displacement modifier to build my high-poly mesh. Export the two models (the low-poly basic frame and the high-poly displaced model) as FBX and import them into Allegorithmic’s Substance Designer, in a second I have the elements needed to build what looks like a detailed picture frame. Now the trick is setting it up in a virtual museum and keeping the player just far enough away that they don’t easily see the cheat. Turns out that Virtual Reality is unforgiving when it comes to using these old tricks when building video games due to players being able to see details on the surfaces of things, but like Mona Lisa behind glass and a roped off area in the Louvre, maybe we can get away with this polygon saving method.

Written by Brinn Aaron for TimefireVR

Samsung Gear VR Headset

 News  Comments Off on Samsung Gear VR Headset
Sep 032014
 

Gear VR from Samsung featuring a 2560x1440 screen - in cooperation with OculusLive on stage this morning Samsung, with the help of Oculus CTO John Carmack, is showing the Samsung Gear VR headset. With 2560×1440 resolution and what appears to be a wheel based IPD (Interpupillary Distance) the headset is rumored to be aimed at the entertainment market more than immersive 3D gaming. Another great day in VR history, uncork the champagne.

Click HERE to read more information direct from Oculus.