This fall, Life of Lon will be released to Steam as an early access game. In this weeks post, we look back on the past 2 years of concept art that has brought us to this pivotal moment.
Flora and Fauna
Lon's Multi-tool (still unnamed)
We're excited to announce that we have coordinated a number of demo events in the coming months to show gamers what to expect when we release Life of Lon to Steam this fall! If you live in the midwest anywhere near Chicago, here's the info you'll need:
Event #1: The Chivr Debut Meetup
The Creative Group will be hosting The Chicago VR Meetup on August 24th. You'll get to try a number of cutting edge VR demos on all of the newest hardware- the HTC Vive, the Oculus Rift consumer version and the Gear VR! Block Interval will be demoing Life of Lon on the HTC Vive and Oculus Rift CV1 with Oculus Touch. Free food and drinks will be provided!
Where: The Creative Group, 205 N Michigan Ave Ste 3301, Chicago, IL 60601
When: Wednesday, August 17th @ 6pm-9pm
Who: Open to the public
Meetup Facebook Event
Event #2: Tech in Motion Timmy Awards Ceremony
Block Interval will be demoing Life of Lon after the Timmy Awards.
Chicago’s 2nd Annual Timmy Awards is an award ceremony hosted by Tech in Motion, a national events organization for tech enthusiasts created by Jobspring Partners & Workbridge Associates. The Timmy Awards recognize companies and individuals that strive to create the best places for technology professionals to work in the Greater Chicago Area. The Timmy’s are presented in three categories: Best Technology Work Culture, Best Technology Manager and Best Tech Startup.
Where: 1871 - 222 W. Merchandise Mart Plaza, 12th floor
When: Thursday, Aug. 25th @ 6pm-8pm
Register via Eventbrite and Meetup
Event #3: Valorcon
Block Interval will be attending Valorcon- a Chicago gaming expo. We will have 2 HTC Vive demo stations showing Life of Lon in its near-release form! Try it out before it releases to Steam in the fall! Daniel Allen, Creative Director of Block Interval, will also be speaking on a panel about VR.
Where: Macy's - 111 N. State St. Chicago
When: Sept. 30th - Oct. 2nd
Who: Valorcon attendees
Cost: See ticket prices here
What's Life of Lon? Watch the trailer below:
Welcome to the first episode of our developer vlog! Today, Daniel Allen and Joshua Corvinus discuss a variety of issues related to virtual reality including God rays, getting high performance in your VR games, and how to get free VR hardware from the major HMD manufacturers.
See you after the jump!
Subscribe for more!
We're going to keep making more of these types of videos, so if you liked it, give our youtube channel a subscribe!
Thanks for watching!
Attention Vive Owners
We're excited to announce that Life of Lon is now in development on the HTC Vive! What is Life of Lon you ask? In short- it's an episodic seated/standing science fantasy story experience that blends film, game and ride into one seamless package. It follows in the legacy of other narrative-driven story games like Another World, Half Life and Journey.
You will travel through vast exotic environments as the titular character Lon- solving environment puzzles using your wits, your tools, and help from your new friend Yep.
About This Trailer
This is the first 5 minutes of Life of Lon- and accounts for approximately 2% of the full game we are building. We recommend watching the trailer with some kind of VR device (such as Gear VR, Oculus Rift, HTC Vive or Google Cardboard) to get the best effect.
Stay tuned for more details!
The Hard Work is Paying Off
If the Life of Lon project could talk, it would say that it has seen some things. It would tell you how it started in August of 2014 as an artistic sidescroller and entered development as such. It would sound heartbroken that so much development had to ultimately be scrapped. But its rebirth as a VR game would shout with excitement that so much has been learned and created.
The Block Interval team has embarked upon a new paradigm in VR storytelling. We've innovated on seated locomotion to drastically reduce or, as we've learned in our testing at SVVR, altogether eliminate motion sickness in a large percentage of people. We've also created a character that is incredibly compelling in VR. We're ready to show the world how a story can be told in this new medium.
Where did this chapter begin? Bristol, UK at the VR World Congress.
VR World Congress - Bristol UK
For all of us except Darryl, an 8 hour flight was how we started our trip to London. We picked up a rental car and had to learn how to drive on the 'wrong' side of the road. That brought some incredibly harrowing moments, but we made it to Bristol in one piece.
Immediately we started facing issues getting our booth setup, but we're a team accustomed to dealing with problems. We banded together and solved them one by one until we had our setup in perfect working order- clean, organized, ready to go. The next day, the crowds came.
Have a look at our video overview of the event:
BBC Breakfast paid a visit to some of the booths including ours.
During the entire day, we were swarmed with people wanting to try Life of Lon. Many of them had heard about it or seen the website. Lines formed and we set out to talk to everyone, set up demos and keep everyone happy despite the long wait. Halfway through the day, we ran out of art books to give out.
Crowd reaction to the demo was very positive and we made a number of new connections in the VR industry. We also learned a lot about what goes into putting on a good demo. We put that new information to use a few weeks later when a larger chunk of the team went to Silicon Valley to attend SVVR.
Silicon Valley VR Expo - San Jose, California
We arrived at SVVR ready to rock.
Our lead developer Dave Nelson (holding the Rift) and interaction developer Josh Corvinus (right) setting up the table for demos. There was a strong sense of optimism about what was to come.
Suzie Sabin (of our biz dev team) joined us on day 3 to build connections with potential investors.
Our demo station was never empty- even during setup times.
Josh manned the demo stations for much of the expo and always greeted our players with a smile!
On day 2, a group of people gathered together for the annual quick pitches. Creative director and co-founder Daniel Allen gave a quick pitch for Life of Lon that you can watch below.
We spoke with a number of journalists throughout the expo.
The lines got longer and longer as the expo went on and more of the attendees began to hear about our demo. At the end of day 3, we had to nearly stop a demo in progress because the expo floor was to be cleared- just like what happened at VRWC. People couldn't get enough of Life of Lon.
So what's next?
We're all back to our respective homes now digging through piles of business cards and working on fostering the relationships we've formed. We're very happy with how both VRWC and SVVR went. We'd love to keep you informed when we reach the next milestone- so perhaps you'd like to follow us on twitter, or like us on facebook? If you enjoyed this post and you want to see more content like it, sound off in the comments below!
Thanks for reading!
An Over Overview of VR Locomotion Types
One of the biggest challenges facing new VR developers is locomotion. How best to let your player move around without feeling sick? Some of the approaches we've seen emerging include:
Some experiences don't have any artificial locomotion built in- the room size is the room size and you can move around freely within those boundaries. Games like Hover Junkers have taken this approach and run wild with it.
Placing the player in a cockpit is a tried and true method for seated experiences. While we've seen these largely used in space and driving games, we're using a cockpit approach for our story adventure game Life of Lon.
There are other approaches being worked on of course, but these are emerging as some of the most popular.
Budget Cuts has a unique approach to locomotion that will surely find its way into other games. By throwing portals, a player can navigate rooms and hallways without moving very far in the play-space.
THIRD PERSON LOCOMOTION
Third person games are starting to find their way to VR devices- and this brings with it a number of approaches for third person. First off, we have the fixed camera approach used by Gunfire Games with Chronos:
Next up we have the following third person approach used by Oculus for Lucky's Tale:
Which approach is better for VR? They both seem to have their strengths and weaknesses. With a fixed camera, the player could be put into awkward situations during the camera shift, and with the moving camera, there might be nausea issues depending on how well the camera is implemented. Considerations worth thinking about.
There are a number of approaches to blink locomotion- but the gist of it is that there is a blinking effect and the player is moved or turned without any animation. There might be a fade out and in effect, or it might be instantaneous. The player might be able to create a portal, or be able to choose where her avatar will move to and which direction to face.
In a hybrid scenario, multiple types of locomotion might be combined to form some new method. For example, an experience may have a seated cockpit but also a standing room-scale component. In Life of Lon, we have a largely cockpit-type experience but we've incorporated blink locomotion into the control scheme. In the future, we'll surely discover new and interesting ways to get around in VR either through the use of new peripherals, or through new ways of thinking.
Locomotion in Life of Lon
Life of Lon is a Journey-esque story game that will have you traveling across long distances taking in beautiful vistas and solving challenging environment puzzles. We originally planned it as a sidescroller that took place largely under water. Initially, the player (playing as a young cosmonaut named Lon) would swim around in his space suit. When we switched to VR, we had to rethink our most fundamental interactions. How were we going to let the player move around under water, and what if they started to feel sick?
Our first tests into this kind of locomotion made us feel quite ill, so we knew we needed to do our homework.
The first thing we learned was that the player needs some kind of stabilization mechanism to give them an equivalent to the horizon. Enter the Stabilization Semi-Sphere.
The Stabilization Semi-Sephere (SSS)
When we were designing Lon's ship, we tested a design where the player would be facing outward looking through a large portal. This resulted in a great deal of simulator sickness. What we discovered through much iteration is that if the ship was turned on its side, a natural horizon line was created around the player- giving her the sense of stability she needed. Once we discovered this, test after test returned an incredible reduction or altogether elimination of nausea in our play testers.
We took this learning and applied it to our underwater areas and found that similar results were achieved. We created a glowing semi-sphere that encloses the player while moving around. Nausea was all but eliminated in most players. For the ones who still felt nauseus, we spent a lot of time refining our comfort controls.
Comfort Controls, Not Just Comfort Options
A lot of devs that are building for VR are putting comfort controls into their options. We think this is a fantastic idea, but we wanted to take it a step further. What if you could enter comfort control mode without going into a menu? Why not integrate it into the primary controls somehow?
What we ended up with is a hybrid approach that utilizes the entire controller. For players that wanted trigger button locomotion, we give them the ability to press R2 and L2 to turn or R2+L2 to go straight ahead. For players that needed to lean into their turns to give their inner ear a sensation of movement, we've integrated tilt to turn controls. If the player is holding down A or R2+L2 and leans, their craft will turn in that direction- more sharply depending on the amount of lean. And finally, we use the D-Pad for blink locomotion. D-left and D-right turn the craft at certain intervals without animation, and D-up and D-down move the craft forward and back without animation. We may yet tweak these controls for the final game, but this is where we've ended up at the time of writing this article.
Bringing it all together
Have a look at our first trailer to get a sense for how we've taken these principles and applied them to Life of Lon:
Have Your Cake and Eat it too
Game devs need to ensure that power is given to the players when it comes to comfort. Being able to change your control configuration on the fly gives players the absolute best chance to play your game comfortably because they can switch back to comfort controls during sequences they deem too intense. We recommend giving this hybrid approach a try.
Did you give this approach a try in your game? Leave us a comment and let us know how it turned out!
Have you ever read a movie or game review and the reviewer said something to the effect of, "The environment was one of the characters." That's what we are attempting with our VR story game Life of Lon. The world you will travel through (or perhaps more appropriately, beneath) is called Paracosma- named after the first fictional VR world written about in Stanley G. Weinbaum's 1935 short story, Pygmalion's Spectacles.
I've asked our lead artist Courtland Winslow to give us a glimpse into each of these creatures, what the thinking was, and how they could add to the world of Paracosma.
Life Found on Paracosma
"This critter is a large pod that generated a bunch of smaller pods. The smaller pods would break off if brushed against, and eventually melt and pop, releasing a bioluminescent gel. Idea was to provide some interesting light sources in dark areas. "
"This bulbous bit of flora is sponge-like and hollow with fins
that flap to circulate liquid and release glowing spores."
"Algae/grass/moss stuff that would be great for areas with flowing water because it would both wave around (think fields of wheat in the wind) and release glowing spores."
"A hollow, glowing sponge."
"Inspired by those deep sea fish that unhinge their jaw and swallow prey much bigger than themselves."
"Like a parrot mixed with a really hands-y octopus."
"This one plays off of the energy-producing crystals we were planning early on. Sort of a hermit clam that lives inside one of the naturally-square bits that would break off."
"a mushroom, a bioluminescant coral, and a giant coconut crab combined, that stomps around on its knuckles."
"An amoeba that would use broken pieces of the crystals for energy. Its eye would be a chunk of crystal that it wore into a sphere/lens. The more they absorbed, the stronger they'd be."
"Inspired by creatures from an episode of Ren & Stimpy where they're on an alien planet. Combined with a really creepy octopus."
"Starfish/snail that would release tons of smaller, glowing starfish/snails."
"My favorite. A three-legged parasitic crab that would develop these large glowing sacs that would act as bait. They'd lay on their backs and release the bait on a tether to attract large predators (something like a whale). Once something came to eat the glowing bit, they'd dig their legs into it and hang on."
"The Party Pineapple. Ok, so this one's my favorite too."
"Like a spider meets a flashlight meets an octopus."
"Just a small, glowing plant that would be a convenient light source."
"Tube-worm looking things that provide light until you disturb them and they retreat."
"Bioluminescant sand dollars."
"Pods that bloom and provide some light."
"Alien snail with angler-fish-like glowing bits."
Visit Paracosma Soon!
We're hard at work bringing Courtland's artistic visions to virtual reality with Life of Lon. Expect to see our first trailer soon, and if you're at VRWC or SVVR this month, come by and play our demo!
Since 2014, Courtland Winslow has been providing fantastic art leadership on Life of Lon.
This week, Dave Nelson and Daniel David Allen sat down to discuss the process and team dynamics of creating Block Interval's flagship VR story game Life of Lon.
Dave is not only in charge of music and sound, he's also graduated into the lead developer role. His contribution to the project and our research into VR has been substantial.
Dan had the original idea for the game, and has been the creative director and lead writer from the outset.
Have a look at the video and see you after the jump!
Block Interval is headed to VRWC in April! We'll be showing off our very first demo of the game. We're very excited and proud of the work we've done, and we can't wait to share it with the world. Here's to reaching new heights in VR storytelling!
A couple of weeks ago, we shared our process in creating a cockpit in VR. Now we're ready to reveal our process behind creating our lead character Yep!
Some important things we knew early on that Yep needed:
- Yep needed to be the player's primary way of interacting with the surrounding world.
- Yep should have abilities that engage the player and directly influence progression.
- Yep needed to be very cute and have some cat-like as well as dog-like qualities.
- Yep should be both aquatic and able to move around on land.
These decisions came from one place: The writing. If you want to create an engaging character in VR, write a good character.
Growing up, there were 3 games I can think of that very strongly influenced the creation of Yep. The first is a game called, "A Boy and His Blob." This puzzle/journey game was released for the Nintendo Entertainment System in 1989.
What was cool about A Boy and His Blob was that your companion was essential to the journey. Without him, you couldn't do anything. So I thought we should create a companion like this in some ways that will be surprising to players later in the game.
Another game that inspired me was a game called The Secret of Evermore released in 1995 for the Super Nintendo. Evermore had a dog companion that fought alongside you and changed forms throughout the game.
Just like in A Boy and His Blob, your dog was critical to finishing the game.
The third influential game was Another World (or as I knew it from the SNES release, "Out of This World.") Another World had an alien companion that joined you on your journey and helped you out of trouble.
Dogmeat is your faithful canine companion in the Fallout series.
There are of course other games that have companions, but these were some of the most influential ones when it comes to the idea behind Yep.
Let's get into the concept art now and step through how Yep came to life.
These early drawings were based on some descriptions I gave Claudio Rodval, our character artist at the time. I had an idea of what Yep would do in the game, but not exactly what he would look like. Based on these sketches, I was able to decide which features I liked and which I disliked.
I thought the big eyes were a very cute looking feature so I asked for another round that focused more on that look. Some of these guys are pretty funny to look at.
This round of drawings came from Claudio while he was staying in a hotel. You can see that we were starting to nail down the overall shape Yep should take and how his face proportions would look.
I thought this one was absolutely adorable- but still just a tad too mammalian. I wanted Yep to straddle the line between mammal and dragon-like.
These were the first drawings were I felt like the character had emerged. Everything we did following these drawings are derivative. We had solved the core features of Yep's face.
Whiskers? Nope, that looks kind of weird.
No more whiskers and trying to get the 3/4 view of the head looking right.
The first largely-complete drawing of Yep's whole body. The tail wasn't figured out yet, but I was very happy with how this turned out.
The tail was such an ongoing struggle that we iterated on all the way till the final model. I knew it should be something iconic and memorable.
Since Life of Lon was originally a 2D sidescroller, players would have gotten to play as Lon. The interplay between Yep and Lon was going to be an important part of the gameplay. Now that Life of Lon is a VR game, the interaction won't quite happen in the same way, but we think it will be even more engaging and immersive. Here are some drawings of Lon and Yep getting to know each other.
We had planned a section called The Bubble Stream and Yep would float with the player through a bubble-filled underwater river.
A really beautiful render that Matt Vince did. At this point, we still hadn't solved the gem in Yep's tail- so in these early drawings there was no gem.
Claudio created this character sheet that turned out to be very useful while we were modeling Yep.
This is a throw together piece I did to attempt to figure out a solution for the tail gem. Ultimately, I decided that it was important enough that we need to go through rounds of concept art to figure out.
First round of formal concepts around the tail gem. Initially, I was really fond of these designs, but over time, I realized that I didn't like the way they looked. I was determined to go back to the drawing board as many times as it took to get it right.
While we continued to noodle on the tail, I hired a contract modeler to start work on the Yep model. He didn't follow the specs too well, so I had to provide him with detailed notes to get the proportions right.
A look into the kind of feedback I needed to give this modeler to get the proportions correct.
Second round of revisions.
We ended up with this as a low-poly version. I wasn't super excited about where we ended up so I found a talented modeler named Brian Olmstead to pick up the baton and take Yep to completion.
Brian ran through a few rounds of cleanup. Things like eyes that can blink and mouth that can open hadn't been solved yet.
An early render of how yep's body might be painted.
A grab from Unity to see how the light would fall on Yep.
The low-poly got dropped into a boilerplate scene where we tested how he might look underwater.
Finally, yep gained the ability to blink and open his mouth!
Alisa Kober animated the low-poly model. We were really impressed by the effect!
A mockup I did showing my idea of where the tail could end up. This version was ultimately the one we've gone with.
An early animation of Yep popping his head up. We all agreed that it was adorable. He seemed a bit bouncy to me- so that was fixed in later tests.
The finished high-poly model. The eyes ended up a bit smaller than the original concept art due to blinking logistics. I'm really happy with where everything ended up.
Darryl Dempsey started doing tests on light passing through certain parts of Yep's body.
This was a fun test to look at.
Still some work needed to be done on the tail- but the shape was looking right.
A beautiful render of the high resolution unposed model under water.
Alisa tests out the facial rig.
An early idle animation by Alisa.
Our first animation test with the high poly model. It looked great!
After about a year and a half- Yep had finally come to life in a meaningful way. As you can see- you can get a really strong sense of his personality just from the subtle animation cues.
And here he is with the glow applied. All that remains is the glowing tail gem and he's done!
I hope you've enjoyed this behind the scenes look at how Block Interval created our beloved mascot for Life of Lon. Stay tuned for all sorts of new developments by following our twitter! We post new art and progress all the time!
Also if you'll be at the VR World Congress in Bristol UK this April, stop by our booth and try out our trailer for Life of Lon. In it- you will meet Yep for the very first time! Happy trails!
Daniel Allen is the co-founder of Block Interval and creative director of the in-development VR game Life of Lon. For more information about the project, check out lifeoflon.com or the VR announcement.
Ok so if you're a regular reader of this blog, you know that I don't tend to just write something personal, but I figured this was as good of a place as any to share a story (we are a story studio after all!).
So I go to the Chicago VR Meetup last night (it's 1am right now) to show off the progress we've made on Life of Lon. Standard fare, grab some pizza, demo your game, get some feedback, ok life's good. So I meet a few people, make some connections, and then I meet this guy named August who is part of the company called In Context Solutions that is hosting the whole shindig. He mentions the Vive and I'm like, 'Oh yeah I haven't actually tried a Vive yet.' Which is strange to say when I've been running a studio launching a game for the Rift. Shouldn't I know what else is out there? But alas, difficult to get a Vive.
Anyways- August, yes. August. He goes, 'Hey we got a Vive here- come on back and try it!' Ok so what could I say? I was done demoing and everyone had filtered out. So I go to this small office space with chairs up against the wall and it's dark. There's this guy in the corner at a standing desk presumably running the game software. There's a zombie game called The Brookhaven Experiment created by Phosphor games on at the moment, and a guy in the middle is flailing around. I guess I'm next.
I make some small talk and then it's my turn. The guy in the middle of the room looks at me expectantly. I move in slow motion to the middle of the room, strangely out of control of my own body. I hold the headset and look at it. This is what I've been seeing in videos for 8 months, and I'm holding it in my hands. I take a moment and look at the lenses and the front of it. The guy waits for me and I nervously explain that I haven't held a Vive before. He nonchalantly goes, "Not a lot different than the Rift really." I suddenly feel very behind the curve.
I pull the headset down and two floating controllers drift into view. I grab for them and they become a flashlight and a gun. I feel very strange- halfway between two worlds. I know immediately that I am having some sense of what they call 'Presence'- that feeling like you are somewhere else.
It's a zombie defense game. The zombies spawn and you have to a) be able to see them (hence the flashlight) and b) be able to accurately shoot and kill them. I did an alright job of it (I thought) until I ran out of ammo. That's when someone from the real world said something to the effect of, 'Wow, nobody runs out of ammo on level one.' So apparently I am trigger happy and I will die in the zombie apocalypse. To all of my friends and family: I'm sorry. So here I am with no ammo and piles of zombie corpses around me. What do I do? I ask, "What do I do?" And someone offers- "Hit them with your gun!"
So now I'm swinging around like an idiot trying to kill zombies with my gun. I also swing the flashlight around because hey, it's metal too, right?
I swing for a zombie with my flashlight and BAM did it connect with a satisfying THWACK. But uhh, that hit felt pretty real. What's going on.
The crowd of bystanders erupts to my right and I realize I had hit someone. I slowly pull the headset off almost unable to bear the embarrassment I felt. I hit a bystander in the jaw- a clean hit at just the right angle the zombie was at. Somehow the Vive's glowing walls didn't appear from the distance I was standing (or someone had mis-calibrated it) and I swung at a place that was actually off limits.
Thankfully, he was ok. But I felt horrible about it. I had basically cold cocked someone.
So it goes when you try out development kits with in-flight games. All I can say is that I cannot wait to get my hands on The Brookhaven Experiment to try my luck at surviving the zombie apocalypse again.
I once heard another VR creator say, "Building content for VR is a bit like skydiving and constructing the parachute on the way down." There are so many fundamentals that have yet to be figured out- so you can rest assured that if you're building in VR, you're likely doing something new. There just hasn't been enough content created for 'copycats' to pop up. A Carl Sagan quote comes to mind:
Inventing the universe in this case can mean 'simple' things like locomotion and interface design. But these things have proven to be far from simple- requiring lots and lots of testing, research and iteration.
Our studio is called Block Interval- and we have been working on a story game called Life of Lon since August of 2014. Originally the game was planned as a minimalistic story sidescroller- inspired by the stylistic approaches of games like Another World, Journey and Limbo.
We spent 4 months building this version of Life of Lon. Unfortunately, we had to scrap all of our dev work to make the transition to virtual reality- a painful decision to make when you're bootstrapped. But we have since discovered that it was the right decision. Life of Lon is much more dynamic and immersive in this new medium, and because we had planned for a silent protagonist- the story still works (with the player as the stand in for Lon).
So about that Ship...
Before we get into the cockpit itself, let's talk about the ship. Lon's ship started out with just some rough sketches to get a feel for a general shape. I knew that we needed a portal and some legs but beyond that, I wasn't sure where it should go. Here are the first sketches of the ship.
The first ship in the above picture (ship A) really intrigued me because it reminded me somewhat of the tripod ships from the 1980's UK sci-fi show. This seemed like a good direction to head in, so we iterated more with that idea.
More iterations trying different ideas related to how Lon would exit the ship and how it could take off and land.
At this point, the overall shape of the ship began to take shape. I left in some of my notes to Claudio and Matt so you can see the thought process behind getting the shape right.
A very nice render. We ultimately ended up pretty close to this except for the bubble instead of banded window.
More thought was given to how Lon might get in and out.
One idea we talked about was making Lon's ship feel 'lived in' with traces of his personality spread around. This concept of the cockpit was unique in that it just had a flat window at a desk.
Remember that this was still when we were planning Life of Lon as a sidescroller. We pursued this idea where the interior of the ship would rotate around Lon as he 'swam' through space horizontally. Would have really liked to see this come to fruition- but so it goes.
A very nice render of the above schematic.
At this point, we really began to understand what Lon's ship needed to look like. Everything from here on out is just variations on this design.
Another really nice render that is pretty close to what we have right now.
This is the first concept after the switch to VR. As you can see, we still maintained a lot of momentum from the previous work. Our goal was to figure out how to make this type of cockpit feel comfortable in VR. Unfortunately, we found ourselves getting sick with an outward facing cockpit no matter what we did.
Concept of what it would look like falling towards the water in VR.
Our early VR cockpit prototypes where enclosed spheres that would protrude from the ship. It was our tests into comfort that forced us to change the plan a little later.
Another nice render showing what the ship might look like once it had crashed to the bottom of the ocean.
Our original crash site was right inside of an alien agricultural dome. You can see a basic model of the ship that we used for placeholder purposes. You can also see an early model of Yep on the left.
Our second model of the ship's hull. The final version we ended up with had much longer legs and is more of a charcoal color/texture.
We were still convinced we could do the cockpit in a glass sphere but we hadn't solved nausea yet.
At this point we started to work on locomotion. How would it work? We settled on the player sitting on top of a submersible that could detach from the cockpit (Meaning the player never leaves their seat throughout the game). We wanted to do something that was a mixture of biological and technological- so you will see that theme reappearing throughout further iterations.
We found ourselves having to create rough models to determine the visual footprint a design would have. This can be a very time consuming process.
An early UI approach- bio-mechanical tendrils that would project holograms important to the gameplay. We had to kill the idea because of how expensive it would be to build/animate the tendrils. Would have been really cool though!
We shifted gears to a more hard shelled anemone shape.
An early model based on this idea that we tested.
This was our breakthrough moment. We discovered that if we tilted the player on her side, motion sickness was all but eliminated. In our previous cockpit designs- the player faced forward towards the bubble, but when we put the bubble overhead, nausea was completely eliminated. We speculate that this is because the player has a consistent horizon line around her- making the cockpit feel more stable and anchored. This was a really key learning that we hope to carry throughout the rest of the game.
An early model of the more minimal cockpit. We were enamored by how it looked kind of like an eye (lens, iris, etc).
The new test cockpit sitting in a further along version of the ship.
In VR it felt pretty good. Wide open view of space in full 360 (as well as up). Not being able to look down didn't seem to detract from the experience.
Early models of the gems that power the ship.
With our R&D completed, we began work on the final renders of the cockpit. This would be the design that we would shoot for in the production build of the game. Much more attention being paid to the details- but still attempting to keep it on the minimal side.
Throughout most of the development of the cockpit in VR, we knew we wanted to build holographic UI- we just weren't sure how to execute since there is very little precedent. My hope was that the holograms would look beautiful and simple, not slapped together or cluttered.
A concept that shows how the pod Lon sits in could detach and move into sockets inside the ship. We're not sure if we'll end up showing this part of the ship during the game- but if we have time, we'd like to!
A concept we are testing based on what we learned with eliminating nausea in the cockpit- putting a horizon around the player to make them feel anchored as they move.
An unpainted model of the final cockpit design.
A test involving spraying water against the glass. The effect is really impressive.
All of our hard work is paying off. The cockpit is becoming more of a dramatic stage where things can play out in a memorable and unique way.
To be Continued...
Want to try out our demo and sit in this cockpit? You can! We're going to be exhibiting a demo of Life of Lon at the VR World Congress in Bristol UK this coming April! If you plan on attending, make sure to stop by and say hi. We'd also love it if you could give our Twitter a follow or like us on Facebook! Stay on top of all the new behind the scenes info as we bring Life of Lon to a VR headset near you.
Full body VR is coming. Take one look at Stress Level Zero's forthcoming Vive game Hover Junkers, and you'll see lots of crouching, ducking, dodging, turning and reflex usage. This is a very exciting time for most of the gaming world- but one group of people could get left in the dust if we're not careful: The disabled.
People with disabilities have gotten pretty clever at gaming in a world designed for the fully-abled. Youtuber MacsHG, aka "The Handless Gamer," uses his feet to play games competitively. Or have a look at Broly, an extremely talented competitive fighting gamer who uses his mouth to play. Recently, I came across a blind gamer named stirlock who had some things to say about how he plays a smartphone game called Final Fantasy Record Keeper without eyesight.
What will happen when we make the switch from gamepad and keyboard based controls to full body tracking and motion controllers? And what can be done for those who are bedridden, wheelchair bound, or disabled in some other way?
Accessibility and You
Accessibility is a huge deal on the web. There are a lot of things to consider when building internet content. For example, text size must be scalable for those with poor vision. If we build web content without taking disability in mind, we close people off from our message and experience. We may lose out on their sale, or their ability to enjoy everything the web has to offer.
So too, accessibility in VR will matter. Here are some ideas for making your experience more accessible:
Examples from the history of modern signage are great models to follow when creating text in VR. Readable from a distance, bold, high contrast, and avoiding unusual typefaces. Iconography is often used to communicate concepts along with text. Many users will be colorblind, so putting colorblind test modes on your camera is one example of how you can make your experience more accessible.
Can you degrade to a seated experience?
There are a lot of standing experiences coming- and this is exciting news for the future of gaming- but not everyone can stand or move around. There will be many kids and adults in hospital beds who are tethered to machinery that won't be able to get up and swing their arms. Can you figure out a trimmed down version of your experience that works sitting down?
Positional audio will be critical
VR will be just as fun for the visually impaired. With positional audio, the blind will be able to hear all kinds of new and immersive environments. It's going to be more important than ever to put lots of effort into sound design.
Focus a great deal on comfort
Making experiences that don't cause nausea is a given. Have a look at our overview of comfort for ways to give your users a more comfortable journey.
We need to come together on standards
Within 6 months of the writing of this article, consumer VR will be here. The world is going to change as content consumption changes. It's time for the UX field to gather together and start working on standards of usability and accessibility.
The emergence of the PC Master Race has shown that high quality home gaming systems cost less, have cheaper game libraries that go with you from PC to PC, and have enough modularity to keep gamers from having to buy a whole new PC every year. To put it bluntly PC's are winning.
Consoles Used to be Top Dog
I've been PC gaming since the 80's. I remember playing Battle Chess in 1989 on my dad's 486. And I will never forget the emergence of Commander Keen, Wolfenstein 3D and Doom- all PC games that redefined their respective genres.
But back then- consoles were still king. Every kid on the block wasn't PC gaming- they were console gaming. I remember going over to my next door neighbor's house to watch my friend's mom play Super Mario Bros with a lit cigarette clenched between her teeth. I remember being astonished by the gameplay and graphics which were far superior to the Apple II games I played in 1st grade or the Atari 2600 games I had at home like ET and Breakout.
Consoles had survived the market crash of 1983 and were here to stay. This ushered in a sort of golden age of gaming during the 90's- where triple A studios were forged and gaming hit the mainstream. Every home had a console it seemed, and every kid in class knew the kids who believed that Genesis does what Nintendon't. We saw the pinnacle of sprite gaming with the release of masterpieces like Street Fighter II, Donkey Kong Country, Chrono Trigger, Final Fantasy VI, and The Legend of Zelda: A Link to the Past.
But something changed around the time of the release of the Playstation and N64- PC gaming started catching up. Suddenly we were seeing releases like Half Life, Quake, and Warcraft which were offering experiences that simply couldn't be matched on consoles (despite often lackluster attempts at ports). In 2002, a very buggy, very ugly, but very promising digital game management software called Steam got started, and consoles met their first major competition ever.
Enter: The Golden Age of PC Gaming
It could be argued that the golden age started in 1998 with the releases of Half Life, Grim Fandango, Fallout 2, Unreal (which was the launch of the Unreal Engine) and Starcraft. In 2000, we saw Deus Ex, The Sims, Diabolo II, and Escape from Monkey Island. Then came 2002 which heralded Neverwinter Nights, Battlefield 1942, and Warcraft III. But holy cow, did things start to happen in a big way in 2004: Half Life 2 and World of Warcraft- arguably two of the most important PC games released during the 2000's.
Since those days, Steam has become the must-have PC gaming distribution platform, and the release of Minecraft in 2009 has changed the entire landscape of both PC and console gaming forever. Now indie developers have the powerful tools they need to make high quality games.
Today in 2015/2016, the rise of indie gaming has spread the marketplace out and given players more options specific to their tastes. Titles like Journey and Limbo have taken us into deeper psychological territory than we've ever gone. And now we are on the cusp of a new era- an era of immersive gaming like we've never seen: The VR gaming era. With VR devices due out in 2016 for both PC and console, what will this mean for the future of console gaming?
Could Playstation VR be a Game Changer?
There's no doubt about it, Sony has gone all in with VR. They recently announced over 200 developers working on content for the PSVR. Have a look at a recent showreel that demonstrates the experiences they are going to bring to players:
I don't know about you but that video got me amped when I first saw it.
The Ubiquitous PS4
Despite the dip in sales, there are PS4 units in a lot of living rooms.
By the time the Playstation VR hits store shelves, this number could be as much as 10 million units higher. It's easy to imagine a craze for PSVR as so many gamers will be ready to use it. Bring it home, plug it in, and you're in VR. This ease of integration into the home will make it stand out in an interesting way from the Oculus Rift or HTC Vive.
Performance May Become Irrelevant
One of the biggest problems with modern day console gaming is that it can't even come close to comparing with PC's when it comes to frame rate and resolution- but that might not forever be the case. There is a limit to the density of pixels the human eye can see, and streaming game services that process on the server side may make hardware irrelevant. Playstation already offers a service called Playstation Now that streams video games straight into your PS4 as if you had downloaded and installed them. The console industry may actually be saved by software- exclusive franchises that players want to keep paying for. Another selling point may end up being the ease of getting setup with VR without souping up your computer. Streaming games, hitting that maximum resolution, exclusives- these things might not only stop the bleeding for consoles, but restore them to health.
Keeping it Optimistic
As an enthusiastic console and PC gamer, I hope we don't ever see the death of consoles. More innovation is better- and as gamers and enthusiasts, we're all in this together. It's true that PC's are far out ahead right now- but we can't forget the lasting legacy that consoles have brought us. The Playstation VR will be an HMD to watch. I'll be getting mine on day 1. Here's to seeing consoles restored to their former glory.
Hear that sound? It's the robots. They're coming for you. Well not you specifically, but your job at least. Automation is on the upswing and it's only a matter of time before we're all thinking about how to make ourselves more employable. "Automation?" you say. "Surely not me! I am a (designer/developer/architect/content creator)! I can never be replaced by robots!" Au contraire my friend- the robots will one day come for your job too. Don't believe me? Have a look at this 15 minute documentary called, 'Humans Need Not Apply.' See you after the jump.
The Robot Revolution and You
Ok so that video might have been a bit depressing. I can't make any promises that, 'everything will be ok' and 'it'll all work out in the end' because we've never been here before in human history.
So how does this affect you specifically? Well, you could have a look at this scholarly paper by the Oxford Martin school which lists (at the end) 702 careers and how likely they are to be automated. In a more recent paper by the same group, it was estimated that 47% of all US jobs were at high risk of automation, with another 19% at medium risk.
Within the next few years, we're going to see the collapse of the driving industry as self driving cars take over the roads. That friendly Uber driver you just thanked? He's going to be looking for work real soon. Your truck driving uncle? Same. Our labor force is going to be facing a drastic turn very soon- and many highly skilled workers will join them. If only there was a growing tech industry with zero automation that lacks manpower right now. Oh wait.
VR/AR is the Answer...For Now
The VR/AR industry is projected to be big. Not just big- MONSTER big. Fortune estimates that VR and AR will be a 150 billion dollar industry by 2020- only 4 years away as of this writing! That's stupid fast growth.
The thing about automation in VR is that the industry has to understand it before it can automate anything. We can't automate virtual worlds because we haven't even begun figuring out how to build them. We can't automate virtual games because we've only scratched the surface of what's possible. Virtual productivity applications? Virtual search engines? Virtual web browsers? These things only exist in their primitive infancy and won't reach full maturity for at least 5 years. And the hardware- while we are already seeing great consumer grade hardware coming out (such as the recent Gear VR launch), we are probably at least 5 years away from ultra high resolution, so-good-it-might-as-well-be-real quality of experiences.
The point I'm trying to make is- if you invest your brain into learning VR right now, you're going to stay employed.
Think of the folks who learned how to build smartphone games and apps when the iPhone came out- how much cash they now roll in. That could be you. That could be me. We're right here on the cusp of the Next Big Thing.
Are you ready?
Motion sickness, eye strain, drowsiness: these are just a small handful of things that VR creators need to be aware of when designing immersive worlds. Oculus shares a best practices guide that provides tips for how to reduce these physical discomforts, such as minimizing latency, adjusting field of view, and trying not to send your users through an endless series of upside down roller coasters.
It’s hard to find any documentation highlighting the possible emotional effects of VR, though. That might seem less important for now, but as VR creators, it is our responsibility to evaluate and respond to the physical and mental impact that our immersive worlds could have. Because VR is still in its infancy, it can be hard to understand or measure that impact.
But that doesn’t mean that we can’t start researching now!
THE POWER OF PERCEPTION IN VR
Imagine you are a burn victim. You’re lying on your hospital bed, anxiously awaiting the nurse to re-bandage you.
Before the pain sets in, you open your eyes to see that you are suddenly floating on a boat down an icy river, completely surrounded by a land covered in snow. You pass by a cluster of penguins hanging out on icy cliffs, and just below them you see a giant woolly mammoth taking a stroll. Suddenly a gang of snowmen start chucking snowballs right at you.
Within this time, you didn’t seem to notice that the nurse was changing your bandages; a procedure that usually triggers excruciating pain. That’s because you were present in a world other than your own.
Developed by a R&D lab in the late 1990’s, SnowWorld is the first VR environment designed for pain control. It was made not only to distract burn victims during treatment, but to also transform their perceptions through VR. SnowWorld is still being used today in hospitals across the world, and in many cases, has proven to be just as effective at pain control than morphine.
This is just one example that proves how powerful VR really is. Television, tablets, computers, and video games simply cannot provoke the level of presence that VR does. As VR creators, we can’t rely on past research or assumptions that have been applied to other digital platforms. So how can we begin to form best practices for VR design?
Let’s start off by looking into how virtual interfaces interact with the brain.
WHAT DOES VR DO TO THE BRAIN?
We are all well aware that VR is great at making us feel immersed in another world. As VR creators, it is valuable to also understand how VR applications interact with our brains.
By referring to memories and cognitive patterns that we have developed throughout our lives, our brain is constantly interpreting the photons that transfer through our eyes. So when we throw on a head-mounted display, we process visual cues within the virtual world using the same neural circuits that we would in the real world.
High-level cognition areas in the front of the brain know that the virtual world isn’t real, but the back of our brain tricks us into thinking that it is. In fact, its trickery is so effective that VR can make people tumble and sweat even when walking across carpet because their mind is momentarily convinced that they are wobbling on a wooden plank hundreds of feet above a busy city street.
That means that both the real and virtual world can trigger nearly identical chemical reactions in the brain, even if the virtual one doesn’t look fully realistic. All of this contributes to the level of presence, which is ultimately responsible for evoking “the same reactions and emotions as a real experience.”
HOW CAN THIS BE APPLIED TO VR DEVELOPMENT?
After taking a look at how easily the human brain can adapt to the “virtual being”, there are some serious questions that we need to start asking earlier rather than later. Some of them include:
"Will interactions we learn in the virtual world carry over to the real world? How will this impact our natural behavior?"
"VR has been used to treat things like fear and PTSD, but is it capable of causing those?"
"Can virtual events/interactions provoke emotions we wouldn’t otherwise experience in the real world?"
"Can violence in VR provoke a user to grow more aggressive in the real world?" (As we all know, violence in video games is a heated topic, but violence in VR cannot be evaluated the same way due to a differentiating factor: presence. There's already plenty of shooter games in VR, so this will definitely be an important research area.)
These questions are going to start off broad. During testing and research, they should be broken down into items that are measurable both quantitatively and qualitatively.
Of course, alongside the psychological impact of VR, it is still extremely important to test the physical effects of VR, like motion sickness and eye strain. Just be sure to give a good amount of attention to both areas.
STEPS YOU CAN START TAKING NOW
Test Often and Reel in the 'VRgins'
If you’ve run testing sessions, you know how eye-opening even a single participant study can be. Make it a habit to test your virtual world on a regular basis, with all types of users--especially those that haven't tried it before (“VRgins”).
We don’t know how the public will react to VR, which is why it is crucial that you keep an open-mind to who you recruit for testing. We may discover that Group A responds significantly different to VR than Group B. It is our responsibility to factor that information into the next iteration of our designs.
Study up on Psychology and Neuroscience
Consider recruiting someone with a background in psychology or neuroscience. Research is such a critical step in the VR development process, but sometimes it can get crazy when a smaller team has to wear a lot of hats. It’s super helpful if someone with the expertise can devote their time to identifying how a VR experience impacts not just the body but also the brain.
Design for the Long Term
VR is awesome. My morning 'commute' the other day was flying through space in the Oculus DK2!
We get excited about this stuff, and that’s what drives this technology forward. But as we start to progress further in the VR world, it is important to ask ourselves “why” we are making certain design decisions and then questioning and researching what type of impact they may have:
Why are we implementing [feature name]
What impact may this have on the user?
For Example: Why are we choosing to splatter blood on the camera when the player shoots a zombie? Which types of emotions does this trigger?
This is the same thought process that goes on in UX design for any type of platform: websites, apps, games. But it will be especially important for VR, since the design decisions we make can affect users on a deeper, mental level.
WE REALLY DON’T KNOW ANYTHING ABOUT THE FUTURE OF VR
It’s hard to predict how the public will respond to VR once headsets appear on store shelves. I am surrounded by people whose eyes light up as they preach ideas for VR, but there’s also a lot of people that are more hesitant.
While writing this article, I had a discussion about the future of technology with a man in a coffee shop. He expressed the fear that if people are present in virtual worlds for too long, they could lose touch with their real-life senses. We can’t prove this yet, but it is a valid fear to have.
So should we be terrified about VR? Will it turn us into the soulless humans in WALL-E? I don’t think so.
Right now, the opportunities outweigh the risks. Plus, VR is key to welcoming the transition to even more futuristic technologies (cough cough AR) that we can interact with minus a chunky brick strapped to our face.
As VR creators we must always keep the mind in mind as we envision our next creations and pave the way for mixed reality.
This article contains a 20 minute video tutorial about creating immersive audio in VR. Skip to the end to watch it!
Anyone who has tried modern virtual reality technology will agree that it offers a level of visual immersion far beyond what past technologies were capable of. That first time experiencing a sense of presence, that “is this real?” moment, is unforgettable.
I see VR as the next great leap in sound design. Presence doesn't only come from what we see, but from all senses. Since we can't (yet!) control what you touch, taste, and smell, we have to rely on visual and audio cues to immerse the player.
The brain has this amazing ability to interpret its environment through what we hear, and we need to exploit that. We as sound designers have some amazing tools at our disposal, such as the sound engine Wwise, which Block Interval is using for Life of Lon. More important than the tool, however, is how it's used, and I believe we have broken some new ground in this area.
BUILDING A SEASCAPE
Most of the gameplay of Life of Lon will take place underwater, and one tricky part has been expressing this to the player. From making underwater recordings using a hydrophone, watching videos taken underwater, and just dunking my head in, one conclusion I came to was, water sounds boring! Everything sounds dull and lifeless, and the last thing we want in VR is to make our environment sound dead and flat. Add to that the fact that the player will be wearing a virtual helmet (which in real life would make things sound even more claustrophobic), I realized that realism just won't cut it.
So I set out to create a unique underwater ambience, that sounds exciting and vibrant, but still gives the player the feeling of being underwater. I don't necessarily want to slap a filter or effect on every sound effect in the game. After all, would you want to play a game where everything sounds dull or warbly?
Instead, I'm using an ambient backdrop to give the player a sense of space. Think wind noise, but with water. The backdrop I created is made of dozens of hydrophone recordings made from a stream by my house. Some spots were turbulent, some were relatively calm, and by filtering, pitch shifting, and blending them, I'm able to create a wide variety of ambient loops. I created 4 different “turbulence” levels, from calm to choppy water.
I blended the loops in Wwise using a blend container, and tied it to a game sync called Amb_Turbulence, with a range of 0 – 100. Using this game sync parameter, I can change how turbulent the water sounds around the player at different spots in the environment. This backdrop will be a stereo file, non-positional, and gives the player the sense of a dynamic, changing environment.
All of this is well and good, however integrating these parameter changes into Unity in a way that was easy to use, streamlined, and versatile took a little finesse.
PAINTING WITH SOUND
In order to create this truly dynamic environment, I had to make a custom tool in Unity. What I wanted was the ability to create zones where the backdrop would change smoothly and seamlessly. If you swim through a current, or through a tunnel with rushing water, you don't hear the current come from a point source; you hear the water around you become more agitated.
The central idea to express here is that the environment is a character. You are surrounded by water and it's alive. This tool allows me to do that. It has four main components. Emitters, Groups, Zones, and Emission Points.
- Emitters are GameObjects that the Wwise 'Play' events are attached to. They also contain scripts defining the Game Syncs, which control the RTPCs in Wwise for the blend containers.
- Groups are containers for Zones and Emission Points. They control which Emitter the zones control, which Game Sync(s) they affect, and control the overlap behavior of the zones. If two zones are overlapping, do we average the values, take the loudest, or take the quietest? This can be set in the Inspector for the Group.
- Zones are the areas where the sound is affected. They can either be box shaped or spherical, and when the player moves through them, they set the Game Sync to a certain value, as determined by a slider in the inspector. The box shaped zones can be tapered across the x/y/z axis to give a smoother transition, and the spherical zones can be tapered from the origin to the edge.
- Emission Points are optional objects that, when added to a Group, will cause the sound to taper out from either a point source or a line, adding more flexibility to the taper behavior of multiple zones. They are represented by a wire sphere or cylinder gizmo.
Multiple groups can affect the same Emitter and parameter, and groups can be children of any GameObject in the scene, so they can move with it.
Here is a video demonstrating how I can dynamically change Wwise game parameters using this tool. Right now I just have a simple blend container with a basic turbulent water sound, but this can be applied to multiple (or nested) blend containers with several parameters, which will give me even more control over the soundscape.
Thanks for reading- hope it's been helpful as you work on making your experiences more immersive!
Palmer Luckey, founder of Oculus, had something very interesting to say about cables recently.
"Cables are going to be a major obstacle in the VR industry for a long time. Mobile VR will be successful long before PC VR goes wireless." Source
He followed that up with two more caveats:
"It is important to design both hardware and software with those limitations in mind. Real users won't have cable servants." Source
"And I say this as someone who has spent many hours as a cable servant, dancing cables around users to keep them immersed!" Source
That's right, cable servants. People who help you through your VR experience like a caddy on the golf course.
Cable servants work great if you're giving a demonstration or doing game development, but that's not very practical for most people. For this reason- Palmer is right: Mobile VR will probably outsell desktop VR. I would speculate that price will be another factor. Mobile VR (such as google cardboard) has an under $20 point of entry vs. the cost of a good PC and the $300-500 for a consumer grade head mounted display like the HTC Vive of Oculus Rift.
What Are The Usability Concerns?
If you are building content for VR, you've got to consider the cable. Will you be building for standing or sitting? If you're building for standing- does your experience involve moving around on the ground like a fish out of water? Watch this video for an idea of what that might look like:
Maybe experiences that take you all over the floor and over your own cables aren't the best user experience. Let's take a look at 4 use cases for cables.
Case 1: Wireless
This is the absolutely best case scenario. Mobile VR like Google Cardboard and Samsung's Gear VR are completely wire-free, meaning you can build experiences that are standing, sitting, or whatever else you can dream up. The only caveat of course is the head tracking. Without an external head tracker (such as the Vive's lighthouses or the Rift's motion tracker), mobile VR headsets can't pick up all that extra movement data. That said- head tracking in mobile VR can't be too far away. This brings us to...
Case 2: Seated and Wired
Despite the lack of freedom to completely move around, seated and wired is a great experience in VR. The head tracking of the Vive or Rift make for impressive immersion, and because you aren't moving around, there's no risk of getting tripped up over wires. Expect to see a lot of really good seated experiences in VR- racing games, space sims, and adventure games in vehicles. Our very own project Life of Lon will be a seated adventure game that plays out from within a submersible!
Case 3: Small Room Standing and Wired
This could also be considered a hybrid experience as maybe you want to allow standing and sitting within the same experience. The advantage here is giving players more room to move their arms around to interact. This setup would be ideal for experiences that make use of the Vive controllers or the Oculus Touch (when those release). Seated players may not have as much room to move their body around as standing players- so consider a standing experience if you need the room and don't need to move the player much.
Case 4: Large Room Standing and Wired
Ahh the ultimate dream. The holodeck. A fully realized world around the player with full immersive tracking. But what to do about that pesky wire? Until someone invents some kind of crafty way to move it around you as you play your VR games, consider creating experiences that allow some mobility without moving in such a way that will overlap the wire. Stress Level Zero is working on an impressive game called Hover Junkers, and it looks like they've done a fantastic job of making the experience work without any trouble from the wires. Have a look at the gameplay:
Hopefully we've given you one more thing to think about as you refine your VR User Experience. Wires are a consequence of the limitations we have today in VR- but someday they will be a relic of the past! Until then, happy building!
We are on the cusp of a new era in technology. We've weathered the mobile and social revolutions, and are poised for the virtual and augmented revolution. But what does that mean for user experience? For the past decade, UX has been firmly rooted in the world of 2D. Sitemaps, wireframes, user flows- all two dimensional constructs to organize information for a two dimensional web. What's going to change?
The Interface is Dead, Long Live the Interface
With virtual reality and augmented reality, we are seeing a shift in what is possible on an interface. How about subtle finger tracking?
With the kind of technology in Project Soli, we may not need peripherals anymore. How about hand tracking in VR?
For the remaining few who are still not impressed, how about eye tracking?
The interface is about to change quite a bit as we shift to more hand and eye-centric methods of interacting. Expect to rethink your interaction design within the next few years.
Data Visualization Goes Virtual
Charts and graphs aren't going to be enough to represent real time or big data. We've gotten a taste of something better. Soon you will be able to represent data for 1,000 people by seeing 1,000 digital people. Here's a showreel that gives us an idea of the possibilities.
How about visualizing a database in VR?
UX professionals are going to need to form close partnerships with motion graphics and 3D modeling professionals to visualize data in new and more personal ways.
UX Architects Will Become Environment Architects
Everything up to this point has been blueprints. Schematics, wireframes, concepts. In the coming year, user experience designers and architects will be building worlds that users will be experiencing from the INSIDE. If you haven't seen Mike Alger's thoughts on this- have a look:
Are you ready?
UX isn't dying, but it's about to fundamentally change. Are you prepared? The ones who aren't will be caught off guard and left behind. Start buying VR and AR devices now and begin testing them out. We have a lot of work to do to determine the UX best practices in virtual reality.