(If you would like to receive an email update when I release a new devlog post (approx. 1 email each week), click here to sign up for my VR devlog newsletter!)
My Vive is set up!
After a handful of firmware/driver updates and a little bit of wrangling with my room setup, I finally got my Vive Pre up and running! BWEBWEBWEBWEEEEEEEEEEEE *airhorns*
I’m ecstatic and incredibly grateful to be able to develop for this headset. There’s just something so cool and instantly immersive about being able to walk around in a space and using the controllers to interact with objects in the scene. I found myself playing some of the demos for quite an extended period of time, and had fun messing around with the built-in camera:
I am very eager to see how the headsets and controllers (and other peripherals) evolve from here.
I did run into a few hiccups along the way when setting up though. If you happen to have a Vive and are setting it up, a little heads up: at first, I was experiencing a fair amount of tracking issues with the controller and headset. This was INCREDIBLY disorienting. After several hours of researching and experimentation, I discovered that the light from outside was likely the culprit of my tracking problems. I closed my blinds and faced away from the window and things went back to running smoothly.
A Musical Prototype
I am heading out of town for a few weeks at the end of this month, so given these logistics, I decided to have this next prototype be only a 2-week long experiment. My intention here is to create a basic music visualizer/performance experience. You will be able to pick up a handful of cubes lying on the floor around you, and when you place a cube on a pedestal in front of you, it will start playing a music layer. Pending time, I will have some basic music visualizer in the scene to reflect the music that is playing. You can layer multiple cubes/music stems to ultimately end up playing a full song.
I figured this would be a fun way to create a sliver prototype of a larger VR music experience I want to eventually make in the future, as well as get my feet wet in terms of learning how to develop for the Vive and get the controller input/interactivity scripted.
I recently learned about a new game engine developed by Amazon called Lumberyard. Apparently it’s basically just CryEngine 3.8 with a few tweaks. I figured I’d give it a shot, since I really like how CryEngine looks. I downloaded it and messed around, but long story short, ultimately my verdict is that I’m not going to jump ship to using it full time just yet. My justification is that while it’s basically just a CryEngine clone right now, they apparently are planning on branching off and tweaking it to become much more of its own unique engine. I figured that if they are going to be making major changes to the engine in the future, I’d wait until the next major release before diving in too far and potentially abandoning Unity.
I really do want to mess around with other engines though, because I feel like some of the other ones out there make it super easy to make the scene look super pretty. It’s definitely possible to make things in Unity look pretty, I just feel like given my current skill set, it takes much longer to get the level of visual polish I want. I am very keen on learning more about lighting in Unity though, and might make that part of a future prototype.
Ultimately, I want my VR scenes to feel good, sound good, and look good. As such, I ended up experimenting a bit with CryEngine V. This new version (released only a few months ago) now comes with a Pay What You Want model, as well as apparently a whole slew of built-in VR tools to help make it easy to dive in and develop. Having played Everybody’s Gone to the Rapture a few weeks ago (which was built in CryEngine) and was massively impressed with how it looked, I wanted to give CE V a chance. In the coming weeks, I’ll slowly pick away at some of the tutorial videos I have queued up to see if using CE for some of my prototypes will be a viable option. I will also mess around with Unreal soon too, just so all of my bases are covered!
Basic Controller Input
So for this week, my goal was just to get my Vive set up and get basic controller input/object interaction scripted. I stumbled across a website called vrdev.school which just so happened to have a free course on VR dev for the Vive. I enrolled and watched some of the tutorial videos, which helped me learn some of the basics of using the SteamVR Unity assets to script the controller input. It was a little more involved of a process than I thought it would be, but ultimately I ended up getting object interaction (pick up, drop, throw) working!
I have also begun partially implementing some basic feedback for when you pick up an object (such as audio cues and controller rumble). Next up is to tie the cubes to different music stems so that you can place it on a pedestal and have it play a music track!
Seattle VR Hackathon
This coming weekend from the 22-24 is a VR hackathon! I am incredibly excited to go, as it will be a really fun opportunity to hang out with friends and flex my VR audio/dev skills. My goal is to walk out with a new polished portfolio piece, and I have some ideas already about what I might want to make. I will be sure to blog about my experience and post a link for anyone to download the build of what my team and I create!
If you are in the area and are planning on attending, please let me know! If you have never been to a hackathon and want to go or learn more but are unsure about what it entails, I highly encourage you to read this article written by my friend Eva Hoerth: “You don’t have to be a ‘hacker’ to attend a hackathon. Here’s 10 stories why.”
Thanks so much for reading and joining me on my VR dev journey! I’m very excited to keep developing with the Vive as I have SO many ideas of things I want to prototype in the future. I am not releasing a build of this scene this week but will definitely release next week when I have the final version of this Vive test project #1 complete!
Have a great week, and talk to you soon!