(If you would like to receive an email update when I release a new devlog post (approx. 1 email each week), click here to sign up for my VR devlog newsletter!)


(If you missed my intro devlog post, you can read it here! If you want to skip straight to downloading the build, scroll down to the bottom!)

SceneAnimated

Intro
Wouldn’t it be awesome to get lost in a rainforest, get kissed by a magical iguana that transports me back in time, and then stumble upon a tribal village? #showerthoughts

For my first VR prototype, I wanted to keep things simple and make a small, contained, exploration-based experience that allowed me to focus on ambience creation and 3D audio implementation (and, perhaps in the future, magical iguanas). Inspired by some of the VR tech demos I’ve played, I kept imagining this scene walking around a forest and stumbling across a cabin. There would be various objects in this cabin that you could interact with, and by exploring this space you could learn about the person who lived there. I definitely want to explore this kind of environmental narrative direction in the future, but for now this cabin idea eventually evolved into a jungle rainforest scene with a tribal twist, so I decided to roll with that and come back to the cabin idea for a future prototype.

One of my best friends, Alex Mathew, is a brilliant game designer and game developer. I told him about my idea, and he helped me get up and started with a plan to keep me on track. I decided to use Asana as my task management tool, as I have enjoyed using it in the past for previous projects and appreciate that I can see a graph of my progress. It’s always nice to look back on a day/week/etc. and see how much you’ve accomplished!

AsanaTasklist

Starting my Asana task list (with the reminder that things take time)!

Alex assumed the role of my producer last weekend, so we sat down together over Skype and brainstormed some of my initial tasks. Effectively, the plan was to have week 1 be focused on making an alpha build, week 2 for beta, and week 3 for final polish and release. The intention with these prototypes is to experiment with whatever it is I want to explore, and ultimately to have a polished and playable experience I can add to my portfolio by the end of the development cycle as well.

For this first week, I wanted to have all of the core elements in there: the level/art roughly mapped out, the core sounds/music (even if placeholder) spatialized in the scene, and some basic gameplay functionality. With my tasks for the coming week mapped out, it was time to get started!

Initial Design & Visual Aesthetic
Before I opened Unity and touched a single prefab or line of code, it was important that I nailed down the design a little more and understood what I actually wanted to make. I spent some time on Monday writing out a 1 paragraph description of the game and some of my player experience goals.

GameDescription

Initial design documentation

Also, to keep organized with art and sound assets, I made spreadsheets for each and brainstormed every single asset I could think of that I’d need to find or create.

ArtList

The start of my sound asset list

With more clarity about what I wanted to make and have the player feel/experience during this prototype, I was ready to start blocking out the level. I started by mapping out my level on paper, and quickly transitioned to blocking things out in Unity itself using some free assets I found.

LevelBlocked

Beginning to block out the level

With the level very roughly laid out, I wanted to find some assets to help me fill out the visual aesthetic further. I wanted to make this project look as good as possible given my budget/time frame to help increase the sense of presence and immersion in this world. I know that focusing only on art and trying to find the prettiest thing possible is a deep rabbit hole to be diving into, so I made sure to give myself a time limit when hunting for my first pass of art assets.

JunglePack

Preeeeeeettyyyyyyyy….

There were a handful of assets I found that were free but didn’t end up working properly upon import or didn’t look the way I wanted, but I eventually found a really beautiful asset package on the Unity asset store that really appealed to me: the Stylized Jungle Pack. It contained everything I needed to get started creating this scene (trees, terrain textures, particle effects, etc.), which was exactly what I wanted.

After that, I also found some hutsa campfire, and a handful of other assets.

Since I am creating these experiences by myself under a limited time frame, I felt it important to make quick decisions and keep moving forward steadily. I could always come back later to clean things up, but for the sake of moving forward and laying out the level, using a pre-made art asset bundle like this was perfect for helping with my prototyping purposes.

It did push me a tiny bit over budget however (I had initially set a cap of $10 for this project), but for how well it fit my purposes and how pretty it looks, and the fact that I will be able to use this asset pack again for future projects, I found the purchase justified. Even so, going over budget is not something I intend to make a habit out of on my future prototypes.

VR Hiccups
After collecting the assets I wanted to use for my first art pass, it was time to get my VR headset hooked up and ready to roll. I have an Oculus DK2 and have used it to play games on other friend’s machines and for my own developments as well. Previously, I was using my old Macbook Pro laptop to do my VR dev, but since it is about 5 years old now and is on the verge of self-destructing, I figured my old PC would do a much better job.

I grabbed my PC out of storage, hooked it up, ran the Oculus Compatibility Tool and…

OcuSad

LOLOLOLOLOLOLOLOL… T___T

GraphicsOutofDateFAILED! On every level except for me having Win7 as my OS. #OcuSad

Not entirely sure what I was thinking, honestly. Guess I had a giant brain-derp moment and forgot that my PC is even older than my laptop. It was a pretty sweet machine at the time, so maybe there was some part of me that was wishfully thinking that it’d be able to handle my DK2.

I can hear you all now: “Jacob, you silly mega-n00b, you’re making prototypes for virtual reality and you’re trying to do it on a machine that’s older than your great grandma?!” Well dear readers, my answer to you is: Yup! At the moment at least.

This is just a small hiccup in my development process. I am now saving for a new VR powerhouse PC that meets the recommended specs and, in the meantime, I realized that I could still get a large amount of work done without the HMD. Of course, it’ll be hugely beneficial to have the HMD working from the start, as that will help inform my future design decisions. But for now, I felt it important to keep moving forward and get done what I could. All this means is I have to be a little more resourceful right now – can’t test the VR headset on my own pc? I’ll just borrow a friend’s computer who can run it. There’s always a way to make things work!

So, putting my DK2 aside for now (I’m sorry, Archimedes!), I hopped back into Unity to get back to work.

Art Pass
With my assets downloaded, it was time to start filling out my level. I loaded up my jungle pack and started building out the scene more, placing trees, huts, particles, and the like. It ended up looking like this:

ArtComingTogether

ArtComingTogether2

And eventually like this (with a 3rd hut and a fire pit!):

ArtComingTogether3

Creating and Implementing Audio
Now that I had a pretty good start of a scene, it was time to bring it to life with audio! I hopped in my DAW (I use Logic Pro X) and found some ambience recordings from various libraries, including some of my own recordings. I dedicated most of my Wednesday to audio creation and initial implementation. I decided that it would be better not to cross wires here: have development/coding be on separate days than audio content creation, whenever possible, since it can be difficult for me to quickly/effectively switch between these different types of tasks that require different ways of thinking.

I created a large handful of short mono loops of birds chirping, leaves rustling, and wind blowing. Most of these sounds were edited fairly quickly, so I guessed that there would be a certain level of repetitiveness when placed in the scene. I wanted to focus on getting something in there sooner than later though, so I decided that I would do another sound pass in my second week to clean things up and diversify the sounds a little more. With that, next up came audio implementation.

The cool thing about implementing audio in VR is that there are tools that use algorithms to model what the sound would actually sound like as if it were in the real world. This takes into consideration its location in space not just horizontally, but vertically as well. So if a sound is coming from up above you, it will accurately sound like it is above your head. This might not seem like much, but this type of implementation and spatialization doesn’t really exist in traditional 3D games. This is part of the reason why creating and implementing audio for VR is so exciting, because it allows the player to truly feel present in the game world, since the sound cues are reacting in the space as if it existed in the real world.

There are a handful of these binaural/3D audio tools that can be used. Anastasia Devana did an awesome comparison of them in this article a while ago. I am friends with the gentlemen over at Two Big Ears and thus have used 3DCeption a few times for some earlier projects. I really like how easily it integrated into Unity and how intuitive it was to use with my workflow.

Since I have a little bit of experience with 3DCeption, I decided to give the Oculus Audio SDK a shot. To start, and for the sake of hearing things in the scene sooner than later, I decided to implement directly in Unity. However, I plan to spend part of weeks 2 and 3 of development transitioning over to Wwise (using the Oculus Audio SDK plugin for Wwise) and implementing my sound assets that way.

I began placing all the sounds I had so far around the scene, up in trees and down in bushes, with a heavier wind sound appearing overhead once you enter the clearing. It ended up looking something like this (each purple thing is an audio cue):

AudioPlacement1stPassDone

Next came the slightly more tedious part: getting things to actually sit right in the space. Every sound has a ‘near’ and ‘far’ parameter that I could tweak. ‘Near’ means the distance at which I could hear the sound before it starts attenuating (decreasing in volume). ‘Far’ is the maximum distance I can hear the sound before it falls off in volume completely. Balancing these numbers so that the sound reacts right in the space is an incredibly fiddly thing.

For my first pass on this alpha build, I ultimately settled on 30/60 for near/far for most of my sound cues (though a few of them are set at 30/100). I notice though that the falloff of sounds happens too quickly IMO (when I walk to the edge of the ‘far’ limit, the sound suddenly cuts out much quicker than I would expect and it is jarring), so this is definitely something I am going to need to go back and tweak. I also didn’t play around with room modeling/reverb zones yet, but I am eager to experiment with some convolution reverb in Wwise to see how that helps the sounds sit in the space. As far as a rough first pass though, I am pretty happy with how things sound!

I spent Thursday working on a VERY basic music track, with a few drum layers and a kalimba. Music in VR is something that excites me to explore since it raises the most question-marks for me. This is such an unexplored territory, so there’s not really any one “right” way to do things. I have heard other people’s takes on it, and based on my own research and experimentation, I am currently in the camp that all music cues should come from somewhere in the space, not just be slapped on as a stereo file.

Thus, I placed some cubes around the scene as my placeholder instruments (I will update these with actual instrument models in the second art pass), and attached the music stems to them. (Quick side note: getting into the habit of exporting ALL of my sound and music files as mono will take some getting used to.) These music layers also used the same 30/60 settings.

Day to Night Interactivity
At some point during the week, Alex and I came up with a loose narrative idea for the project. It was decided that after the player explored inside all three huts, the sky would fade to night time and you could then go sit/stand(?) by the campfire. As the fire burned and embers rose, you could look up at the starry night sky and watch as each ember from the fire then turned into a new star. We thought it was cool to have this hint of a narrative of this whole experience being part of a ‘tribal star creation ritual’. That ember -> star mechanic is for week 2, but this past week I implemented the triggers and used the UDay Cycle asset to help with the day/night transition.

I have a little bit of background in programming (I used to make a ton of Flash games back in high school), but if you knew me from a few years ago, you’d look at me super funny to hear me say that I now actively enjoy coding. I used to be super afraid of it when I was studying in undergrad because I would constantly feel confused and bash my head against a wall trying to learn it for my assignments. At the time though, I think this resistance was partially stemming from my stubbornness in wanting to *only* be a composer (and that’s it damnit! *folds arms, pouty face*), and that was what held me back from getting into it more at the time. Now, I think scripting in Unity is a fairly easy way to get into programming, because you (generally) get immediate feedback right there on the screen when you write something, so it’s very validating when you do something right.

If you are interested in doing any form of game development, I highly encourage you to try to learn the basics of programming if you don’t know already. There are tons of free resources out there. Even if you want to remain strictly as an artist/animator/audio person/etc. and never intend to touch code, I still think it’s important to at least understand the fundamentals (same goes for programmers learning about their teammates’ respective skills). Knowing how to speak the language of other people on your game teams will enhance your game development experience. Having that inherent (even if very basic) understanding of each other’s vocabulary will allow for smoother communication, which in turn helps the design and implementation of those ideas flow a lot smoother.

*steps off my soapbox* Ahem.

Anyway, for those curious, here’s my code for the triggers that were placed in each hut:

CheckTriggerCS

And here’s my code for a controller object I created to handle the day to night time transition:

DayNightController

(Note that I’m not an expert programmer by any means, so if you have any suggestions on how to improve my code/do things more efficiently, I’d love your feedback!)

And with that code (which references DNC_DayNight.cs, a script as part of the UDay Cycle asset), it turns from day time to night time after you explore all 3 huts!

Vive Demo
Quick aside here: I had the opportunity to go to Valve HQ on Thursday to get my very first demo of the Vive (thanks Ricky for running my demo!). While there are certain parts about my demo experience that they have asked me not to share, I will say that it was one of the smoothest VR experiences I’ve had the pleasure of trying.

I had hesitations at first since, to be honest, as the controllers look sort of awkward to me and I wasn’t sure how responsive physically walking around the space would be. But as soon as I had the headset on and had the controllers in my hands and started walking, I was completely won over. Everything about the experience was smooth and felt incredibly intuitive. In one of the demo experiences, I was interacting with the space before the controls could be explained to me by my demo leader.

The intuitiveness, smooth gameplay experiences, and the fact that I could physically walk around in this small area instantly transported me into the world I was in. I haven’t felt a sense of presence that strong in a while, and if left to my own devices, I would have stayed in there for hours. It made me INCREDIBLY grateful to have the privilege to work in VR right now and I am now even more excited to see how VR as a whole evolves from here.

Using the Vive raised some design questions, and if you have any insight about this, I would love to hear your thoughts. Given that you are standing and walking around so much using the Vive, I would imagine that players would get tired sooner from this movement. Thus, the game play would have to be cut shorter than if you were, say, sitting on your couch playing Rocket League for 12 hours straight (I totally haven’t done that. Nope.).

I wonder how this will affect the pacing of games. If you’re playing a longer form game, the design of the flow of the experience would have to cater to that potential physical fatigue (I imagine a VR game paced similarly to that of Brothers: A Tale of Two Sons could work really well, where it’s broken up elegantly into distinct segments that each take 40-90 mins or so). Or maybe not! I could be severely underestimating the fitness levels of the average gamer, or perhaps eye strain will kick in before physical fatigue does (which still would affect the pacing of the games). I suppose that’s something that will be uncovered over time, and I’m sure we’ll start gathering some really useful data once all of the headsets are available to the public in the very near future.

Wrapping Up
So, to recap: this past week I laid out a game plan moving forward for this 3-week development period of this VR prototype. I got an initial level design and art pass completed, and I created and implemented a first pass at the audio and music. I also began implementing the gameplay mechanic of the transition from day to night time!

AlphaDone

Alpha build complete!

This coming week, I will fill in many of the gaps missing for art/sound when I do their respective second passes. I will also go in and start implementing the sounds using Wwise, and tweak parameters from there. I am also super excited to dive back into coding mode and getting the embers-turning-into-stars feature hashed out.

While I’m pretty set on what I want to accomplish for this prototype, if you have any design ideas or challenges you want to toss my way, please leave me a comment!

Rainforest – Alpha Build
If you would like to play what I have so far, you can download the alpha build of this tribal rainforest scene via the links below!
(No VR headset required; these builds use a normal FPController prefab. Also note that this scene is not optimized yet.)

Download Rainforest – Alpha (PC)
Download Rainforest – Alpha (Mac)

***

Thank you very much for reading! If you would like to receive email updates when I release the next devlog, sign up for my VR newsletter here. If you have any questions/comments/suggestions/love haiku, please leave me a comment! Also, please let me know if there’s anything you’d like me to go into more/less detail about in the future – doing a devlog like this is a new experience for me, so feedback about what you all want to see/hear is appreciated.

Talk to you next week!

– Jacob