Over the past couple of years I’ve spent on ROBLOX, I’d like to think I’ve come up with some pretty crazy ideas for games. With the release of dynamic lighting, I was able to create games that looked precisely as I had imagined them. Then, with the recent release of uploadable sounds, I was finally able to bring scenes to life with audio.
As soon as sounds were released, I knew immediately I wanted to create a cinematic work. I saw the announcement when I was doing my math homework, and, well, math homework suddenly seemed less important. I flipped over a piece of grid paper and started to plot a storyline. Once I had some general ideas on paper, I wrote a title above my notes: Nuclear Simulation. With some scribbled notes and a general idea of what this was going to look like, I jumped into ROBLOX Studio and started building the first “scene” of the game.
To start, I knew I wanted to make a game that an experience. In order to do this, I would have take control of the player’s camera–not an easy task. The initial goal was to develop a sort of “checkpoint-based” camera system, where the camera fixes at preset angles and moves dynamically from point to point. I also wanted to give users a small degree of control, meaning they could look around from any fixed angle. This meant that I had to design two systems on top of one another: the first controls the interpolation of the camera (a.k.a the movement), while the second handles player interaction. This idea was inspired by one of the early Call of Duty games–I don’t remember exact details, but I remember there was a scene where you start as a hostage. The angle is fixed, but you can look around slightly using the analog stick.
It was time to start writing some code. I built a custom framework for camera animation using this website, which lists out many of the “easing” functions you can write in ActionScript. Though a different language, these functions are extremely easy to translate into Lua. Getting the camera to move from one point to another wasn’t enough–I also wanted to simulate sporadic movement, like sudden shaking and intense vibrations. This required additional code–though they were ultimately pretty simple algorithms.
Once I was content with the overall framework, I started to code the storyline–let me tell you, this code is not pretty. To help simplify, I wrote one master function called “Start” that controls every scene that occurs in the game. Different special events, like the spinning airliner at the end of the second scene, had to be given their own custom functions. The rest of the code is a series of function and method calls that communicate with the custom camera framework I had written. After spending some time with it, the code became easier and easier for me to read, allowing me very specific control of each scene.
While all of this is great, sound is what brings Nuclear Simulation to cinematic life. Without adding sounds, the scenes would have no context and the game would be boring. I have learned from being a game developer and lifelong gamer that sound is a make-or-break aspect of digital entertainment as a whole. Sound provides context in a way that no visual sense can. If you’re in a game, and you’re in a forest, you expect to hear the ambient noises that a forest makes. In fact, the absence of these noises is actually more distracting than the noises themselves. I wanted to craft an experience that was loaded with custom sounds, in order to make Nuclear Simulation really feel alive. I had a blast recording my own sounds, and even got to write a little music, like the guitar loop you hear upon entering the game.
Creating my own sounds was fun and a learning experience. I used an open-source program called Audacity, which I found to be fairly simple and intuitive. Many of the sounds I used were created synthetically using this software, like the explosion sounds and the sound of the airliner engine. I am in no way an expert in creating sound–to learn how to make these noises, I typed things like “how to make an explosion on Audacity” into Google. Other sounds were actual noises I captured myself, and some open-source sounds I stumbled upon online. The sounds of the jet came from a video I captured of an airshow five years ago. The burning fire from the plane crash was audio I captured from a fire pit in my backyard–I brought my laptop outside, hooked up a microphone, and just let it sit for a while. I synthetically added a breezy “wind” noise to the fire to make it sound a bit more eerie. The sounds of the passengers panicking and the tornado siren were open-sourced sounds I found–you’ve got to be careful with sounds on the internet, and pay close attention to their licensing. The “Nuclear Warning” voice in the beginning of the simulation was a friend of mine. I had her read the words, then did a lot of post-production editing to make the recording sound robotic and synthetic. We laughed at how different I was able to make her sound. I’m having so much fun creating audio–I’m planning on hooking up my electric guitar to a bunch of effect pedals in order to create more interesting sound effects. The biggest thing I learned from this experiment is that you can create awesome and useful sound effects by doing simple tasks around a microphone, like clapping your hands or stomping your feet.
As I write this, my simulation is not yet complete. I’m working on building the final scene, and can already tell you this: the ending is a real cliff hanger. Personally, I hate cliff-hangers. So I thought I’d use my second Developer’s Journal to ask all you readers out there: how should this end? I’ve talked about this with lots of people on ROBLOX and on Twitter. A large portion of viewers think I should use the four-scene experience as a sort of intro to a brand new game–good idea! What kind of game would it be? I don’t have any ideas for gameplay currently, and would love to hear yours!