In this chapter and the next, we'll create a first-person Virtual Reality (VR) game. The game will be targeted specifically at the Oculus Rift S hardware, and it's worth declaring this from the outset, as many other VR platforms exist as well. Although development methodologies do differ from device to device, the main principles of VR development within Unity are sufficiently similar that these chapters will still be helpful to any VR development on any of the hardware available today.
The lighting in a game can have a significant impact on the mood of the game. In previous chapters, we've briefly touched on how to configure basic lighting in a game. In this chapter, we'll take a deeper look into some of the techniques and tools game developers can use to create a visually appealing game.
As well as the lighting, we'll also add post-processing effects to enhance the visuals further. Post-processing effects are filters and adjustments applied to the pixels of the scene camera to stylize or improve the aesthetics of the rendered frame. By applying this knowledge in your own projects, you can create stunning-looking games with minimal effort.
Once we've looked at the lighting and post-processing effects, we'll move onto preparing the project for VR. This will involve installing the necessary packages. Luckily, as we'll soon discover, Unity provides a useful plugin management tool for this purpose.
We will build on many of the development techniques seen so far, as well as implementing new features specific to VR.
This chapter explores the following topics:
This chapter assumes that you have not only completed the projects from the previous chapters but also have a good, basic knowledge of C# scripting generally, though not necessarily in Unity.
The assets used in this project can be found in the book companion files in the Chapter11/ Assets_To_Import folder. The end project can be found in the Chapter11/End folder.
You will also require a VR device (we use the Oculus Rift S), and a computer that meets the minimum requirements for that device. The minimum requirements for Oculus devices can be found here: https://support.oculus.com/248749509016567/.
In this game, the player will be a stationary character that can look around and shoot in any direction but cannot move around. The player will be standing in a sci-fi interior, and enemy bots will spawn into the level at random intervals. The bots will initially wander around searching for the player and, upon finding them, will run toward them, eventually attacking them. The player will be armed with plasma cannons on each hand and will have the ability to attack oncoming enemies to avoid being injured with the primary objective of seeing how long they can survive!
To create this project, we will begin by creating the core functionality for a standard first-person mode, and then migrate that to VR. To get started, do the following:
When working with different mesh assets, especially when reusing assets made by others, you'll often find they're made at different sizes and scales. To fix this, we can adjust the Scale Factor of each mesh.
With the meshes scaled correctly, we can construct the level for our game:
Now that the assets have been imported, we can improve the look and feel of the game drastically by spending some time configuring the lighting. For this reason, we'll create the lighting for the scene before we enable VR.
Lighting is fundamental in creating a specific mood in your scene, and here we'll create an atmospheric, dystopian sci-fi interior. This kind of environment typically relies on high levels of contrast between lighting, contrasting dark colors with vibrant non-natural lighting colors, such as green, blue, and red.
In this section, we'll remove the lighting that Unity automatically adds to a new scene. We'll then add the base lighting, which, as the name suggests, will be the base of our lighting system, which we will then build on by adding emissive wall panels and Light Probes.
When you create a new scene in Unity, it will automatically add two common objects for you: a Camera and a Directional Light. By adding these two objects, you are able to get up and running as quickly as possible. However, there are times when this default setup may need some tweaking, and this is one of those times. As we'll be creating our own lighting system, we can do away with the defaults provided by Unity:
Now with the Unity default lighting removed, we can configure our own custom lighting. By configuring lighting from the ground up and not relying on the default lighting setup, you'll gain a better understanding of how to implement lighting solutions in your own games. This information will prove valuable to ensure that your games have a distinctive look. This is arguably even more critical in a VR game, where the player enters the world like never before. We'll start our custom lighting setup with the base lighting. This lighting is the foundation on top of which our other lighting systems will provide an additive effect.
The base lighting should provide the player with the visibility they require to play the game comfortably. It should also help set the mood for the game, but we'll be adding extras such as emissive wall panels and Light Probes that will help with setting the mood. Let's start by adding a new light:
That's looking perfect so far – atmospheric but with enough visibility. For more of an atmospheric feel, we'll add wall panels that glow.
Emissive lighting lets you use a material and its maps as a light source, which can be emitted from a mesh. We can add emissive lighting to wall panels in our level and they will glow, reinforcing our sci-fi theme. Start by creating a material:
Great! You've just created an emissive material. Now let's assign it to a mesh to illuminate the scene:
Tip
If the lighting doesn't automatically rebuild, you can start the process manually by opening the lighting window (Window | Rendering | Lighting) and clicking on Generate Lighting.
Very eerie! However, it may be a little bit dark for the player (depending on their preference). To fix this, and complete our lighting system, we'll add Light Probes to the scene.
Light Probes are useful for adding indirect illumination to real-time (dynamic) objects, such as the player character and enemies. Light Probes refer to a connected network of empty-style objects. Each object (a node) records the average color and intensity of neighboring lights in the scene within a specified radius. These values are stored in each node and are blended onto moving objects:
The idea is to add or duplicate more probes within the network, and to position them strategically around the level in areas where the light changes significantly, in either color or intensity, or both. The probe network should capture the distribution of light in the scene. To get started with this, do the following:
Excellent, we've now completed the lighting for the environment: we've added spot and point lights, emissive lights, and Light Probes. These components are working to ensure objects are illuminated correctly. Things are looking good!
And that's it for lighting. We covered a lot in this section, including the following:
This is a solid foundation in Unity lighting that can be applied to any of your projects moving forward, not just VR.
With lighting complete, there's one last thing we should look at before adding VR support, and that is Unity's new post-processing stack. Becoming comfortable with implementing effects using the post-processing stack will help lend your game a unique visual appearance.
Introducing the post-processing stack
In this section, we'll add these post-processing camera effects to enhance the appeal of the scene. As mentioned in earlier chapters, post-processing effects are filters and adjustments applied to the pixels of the scene camera to stylize or improve the aesthetics of the rendered frame.
Important note
Unity 2018 and above ships with version 2 of the post-processing stack for adding volume-based post-processing effects to your project.
To use the post-processing stack, we need to perform the following steps:
Unsurprisingly, we'll start by installing the post-processing stack, which provides the functionality we need to enhance our game's visuals.
The post-processing stack is a Unity package that combines a set of effects into a single post-processing pipeline. There are numerous benefits to this, but one for you as a developer is that once you know how to apply one effect using this pipeline, you'll easily be able to add any of the other available effects.
To add the newest version of the post-processing stack, do the following:
After installation, the complete post-processing functionality has been added to your project. This workflow depends on two main concepts or steps. First, we need to mark out volumes in the level that, when entered by the camera, will cause effects to be applied to it. Second, we must specify which cameras are to be affected by the volumes. We'll start by creating the volumes.
Let's start by creating a single post-processing volume in the level:
Now we have a volume that encompasses our tunnel. This is called a local volume as the effects are only applied in a specific area of the game.
Important note
The post-processing stack introduces the concept of global and local volumes, and each volume can be given a priority and a specific set of effects. You can take advantage of this in your games by creating a global volume that will apply specific effects, and then creating local volumes, as we have done with our sci-fi tunnel, that provide additional effects or override the global effects for when the player is in that area.
Next, we'll configure our camera to respond to the post-processing volume we've just created.
To test the effects, we'll need a camera. As mentioned earlier, every new scene automatically includes a camera object. However, rather than using the default static camera, we'll use the first-person controller we created in Chapter 8, Creating Artificial Intelligence.
Important note
We'll shortly be replacing this character controller with a VR version.
To do this, we'll need to first export the package from the previous project before importing the package into our new project. These steps will come in handy whenever you want to share objects and all associated data between your projects.
To export the character controller, do the following:
This creates a unitypackage file, which contains all the data we need to use the character controller.
Now we have the Unity package, we can import it into our current project:
Important note
You may notice when you import the package that the assets are placed at the same location as they were when they were exported. This is something to consider when creating your folder structure if you will be exporting packages.
With the package imported and added to our scene, we are ready to configure the camera:
We've now added a post-process volume to the scene and a post-process layer to the camera. Next, we need to associate specific post-processing effects with the volume, which will be rendered to the camera when it enters the volume.
We've laid the groundwork for using post-processing effects but have not yet selected and configured the effects we would like to apply. This will change now as we go through the last steps required to add post-processing effects to our project.
To select which effects we would like to use, we first need to associate a Post-processing Profile with the volume:
Finally, ensuring the first-person camera is inside the volume, you should immediately see the result in the Game tab. Excellent work. Our scene is looking great, with both lighting and post-processing combined:
We covered a lot in this section, including the following:
That's it for the lighting and post-processing effects. It's now time to prepare the project for VR!
In this chapter so far, we've been preparing the foundations for a scene, ready to add core functionality and gameplay. To recap, our game will be a first-person VR shooter, in which waves of enemy droids will spawn into the level, move toward the player, and then attack on approach. The player must dispatch all enemies and see how long they can survive the level. We still have the gameplay to implement, and whenever creating VR content, I like to make the project compatible with both VR and a standard first-person controller, both to help debugging and to aid testing without a headset.
But, before moving forward with development, let's prepare for VR development generally. This section uses the Oculus Rift S device, although the development workflow is similar for Oculus Go and Oculus Quest. To get started, you'll need to connect and install your Oculus Rift device. Instructions for doing this can be found online at https://support.oculus.com/1225089714318112/.
With the Oculus hardware connected, back in Unity, do the following:
Selecting the Oculus checkbox will automatically install the Oculus XR Plug-in Package. You can confirm this by checking in the Package Manager.
You can test whether this configuration has found your device simply by clicking play on the toolbar. The orientation of the Head Mounted Display (HMD) will automatically control the scene camera, so you'll be able to look around in VR. If this doesn't work, ensure the Oculus Device is connected and installed and can play VR content normally.
Tip
If you don't have a VR device, you can still test the application by selecting the Unity Mock HMD, as shown in Figure 11.42. For more information on how to set up testing without a device, take a look at https://docs.unity3d.com/Packages/[email protected]/manual/index.html.
If all you wanted to do was look around in VR wearing the HMD, then you'd be done already! Unity makes it that easy to get up and running in VR. However, we're missing interactivity. To achieve more complex behaviors with Oculus Rift S, we'll import additional Asset Packages made freely available to us on the Asset Store from Oculus Technologies:
Once imported, you'll see a range of folders added to the Project panel in the Oculus folder. Many of these were previously separate asset packages but have since been integrated into one for ease of use:
Critically important is the Oculus/VR/Prefabs folder, which contains an Oculus player character called OVRPlayerController. This controller works much like FPSController but has VR-specific functionality. Let's add this to the scene now:
After adding OVRPlayerController, new cameras will be added to the scene, one for the left and one for the right eye. Your post-processing effects won't work with these cameras by default. To make them work, do the following:
We'll next give the player character some hands using the Oculus Avatar prefab, which integrates directly into the player controller and works using input from the Oculus Touch controllers. This asset provides a mesh representation of the location and the status of the hands, as understood from player input within the tracking space (the play area).
Excellent. We're now up and running with Oculus VR, ready to implement core functionality for the first-person shooter game, and we'll do precisely that in the next chapter.
Summary
Great work. So far, we've imported all project assets, configured those assets, set up the environment, and calibrated the lighting.
Using several Spot Lights to create interior lighting, as well as emissive materials applied to wall panels to make them glow, we created an atmospheric interior scene. The lighting was further enhanced by the addition of several Light Probes scattered throughout the scene. These Light Probes will come into their own as we add more dynamic objects to the project.
As well as the lighting, we implemented the post-processing stack. As we learned, these post-processing effects are filters and adjustments applied to the pixels of the scene camera and involved several steps to set up, including importing the post-processing package, creating a volume that encompasses the level, configuring the camera by adding a post-processing layer, and creating a post-processing profile and adding the desired effects to that profile.
This knowledge can be used in any of your future projects to add a certain level of visual flair or polish that can make a huge difference.
We then moved onto preparing the project for VR, which involved installing the necessary packages and setting up input using touch controllers. This preparation is essential for the next chapter, where we'll continue this project by adding enemies and a projectile system controlled using VR.
Q1. XR stands for...
A. Extended Reality
B. Extensible Realism
C. X-Rated
D. Expansive Reality
Q2. Light Probes are useful for…
A. Adding indirect illumination to moving objects
B. Light-mapping non-UV objects
C. Unwrapping UV sets
D. Adding soft shadows on static objects
Q3. Emissive lighting can…
A. Break easily
B. Cast light from mesh surfaces via materials
C. Hide static objects
D. Customize the Object Inspector
Q4. The post-processing stack version 2 includes…
A. Terrain tools
B. Particle systems
C. Multiple effects in a single post-processing pipeline
The following resources include more information on VR:
3.129.45.92