Virtual Reality Game Menu Create (Oculus Go)

This is a content archive of a course that I wrote a few years ago. It is not guaranteed to be up to date. But some of the information may still be useful.

Part I - Introduction and Prequisite

Prerequisite

To get the most out of this article you should know how to create and setup an Oculus Go app using Unity. If not you should first read Oculus Go Unity Setup (Quick Start) .

To succeed in this course you should have access to an Oculus Go Headset and a Windows machine that can run Unity.


Introduction to Gaze Event Triggers

In Unity, an event trigger is a function that is called (triggered) when an event happens. A gaze event trigger is an Oculus extension that triggers events when a user gazes at a game object.

The user can trigger a gaze event by turning their head until the target object is in the center of view. In other words, directly in the center of their gaze. By detecting such events an action can be triggered without requiring the user to use the controller. They can select objects by simply gazing at them.

Gaze Selection

To understand why gaze selection is so important you have to look at some of the original VR frameworks, like Google Cardboard.

VR headsets built for the original Google Cardboard have no controller. The only capability is to click a button on the headset. This presents several issues:

  • Bumping the headset could cause the phone to become misaligned or worse, fall out
  • When grabbing the headset to click the button, users may accidentally press the volume keys
  • Some headsets are actually made out of cardboard, including the trigger which is awkward, spongy and far from precise
  • Loading the phone into the headset can cause the clicker to trigger unexpectedly
  • Users may end up fumbling trying to find the trigger People with limited capabilities may have trouble finding and clicking the trigger

Issues also arise when people want to restrict their movements so they don’t bump into or hit other people. Such as when sitting on an airplane or in bed.

For those reasons and more the use of gaze selection as an alternative to using a trigger became popular. Even though the Oculus Go comes with a much needed controller, it can be nice to provide an alternative - especially if the controller battery dies or it loses it’s connection. It also makes it easier for people who have trouble using their hands.

How gaze selection works

In VR the headset is tied to the camera. The camera in Unity is a special kind of game object. Like all game objects you can tell at any time what the position and orientation of the camera is. Using this information you can calculate what other objects are in the cameras center of vision. Then you can write scripts to change or select those objects when the user looks directly at them.

The equivalent to mouse hover

When a user looks at something you can write a script that provides the 3D equivalent to hovering. No doubt you’ve used desktop and browser apps where an object or link changes state when the user hovers their mouse over it. You can do similar things in 3D.

Like with a browser link you could pop up a text box or change the objects color. But in 3D you may want to try something else. Like increase the scale of the object to 110% or spinning it to give the user a visual clue that the object has focus.


Part 2 - Create an App Project and Scene

Create a lab app

For this course I’m going to have you create a new lab app in Unity. It will be a playground where I can introduce you to techniques and scripts without making you build a new app for every exercise.

If you took the prerequisite course, you should know how to create a basic app in Unity that you can run on the Oculus Go.

Instead of walking you through the steps again, I’m going provide you with a copy of checklist from that course below.

  • Create a new 3D app project in Unity and give it a name
  • In File / Build Settings set the project type to Android
  • Click Player Settings …
  • Click on the Android tab in the Inspector (or click Switch Platform)
  • Set your Company Name (make one up, or just use your name)
  • Expand Other Settings and set the Package Name (com.example.name)
  • Set the Minimum API to Android ‘Nougat 7.1’ (API level 25)
  • Expand XR Settings and check Virtual Reality Supported
  • Click the plus (‘+’) button and add Oculus
  • Build the app (to save the settings)
  • Create a builds folder under your project and store the *.apk file there

Download and Import the Oculus Utilities

Every time you build a new app you don’t have to go back and get the latest package from Oculus. But when starting a new project you may want to go back and see if there was an update:

https://developer.oculus.com/downloads/package/oculus-utilities-for-unity-5/

If the package has been updated, feel free to download and unzip it.

Extract the Utilities Download

  • In File Explorer right-click on the download and select Extract All…
  • Expand the OculusUtilities folder (in the extracted folder, not the zip file!)

Import the Utilities into the Project

  • Drag the OculusUtilities.unitypackage file (the extension may be hidden) onto the Unity Project window Assets folder
  • If it won’t go, make sure you aren’t dragging from the zip file folder view
  • The Import Unity Package dialog box should pop up (if you have multiple screens it may pop up on another screen)
  • Click the Import button
  • If you get a warning titled Update Oculus Utilities Plugin, click Yes
  • If you get a Restart Unity popup, click Restart

Oculus Sample Framework for Unity 5

For this example you also need to import some components from the Oculus Sample Framework for Unity 5. You can download it from here:

https://developer.oculus.com/downloads/package/oculus-sample-framework-for-unity-5-project/

In your project all you need to import is the OVR Inspector.

  • In the Project panel, right-click Assets / Import Package / Custom Package …
  • Select the OculusSampleFramework package and click Open
  • In the Import Unity Package dialog click None to deselect everything
  • Find OVRInspector in the list and check it
  • Click Import

Troubleshooting

If the Import Unity Package dialog doesn’t come up, make sure it didn’t end up on another screen or under another window.

If for some reason drag and drop doesn’t work you can import the package by doing the following:

  • Right-click in the Project window on the Assets folder
  • Select Import Package / Custom Package …
  • Browse to the location of the *.unitypackage file
  • Click Open
  • Follow the remaining steps above

Create a gaze scene

Save a new scene

Let’s start this exercise by creating a new scene. Later when you build the app you will need to remember to add this new scene to the build settings.

In Unity save the current scene as GazeScene:

  • Click File / Save Scene as …
  • Be sure to save it in the Scenes folder
  • Save as GazeScene

Replace the camera

Oculus provides a replacement for the Main Camera called OVRCameraRig (included when you imported the Oculus Unity package). Delete the main camera and replace it with the new camera using these steps:

  • Right-click on Main Camera in the Hierarchy panel and selected Delete
  • Drag the Assets / Oculus / VR / Prefabs / OVRCameraRig into the Hierarchy
  • Set the OVRCameraRig position in the Inspector to X: 0, Y: 1, Z: -10

Add the Oculus Physics Raycaster component

In 3D Unity projects a physics raycaster can be used to detect if an object exists in a certain direction. It does this by acting as if a ray is being sent out in a certain direction and calculating if it would hit anything. Think of it like pointing a laser pointer at an object and detecting if they intersect. But the laser beam may be invisible.

Just as Oculus created their own camera for VR, they also created their own special modified version of the physics raycaster, which exists as a script component. To work with Oculus you need to add their special raycaster component to their camera rig using the following steps:

  • Click on the OVRCameraRig in the Hierarchy panel
  • Click the Add Component button in the Inspector window
  • Type OVR Physics Raycaster and select it from the list

Add an event system to the scene

Unity has an event system that handles input, raycasting, and sending events. To use it you need to add an EventSystem object to the scenes Hierarchy. If you’ve done 2D development in Unity you may have seen this added automatically when you added a Canvas to a 2D scene. You can also use an EventSystem game object in 3D projects by doing the following:

  • Drop down the Create menu in the Hierarchy panel
  • Select UI / Event System

Replace the Input Modules

Just as Oculus has it’s own camera and physics raycaster, it also has it’s own input module. To use it you need to replace the Event System’s Standard Input Module component with the Oculus version by doing the following:

  • Click on EventSystem in the Hierarchy panel
  • In the Inspector panel remove Standard Input Module (click on the gear menu and select Remove Component)
  • Click the Add Component button in the Inspector window
  • Type OVR Input Module and select it from the list

Map the camera to the event system

  • Expand OVRCameraRig / TrackingSpace in the Hierarchy
  • Click on EventSystem in the Hierarchy to make it the focus of the Inspector panel
  • Drag OVRCameraRig / TrackingSpace / CenterEyeAnchor to the EventSystem / OVRInputModule / Ray Transform field in the Inspector panel

Part 3 - Add Game Objects to the Scene

Add a gaze pointer to the scene

What is a reticle?

The most well-know reticle is the target like image that you see when looking through the scope of a rifle or a missile launcher. It helps to make sure that you are aiming at the right thing.

Along with a reticle you can provide a way of visually indicating that an object is the target, such as rotating it or scaling it.

Another way to indicate that an object is the target is to draw a line to it, like a laser. But that works better when you are targeting with a controller and not your gaze.

Create a gaze pointer with a reticle

In this section I’m going to show you a simple way to create a gaze pointer.

Add the gaze pointer ring

  • In the Project panel Assets folder drag OVRInspector / Resources / Prefab / GazePointerRing into the Hierarchy

Map to the camera

  • Drag OVRCameraRig / TrackingSpace / CenterEyeAnchor to the GazePointerRing / OVR Gaze Pointer / Ray Transform field in the Inspector panel

Download arrow-left.zip

Download this file to a folder.


Add a cube to the scene

Add a cube to the scene using the following steps:

  • Click GameObject / 3D Object / Cube
  • In the Inspector set the cube Scale to X: 3, Y: 3, Z: 3

Create a Graphics Assets folder

Make sure that you downloaded and extracted the arrow-left.zip file attached to this course.

  • In the Project / Assets folder create a new subfolder called Graphics
  • Drag the arrow-left.png file that you downloaded and extracted into the Graphics folder
  • Drag arrow-left from the Graphics folder onto Cube in the Hierarchy panel

Part 4 - Wire up the Scene

Create a visual feedback script

Besides a reticle, you can provide an additional way to indicate that a target has been selected by gazing at it.

You could:

  • Rotate the target object
  • Scale the target object
  • Change the material or color of the target object
  • Draw a border around the object
  • Put a cursor above, below or in front of the object (think of the spinning indicators over characters in the Sims)
  • Swap the current object with a similar object drawn or animated to show that it has been selected

Add a Script to the Cube

In this example I am going to show you how to add a script that will rotate an object when it is selected by gazing at it. It will also scale the object when it is gaze selected and the user clicks the controller or a button on a game controller. Adapting the script to provide other forms of visual feedback I will leave as an exercise for the reader.

Add a new C# script component to the cube and call it GazeSelect. Then replace the code with the listing below.

Listing GazeSelect.cs

using System.Collections;
using System.Collections.Generic;
using UnityEngine;

public class GazeSelect : MonoBehaviour {

     float speed = 30.0f;
     bool spin = false;
     bool bScale = false;

     // Use this for initialization
     void Start () {

     }

     // Update is called once per frame
     void Update () {

         float sc = bScale ? 4.0f : 3.0f;

         transform.localScale = new Vector3(sc, sc, sc);

         if (spin)
         {
             transform.Rotate(Vector3.up, speed * Time.deltaTime);
         }
     }

     public void GazeFocusStart() {
         spin = true;
     }

     public void GazeFocusEnd() {
         spin = false;
     }

     public void GazeFocusClick() { 
         bScale = !bScale;
     }
 }

Add event trigger components

Add an event trigger component by doing the following:

  • Click on Cube in the Hierarchy
  • Click the Add Component button in the Inspector window
  • Type Event Trigger and select it from the list

Map the click event

  • Click Add New Event Type
  • Select PointerClick
  • Set the object to Cube
  • Set the function to GazeSelect.GazeFocusClick

You could also add events for pointer enter and exit. But that will be handled in the next step using the Oculus provided component.

OVR Gaze Event Trigger script

  • Click on Cube in the Hierarchy
  • Click the Add Component button in the Inspector window
  • Type OVR Gaze Event Trigger and select it from the list
  • Set the On Enter () object to Cube
  • Set the function to GazeSelect / GazeFocusStart
  • Set the On Exit () object to Cube
  • Set the function to GazeSelect / GazeFocusEnd

Part 5 - Test the Gaze Scene

Play the app in the Unity editor

To test the app in the Unity editor, do the following:

  • Click the play button in the Unity editor
  • Expand OVRCameraRig in the Hierarchy
  • Click TrackingSpace
  • In the Inspector put your mouse over the Rotation Y label (label, not the value)
  • Drag your mouse around the screen
  • Repeat for the X and Z values
  • When over the cube, click the mouse in the Game panel and hit the space bar to simulate a click

Stop playing

Before moving forward be sure to turn off the play button. Otherwise changes you make will be lost.

Did you notice a problem?

By the time you read this the Oculus code may have changed. But if it hasn’t you may have noticed an issue with the reticle. When the cube rotates, the reticle overlaps the cube. This causes parts of the reticle to disappear. If it’s still happening, I’ll show you a bit later how to fix the Oculus code.


Create and use a prefab

Create a prefab

  • If it doesn’t exist already, create a folder called Prefabs directly under the Project panel’s Assets folder
  • Drag the Cube from the Hierarchy to the Prefabs folder
  • In the Inspector for the Cube you should see a new row of buttons near the top labeled Prefab

Create more cube instances

  • In the Assets / Prefabs folder drag the Cube prefab to the Hierarchy - twice
  • This should create two new cubes (Cube (1) and Cube (2)) but they occupy the same space as the existing cube
  • Set the Pos X value of Cube (1) to 5
  • Set the Pox X value of Cube (2) to -5

Play the scene again

Play the scene again, dragging the OVRCameraRig / TrackingSpace positions to move the reticle over each cube. Click in the Game panel and hit the space bar to simulate a click.

Stop playing

When done, click the play button again to stop playing. Otherwise changes you make may be lost.


Test the scene on the headset

Do the following to test the scene on the headset

  • Plug the headset into your computer
  • Click File / Build Settings …
  • Click Add Open Scenes
  • Uncheck all other scenes
  • Click Build and Run
  • Wait for the build to finish and put on your headset
  • Move your head so that the reticle is over one of the cubes
  • The cube should start to rotate
  • When the reticle is over the cube, click the controller trigger to toggle the scale of the cube

Game Controller Support

The buttons also work with game controllers. For example if you have an Xbox controller paired with the headset, pressing the (A) button will work.


Fixing OVRGazePointer.cs

Note that this fix may not work or be necessary on future version of the script if Oculus changes the file.

One thing you will notice is that when the cube rotates, the reticle overlaps it and part of it disappears. Rotating in another direction doesn’t solve the problem. Even removing the rotation doesn’t solve the problem when the user moves their gaze away.

The best way is to solve the problem is to hack the script Oculus provides for the gaze pointer.

You can hack the OVRGazePointer.cs file by doing the following:

  1. Click on GazePointerRing in the Hierarchy so that it has focus
  2. In the Inspector click the gear shaped icon next to OVR Gaze Pointer (Script) and select Edit Script
  3. In the SetPosition method, change this line from this:

    depth = (rayTransform.position - pos).magnitude;

To this and save the file:

depth = (rayTransform.position - pos).magnitude / 2;

If you reimport the Oculus package again the changes may be overwritten.

Another option is to clone their script and hack the cloned version. Then replace the current script on GazePointerRing with the new script. You will still run into an issue if you import the Oculus package with GazePointerRing again. You will have to replace the script again.

Yet another option would be to build your own version of GazePointerRing from scratch and save it as a prefab or package.

How you want to handle it is up to you.


Part 6 - 2D Menus

2D menus in a VR world

You could easily adapt the code from the previous section on gaze selection to get 3D objects to act as menu items. The user could gaze at a game object and select it to trigger something, like a new scene loading.

A simpler option is to create a flat 2D menu on a panel and float it in front of the user in the VR world.


Create a Gaze Canvas Scene

Create a new scene

  • From the main menu select File / New Scene
  • Again from the main menu select File / Save Scene as …
  • Save the scene in the Scenes folder as GazeCanvasScene

Replace the camera

  • Right-click on Main Camera in the Hierarchy panel and selected Delete
  • Drag the Assets / Oculus / VR / Prefabs / OVRCameraRig into the Hierarchy
  • Set the OVRCameraRig position in the Inspector to X: 0, Y: 1, Z: -150

Add the Oculus physics raycaster to the camera rig

  • Click on the OVRCameraRig in the Hierarchy panel
  • Click the Add Component button in the Inspector window
  • Type OVR Physics Raycaster and select it from the list

Add a canvas

Add a canvas to the scene

In the Hierarchy drop down the Create menu Select UI / Canvas

Set the component properties for the canvas

  • Set the Canvas component Render Mode to World Space
  • In the Inspector panel set the Rect Transform component values to Pos X: 0, Pos Y: 0, Pos Z: 0
  • Set the Width to 100 and the Height to 100

Map the camera

  • In the Hierarchy panel expand OVRCameraRig
  • In the Hierarchy click on Canvas
  • Drag OVRCameraRig / TrackingSpace / CenterEyeAnchor to the Canvas component Event Camera field in the Inspector panel

Replace the default canvas raycaster

  • In the Canvas Inspector panel remove the Graphic Raycaster (Script) component (click on the gear menu and select Remove Component)
  • Add an OVR Raycaster component

Configure the EventSystem

When the Canvas object was added to the scene, Unity also added an EventSystem object to the Hierarchy

  • Click on EventSystem in the Hierarchy panel
  • In the Inspector panel remove Standard Input Module (click on the gear menu an select Remove Component)
  • Click the Add Component button in the Inspector window
  • Type OVR Input Module and select it from the list

Map the camera to the event system

  • Expand OVRCameraRig / TrackingSpace in the Hierarchy
  • Click on EventSystem in the Hierarchy to make it the focus of the Inspector panel
  • Drag OVRCameraRig / TrackingSpace / CenterEyeAnchor to the EventSystem / OVRInputModule / Ray Transform field in the Inspector panel

Add a gaze pointer ring to the scene

Do the following to add a gaze pointer ring prefab:

  • Drag an OVRInspector / Resources / Prefabs / GazePointerRing to the Hierarchy
  • Expand GazePointerRing in the Hierarchy
  • Set the scale of GazeIcon to X: 1.5, Y: 1.5, Z: 1.5

Map to the camera

  • Click on GazePointerRing in the Hierarchy to put it’s Inspector panel in focus
  • Drag OVRCameraRig / TrackingSpace / CenterEyeAnchor to GazePointerRing / OVR Gaze Pointer (Script) / Ray Transform in the Inspector panel

Create a 2D menu

Add an empty object to the canvas

  • Right-click on Canvas in the Hierarchy
  • Select Create Empty to create a child game object
  • In the Inspector panel rename the new game object Menu
  • Set the scale to X: 0.5, Y: 0.5, Z: 1

Add a panel

  • Right-click on Menu in the Hierarchy
  • Select UI / Panel to create a child object of the menu

Add a button

  • Right-click on Panel in the Hierarchy
  • Select UI / Button to create a child object of the panel
  • If it isn’t already the focus, click on Menu / Panel / Button in the Hierarchy to set it’s Inspector panel in focus
  • Rename the button Start
  • Set the Y position to 25
  • Set the Width to 80
  • Click on the Button (Script) component’s Highlighted Color field to pick a color that stands out (like light green)
  • Do the same for the Normal Color, picking a different color
  • Set the Navigation to None otherwise the button will retain the Highlighted Color even when the reticle is no longer over it
  • Expand the Start button in the Hierarchy to reveal the child Text object
  • Click on the child Text object in the Hierarchy
  • Set the Text object’s Text component’s Text field to Start

Duplicate the button

  • Right-click on the Start button in the Hierarchy
  • Select Duplicate
  • Rename the new button to Stop
  • Change the button text to Stop
  • Set the Y position to -25

Test the menu in the Unity editor

Do the following to test the menu in the Unity editor

  • Click the play button at the top of the Unity editor.
  • You should see the gaze pointer reticle in the middle of the menu panel
  • In the Hierarchy, while still in play mode, click on OVRCameraRig / Trackspace to make it the focus on the Inspector panel
  • Drag the mouse over the label (not the value) for the Transform component Rotation X field
  • Drag the mouse up and down to put the reticle over the buttons
  • The buttons should change to the Highlight Color when the reticle is over them
  • The buttons should switch back to their Normal Color when the reticle is moved away

Part 7 - Control a cube with a menu

Add a cube to the scene

Add a cube to the scene using the following steps:

  • Click GameObject / 3D Object / Cube
  • In the Inspector set the name to Cube2
  • Set the cube position to X: -100, Y: 0, Z: 0
  • Set the cube Scale to X: 50, Y: 50, Z: 50
  • Drag from the Project panel Assets / Graphics / arrow-left onto Cube2 in the Hierarchy panel

Create a GazeClick script

Add a new C# component to Cube2 and name it GazeClick. Then replace the contents of the script with this code:

Listing GazeClick.cs

using System.Collections;
 using System.Collections.Generic;
 using UnityEngine;

 public class GazeClick : MonoBehaviour {

     float speed = 60.0f;
     bool spin = false;

     // Use this for initialization
     void Start()
     {

     }

     // Update is called once per frame
     void Update()
     {

         if (spin)
         {
             transform.Rotate(Vector3.up, speed * Time.deltaTime);
         }
     }

     public void OnClickStart()
     {
         spin = true;
     }

     public void OnClickStop()
     {
         spin = false;
     }
 }

Wire up the buttons

Wire up the Start button

  • Click on the Start button in the Hierarchy to highlight it in the Inspector panel
  • In the Button (Script) component add a new On Click () function (click the + sign)
  • Drag Cube2 to the On Click () object field
  • Set the function to GazeClick / OnClickStart

Wire up the Stop button

Repeat the steps above for the Stop button, but wire the function to GazeClick.OnClickStop


Test the modified menu in the Unity editor

You should now be able to start and stop the cube rotation using the menu buttons

Test the menu in the Unity editor, do the following:

  • Click the play button at the top of the Unity editor.
  • You should see the gaze pointer reticle in the middle of the menu panel
  • In the Hierarchy, while still in play mode, click on OVRCameraRig / Trackspace to make it the focus on the Inspector panel
  • Drag the mouse over the label (not the value) for the Transform component Rotation X field
  • Drag the mouse up to put the reticle over the Start button
  • Click in the Game scene panel so that it has focus
  • Press the Space bar to simulate a click
  • Repeat for the Stop button
  • The buttons should change to the Highlight Color when the reticle is over them
  • The buttons should switch back to their Normal Color when the reticle is moved away

Test the scene on the headset

Do the following to test the scene on the headset:

  • Plug the headset into your computer
  • Click File / Build Settings …
  • Click Add Open Scenes
  • Uncheck all other scenes
  • Click Build and Run
  • Wait for the build to finish and put on your headset
  • Move your head so that the reticle is over the Start button and then * press the trigger on the controller
  • The cube should start to rotate
  • Do the same for the Stop button and the cube should stop

Game Controller Support

The buttons also work with game controllers. For example if you have an Xbox controller paired with the headset, pressing the (A) button will work.


Wrapping up

Now that you know how to use 2D and 3D objects as menus in VR here are some things to try:

Create a menu that looks like an arcade

Create a scene containing structures that look like booths at an arcade. When the user gazes at a scene and clicks the controller, transport them to a new scene.

Create a menu that looks like a list

Create a scene containing a menu in the shape of a floating list. When the user clicks a menu item, transport them to a new scene.

Advanced users could try things like audio and particle effects when the user clicks the menu item.



About the Author

Mitch Allen has worked for software companies in Silicon Valley, along Boston’s Route 128 and in New York’s Silicon Alley. He currently works for a robotics company in Massachusetts.