Add locomotion to the Unity to WebXR Mozilla example tutorial / copy and paste script

What this is

This is a pretty much plug-and-play script and tutorial for adding locomotion to the WebXR example from Mozilla.

By the end of the implementation, you will:

  • be able to navigate a virtual space with one or both hand controllers
  • move forwards and backwards at the speed you choose (depending on how far the thumbstick is tilted), in the direction your headset is facing
  • snap-turn by the number of degrees set in the code

Background

My next VR experiment will most likely have to run online, due to the implication of Covid-19 on in-perosn research (especially when that research involves strapping a face mask to the faces of tens of participants).

Thankfully (and coincidentally), Mozilla have recently updated their Unity-WebXR exporter tool.

Running a decent VR experience directly in the browser seems like a natural place for research experiments, as it stops users having to go to the trouble of downloading VR builds to install on their systems, and it should make it easier to collect additional data about participants.

While the implications and challenges of conducting remote VR research are currently unclear, it doesn't mean we can't get up and running.

Mozilla Unity->WebXR Example

Mozilla provide and download-and-run Unity project to get started with their WebXR exporter, giving you the following features:

  • Roomscale headset tracking
  • Hand tracking
  • Trigger and grip button controls
  • Interactable items (which you can grab)

This is a great start, but it misses out locomotion - you're totally unable to move in the space. For me, this was limiting, so I wanted to create a plug-and-play script to add this functionality for myself and others who might need it. The result is one script, but there is also some fiddling around the edges that is needed, too. Check out the tutorial below.

How to add locomotion + snapturns

1. Get started with the Mozilla example

First, head to https://github.com/MozillaReality/unity-webxr-export and download the example. It's a complete Unity project, so open it like you would a normal one - they recommend Unity 2019.3.

They have a walkthrough on the GitHub how to set this part up, so I'm leaving it in Mozilla's capable hands.

2. Find the Hand gameobjects

Once you have the example open and running, it's time to add locomotion. First, find the handL gameobject - it's a child of WebXRCameraSet.

3. Open up its input map

After selecting the hand object, look at the Inspector on the right. In the script “Web XR Controller”, there is an Input Map called XRLeftController. Double click on that to open up the controller script.

4. Add two new inputs

After opening the XRLeftControllerMap, it'll now take over the Inspector window. Make sure “Inputs” is expanded, and change the size from 2 to 4. This adds two new inputs for us to access.

5. Name the inputs

The two new inputs will appear at the bottom. We need to name them with terms that Unity's Input Manager will recognise. Therefore we must called them Horizontal and Vertical. Check the picture below and match the settings

6. Create a Locomotion.cs script

We now need to create a script to turn those inputs into actions. Right click anywhere in the Project window (I like to put it inside the Scripts folder, because I'm RPing a lawful good developer) and go to Create → C# Script. Call it Locomotion.cs and open it up, ready to edit.

7. Copy my code

Copy the code below into the file. If you're curious how it works, it is commented in-line.

using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using WebXR;
 
[RequireComponent(typeof(WebXRController))]
public class Locomotion : MonoBehaviour
{
    private WebXRController controller;
    public float speed = 0.2f;
    private bool snapTurnDebounce = false;
    GameObject player, head;
 
    // Start is called before the first frame update
    void Start()
    {
        //Get the GameObject that is the player's avatar
        player = GameObject.Find("WebXRCameraSet");
 
        //Get the object that represents the real-world headset (e.g the main camera)
        head = GameObject.Find("CameraMain");
 
    }
 
    // Update is called once per frame
    void Update()
    { 
 
        //Read float values (which range from -1 to 1) from the two thumbstick axis (forward/back, left/right)
        float forwardBack = Input.GetAxis("Vertical");
        float leftRight = Input.GetAxis("Horizontal");
 
        //SNAP TURNING
        //Snap turning on thumbstick being pushed left or right
        //With debounce to stop the snap turning from going forever
        float snapTurnThreshold = 0.5f; //How far thumbstick stick is pushed before snapping happens
        float snapTurnAmount = 30; //How far (in degrees) the player rotates each snapturn
 
        if (leftRight > snapTurnThreshold && snapTurnDebounce == false)
            {
                turnAndAdjust(snapTurnAmount);
            }
 
        else if (leftRight < -snapTurnThreshold && snapTurnDebounce == false)
            {
                turnAndAdjust(-snapTurnAmount);
            }
 
        else if (leftRight < 0.1 && leftRight > -0.1 && snapTurnDebounce == true)
            {
                snapTurnDebounce = false;
            }
 
        //FORWARD/BACKWARD LOCOMOTION
 
        float locomotionThreshold = 0.2f; //Only start moving after the thumbstick has been pushed this far
        float locomotionSpeed = 2f; //Adjusts the speed in response to how far thumbstick is pushed
 
        if ((forwardBack > locomotionThreshold || forwardBack < -locomotionThreshold))
            {
                float distance = forwardBack * locomotionSpeed; //
 
                float tempY = player.transform.position.y; //Get curreny Y position so we can put it back in to avoid flying (locks our player to the ground plane)
                Vector3 movement = player.transform.position + head.transform.forward * distance * Time.deltaTime; //Take the player movement, apply transformations
                movement.y = tempY; //Re-introduce previous Y value
                player.transform.position = movement; //Apply movement to player object
 
            }
 
    }
 
 
    private void turnAndAdjust(float i)
    {
 
        //This is actually trickier than it seems. This is because the "player" actually represents the centre of the player's initial starting area,
        //according to the centre of their VR setup in their home.
        //And the "camera" and hands that you use in VR are relative to this initial starting area and its center.
        //E.g. if you and your headset are 1 meter forward and 1 meter to the right of the center of your playspace in the real world, 
        //the headset "camera" will be 1m forward and 1m right of the position of the "player" in Unity.
        //You also can't rotate the "camera" object directly, as it is 1-1 fixed to the headset's position and rotation in the real world space
        //If you rotate the "player", the "camera" actually moves position.
        //E.g. if your player is at X: 0, Z: 0, but your camera is at X: 1, Z:1 (because you and your headset are 1m forward and to the right in the real world)
        //When you rotate the player 90 degrees, the camera (as a child) moves to X: -1, Z: 1. 
        //SO we need to understand how the camera moves position, and where, and then move the player object to compensate for this movement, so it feels like
        //the camera has rotated but not moved. Phew.
 
        Vector3 originalPosition = head.transform.position; //Store the head/camera position
        player.transform.Rotate(0, i, 0); //Rotate the player area
        Vector3 newPosition = head.transform.position; //Get the new head/camera position
        Vector3 difference = newPosition - originalPosition; //Calculate how much the head/camera moved after player was rotated
        player.transform.position = player.transform.position - difference; //Move the player by the difference above, to offset the head/camera position change caused by the rotation
        snapTurnDebounce = true;
    }
}

8. Add the Locomotion.cs script to the handL object

As the subheader says, add the Locomotion.cs script to the handL object.

9. Change the Unity Input Manager settings

This example plugs into the Unity Input Manager for understanding controller inputs, which means to get the result we want from our input controllers, we need to go to the Input Manager.

Go to Edit → Project Settings:

Then choose “Input Manager”:

Now find the second entries for Horizontal and Vertical. The first entries are for handling buttons, but ours our joystick axises (axe-ees?).

You can fiddle with these, but I like Gravity (1000), Dead (0.01) and Sensitivity (1.2). Gravity is how quickly your input resets to normal. The higher the gravity, the more responsive your input will be at registering when you have stopped pushing it. Dead is how far you need to push the stick before movement is registered. And sensitivity is the “units per second” that the axis will move toward the target value.

10. Repeat the above steps for the right hand

If you want both hands sticks to do the same thing (movement), then repeat the steps for the right hand game object (handR).

You can use the same locomotion script, but remember you'll need to add the Horizontal and Vertical inputs to the XRRightControllerMap (you previously did the XRLeftControllerMap).

And you're done. Please enjoy!