Next, let's look at the test in not just a 3D environment, but in XR environment. The main difference between 2D and 3D is that instead of moving in side to side and up and down, we're moving in three different directions or 3D space with 6 degrees of freedom, with the position, rotation, etc. In XR, instead of moving the camera left, right up or down, or even in 3D space, we become the camera. So the challenges of XR are similar to that of 3D, and the reason why we placed it later on in this course. But it's important to understand that our focus in XR is less about can we move the player on the scene? It's more about when I move around in the scene as the player wearing the headset, holding the remotes, can I perform the actions that I expect to? now? Of course, in order to do that, we need to test with our XR rig. And so I'm going to get that ready and then we'll take a look at how XR tests are created and executed.
Thanks to the power of movie magic, I'm now ready to test this application in VR. So the application that we're going to be using today is the escape room for the VR Beginner Project, which is available from the Unity Asset Store. There is an accompanying Unity learning project for this, which allows you to walk through how to create and develop applications in VR, which I strongly recommend. But in this case, we're just going to go ahead and start testing with this application. Now, in order to do this, I need a few things. I need a project which is set up to execute this VR application. Here it is ready to go. And I've already installed the GameDriver, Agent and license and all of the requisite tools here. And then second, as you can see, I'm wearing a headset. Now, traditionally, VR can only be tested while wearing a VR headset. And GameDriver provides a method for testing or taking inputs from the headset and turning that into a reasonable set of tests that can be run without the headset. Or, if you like, you can continue to run your tests on the headset if that's your goal. But what we're going to do today is we're going to capture some inputs as a user and then we're going to play those back or a similar output from a recording and show the actual output of the recorded inputs from the headset. And you'll see that it's quite a lot of input here. I'm not going to go through a lot of detail of the project. It's very similar to the 2D and 3D projects in terms of how we interact with objects, how do we identify them? In this case, my focus is on when I'm in VR, can I interact with the scene, can I grab objects? Can I move around? Can I get a key? Can I put it in the drawer, etc.? And that's exactly what we're going to do today.
[00:01:58] Another of the prerequisites for at least testing in the MetaQuest 2 is that I'm running the Oculus desktop here, so you have to have the Oculus desktop installed as well as have the headset on. And what I'm looking at now is actually my desktop in Rift mode, which allows me to see my entire desktop as well as connect through Unity to the device. Now, we're not going to get into the details of how to do that here. That's all available from Unity from their own learn. For this VR application there of course that is. And each headset is going to be a little bit different. So if you're testing Steam VR, there's a different apartment for that. If you're testing on a PICO device, there are different requirements for that. In this case, Oculus Quest or the MetaQuest 2 is just another Android device. And so it's much easier to demonstrate in this example. In order to start gameplay here, I'm going to hit play on the scene and you be able to see what's going on in my headset exactly. But you will see on the screen what I'm doing as I navigate around the scene. And I'm looking at this environment, I can reach forward and grab a key and hopefully not bump into anything in my office, insert that into the drawer and then open the drawer, etc.. And this is going to be the goal of our test is to get this book out of the drawer and onto the pedestal. Very simple. But then we could do other things where we teleport around the scene and perform other actions. But in this case, we're just going to focus on can we perform the simple actions that I just demonstrated. And I'll take off the headset for a second so I can see what's going on in my screen. And then we're going to record that behavior. I'm going to move the recorder up here so you can see much more of the output. What happens when I hit record? XR provides a lot of inputs which are all going to be captured by the GameDriver agent here. And so when I do that, that log is going to flood very, very quickly. Get a hit record. It's going to put the player into or the editor in playmode. It's going to launch me in my scene here. And so I'm again, in this space and then I'm going to grab the key, put it into the keyhole, grab the drawer. Open the drawer, grab the book, and put that on the pedestal. And so you should see a ton of input passing by here as I did that. Yes, I can see that we did. Here, I'm going to stop recording. I'm going to stop playing mode and then we're going to take a look at what that output entails.