Setting up Unity for VR and AR
09.05.2022
The VR and AR templates in Unity use the built-in renderer so if you want to use URP renderer you will need to set things up manually. It's presumed you have downloaded Unity Hub and have installed a version of Unity. This tutorial uses Unity version 2021.2.19 so if you use a different version it might be a little different. When installing Unity, for VR and AR on Android you will need to check Android Build Support, Android SDK & NDK Tools and Open JDK, and for AR on iOS you will need to check iOS Build Support. For iOS you will also need to have XCode installed.
VR with Quest 2
- To side load apps onto the Quest 2 you will need to sign-up as a developer using this link.
- Open the Oculus app on your phone. Tap on
Menuin the tab bar, thenDevices, scroll down toDeveloper Modeand turn it on. - Connect the Quest 2 to your computer with a USB cable. The USB charging cable that came with the headset will work as long as you have a USB-C socket on your computer or else you will need to get a USB-C to USB-A adaptor.
- In Unity Hub press
New projectin the top right. - Select
3D (URP)template and enter a project name and location and pressCreate project. - After Unity opens go to
Edit > Project Settings, selectXR Plugin Managementon the left. PressInstall XR Plugin Managementbutton. - Click on the
Androidtab inXR Plugin-in Managementand checkOculus. - If you are on Windows select the desktop tab and check
Oculus. This allows you to preview your app on the Quest 2 when you press play in Unity. You will need to install the Oculus PC app for this to work. - Select
Playerfrom the left and underGraphics APIson the right removeOpenGLES2 (Deprecated). - Go to
File > Build Settings, selectAndroidon the left. - Plugin in your Quest 2 and in the
Run Devicedrop down select the Quest 2 device. If it doesn't show up, pressRefreshon the right. - Press
Switch Platformin the bottom right. - Go to
Window > Package Managerand selectPackages: Unity Registryfrom the drop down in the top left. - Press the gear icon and select
Advanced Project Settingsand checkEnable Pre-release Packages. - Close the project settings and go back to the Package Manager. In Unity 2020 type
xr interactioninto the search field and theXR Interaction Toolkitpackage should show up. Select it and pressInstall. For Unity 2021 press the plus button in the top left and selectAdd package from git URL. Then typecom.unity.xr.interaction.toolkitand press enter. The package will then install. - A dialog will appear asking if you want to enable the new input system. Press
Yes. It will then ask if you want to update your project from the old to the new input system. Since this is a new project there is no need to do this so pressNo Thanks. Unity will then restart. - Open the package manager again and select
XR Interaction Toolkiton the left. Expand samples and pressImportto the right ofStarter Assets. In older versions of the package it's calledDefault Input Actions. If you want to test the project on a PC with a keyboard also importXR Device Simulator. This allows you to use the keyboard to control the controllers but it's a bit awkward to use. - In the Project window open
Assets > Samples > XR Interaction Toolkit > <version> > Starter Assets. Select each of the items in that folder apart fromXRI Default Input Actionsand pressAdd to ...button at the top of the inspector. - Open
Edit > Project Settingsand selectPreset Manageron the left. UnderActionBasedControllerenterRightin the text field to the left ofXRI Default Right Controllerand enterLeftin the text field to the left ofXRI Default Left Controller. - Go to
GameObject > XR > XR Origin (Action-based) - Select
XR Originin the hierarchy window and pressAdd Componentin the inspector and addInput Action Manager. - In the
Input Action ManagerexpandAction Assetsand press the plus button and click on the circle button to the right ofElement 0and selectXRI Default Input Actions. - Go to
GameObject > XR > Locomotion System (Action-based) - If you are on a Windows machine and you have the Oculus PC app installed you can preview your scene by pressing play and having the Quest 2 connected. Otherwise you will need to install the app on the Quest 2 by going to
File > Build and Run. Create a new folder calledBuildin your project and enter a filename for the app. On the Quest 2 open Apps and click on the drop down in the top right and selectUnknown Sources. Your app should be listed. Click on it to run it.
For an introduction to creating interactions in VR, Unity has a good tutorial on its site.
AR with iOS and Android
- In Unity Hub press
New projectin the top right. - Select
3D (URP)template and enter a project name and location and pressCreate project. - After Unity opens go to
Edit > Project Settings, selectXR Plugin Managementon the left. PressInstall XR Plugin Managementbutton. - Select the
Androidtab and checkARCore. Select theiOStab and checkARKit. - Go to
Window > Package Manager. SelectUnity Registryfrom the drop down on the top left and typear foundationinto the search field. SelectAR Foundationon the left and pressInstallin the bottom right. - Close the package manager and go to
GameObject > XR > XR Session Originand then delete theMain Camera. Then go toGameObject > XR > XR Session. - Click on
AR CameraunderAR Session Originand in the inspector on the right setLight Estimationto Everything. - In the Project window go to
Assets > Settingsand selectURP-Balanced-Renderer. PressAdd Renderer Featurein the inspector and selectAR Background Renderer Feature. - Go to
Edit > Project Settingsand selectGraphicson the left. UnderScriptable Render Pipeline Settingsclick the circle button on the right and selectURP Balanced.The default renderer isURP-HighFidelitybut I found it produced weird colours when you view the app on a device. - Select
Playeron the left in project settings and enter company and product name. - Select the
Androidtab. UncheckMultithreaded Rendering. RemoveOpenGLES2 (Deprecated)fromGraphic APIs. Enter a Package Name. Set Minimum API Level toAndroid 7.0 'Nougat'. - Next select the
iOStab. Enter theBundle IdentifierandSigning Team ID. Signing Team ID can be got by signing into your Apple developer account. Click on Membership on the left and Team ID should be listed on that page. Back in Unity, checkAutomatically Sign. Enter a sentence forCamera Usage Descriptionlike "Camera is required for AR". - Just so you can see something when you run the app add an
AR Point Cloud ManagertoAR Session Origin. Create a particle system by going toGameObject > Effects > Particle System. Select the particle system and uncheckLooping, set start size to 0.1 and uncheckPlay on Awake. ClickAdd Componentand selectAR Point Cloud. ClickAdd Componentagain and selectAR Point Cloud Particle Visualizer. Create a folder calledPrefabsunderAssetsin the Project window and drag the particle system into thePrefabsfolder. Delete the particle system from the hierarchy window. SelectAR Session Originand drag the particle system prefab on toPoint Cloud PrefabunderAR Point Cloud Manager. - To build for iOS go to
File > Build Settings. Select iOS under Platform on the left and pressSwitch Platformin the bottom right. Make sure you have an iPhone connected to your computer via an USB cable. You can connect your iPhone via WiFi but for the initial build you will have to connect it via USB. To enable WiFi connection, go toWindow > Devices & Simulatorin Xcode. Select your device on the left and checkConnect via network. Back in Unity pressBuild and Run. Create a folder calledBuildin your project and inside that folder create a sub folder callediOS. Select it and then pressChoose. - To build for Android go to
File > Build Settings. SelectAndroidunderPlatformon the left and pressSwitch Platformin the bottom right. Make sure you have an Android phone connected to your computer via a USB cable. Next pressBuild And Run. Select theBuildfolder you created earlier and enter a filename. If the app doesn't start automatically swipe up on the Android phone and look for your app amongst the list of apps.