Use the ARCore XR Plugin package to enable ARCore support via Unity's multi-platform XR API. This package implements the following XR Subsystems:
This version of ARCore XR Plugin uses ARCore 1.22 and supports the following functionality:
The ARCore SDK for Unity, configured to target Android. See the Unity quickstart for Android for installation and instructions for configuring your environment. The ARCore XR plug-in supports AR Foundation features on the Android platform by implementing the native endpoints required to target Google’s ARCore platform using Unity's multi-platform XR API. Use the ARCore XR plug-in APIs when you need access to Android ARCore-specific features.
- Device localization
- Horizontal plane detection
- Vertical plane detection
- Point clouds
- Pass-through camera view
- Light estimation
- Anchors
- Oriented feature points
- Hit testing
- Session management
- ARCore APK on-demand installation
- 2D Image tracking
- Face tracking
- Environment probes
- Occlusion
It doesn't support the following subsystems:
To install this package, follow the instructions in the Package Manager documentation.
In addition, install the AR Foundation package, which uses ARCore XR Plugin and provides many useful scripts and prefabs. For more information about this package, see the AR Foundation documentation.
The ARCore XR Plugin implements the native endpoints required for building Handheld AR apps targeting Google’s ARCore platform using Unity's multi-platform XR API. However, this package doesn't expose any public scripting interface of its own. In most cases, you should use the scripts, prefabs, and assets provided by AR Foundation as the basis for your Handheld AR apps.
Including the ARCore XR Plugin also includes source files, static libraries, shader files, and plug-in metadata.
Build Settings
You can flag ARCore XR Plugin as either required or optional.
If ARCore is optional, the Play Store lets users install your app on devices that don't support ARCore, or that support ARCore but don't have it installed. This is useful if you want to provide different alternatives depending on whether AR is available.
AR is required by default.
To create an ARCoreSettings
Asset and assign it to your build settings, open the Project Settings window (from Unity's main menu, go to Edit > Project Settings), then navigate to the XR Plug-in Management menu and check the ARCore provider, as shown in the screenshot below:
This will create an ARCoreSettings
Asset that can be accessed under XR Plug-in Management > ARCore, as shown in the screenshot below:
Note: If ARCore is required, the availability check will always report that ARCore is supported, even on unsupported devices. This is because the Play Store prevents the installation of apps that require ARCore on unsupported devices, so these apps always assume they're running on a supported device. However, if you install an Android APK onto an unsupported device via USB (called 'side loading') or via Unity, the unsupported device will report that ARCore is supported.
Session
ARCore implements XRSessionSubsystem.GetAvailabilityAsync
. The list of devices ARCore supports is frequently updated to include additional devices. If ARCore isn't already installed on a device, your app needs to check with the Play Store to see if there's a version of ARCore that supports that device. GetAvailabilityAsync
returns a Promise
which can be used in a coroutine. For ARCore, this check may take some time.
If the device is supported, but ARCore is not installed (or requires an update), you need to call XRSessionSubsystem.InstallAsync
, which also returns a Promise
.
For more information, see the ARSubsystems session documentation.
Depth subsystem
Raycasts always return a Pose
for the item the raycast hit. When raycasting against feature points, the pose is oriented to provide an estimate for the surface the feature point might represent.
The depth subsystem doesn't require additional resources, so enabling it doesn't affect performance
ARCore's depth subsystem will only ever produce a single XRPointCloud
.
For more information, see the ARSubsystems depth subsystem documentation.
Plane tracking
ARCore supports plane subsumption (that is, one plane can be included in another). When this happens, the subsumed plane isn't removed, but won't be updated any further.
ARCore provides boundary points for all its planes.
The ARCore plane subsystem requires additional CPU resources and can be energy-intensive. Enabling both horizontal and vertical plane detection requires additional resources. Consider disabling plane detection when your app doesn't need it to save energy.
Setting the plane detection mode to PlaneDetectionMode.None
is equivalent to Stop
ping subsystem.
For more information, see the ARSubsystems plane subsystem documentation.
Image tracking
To use image tracking on ARCore, you must first create a reference image library. To learn how to do this, see the AR Subsystems documentation on image tracking.
When building the Player for Android, each reference image library is used to generate a corresponding imgdb
file, which is how ARCore represents reference image libraries. These files are placed in your project's StreamingAssets
folder, in a subdirectory called HiddenARCore
, so they can be accessed at runtime.
ARCore's AR reference images can be either JPEG or PNG files. If a different type of source texture is specified in the XRReferenceImageLibrary
, the ARCore build processor will attempt to convert the texture to a PNG for ARCore to use. Exporting a Texture2D
to PNG can fail for several reasons. For example - the texture must be marked both readable and uncompressed in the texture importer settings. If you plan to use the texture at runtime (and not just as a source Asset for the reference image), you should create a separate PNG or JPEG as the source Asset, because those texture import settings can negatively affect performance or memory at runtime.
Reference image dimensions
Image dimensions are optional on ARCore; however, specifying them can improve image detection. If you specify the dimensions for a reference image, ARCore only receives the image's width, and then computes the height from the image's aspect ratio.
Face tracking
For information about face tracking, see the ARSubsystems Face Tracking documentation.
In addition to the core functionality, the ARCore face subsystem has methods that allow access to ARCore-specific features. ARCore provides access to regions, which are specific features on a face. Currently, these features are:
- Nose tip
- Forehead left
- Forehead right
Each region has a Pose associated with it. To access face regions, you need to obtain an instance of the ARCoreFaceSubsystem:
Light estimation
ARCore light estimation has two modes of operation: LightEstimationMode.AmbientIntensity
and LightEstimationMode.EnvironmentalHDR
. LightEstimationMode.AmbientIntensity
providers color correction and average pixel intensity information, while LightEstimationMode.EnvironmentalHDR
provides an estimated Main Light Direction, HDR Color, and the ambient SphericalHarmonicsL2 (see SphericalHarmonicsL2 for more information on Spherical Harmonics). The two modes can't be used simultaneously.
Additionally, the light estimation modes are either used or affected by other subsystems, namely the ARCoreFaceSubsystem and ARCore's EnvironmentProbeSubsystem. If one or both of these subsystems is present and enabled
, the light estimation mode will have its behaviour modified depending on the configuration.
Functionality | Supported light estimation modes | Modifiable |
---|---|---|
Face tracking | LightEstimationMode.AmbientIntensity , LightEstimationMode.Disabled | Yes |
Environment probes | LightEstimationMode.EnvironmentalHDR | No |
Face tracking: ARCore doesn't support
LightEstimationMode.EnvironmentalHDR
while face tracking and rendering won't work if that mode is specified. To prevent this, the providers only allowLightEstimationMode.AmbientIntensity
orLightEstimationMode.Disabled
modes to be set while tracking faces or it will enforceLightEstimationMode.Disabled
.Environment probes: ARCore environment probes rely on the light estimation mode being
LightEstimationMode.EnvironmentalHDR
to surface and update the cubemap, and therefore take ownership of the setting.
Camera configuration
XRCameraConfiguration contains an IntPtr
field nativeConfigurationHandle
, which is a platform-specific handle. For ARCore, this handle is the pointer to the ArCameraConfiguration
. The native object is managed by Unity and should not be manually destroyed.
Occlusion
The ARCore implementation of XROcclusionSubsystem supports Environment Depth Texture but does not support the other textures related to human segmentation.
Requirements
This version of ARCore XR Plugin is compatible with the following versions of the Unity Editor:
- 2019.4.15f1
- 2020.3
- 2021.1
- 2021.2
Known limitations
- The AR Core Supported setting in the XR Settings section of the Android Player settings must remain disabled in order for apps built with the ARCore XR Plugin to work properly.
- Color Temperature in degrees Kelvin is not presently supported.
- Due to changes made in Google's ARCore client libraries which are redistributed in ARCore XR Plugin, projects built with Unity 2019.4 must be updated to use Gradle 5.6.4 or later. Please refer to these instructions for updating your project's Gradle version.
- The XROcclusionSubsystemDescriptor properties supportsEnvironmentDepthImage and supportsEnvironmentDepthConfidenceImage require a session before support can be determined. If there is no session, then these properties will return
false
. They may later returntrue
when a session has been established.
Unity Arcore Anchor
Package contents
This version of ARCore XR Plugin includes:
- A shared library which provides implementation of the XR Subsystems listed above
- A shader used for rendering the camera image
- A plugin metadata file
ARCore is a platform for building augmented reality apps on Android. ARCore uses three key technologies to integrate virtual content with the real world as seen through your phone's camera:
- Motion tracking allows the phone to understand and track its position relative to the world.
- Environmental understanding allows the phone to detect the size and location of flat horizontal surfaces like the ground or a coffee table.
- Light estimation allows the phone to estimate the environment's current lighting conditions.
This codelab guides you through building a simple demo game to introduce these capabilities so you can use them in your own applications.
- Enabling ARCore through the Player settings
- Adding the ARCore SDK prefabs to the scene
- Scaling objects consistently to look reasonable with respect to the real world.
- Using Anchors to place objects at a fixed location relative to the real world.
- Using detected planes as the foundation of augmented reality objects
- Using touch and gaze input to interact with the ARCore scene
Unity Game Engine- Recommended version: Unity 2017.4 LTS or later
- Minimum version: 2017.3.0f2
- JDK 8 (JDK 9 is currently not supported by Unity, use JDK 8 instead)
- Recommended version: v1.5.0 or later ( arcore-unity-sdk-v1.5.0.unitypackage)
- Minimum version: v1.0.0
- Recommended version: Unity 2017.4 LTS or later
- Minimum version: 2017.3.0f2
- JDK 8 (JDK 9 is currently not supported by Unity, use JDK 8 instead)
- Recommended version: v1.5.0 or later ( arcore-unity-sdk-v1.5.0.unitypackage)
- Minimum version: v1.0.0
Create a new Unity 3D project and change the target platform to Android (under File > Build Settings).
Select Android and click Switch Platform.
Then click Player Settings... to configure the Android specific player settings.
- In Other Settings, disable Multithreaded rendering
- Set the Package name to a unique name e.g. com.<yourname>.arcodelab
- Set the Minimum API level to 7.0 (Nougat) API level 24 or higher
- In XR Settings section at the bottom of the list, enable ARCore Supported
Add the codelab assets
- In XR Settings section at the bottom of the list, enable ARCore Supported
Import arcore-intro.unitypackage into your project. (If you haven't already done so, check the Overview step for a list of prerequisites you need to download). This contains prefabs and scripts that will expedite the parts of the codelab so you can focus on how to use ARCore.
- Delete the default Main Camera game object. We'll use the First Person Camera from the ARCore Device prefab instead.
- Delete the default Directional light game object.
- Add the Assets/GoogleARCore/Prefabs/ARCore Device prefab to the root of your scene. Make sure its position is set to (0,0,0).
- Add the Assets/GoogleARCore/Prefabs/Environmental Light prefab to the root of your scene.
- Add the built-in EventSystem (from the menu: GameObject>UI>EventSystem).
Now you have a scene setup for using ARCore. Next, let's add some code!
The scene controller is used to coordinate between ARCore and Unity. Create an empty game object and change the name to SceneController. Add a C# script component to the object also named SceneController.
AR Optional application, you can see more information on the Google Developer website.
Open the script. We need to check for a variety of error conditions. These conditions are also checked by the HelloARExample controller sample script in the SDK.
First add the using statement to resolve the class name from the ARCore SDK. This will make auto-complete recognize the ARCore classes and methods used.
SceneController.cs
Update() method. At the same time, adjust the screen timeout so it stays on if we are tracking.
Save the scene with the name 'ARScene' and add it to the list of scenes when building.
Build and run the sample app. If everything is working, you should be prompted for permission to take pictures and record video, after which you'll start seeing a preview of the camera image. Once you see the preview image, you're ready to use ARCore!
If there is an error, you'll want to resolve it before continuing with the codelab.
Unity's scaling system is designed so that when working with the Physics engine, 1 unit of distance can be thought of as 1 meter in the real world. ARCore is designed with this assumption in mind. We use this scaling system to scale virtual objects so they look reasonable in the real world.
For example, an object placed on a desktop should be small enough to be on the desktop. A reasonable starting point would be ½ foot (15.24 cm), so the scale should (0.1524, 0.1524, 0.1524).
This might not look the best in your application, but it tends to be a good starting point and then you can fine tune the scale further for your specific scene.
As a convenience, the prefabs used in this codelab contain a component named GlobalScalable which supports using stretch and pinch to size the objects. To enable this, the touch input needs to be captured.
Now when running the application, the user can pinch or stretch the objects to fit the scene more appropriately.
Next, let's detect and display the planes that are detected by ARCore.
ARCore uses a class named DetactedPlane to represent detected planes. This class is not a game object, so we need to make a prefab that will render the detected planes. Good news since ARCore 1.2 there's already such a prefab in the ARCore Unity SDK, it is Assets/GoogleARCore/Examples/Common/Prefabs/DetectedPlaneVisualizer.
ARCore detects horizontal and vertical planes. We'll use these planes in the game. For each newly detected plane, we'll create a game object that renders the plane using the DetectedPlaneVisualizer prefab. You may have guessed it, since ARCore 1.2 there's a convenient script in ARCore Unity SDK, Assets/GoogleArCore/Examples/Common/Scripts/DetectedPlaneGenerator.cs that just does this.
DetectedPlaneGenerator onto SceneController object. Select SceneController object, in property Inspector, click Add Component button, and type-in DetectedPlaneGenerator. Then set the value of Detected Plane Prefab to the prefab Assets/GoogleARCore/Examples/Common/Prefabs/DetectedPlaneVisualizer.
In SceneController script, create a new method named ProcessTouches(). This method will perform the ray casting hit test and select the plane that is tapped.
SceneController.cs
Remember:
Set the firstPersonCamera
to the first person camera object in the scene editor!
In ARCore, objects that maintain a constant position as you move around are positioned by using an Anchor. Let's create an Anchor to hold a floating scoreboard.
Write the Scoreboard controller script
In order to position the scoreboard, we'll need to know where the user is looking. So we'll add a public variable for the first person camera.
The scoreboard will also be 'anchored' to the ARScene. An anchor is an object that holds it position and rotation as ARCore processes the sensor and camera data to build the model of the world.
To keep the anchor consistent with the plane, we'll keep track of the plane and make sure the distance in the Y axis is constant.
Also add a member to keep track of the score.
using GoogleARCore; to script to resolve the Anchor class!
Just as in the previous step, save the script, switch to the scene editor, and set 'First Person Camera' property to ARCore Device/First Person Camera from the scene hierarchy.
We'll place the scoreboard above the selected plane. This way it will be visible and indicate which plane we're focused on.
In ScoreboardController script, in the Start() method, add the code to disable the mesh renderers.
Create the function SetSelectedPlane()
This is called from the scene controller when the user taps a plane. When this happens, we'll create the anchor for the scoreboard
Create the function CreateAnchor()
The CreateAnchor method does 5 things:
- Raycast a screen point through the first person camera to find a position to place the scoreboard.
- Create an ARCore Anchor at that position. This anchor will move as ARCore builds a model of the real world in order to keep it in the same location relative to the ARCore device.
- Attach the scoreboard prefab to the anchor as a child object so it is displayed correctly.
- Record the yOffset from the plane. This will be used to keep the score the same height relative to the plane as the plane position is refined.
- Enable the renderers so the scoreboard is drawn.
Add code to ScoreboardController.Update()
First check for tracking to be active.
ScoreboardController.cs
The last thing to add is to rotate the scoreboard towards the user as they move around in the real world and adjust the offset relative to the plane.
Call SetSelectedPlane() from the scene controller
Switch back to the SceneController script and add a member variable for the ScoreboardController.
SceneController.cs
Save the scripts and the scene. Build and run the app! Now it should display planes as they are detected, and if you tap one, you'll see the scoreboard!
Now that we have a plane, let's put a snake on it and move it around on the plane.
Create a new Empty Game object named Snake.
Add the existing C# script (Assets/Codelab/Scripts/Slithering.cs). This controls the movement of the snake as it grows. In the interest of time, we'll just add it, but feel free to review the code later on.
Add a new C# script to the Snake named SnakeController.
In SnakeController.cs, we need to track the plane that the snake is traveling on. We'll also add member variables for the prefab for the head, and the instance:
using GoogleARCore; to script to resolve the DetectedPlane class!Create the SetPlane() method
In SnakeController script, add a method to set the plane. When the plane is set, spawn a new snake.
SnakeController.cs
Now add a member variable to the SceneController.cs to reference the Snake.
SceneController.cs
To move the snake, we'll use where we are looking as a point that the snake should move towards. To do this, we'll raycast the center of the screen through the ARCore session to a point on a plane.
First let's add a game object that we'll use to visualize where the user is looking at.
Edit the SnakeController and add member variables for the pointer and the first person camera. Also add a speed member variable.
Set the game object properties
Save the script and switch to the scene editor.
Add an instance of the Assets/CodelabPrefabs/gazePointer to the scene.
Then, select Snake object, in Inspector view, set the pointer property to the instance of the gazePointer, and the firstPersonCamera to the ARCore device's first person camera.
SnakeController.csSnakeController.Update().Move towards the pointer
Move towards the pointer
Once the snake is heading in the right direction, move towards it. We want to stop before the snake is at the same spot to avoid a weird nose spin. Add below code to the end of SnakeController.Update().
using GoogleARCore; to script to resolve the DetectedPlane class!Add the food tag
Add a tag in the editor by dropping down the tag selector in the object inspector, and select 'Add Tag'. Add a tag named 'food'. We'll use this tag to identify food objects during collision detection.
Important: Remember to add the tag using the tag manager! If you don't the food won't be recognized and won't be eaten!
FoodController.cs
In the FoodController.Update() method:
- Check for a null plane; if we don't have a plane we can't do anything.
- Check for a valid detectedPlane. Again, if the plane is not tracking, do nothing.
FoodController.cs
- Check if there is no active food instance, and spawn a new one if needed.
FoodController.cs
FoodController.csSetSelectedPlane()in the SceneController.SetSelectedPlane():
It to the instance when we spawn.
In SnakeController.SpawnSnake(), add the component to the new instance.
FoodConsumer.cs
Remember that Scoreboard from the beginning of the codelab? Well, now it is time to actually use it!
In SceneController.Update(), set the score to the length of the snake:
ScoreboardController.cs
Add GetLength() in the SnakeController
Other Resources
As you continue your ARCore exploration. Check out these other resources:
- AR Concepts: https://developers.google.com/ar/discover/concepts
- Google Developers ARCore https://developers.google.com/ar/
- Github projects for ARCore: https://github.com/google-ar
- AR experiments for inspiration and to see what could be possible: https://experiments.withgoogle.com/ar