This beginner-friendly tutorial explains how to get started with Unity AR Foundation, a powerful framework for creating cross-platform augmented reality apps. You’ll learn how to set up your development environment, configure AR settings, deploy your first AR scene, and test it on Android or iOS. Designed for creators with minimal AR experience, this step-by-step guide uses Unity’s built-in AR Foundation package and emphasizes best practices, troubleshooting, and extensibility.
What is Unity AR Foundation?
Unity AR Foundation is a framework developed by Unity Technologies to simplify cross-platform AR development using a single API. It supports both ARKit (iOS) and ARCore (Android), allowing developers to write once and deploy to multiple platforms.
Why Use Unity AR Foundation in 2025?
AR Foundation has matured significantly with recent updates supporting improved plane detection, depth estimation, occlusion, light estimation, and integration with new XR devices. For developers building scalable, real-world AR apps or prototypes, AR Foundation provides flexibility, performance, and platform consistency.
Prerequisites
Before you begin this tutorial, ensure you have:
- Basic understanding of Unity interface and GameObjects
- Unity Hub and Unity 2022.3 LTS or later
- Android Studio or Xcode installed (for Android/iOS deployment)
- AR-supported mobile device (Android 8.1+ or iOS 13+)
Step 1: Install Unity and Create a Project
- Open Unity Hub.
- Create a new 3D project and name it “AR Foundation Demo”.
- Choose Unity 2022.3 LTS or higher for maximum AR compatibility.
Step 2: Install AR Foundation Packages
- Go to Window > Package Manager.
- Click + and choose Add package from Git URL (for latest AR Foundation).
- Install the following:
com.unity.xr.arfoundation
com.unity.xr.arcore
com.unity.xr.arkit
These packages ensure compatibility with both Android (ARCore) and iOS (ARKit).
Also Read: 10 Best Free Tools for Building AR Apps: 2025 Guide for Developers & Creators
Step 3: Set Up XR Plugin Management
- Go to Edit > Project Settings > XR Plugin Management.
- Click Install XR Plugin Management.
- Select Android and enable ARCore.
- Select iOS and enable ARKit.
This step ensures platform-specific plugins are activated.
Step 4: Configure AR Session and AR Camera
- Create a new Empty GameObject in the scene, rename it AR Session.
- Add the following components:
AR Session
AR Input Manager
- Add another Empty GameObject and name it AR Session Origin.
- Add:
AR Session Origin
AR Camera Manager
AR Plane Manager
(for plane detection)AR Raycast Manager
(for interaction)
Then, make your Main Camera a child of this object, and add:
- Camera tag
AR Camera
component
Step 5: Add Plane Detection and Visualizer
To visualize detected planes:
- Import a simple material and prefab (create or download an AR plane prefab).
- Assign this prefab to the AR Plane Manager.
- Make sure the prefab has a Mesh Renderer and a transparent material for display.
This step allows users to see detected surfaces in real time.
Also Read: SWERY Predicts VR Will Be The Next Major Computing Platform
Step 6: Deploy to Mobile Device
Android Deployment:
- Go to File > Build Settings.
- Select Android, then click Switch Platform.
- Under Player Settings:
- Set Minimum API level to 24+
- Enable ARCore supported
- Set Camera Usage Description
- Connect your device, enable Developer Mode and USB Debugging.
- Click Build and Run.
iOS Deployment:
- Switch to iOS in Build Settings.
- Set your Bundle Identifier, team ID, and camera permissions.
- Export project to Xcode and deploy to a connected iOS device.
Step 7: Add Touch Interaction (Place Object on Plane)
To place a 3D object on a plane:
- Create a simple 3D prefab (e.g., a cube or imported model).
- Attach a script to your AR Session Origin GameObject like this:
using UnityEngine;
using UnityEngine.XR.ARFoundation;
using UnityEngine.XR.ARSubsystems;
using System.Collections.Generic;
public class TapToPlace : MonoBehaviour
{
public GameObject objectToPlace;
private ARRaycastManager raycastManager;
private List<ARRaycastHit> hits = new List<ARRaycastHit>();
void Start()
{
raycastManager = GetComponent<ARRaycastManager>();
}
void Update()
{
if (Input.touchCount > 0)
{
Touch touch = Input.GetTouch(0);
if (raycastManager.Raycast(touch.position, hits, TrackableType.PlaneWithinPolygon))
{
Pose hitPose = hits[0].pose;
Instantiate(objectToPlace, hitPose.position, hitPose.rotation);
}
}
}
}
- Drag your prefab to the public
objectToPlace
field in Unity Inspector.
This will instantiate your object wherever the user taps on a detected surface.
Step 8: Best Practices for Unity AR Foundation
- Use lightweight shaders for mobile AR performance.
- Regularly test on real devices; Unity Editor support for AR simulation is limited.
- Use object pooling if you expect to spawn many AR objects.
- Handle permissions gracefully (camera access, device compatibility).
- Optimize textures and meshes to avoid battery drain and app crashes.
Also Read: PlayStation VR2 Eye Tracking Now Works On PC: A Major Breakthrough
Step 9: Common Errors and Fixes
Issue | Solution |
---|---|
AR camera not displaying video feed | Ensure AR Camera Manager is added and camera permission is granted |
Planes not detected | Check light conditions and verify AR Plane Manager setup |
App crashes on start | Match ARCore/ARKit versions with Unity version, and clear Gradle/Xcode errors |
Touch not responding | Verify raycast hits and that prefab is assigned properly |
Conclusion
Unity AR Foundation is one of the most powerful ways to build AR applications that work across iOS and Android. With minimal setup, you can deploy interactive, real-world AR experiences. This tutorial has walked you through the essentials—now you’re ready to explore anchors, occlusion, face tracking, and beyond.
FAQs: Unity AR Foundation Tutorial
- Can I use Unity AR Foundation without writing code?
Yes, using Unity’s visual scripting or templates, basic AR projects can be done with minimal coding. - What is the difference between AR Foundation and Vuforia in Unity?
AR Foundation is optimized for real-world tracking using ARKit/ARCore, while Vuforia focuses on image/marker tracking. - How do I enable occlusion in Unity AR Foundation?
Install ARKit or ARCore Depth packages, then use theAR Occlusion Manager
to apply depth masking. - Can I simulate AR Foundation scenes in the Unity Editor?
Basic simulation is available, but most features require deployment to a real device. - Does Unity AR Foundation support WebAR?
No, AR Foundation is native-only. For WebAR, tools like 8thWall or ZapWorks are recommended. - What rendering pipeline works best with AR Foundation?
URP (Universal Render Pipeline) is recommended for performance and mobile optimization. - Can I publish AR Foundation apps on the Play Store and App Store?
Yes, AR Foundation apps can be published like any other Unity mobile app with platform-specific packaging. - Is face tracking possible in AR Foundation?
Yes, ARKit supports face tracking on iOS, but Android support is limited and device-dependent. - What Unity version is best for AR Foundation in 2025?
Unity 2022.3 LTS or higher offers the most stable AR Foundation integration. - How can I make my AR app more interactive?
Use raycasting, object manipulation scripts, gesture detection, and physics interactions for richer user engagement.