Unity AR Foundation Tutorial for Beginners

This beginner-friendly tutorial explains how to get started with Unity AR Foundation, a powerful framework for creating cross-platform augmented reality apps. You’ll learn how to set up your development environment, configure AR settings, deploy your first AR scene, and test it on Android or iOS. Designed for creators with minimal AR experience, this step-by-step guide uses Unity’s built-in AR Foundation package and emphasizes best practices, troubleshooting, and extensibility.

Unity AR Foundation Tutorial for Beginners

What is Unity AR Foundation?

Unity AR Foundation is a framework developed by Unity Technologies to simplify cross-platform AR development using a single API. It supports both ARKit (iOS) and ARCore (Android), allowing developers to write once and deploy to multiple platforms.

Why Use Unity AR Foundation in 2025?

AR Foundation has matured significantly with recent updates supporting improved plane detection, depth estimation, occlusion, light estimation, and integration with new XR devices. For developers building scalable, real-world AR apps or prototypes, AR Foundation provides flexibility, performance, and platform consistency.


Prerequisites

Before you begin this tutorial, ensure you have:

  • Basic understanding of Unity interface and GameObjects
  • Unity Hub and Unity 2022.3 LTS or later
  • Android Studio or Xcode installed (for Android/iOS deployment)
  • AR-supported mobile device (Android 8.1+ or iOS 13+)

Step 1: Install Unity and Create a Project

  1. Open Unity Hub.
  2. Create a new 3D project and name it “AR Foundation Demo”.
  3. Choose Unity 2022.3 LTS or higher for maximum AR compatibility.

Step 2: Install AR Foundation Packages

  1. Go to Window > Package Manager.
  2. Click + and choose Add package from Git URL (for latest AR Foundation).
  3. Install the following:
    • com.unity.xr.arfoundation
    • com.unity.xr.arcore
    • com.unity.xr.arkit

These packages ensure compatibility with both Android (ARCore) and iOS (ARKit).

Also Read: 10 Best Free Tools for Building AR Apps: 2025 Guide for Developers & Creators


Step 3: Set Up XR Plugin Management

  1. Go to Edit > Project Settings > XR Plugin Management.
  2. Click Install XR Plugin Management.
  3. Select Android and enable ARCore.
  4. Select iOS and enable ARKit.

This step ensures platform-specific plugins are activated.


Step 4: Configure AR Session and AR Camera

  1. Create a new Empty GameObject in the scene, rename it AR Session.
  2. Add the following components:
    • AR Session
    • AR Input Manager
  3. Add another Empty GameObject and name it AR Session Origin.
  4. Add:
    • AR Session Origin
    • AR Camera Manager
    • AR Plane Manager (for plane detection)
    • AR Raycast Manager (for interaction)

Then, make your Main Camera a child of this object, and add:

  • Camera tag
  • AR Camera component

Step 5: Add Plane Detection and Visualizer

To visualize detected planes:

  1. Import a simple material and prefab (create or download an AR plane prefab).
  2. Assign this prefab to the AR Plane Manager.
  3. Make sure the prefab has a Mesh Renderer and a transparent material for display.

This step allows users to see detected surfaces in real time.

Also Read: SWERY Predicts VR Will Be The Next Major Computing Platform


Step 6: Deploy to Mobile Device

Android Deployment:

  1. Go to File > Build Settings.
  2. Select Android, then click Switch Platform.
  3. Under Player Settings:
    • Set Minimum API level to 24+
    • Enable ARCore supported
    • Set Camera Usage Description
  4. Connect your device, enable Developer Mode and USB Debugging.
  5. Click Build and Run.

iOS Deployment:

  1. Switch to iOS in Build Settings.
  2. Set your Bundle Identifier, team ID, and camera permissions.
  3. Export project to Xcode and deploy to a connected iOS device.

Step 7: Add Touch Interaction (Place Object on Plane)

To place a 3D object on a plane:

  1. Create a simple 3D prefab (e.g., a cube or imported model).
  2. Attach a script to your AR Session Origin GameObject like this:
using UnityEngine;
using UnityEngine.XR.ARFoundation;
using UnityEngine.XR.ARSubsystems;
using System.Collections.Generic;

public class TapToPlace : MonoBehaviour
{
    public GameObject objectToPlace;
    private ARRaycastManager raycastManager;
    private List<ARRaycastHit> hits = new List<ARRaycastHit>();

    void Start()
    {
        raycastManager = GetComponent<ARRaycastManager>();
    }

    void Update()
    {
        if (Input.touchCount > 0)
        {
            Touch touch = Input.GetTouch(0);
            if (raycastManager.Raycast(touch.position, hits, TrackableType.PlaneWithinPolygon))
            {
                Pose hitPose = hits[0].pose;
                Instantiate(objectToPlace, hitPose.position, hitPose.rotation);
            }
        }
    }
}
  1. Drag your prefab to the public objectToPlace field in Unity Inspector.

This will instantiate your object wherever the user taps on a detected surface.


Step 8: Best Practices for Unity AR Foundation

  • Use lightweight shaders for mobile AR performance.
  • Regularly test on real devices; Unity Editor support for AR simulation is limited.
  • Use object pooling if you expect to spawn many AR objects.
  • Handle permissions gracefully (camera access, device compatibility).
  • Optimize textures and meshes to avoid battery drain and app crashes.

Also Read: PlayStation VR2 Eye Tracking Now Works On PC: A Major Breakthrough


Step 9: Common Errors and Fixes

IssueSolution
AR camera not displaying video feedEnsure AR Camera Manager is added and camera permission is granted
Planes not detectedCheck light conditions and verify AR Plane Manager setup
App crashes on startMatch ARCore/ARKit versions with Unity version, and clear Gradle/Xcode errors
Touch not respondingVerify raycast hits and that prefab is assigned properly

Conclusion

Unity AR Foundation is one of the most powerful ways to build AR applications that work across iOS and Android. With minimal setup, you can deploy interactive, real-world AR experiences. This tutorial has walked you through the essentials—now you’re ready to explore anchors, occlusion, face tracking, and beyond.


FAQs: Unity AR Foundation Tutorial

  1. Can I use Unity AR Foundation without writing code?
    Yes, using Unity’s visual scripting or templates, basic AR projects can be done with minimal coding.
  2. What is the difference between AR Foundation and Vuforia in Unity?
    AR Foundation is optimized for real-world tracking using ARKit/ARCore, while Vuforia focuses on image/marker tracking.
  3. How do I enable occlusion in Unity AR Foundation?
    Install ARKit or ARCore Depth packages, then use the AR Occlusion Manager to apply depth masking.
  4. Can I simulate AR Foundation scenes in the Unity Editor?
    Basic simulation is available, but most features require deployment to a real device.
  5. Does Unity AR Foundation support WebAR?
    No, AR Foundation is native-only. For WebAR, tools like 8thWall or ZapWorks are recommended.
  6. What rendering pipeline works best with AR Foundation?
    URP (Universal Render Pipeline) is recommended for performance and mobile optimization.
  7. Can I publish AR Foundation apps on the Play Store and App Store?
    Yes, AR Foundation apps can be published like any other Unity mobile app with platform-specific packaging.
  8. Is face tracking possible in AR Foundation?
    Yes, ARKit supports face tracking on iOS, but Android support is limited and device-dependent.
  9. What Unity version is best for AR Foundation in 2025?
    Unity 2022.3 LTS or higher offers the most stable AR Foundation integration.
  10. How can I make my AR app more interactive?
    Use raycasting, object manipulation scripts, gesture detection, and physics interactions for richer user engagement.

Leave a Comment