JP/EN/CH
MENU

ToF AR
Download

Table of Contents

1. About ToF AR

ToF AR, Time of Flight Augmented Reality, is a toolkit library intended to aid in Unity application development. It consists of a group of functionalities with the ToF camera at its core.

1.1. System requirements for ToF AR

1.1.1. Supported devices for applications developed with ToF AR

Please refer to the section list of SDK compatible devices for a list of Sony, Samsung, Huawei, Honor and Apple devices supported by the ToF AR SDK. This list also includes the camera IDs and default settings for supported devices.

1.1.2. Recommended software versions for ToF AR application development

  • Unity 2021.3.31f1, Unity 2022.3.11f1 (Unity 2020 or later is required)
    with Android and iOS builds set up.

    For Android setup, please see Android environment setup in the Unity manual.
  • AR Foundation 4.2.8

  • ARCore Extensions for AR Foundation 1.40.0

1.2. 3D coordinate system of ToF AR

The 3D coordinate system of ToF AR is called the camera coordinate system. The camera coordinate system is a left-handed coordinate system with the ToF camera as the origin. The unit of measurement is meters.

Illustration of the 3D coordinate system of ToF AR

1.3. Unity world coordinate system and ToF camera image coordinate system

The Unity world coordinate system is a left-handed coordinate system.

The ToF camera image coordinate system is a two-dimensional coordinate system that depends on the orientation of the device.

Illustration of the ToF Image coordinate system for a device in portrait orientation
Illustration of the ToF Image coordinate system for a device in landscape left orientation

2. ToF AR Overview

2.1. System diagram

ToF AR consists of two parts:

  • ToF AR Component Library accesses the hardware, such as the RGB camera and the ToF camera, and implements algorithms, and so on.

  • ToF AR Unity Toolkit is a Unity wrapper for the various functions in the ToF AR Component Library.
    The ToF AR Unity Toolkit is the part that application developers use directly.

System diagram of ToF AR

2.2. Components and Toolkit

ToF AR provides a Manager Prefab and a Visualization Prefab as a Toolkit for each Component that carries out a subfunction. A Manager Prefab manages Components. A Visualization Prefab displays the processing results of Components. The results are transferred from the Manager Prefab to the Visualization Prefab and User Applications via Stream. The figure below shows these relationships.

System diagram showing ToF AR component streams and prefabs
  • Stream
    Provides data transfer between Components. The data transferred is different for each Component, for example, Depth data, Color data, Mesh data, and coordinate data of feature points. It can contain multiple data.

  • Manager Prefab
    Provides access to each Component and its output Stream. To use a Component, obtain a singleton instance of the corresponding Manager Prefab.

  • Visualization Prefab
    Provides a screen display function for data obtained from Component Streams.

When using ToF AR Toolkit, obtaining an instance of either Color Manager Prefab or ToF Manager Prefab is required. Which one is needed depends on the Component, refer to the Prefab description in the ToF AR reference articles. The reason is that Components other than Color/ToF Component also process Stream from Color/ToF Component.

2.3. Component list

ToF AR consists of Components that have the following functions. Users can use the Components alone or in combination.

Component Description Package

ToF

Get Depth data, confidence data, and PointCloud data
See ToF AR reference articles

Base

Color

Get RGB color data
See ToF AR reference articles

Base

Mesh

Get 3D mesh data
See ToF AR reference articles

Base

Plane

Get plane estimation data
See ToF AR reference articles

Base

Coordinate

Depth data, Color data, Mutual coordinate conversion function in 3D coordinate space
See ToF AR reference articles

Base

Hand

Get hand (finger) position estimation data
See ToF AR reference articles

Base, Hand

MarkRecog

Mark recognition
See ToF AR reference articles

Base, Hand

Modeling

3D Modeling
See ToF AR reference articles

Base

Body

Body recognition
See ToF AR reference articles

Base

Segmentation

Segmentation recognition
See ToF AR reference articles

Base

Face

Face recognition
See ToF AR reference articles

Base

Package is the required Unity package to import during setup:

  • Base requires TofAR_Base_xxxx_yyyy.unitypackage

  • Hand requires TofAR_Hand_xxxx_yyyy.unitypackage

Where xxxx is the version number, and yyyy is the build target platform.

2.4. Package layout

The contents of the ToF AR package are listed below. The package can be placed and used anywhere, at the user’s discretion.

Screenshot of the ToF AR package file layout
Folder Description

01_UnityProject

ToF AR Unity package file

2.5. Unity package internal folder layout

The folder layout when ToF AR has been set up inside Unity is as follows:

Screenshot of the directory structure in Unity when ToF AR is set up
Folder Description

TofAr/TofArXXXX

One folder per component.
The basic layout is shared among the components.

TofAr/TofArXXXX/Plugins

Binary files needed for the ToF AR to function properly.
Do not delete or edit these files.

TofAr/TofArXXXX/V0/Resources

Various resource files.
Prefabs are stored here.

TofAr/TofArXXXX/V0/Resources/xml

Setting files needed for the ToF AR to function properly.
Do not delete or edit these files.

TofAr/TofArXXXX/V0/Scripts

Scripts are stored here.

3. Setup

This chapter describes how to set up the ToF AR application development environment.

3.1. Setting up or upgrading ToF AR

3.1.1. Set up ToF AR

Set up the ToF AR Unity Toolkit as follows:

  1. Create a new project with Unity
    The Unity project can be specified at an arbitrary location.

  2. From the menu, open up File / Build Settings…​ / Player Settings / Android Settings and make the following settings under Other Settings:

    • Api Compatibility Level: .NET Framework ( .NET4.x for versions before Unity2020 ) or .Net Standard 2.1

    • Scripting Backend: IL2CPP

    • Target Architectures: Select only ARM64

      Screenshot in Unity of Player Settings
  3. From the menu, select File / Build Settings…​ / Player Settings / iOS Settings and set the following in Other Settings:

    • Api Compatibility Level: .NET Framework ( .NET4.x for versions before Unity2020 ) or .Net Standard 2.1

    • Camera Usage Description: Enter a string that describes how the camera is going to be used

      Screenshot in Unity of Player Settings
  4. Select Assets / ImportPackage / Custom Package… and import TofAR_Base_xxxx_yyyy.unitypackage, where xxxx is the version number and yyyy is the build target platform.

    Screenshot in Unity of importing a custom package
  5. If using the Hand Component also, then import TofAR_Hand_xxxx_yyyy.unitypackage, where xxxx is the version number and yyyy is the build target platform, in the same way.

  • In ToF AR v1.4.0, the Unity package files are provided separately as Base components and Hand components, and for build target platforms Android and iOS.

  • If both Android and iOS are target platforms, import both of the respective Unity packages.

  • To apply Hand components, import both Unity packages, for Hand and Base components.

3.1.2. Upgrade ToF AR

To upgrade your installation for existing projects using the ToF AR Unity Toolkit, do as follows:

  1. Open the existing project with a Unity version that supports the new ToF AR. A dialogue will appear if a different version of Unity was used prior to this. Press Continue.

    Screenshot of Unity dialog box when an existing project is opened with a different version of Unity
  2. Delete the TofAr folder from the Project View.

  3. Select Assets / ImportPackage / Custom Package… and import TofAR_Base_xxxx_yyyy.unitypackage, where xxxx is the version number and yyyy is the build target platform.

  4. If using the Hand Component also, then import TofAR_Hand_xxxx_yyyy.unitypackage, where xxxx is the version number and yyyy is the build target platform, in the same way.

  • In ToF AR v1.4.0, the Unity package files are provided separately as Base components and Hand components, and for build target platforms Android and iOS.

  • If both Android and iOS are target platforms, import both of the respective Unity packages.

  • To apply Hand components, import both Unity packages, for Hand and Base components.

  • If there are compiler errors, adjust the code accordingly.

3.2. Setting up AR Foundation

3.2.1. Install Packages

Install the following packages from Package Manager of Unity Editor:

  • AR Foundation

  • ARCore XR Plugin (Android)

  • ARKit XR Plugin (iOS)

  • ARCore Extensions for AR Foundation

For more information on AR Foundation, see Getting started with AR Foundation.

Install AR Foundation and the ARCore XR Plugin

The following describes how to install ARCore as a Plugin for AR Foundation.

  1. Select Window / Package Manager.

    Screenshot in Unity of Package Manager menu item
  2. Select Packages: Unity Registry.

    Screenshot in Unity of Unity Registry menu item
  3. Select AR Foundation from the package list displayed, and then select Install.

    Screenshot in Unity of installing AR Foundation from the package list
  4. Select ARCore XR Plugin from the package list displayed, and then select Install.

    Screenshot in Unity of installing ARCore XR Plugin from the package list
Install ARCore Extensions for AR Foundation

Install ARCore Extensions for AR Foundation as follows:

  1. Download arcore-unity-extensions-x.x.x.tgz from Google AR GitHub, or other providers, where x.x.x indicates the software version.

  2. Select +, the plus sign, in the Package Manager window, then select Add Package from tarball…​, and install arcore-unity-extensions-x.x.x.tgz.

    Screenshot in Unity of menu item to Add package from git URL…​

3.2.2. Import the ARFoundationConnector

  1. Select Assets-Import Package-Custom Package…​ and open TofAR.ThirdParty.ARFoundationConnector-vx.x.x.unitypackage, where x.x.x indicates the package version.

  2. Click Import in the Import Unity Package window to import the package.

    Screenshot in Unity of the Import Unity Package window

    See also Set up ToF AR.

3.2.3. Set up ARCore

To apply ARCore as a Plug-in for AR Foundation:

  1. Edit the project file Assets/Plugins/Android/AndroidManifest.xml

    If the file AndroidManifest.xml does not already exist, select the CustomMainManifest checkbox under ProjectSettings / Publishing Settings / Build to create it.

    Screenshot in Unity of build menu item to choose a CustomMainManifest
  2. In the <application> tag, set android:extractNativeLibs="true".

    Screenshot of code from the file AndroidManifest.xml

3.3. Build settings for supporting API Level 31 and above of Android SDK Platform

3.3.1. For Unity2021

Gradle setup

Gradle 6.7.1 or above must be used in order to support API Level 31 and above of Android SDK Platform.

  1. Download and install gradle from https://gradle.org/releases/

  2. In Unity, specify the path of the installed gradle in Preferences / External Tools

    Screenshot in Unity showing Preferences-External Tools
Change project settings
  1. Select the Player Settings / Publishing Settings / Custom Main Manifest and Player Settings / Publishing Settings / Custom Base Gradle Template checkboxes

    Screenshot in Unity showing Project Settings-Player
  2. Edit the project file Assets/Plugins/Android/AndroidManifest.xml

  3. In the <application> tag, set the following three <uses-native-library> tags

    <uses-native-library
        android:name="libOpenCL.so"
        android:required="false" />
    <uses-native-library
        android:name="libOpenCL-car.so"
        android:required="false" />
    <uses-native-library
        android:name="libOpenCL-pixel.so"
        android:required="false" />
    Screenshot of AndroidManifest.xml code
  4. Edit the project file Assets/Plugins/Android/baseProjectTemplate.gradle

  5. Set the version of com.android.tools.build:gradle: to 4.2.0 or above

    Screenshot of baseProjectTemplate.gradle code
  6. Set Player Settings / Other Settings / Target API Level to 31 or above

    Screenshot in Unity of Target API Level setting in Project Settings-Player

3.3.2. For Unity2022

Gradle setup

Gradle 6.7.1 or above must be used in order to support API Level 31 and above of Android SDK Platform.

  1. Select the Preferences / External Tools / Gradle Installed with Unity checkbox

    Screenshot in Unity showing Preferences-External Tools
Change project settings
  1. Select the Player Settings / Publishing Settings / Custom Main Manifest checkbox

    Screenshot in Unity showing Project Settings-Player
  2. Edit the project file Assets/Plugins/Android/AndroidManifest.xml

  3. In the <application> tag, set the following three <uses-native-library> tags

    <uses-native-library
        android:name="libOpenCL.so"
        android:required="false" />
    <uses-native-library
        android:name="libOpenCL-car.so"
        android:required="false" />
    <uses-native-library
        android:name="libOpenCL-pixel.so"
        android:required="false" />
    Screenshot of AndroidManifest.xml code
  4. Set Player Settings / Other Settings / Target API Level to 31 or above

    Screenshot in Unity of Target API Level setting in Project Settings-Player

3.4. Eliminate TrueDepth API when submitting apps to App Store

There are cases in which submitting apps using ToF AR to Apple Store is rejected on ground that "TrueDepth API should be eliminated if it is not used."
Even if creating the apps only using LiDAR sensor and without TrueDepth camera, the rejection will be occurred since TrueDepth API is used inside ToF AR.

It will be resolved by deleting the part of using TrueDepth API below.

  1. Delete 2 files in Assets/TofAr/TofAr/Plugins/iOS.

    • component_tofar_true_depth.framework

    • component_tofar_true_depth.framework.meta

    Screenshot of two files related to truedepth in Folder

4. Developing applications with ToF AR

  • Access stream data
    ToF AR can acquire Raw level stream data, such as image data from RGB or ToF cameras, and Mesh vertex information. This describes the procedure to access the stream data using Manager class.

  • Save and play back stream data
    This describes the procedure to save Stream data to a file and play back the Stream data from a file.

  • Use AR Foundation
    This describes how to set up and debug simultaneous use of AR Foundation and ToF AR.

  • Debug with TofARServer
    This describes how to debug using TofARServer in connection with an Android device.

  • Configure a ToF AR device profile
    The standard ToF camera setting in Device Profile of ToF AR is performed automatically. However, manual settings are also possible. This procedure describes how to configure manual profile settings.

  • Configure the ToF mode
    Some ToF cameras have multiple modes with different ranges of ranging. This describes how to obtain the modes from devices and how to set the modes.

4.1. Access Stream data

4.1.1. Streams and Channels

An interface is prepared for each Manager class, that can get Raw level data including ToF camera image data and Mesh vertex information via scripts. This flow of data is called a Stream. One or several different types of data can be stored in the Stream, and each is stored in a different channel. The relationships between Manager, Stream, and Channel are shown below. For example, the stream of TofArTofManager has three channels with Depth data, Confidence data, and PointCloud data.

The structure is internal and the application developer can get the latest data at arbitrary timing by referring to the public data field of the Manager class, which is automatically updated from each channel.

ToF AR system diagram showing the relationships between Manager, Stream, and Channel

4.1.2. Event-driven data access

The following events are prepared for the Manager class, and event-driven data can be handled by registering event handlers.

EventName Description

OnStreamStarted

Stream has started

OnStreamStopped

Stream has stopped

OnFrameArrived

New data generated

4.1.3. Accessing Manager from User Script

The manager can be referred to from anywhere in the script through the Instance field of the Manager class, if the Manager Prefab is instantiated in the scene.

In the case of TofArTofManager, for example:

var instance = TofArTofManager.Instance;

4.1.4. Starting and ending the Stream

The StartStream() and StopStream() methods are prepared for the Manager class, and streams can be started and ended at any time.

4.2. Save and play back stream data

4.2.1. Saving stream data

You can save the stream to a file by calling SetProperty with a given RecordProperty input parameter.

RecordProperty input parameter
Member Type Description

Enabled

bool

true: Save
false: Do not save

Path

string

File save directory path

Formats

Dictionary<long, string>

Data format list

BufferNum

int

Number of buffers

Sample code
var recordProperty = new RecordProperty()
{
    Enabled = true,
    Path = recordingPath,
    BufferNum = 5
};
var channelInfo = TofArTofManager.Instance.GetProperty<ChannelInfoProperty>();

foreach (var channel in channelInfo.Channels)
{
    recordProperty.Formats[channel.Key] = "raw";
}

TofArTofManager.Instance.SetProperty(recordProperty);

4.2.2. Play back stream data

You can play back saved stream data by calling StartPlayback(string path) from the TofArTofManager and other Manager classes. Set path to the directory the saved data is stored in.

For components including Hand and Mesh that generate data from ToF data, you can generate data in real time and use it by first starting the playback of the saved file in TofArTofManager and then calling StartPlayback().

Sample code
// playback from saved files.
TofArTofManager.Instance.StartPlayback(path);

// playback from playing Tof data.
TofArHandManager.Instance.StartPlayback();

4.3. Use AR Foundation

4.3.1. ARFoundationConnector

Using ARFoundationConnector, you can simultaneously use the functions of AR Foundation and ToF AR provided by Unity.

ARFoundationConnector is an extension that relays Depth, Color, and Body data output from AR Foundation to ToF AR.

Set up AR Foundation
Enable ARFoundationConnector
  1. Place the TofAr.ThirdParty / ARFoundationConnector / Prefabs / ARFoundationConnector prefab into the scene.

  2. In the placed ARFoundationConnector object, set ARFoundationConnectorManager.autoStart to off (deselect Auto Start).

    Screenshot in Unity showing where to deselect Auto Start
  3. If a camera other than ARFoundationConnector / AR Session Origin / AR Camera is in the scene, set the tag of that camera to something other than MainCamera.

  4. Enable the ARKit plugin in Project Settings / XR Plug-in Management

    Screenshot in Unity showing where to enable the ARKit plugin
Prefabs

Prefabs with the following functions are available.

Prefab Description

ARFoundationConnectorManager

Relay Depth data, Color data, and so on, output from AR Foundation and input to ToF AR.
The application can get data from TofArTofManager, TofArColorManager, and so on.

ToggleARFoundationBodyUsage

UI for toggling the Body recognition function of AR Foundation.
When the Body recognition function of AR Foundation is enabled on iOS, the operation profile of AR Foundation is switched to Body Tracking.

ToggleARFoundationFaceUsage

UI for toggling the Face recognition function of AR Foundation.
When the Face recognition function of AR Foundation is enabled, the operation profile of AR Foundation is switched to Face Tracking.

4.3.2. Debugging applications with AR Foundation

4.4. Debug with TofARServer

By executing TofARServer on the device, the data retrieved from the ToF camera can be used to debug applications running on Unity Editor.

Make sure to use the same ToF AR version of both the Unity Editor toolkit and ToF AR Server. The current version is ToF AR v1.4.0.

4.4.1. Setting up TofARServer

Environment settings

When using TofARServer on an Android device, use adb (Android Debug Bridge) command on a PC where the Android SDK is installed.

If connection to TofARServer is not possible, select Preference / External Tools in the Unity Editor and make the following settings:

  1. Deselect Android SDK Tools Installed with Unity (recommended).

  2. Specify the path to the installed Android SDK in the SDK path entry field.

TofARServer build and connection settings
  1. Open the TofAR_Server Unity project included in the ToF AR package, and set up ToF AR.
    See Set up ToF AR and note that both Base and Hand components are required by TofARServer.

    If the project is opened before ToF AR is set up, a confirmation message to enter safe mode may appear, depending on your settings. If you start Unity in safe mode, exit safe mode and then import the ToF AR package.
  2. Build the application and install it to the device.

  3. Configure the connection settings between Unity Editor and the device, for example:

    Screenshot in Unity showing where to configure device connection settings
    • Set Debug Server Device to specify the OS of the destination device.
      If the OS is Android, then the adb forward command will be automatically executed, and the connection will be set up during play.

    • Select or deselect Enable network debugging to switch between network connection (selected) and USB connection (deselected).

    • Set the IP address and port if using a network connection.

    • The communication timeout period can be specified in milliseconds in Server Connection Timeout.

      You must restart Unity Editor after changing settings.
  4. Set the server-side receive port in ToF AR Server by tapping the port number on the ToF AR Server screen to edit it.

    You must restart TofARServer after changing settings.
    Screenshot in Unity showing where to set the server-side receive port
Performing debug
  • Click Play in Unity Editor.

4.4.2. Debugging applications with AR Foundation

Import ARFoundationConnector into TofARServer and build the project, then the applications using the output data by AR Foundation can be executed and debugged on Unity Editor.

Setting up ARFoundationConnector for TofARServer
  1. Open the TofARServer Unity project and set up ToF AR and AR Foundation.

  2. Setup ARFoundationConnector.

  3. Open the /Assets/Samples/Scenes/Main.unity scene.

  4. Place /Assets/TofAr.ThirdParty/ARFoundationConnector/Prefabs/ARFoundationConnector into the scene.

  5. In the placed ARFoundationConnector object, set ARFoundationConnectorManager.autoStart to on.

  6. In the placed ARFoundationConnector object, set Dont Destroy On Load (Script) to off (inactive).

    Screenshot in Unity showing where to set Auto Start on
  7. If any camera other than ARFoundationConnector/AR Session Origin/AR Camera is in the scene, set the tag of that camera to something other than "MainCamera".

  8. iOS Only: Place /Assets/TofAr.ThirdParty/ARFoundationConnector/Prefabs/ToggleARFoundationBodyUsage onto DebugServerCanvas within the scene.

    ToggleARFoundationBodyUsage
    UI for toggling the Body recognition function of AR Foundation.
  9. iOS Only: Assign the ARFoundationConnector object within the scene to ToggleARFoundationBodyUsage.ConnectorManager of the placed ToggleARFoundationBodyUsage object.

    image
  10. Place /Assets/TofAr.ThirdParty/ARFoundationConnector/Prefabs/ToggleARFoundationFaceUsage onto DebugServerCanvas within the scene.

    ToggleARFoundationFaceUsage
    UI for toggling the Face recognition function of AR Foundation.
  11. Assign the ARFoundationConnector object within the scene to ToggleARFoundationFaceUsage.ConnectorManager of the placed ToggleARFoundationFaceUsage object.

    image
  12. iOS Only: Install ARKit Face Tracking package from Unity Package Manager. The version should be the same as the ARKit XR Plugin.

  13. Enable the plugin in Project Settings / XR Plug-in Management.

  14. iOS Only: Enable FaceTracking in Project Settings / XR Plug-in Management / ARKit.

  15. Build the application and install it to the device.

When debugging, it is necessary to toggle ToggleARFoundation from the TofARServer screen added from this procedure before executing Play in the Unity Editor.

4.5. Configure a ToF AR device profile

4.5.1. Configuring a ToF AR Device

ToF AR defines ToF camera specific settings on the device. Standard configurations are automatically applied, but in the case of special configurations, the configuration file can be written to the device as follows:

  1. Connect the device and PC via USB. USB debugging must be available on the device.

  2. Open the project that uses ToF AR in Unity.

  3. Open TofArManager in the Inspector.

  4. Assign the appropriate configuration file to Device Profile.

    Screenshot in Unity showing the list of device profiles and the menu item Push to Device
    Configuration files are stored inside the TofAr\TofAr\V0\Resources\DeviceProfiles directory within the Unity project.
  5. Select Push to Device.

4.6. Configure the ToF mode

4.6.1. Configuring the ToF Mode

ToF cameras can have multiple modes, which you can get and set from the properties of TofArTofManager.

4.6.2. Getting a List of ToF Modes

Obtain the list of ToF modes using CameraConfigurationsProperty’s GetProperty(). The necessary size for bufferSize depends on the ToF camera, but we recommended setting this in intervals of 8192.

For example:
var configurations = TofArTofManager.Instance.GetProperty(new CameraConfigurationsProperty(), 8192);

4.6.3. Changing the ToF Mode

Change the ToF mode by setting SetConfigrationIdProperty.uid to the uid previously obtained from the ToF mode info, and using SetProperty().

For example:
TofArTofManager.Instance.SetProperty(new SetConfigurationIdProperty() { uid = TargetUID });

5. Tutorials

This chapter includes the following tutorials:

In these tutorials, Unity 2021.3.31f1 is used to create the application. Different versions of Unity may have some differences, for example, in the Unity Editor interface.

5.1. Application displaying Depth image by ToF AR Unity Toolkit

The ToF AR Unity Toolkit comes with a set of Prefabs that can be used to develop applications efficiently. Creating applications to use the basic functions of ToF cameras becomes easier by combining these Prefabs.

This tutorial shows an example of how to create an application that displays the Depth image data from a ToF Camera on the screen, using the ToF AR Unity Toolkit.

Through this procedure, the flow from application creation to debugging is described in the following subsections:

  • Creating the application
    Basic usage of ToF AR Manager Prefab.

  • Executing the application on a mobile device
    Confirming operations of the ToF AR application on a mobile device.

  • Debugging the application
    Debugging the ToF AR application with Unity Editor on a mobile device.

5.1.1. Creating the application

Create applications with Unity Editor.

Preparation

Make sure that ToF AR is already set up in the Unity project, see Set up ToF AR.

The Hand component is not applied in this tutorial, so installation of the Hand Component package is not required.
Procedure
Instantiate TofArManager and TofArTofManager

TofArManager is a class that provides commonly used functions in ToF AR. It is defined as a Prefab. It is always required for applications that use ToF AR.

TofArTofManager is a class that manages the connection with device and is defined as Prefab. It acquires data, such as Depth data and Confidence data, from the ToF camera.

  1. Drop TofAr/TofAr/TofArManager from the Project window to the Hierarchy window.

    image
  2. Drop TofAr/TofArTof/TofArTofManager from the Project window to the Hierarchy window.

    image
Set properties of TofArTofManager

Set TofArTofManager to start acquiring data from the ToF camera automatically when the application is started.

  1. Select TofArTofManager in the Hierarchy window.

  2. Select Auto Start in the Inspector window.

    image
Set up the visualization of the acquired data

Instantiate DepthViewRawImage to display the acquired data. DepthViewRawImage is a class that automatically displays the Depth data and is defined as a Prefab. When placed in the scene, it accesses the Stream of the TofArTofManager instance, which is a singleton, to acquire Depth data, and automatically links to the start/end of the TofArTofManager Stream to display it.

  1. Select UI / Panel from the Create menu in the Hierarchy window. A Panel will be instantiated and placed under the Canvas.

    image
  2. Select Panel in the Hierarchy window.

  3. In the Inspector window, set Anchor Presets of Rect Transform so that Panel covers the whole screen. By default, the display is set to full screen.

  4. Drop TofAr/TofArTof/V0/Resources/Prefabs/DepthViewRawImage from the Project window onto Panel in the Hierarchy window. This will add DepthViewRawImage as a child object of the Panel.

    image
  5. Select DepthViewRawImage in the Hierarchy window.

    image
  6. If necessary, use Rect Transform and other tools in the Inspector window to modify the display layout of the image.

5.1.2. Executing the application

Build the application into a format that can be executed on the target device, then install the executable file on the device, and execute it.

Preparation

Connect the development PC to the target device via USB.

Procedure
Select a device to execute the application
  1. Select File / Build Settings…​ to open the Build Settings window.

  2. Select the platform (OS) of the target device to run the application, either Android or iOS, and then select Switch Platform.

    image
Configure the application package settings
  1. Select Player Settings…​.

  2. Make any settings required for the application package you created, such as Package Name, under PlayerSettings in the Inspector.

    image
Execute the application
  1. Click Build And Run to build the application, and install it on the device.

    At this time, for Android, the application will be installed directly on the device. For iOS, however, an Xcode project will be output. Make settings, such as Application signature, in Xcode, and then click Build And Run.

    image
  2. When the installation is completed, the application will be executed on the device.

    image

5.1.3. Debugging the application

ToF AR can run and debug applications on Unity Editor by connecting a PC (Windows 10 or macOS 11) and the target device via USB or network. In this case, building the application is not necessary.

In this section, debugging the application for Android devices is described.
image
Preparation

Connect the development PC to the target device via USB, and make sure that the device is set to enable USB debugging.

Environment settings

When using TofARServer on an Android device, use the adb (Android Debug Bridge) command of the Android SDK installed on the PC.

If connection to TofARServer is not possible, check under External Tools in the Preference menu of Unity Editor and make the following settings:

  1. Deselect Android SDK Tools Installed with Unity (recommended).

  2. Specify the path to the installed Android SDK in the SDK path entry field.

Connection settings
  1. In the Inspector of TofArManager, configure the connection settings between Unity Editor and the device, for example:

    image
    • Set Debug Server Device to specify the OS of the destination device.
      If the OS is Android, then the adb forward command will be automatically executed, and the connection will be set up during play.

    • Select or deselect Enable network debugging to switching between network connection (selected) and USB connection (deselected).

    • Set the IP address and port if using a network connection.

    • The communication timeout period can be specified in milliseconds in Server Connection Timeout.

      You must restart Unity Editor after changing settings.
  2. Set the server-side receive port in ToF AR Server by tapping the port number on the ToF AR Server screen to edit it.

    You must restart TofARServer after changing settings.
    image
Procedure
Perform debugging in Unity Editor
  • Click Play in Unity Editor.

By connecting the TofARServer Stream, running on the device, and the Application Stream running on Unity Editor, the application can obtain the image data. This connection makes debugging while acquiring data from the device possible.

5.2. Application using Plane detection of AR Foundation (ARCore) and Hand detection of ToF AR

Using ARFoundationConnector, we can create applications using AR Foundation functions provided by both Unity and ToF AR functions.

This tutorial describes a procedure to create applications on Android with the following functions:

  • Detect a plane using AR Foundation functions, and display the detected plane on the screen.

  • Display a cube at the position on the detected plane displayed on the screen.

  • Recognize hands displayed on the screen by ToF AR functions, and display Skelton hands for them.

  • Touch, tilt, and move the cube placed on the plane using the Skeleton hands.

5.2.1. Build settings

Configure the environment for the application build.

Preparation

Make sure that ToF AR package is set up in this Unity project.

To set it up, see Set up ToF AR.

Make sure that AR Foundation package is set up in this Unity project. To set it up, see Setting up AR Foundation.

Procedure
Select a device to execute the application
  1. Select File / Build Settings…​ to open the Build Settings window.

  2. Select the platform (OS) of the target device to run the application. For this tutorial, select Android, and then select Switch Platform.

    image
Configure the build settings
  1. Select Player Settings…​ in the Build Settings window.

  2. Select Player settings in the Project Settings window to configure the following settings.

  3. If Vulkan is displayed under Other Settings / Graphics APIs, then select Vulkan and delete it by clicking -.

    ARCore does not support Vulkan.
    image
  4. Set Other Settings / Identification / Minimum API Level to "Android 7.0" or a higher version.

    image
  5. Set Other Settings / Configuration / Scripting Backend to "IL2CPP".

  6. Set Other Settings / Configuration / API Compatibility Level to ".NET 4.x".

  7. Under Other Settings / Configuration / Target Architectures, select "ARM64".

    image
  8. Select Publishing Settings / Build / Custom Main Manifest. As a result, the Assets\Plugins\Android\AndroidManifest.xml file is automatically generated.

    • Edit the generated AndroidManifest.xml file and set attribute android:extractNativeLibs="true" in the <application> tag.

      image
  9. Under XR Plug-in Management in the Project Settings window, select ARCore.

    image

5.2.2. Creating the application

Create an application with AR Foundation and ToF AR on Unity Editor.

The part of this tutorial under Deploy AR Foundation Prefabs and scripts with ARFoundationConnector describes the simultaneous use of AR Foundation and ToF AR. By deploying AR Foundation Prefabs and scripts under AR Session Origin of ARFoundationConnector, AR Foundation and ToF AR functions can be used simultaneously.

Preparation

Make sure that ToF AR and AR Foundation packages are set up in this Unity project.

Procedure
Deploy ToF AR Manager Prefabs
Instantiate TofArManager

TofArManager is a class that provides commonly used functions in ToF AR and is defined as a Prefab. TofArManager is always required for applications that use ToF AR.

  1. Deploy TofAr/TofAr/TofArManager from the Project window onto the scene in the Hierarchy window.

    image
Instantiate ToF AR Manager Prefabs that will be used in this application
  1. Deploy TofAr/TofArTof/TofArTofManager from the Project window onto the scene in the Hierarchy window. TofArTofManager is a class that manages the connection with device.

  2. Deploy TofAr/TofArColor/TofArColorManager from the Project window onto the scene in the Hierarchy window. TofArColorManager is a class that manages the connection with RGB camera.

  3. Deploy TofAr/TofArHand/TofArHandManager from the Project window onto the scene in the Hierarchy window. TofArHandManager is a class that manages the connection with ToF AR Hand Components.

    image
Deploy AR Foundation Prefabs and scripts with ARFoundationConnector
Instantiate ARFoundationConnector

Use ARFoundationConnector to use ToF AR functions and AR Foundation functions provided by Unity at the same time.

  1. Deploy TofAr.ThirdParty/ARFoundationConnector/Prefabs/ARFoundationConnector from the Project window onto the scene in the Hierarchy window.

    image
  2. Deselect Auto Start of ARFoundationConnectorManager, to turn it off, in the Inspector window of the deployed ARFoundationConnector.

    image
  3. Delete Main Camera in the Hierarchy window, because ARFoundationConnector supports AR Camera.

Add AR Plane Manager Component to AR Session Origin

By adding the AR Plane Manager Component to AR Session Origin under ARFoundationConnector, detecting the AR plane becomes possible.

  1. Select AR Session Origin in the Hierarchy window.

  2. Select Add Component at the bottom of the Inspector window. The Search window is displayed.

  3. Search for AR Plane Manager, select it, and add it.

    image
Set AR Plane Manager

By setting AR Default Plane to AR Plane Manager, the AR plane will be displayed when an AR plane is detected.

  1. Create a Prefabs folder inside the Assets folder displayed in the Project window.

  2. Select GameObject / XR / AR Default Plane from the top menu of Unity Editor, and add it to the scene in the Hierarchy window.

    image
  3. Deploy AR Default Plane from the Hierarchy window to the Prefabs folder, to convert it to a Prefab.

    image
  4. Delete the original AR Default Plane from the Hierarchy window.

  5. Select AR Session Origin in the Hierarchy window. Drag the AR Default Plane Prefab created in the previous steps, and set the Prefab to Plane Prefab for AR Plane Manager displayed in the Inspector window.

    image
  6. Set Detection Mode of AR Plane Manager to Horizontal in the Inspector window.

    image
Add AR Raycast Manager component to AR Session Origin

Use AR Raycast Manager to identify the position on AR plane touched on the screen.

  1. Select AR Session Origin in the Hierarchy window.

  2. Select Add Component at the bottom of the Inspector window. The Search window is displayed.

  3. Search for AR Raycast Manager, select it, and add it.

    image
Add a function to display a Cube on the AR plane

Create a C# script to place a Cube at the position on the AR plane touched on the screen, and add the script to AR Session Origin as a Component.

  1. Create a Scripts folder inside the Assets folder displayed in the Project window.

  2. Select AR Session Origin in the Hierarchy window.

  3. Click Add Component at the bottom of the Inspector window, and create a script named "ObjectPlacement" from the New script menu.

    image
  4. Move the created script to the Scripts folder in the Project window.

  5. Select Edit Script in the Inspector window, and open ObjectPlacement.cs added to AR Session Origin.

    image
  6. Implement the script as below, and save it.

    using System.Collections.Generic;
    using UnityEngine;
    using UnityEngine.XR.ARFoundation;
    
    public class ObjectPlacement : MonoBehaviour
    {
        public Camera MainCamera;
        public GameObject SpawnObjectPrefab;
        public ARRaycastManager arRaycastManager;
    
        public void Update()
        {
            Touch touch;
            if (Input.touchCount > 0 && (touch = Input.GetTouch(0)).phase == TouchPhase.Began)
            {
                List<ARRaycastHit> hitResults = new List<ARRaycastHit>();
    
                if (arRaycastManager.Raycast(touch.position, hitResults, UnityEngine.XR.ARSubsystems.TrackableType.PlaneWithinPolygon))
                {
                    foreach (ARRaycastHit hit in hitResults)
                    {
                        if (Vector3.Dot(MainCamera.transform.position - hit.pose.position, hit.pose.up) > 0)
                        {
                            // Instantiate a new game object on the hit plane
                            Vector3 position = hit.pose.position;
                            position.y += 0.15f;
                            var planeObject = Instantiate(SpawnObjectPrefab, position, hit.pose.rotation);
                        }
                    }
                }
            }
        }
    }
Set the Cube to display on AR plane

Make settings for the ObjectPlacement Component created above. Set the size of Cube to display, and add Rigidbody functions for collision detection for the Cube. Then add AR Camera and AR Raycast Manager to identify the position on the AR plane touched on the screen.

  1. Select GameObject / 3D Object / Cube from the top menu of Unity Editor, and add it onto the scene in the Hierarchy window.

    image
  2. Select Cube in the Hierarchy window, and set its Transform attributes in the Inspector window according to the figure below:

    image
  3. Select Add Component at the bottom of the Inspector window. The Search window is displayed.

  4. Search for Rigidbody, select it, and add it.

    image
  5. Deploy Cube from the Hierarchy window to the Prefab folder to convert it to a Prefab.

  6. Delete the original Cube from the Hierarchy window.

  7. Set the Properties of ObjectPlacement added to AR Session Origin in the Inspector window as follows:

    • Main Camera: Set the AR Camera from the Hierarchy window.

    • Spawn Object Prefab: Set the Cube moved to Prefab folder.

    • Ar Raycast Manager: Set the AR Session Origin from the Hierarchy window.

      image
Set the skeleton hand to be displayed
Perform display settings of Skelton hand

Make settings for recognizing hands in the displayed image, and display Skelton hands for them.

  1. Select AR Camera in the Hierarchy window, and then select GameObject / Create Empty Child from top menu of Unity Editor. Add the GameObject to AR Camera in the Hierarchy window.

    image
  2. Select the new GameObject added to AR Camera, and change its name to "RelativePositionContainer".

    image
  3. Deploy TofAr/TofArHand/V0/Resources/Prefabs/HandModel from the Project window onto RelativePositionContainer in the Hierarchy window.

    image
  4. Select the HandModel deployed onto RelativePositionContainer, and change its name to "RightHandModel" in the Inspector window. Change the Lr Hand setting of HandModel to "Right Hand".

    image
  5. Deploy a second TofAr/TofArHand/V0/Resources/Prefabs/HandModel from the Project window onto RelativePositionContainer in the Hierarchy window.

  6. Select the HandModel deployed onto RelativePositionContainer, and change its name to "LeftHandModel" in the Inspector window. Change the Lr Hand setting of HandModel to "Left Hand".

    image
  7. Deploy TofAr/TofArHand/V0/Resources/Prefabs/HandCollider from the Project window onto RelativePositionContainer in the Hierarchy window.

    image
  8. Select the HandCollider deployed onto RelativePositionContainer, and change its name to "RightHandCollider" in the Inspector window.
    Change the Lr Hand setting of HandCollider to "Right Hand".
    Check that all parameters of Position under Transform are set to 0.

    image
  9. Deploy a second TofAr/TofArHand/V0/Resources/Prefabs/HandCollider from the Project window onto RelativePositionContainer in the Hierarchy window.

  10. Select the new HandCollider deployed onto RelativePositionContainer, and change its name to "LeftHandCollider" in the Inspector window.
    Change the Lr Hand setting of HandCollider to "Left Hand".
    Check that all parameters of Position under Transform are set to 0.

    image
  11. Add the "Reorient Relative Color Camera" Component to RelativePositionContainer, to align orientations of the screen and the Skeleton hand displayed.

    image
Perform AR Camera settings
  1. Select AR Camera in the Hierarchy window, and Set Target Texture to "None" in the Inspector window.

    image

5.2.3. Executing the application

Build the application into a format that can be executed on the target device.
Install the executable file on the device and execute it.

Preparation

Connect the development PC to the target device via USB.

Procedure
Build and execute the application on the device
  • Click Build And Run.

The application will be built and installed on the device. For Android applications, it will be installed directly on the device. After the installation is completed, the application will be executed on the device.

image

5.3. Application to display PointCloud data by Particle System

This tutorial describes creating an application for displaying PointCloud data acquired by ToF AR, by Particle System in Unity. While creating this application, the method using ToF AR data and Toolbar provided in TofAR_Samples_Basic is described.

This tutorial assumes that the application will run on a supported iOS device with the following functions:

  • Display PointCloud data by Particle System.

  • Display particles as a colored image, with colors based on the positions of each PointCloud data.

  • Display settings can be configured by GUI.

    image

5.3.1. Build settings

Configure the environment for the application build.

Preparation

Make sure that TofAR_Base_xxxx_iOS.unitypackage is set up in this Unity project. To set it up, see Set up ToF AR. This tutorial uses the Base component package only.

Make sure that the TofAR_Samples_Basic Project, which includes ToF AR sample programs, is set up in this unity project. Set this up as follows:

  1. Copy the TofArSamplesBasic directory and the TofArSettings directory from the Assets directory of the TofAR_Samples_Basic Project to the Assets directory of this project.

  2. Delete the Hand directory and the MarkRecog directory from the TofArSamplesBasic directory.

  3. Delete the Scripts/Hand directory and the Scripts/MarkRecog directory in the TofArSettings directory.

Procedure
Select a device to execute the application
  1. Select File / Build Settings…​ to open the Build Settings window.

  2. Select the platform (OS) of the target device to run the application. For this tutorial, select iOS, and then select Switch Platform.

    image
Configure the build settings
  1. From the Unity Menu, select File / Build Settings…​ / Player Settings / iOS Settings.

  2. Set Other Settings / Configuration / Api Compatibility Level to ".NET 4.x".

  3. Set Other Settings / Configuration / Camera Usage Description to "ToF Camera" or any character string describing the ToF camera.

    image

5.3.2. Creating the application

Create an application that displays PointCloud data, using Particle System in Unity and ToF AR.

Preparation

Make sure that ToF AR packages are set up in this Unity project.

Procedure
Create a funtion to display PointCloud data

Create the function to display the PointCloud data acquired by a ToF camera by applying Particle System.

Instantiate TofArManager

TofArManager is a class that provides commonly used functions in ToF AR and is defined as a Prefab. TofArManager includes functions to acquire PointCloud data from the ToF camera. TofArManager is always required for applications that use ToF AR.

  1. Deploy TofAr/TofAr/TofArManager from the Project window onto the scene in the Hierarchy window.

    image
Instantiate TofArTofManager

TofArTofManager is a class that manages the connection with device and is defined as a Prefab. It acquires data, such as Depth data and Confidence data, from the ToF camera. In this application, TofArTofManager is used to acquire PointCloud data.

  1. Deploy TofAr/TofArTof/TofArTofManager from the Project window onto the scene in the Hierarchy window.

    image
  2. Select TofArTofManager in the Hierarchy window.

  3. Deselect Auto Start in the Inspector window so that processing will not start automatically when this application is started.

    image
Make settings for ToF data

Set the FPS value and Delay value of ToF data to be acquired from the ToF camera, and make settings for the recording and playback functions. This uses TofController as a Prefab in TofArSettings to control ToF data via TofArTofManager.

  1. Select GameObject / Create Empty from the top menu of Unity Editor, and deploy a new GameObject onto the scene in the Hierarchy window.

  2. Select the deployed GameObject, and rename it "Controllers" in the Inspector window.

  3. Deploy TofArSettings/Prefabs/Controllers/TofController from the Project window onto Controllers in the Hierarchy window.

    image
Display settings by Particle System

Deploy the Particle System in Unity and set the display settings for PointCloud data.

  1. Select GameObject / Effect / Particle System from the top menu of Unity Editor, and deploy it onto the scene in the Hierarchy window.

    image
  2. Select the deployed Particle System, and rename it to "PointCloud2ParticleSystem" in the Inspector window.

  3. Set the Transform component properties like the image below:

    image
  4. Set the Particle System main module of PointCloud2ParticleSystem in the Inspector window like the image below:

    image
  5. Turn off everything except the Particle System Renderer module of PointCloud2ParticleSystem in the Inspector window.

    image
  6. Set the Particle System Renderer module of PointCloud2ParticleSystem in the Inspector window as in the image below:

    image
Assign each PointCloud data to Particle, and add a script to set it in Particle System as a component

At this stage, we can display PointCloud data three-dimensionally.

  1. Select Add Component at the bottom of the Inspector window, and then create and add a new script named "PointCloud2ParticleSystem" from the New script menu.

    image
  2. In the Inspector window, select Edit Script, and open the new PointCloud2ParticleSystem.cs file added to the PointCloud2ParticleSystem object.

    image
  3. Implement a script to create Particles based on PointCloud data and set them in Particle System.

    The details of the script are described below. Refer to PointCloud2ParticleSystem.cs for the full script.

    1. Define the variables necessary for processing.

      private ParticleSystem particleSystem;
      private ParticleSystem.Particle[] particles;
      private object sync = new object();
      private bool updated = false;
      private Vector3[] vs;

      These variables are used as:

      • particleSystem - components of ParticleSystem

      • particles - array of Particle data to set PointCloud data

      • sync - object for exclusive processing

      • updated - flag to indicate update

      • vs - array of acquired PointCloud data

    2. OnEnable() is called when the object becomes active, and sets the callback for receiving frame data.

      void OnEnable()
      {
          TofArTofManager.OnFrameArrived += OnTofFrameArrived;
      }
    3. OnDisable() is called when the object becomes invalid, and cancels the callback for receiving frame data.

      private void OnDisable()
      {
          TofArTofManager.OnFrameArrived -= OnTofFrameArrived;
      }
    4. Start() is called at the start of the script, and gets the ParticleSystem component.

      void Start()
      {
          particleSystem = GetComponent<ParticleSystem>();
      }
    5. OnTofFrameArrived is called every time ToF data is received. It gets PointCloud data from TofArTofManager and sets it in the vs array. It also sets the updated flag to true after setting ToF data.

      private void OnTofFrameArrived(object stream)
      {
          if (!TofArTofManager.Instantiated)
          {
              return;
          }
      
          var pointCloudData = TofArTofManager.Instance.PointCloudData;
          if (pointCloudData == null || pointCloudData.Points == null)
          {
              return;
          }
      
          lock (sync)
          {
              vs = pointCloudData.Points;
          }
          updated = true;
      }
    6. Update() is called for every frame of Unity, and it generates the array of Particle if updated flag is true. It generates RGB color data using PointCloud data stored in vs and sets this in Particle. It also sets the generated Particle in Particle System.

      void Update()
      {
          if (updated)
          {
              updated = false;
      
              lock (sync)
              {
                  particles = new ParticleSystem.Particle[vs.Length];
                  particleSystem.GetParticles(particles);
      
                  for (int i = 0; i < vs.Length; i++)
                  {
                      particles[i].position = vs[i];
      
                      float r = Mathf.Min(1, Mathf.Max(0, (vs[i].z) / 2));
                      float g = Mathf.Min(1, Mathf.Max(0, (vs[i].y + 1f) / 2));
                      float b = Mathf.Min(1, Mathf.Max(0, (vs[i].x + 1f) / 2));
      
                      particles[i].startColor = new Color(r, g, b);
                      particles[i].startSize = 0.02f;
                  }
      
                  particleSystem.SetParticles(particles);
              }
          }
      }
  1. Save the script.

  2. Select Add Component at the bottom of the Inspector window. The Search window is displayed.

  3. Search for Reorient With Device, select it, and add it.

    image
Generating function setting GUI

Display a toolbar at the edge of the screen, and deploy buttons so that the user can make settings for the ToF camera, recording, playback functions, and so on.

image

We place GUI parts which provide functions like Toolbar and make the necessary settings. The Settings Prefab provided in TofArSettings includes GUI parts such as the Toolbar Prefab. Instantiate this Settings Prefab and make setttings so that various functions can be used, including ToF camera settings, Record Settings, Rec Player Settings, and General Settings. The Toolbar Prefab has buttons to activate various functions by default. Deactivate the functions not needed for this tutorial.

  1. Deploy TofArSettings/Prefabs/Settings from the Project window onto the scene in the Hierarchy window.

    image
  2. Select Settings / Canvas / SafeArea / Panels / CameraSettings in the Hierarchy window and then, in the Inspector window, under Camera Settings and then Use Component, select Tof.

    image
  3. Select Settings / Canvas / SafeArea / Panels / RecordSettings in the Hierarchy window and then, in the Inspector window, under Record Settings and then Use Component, select Tof.

    image
  4. Select Settings / Canvas / SafeArea / Panels / RecPlayerSettings in the Hierarchy window and then, in the Inspector window, under Rec Player Settings and then Use Component, select Tof.

    image
  5. Select Settings / Canvas / SafeArea / Panels / GeneralSettings in the Hierarchy window and then, in the Inspector window, under General Settings and then Use Component, select Tof.

    image
  6. Select Settings / Canvas / SafeArea / DebugInfo / FpsPanel in the Hierarchy window, and then select it to activate the FpsPanel object in the Inspector window, as shown below:

    image
  7. Select Settings / Canvas / SafeArea / DebugInfo / FpsPanel / TxFpsTof in the Hierarchy window, and then select it to activate the TxFpsTof object in the Inspector window, as shown below:

    image
  8. If any other objects in FpsPanel are active, then deselect them to deactivate those objects.

ToF camera Settings

Make settings to change the orientation of the displayed image according to the orientation of the screen or the user’s operation. This will mean that the viewpoint of the displayed image can be changed by changing the orientation of the ToF camera screen, or by touching the screen and sliding your finger.

Set Main Camera setting
  1. Select Main Camera in the Hierarchy window and set the Transform component in the Inspector window as follows:

    image
  2. Also in the same Inspector window, set the Camera component as follows:

    image
  3. Select Add Component at the bottom of the Inspector window. The Search window is displayed.

  4. Search for Camera Controller, select it, and add it.

    image
  5. Select the Camera Controller Component in the Inspector window to activate it.

    image
Light Settings
  1. Select Directional Light in the Hierarchy window, and then, in the Inspector window, set Mode to "Realtime" for the Light component.

    image

5.3.3. Executing the application

Build the application into a format that can be executed on the target device.
Install the executable file on the device and execute it.

Preparation

Connect the development PC to the target device via USB.

Procedure
  1. Click Build And Run to build the application. For iOS, an Xcode project will be output.

  2. Make settings, such as Application signature, in Xcode, and then click Build And Run to build and install it on the device. After installation is completed, the application will be executed on the device.

    An outline of the functions is described below:

    • Select the Camera button at the bottom left of the screen to select the ToF camera to use and to set its Depth data size.

      image
    • Select the Record button to record the displayed image.

      image
    • Select the Play button to play back the recorded image.

      image
    • Select the Settings button to open the panel for general setting functions.

      image
    • Touch and swipe on the screen to rotate the displayed particle data three-dimensionally.

      image

6. List of devices supported by ToF AR

This is a list of devices supported by the ToF AR SDK. This list also includes the camera IDs and default settings for supported devices.

6.1. List of SDK compatible devices

ModelGroup Model Json file

Sony Xperia 1 II

SOG01

sony_xperia1-2_v1.4.0.json

Sony Xperia 1 II

SO-51A

sony_xperia1-2_v1.4.0.json

Sony Xperia 1 II

XQ-AT42

sony_xperia1-2_v1.4.0.json

Sony Xperia 1 II

XQ-AT51

sony_xperia1-2_v1.4.0.json

Sony Xperia 1 II

XQ-AT52

sony_xperia1-2_v1.4.0.json

Sony Xperia 1 II

XQ-AT72

sony_xperia1-2_v1.4.0.json

Sony Xperia 1 III

A101SO

sony_xperia1-3_v1.4.0.json

Sony Xperia 1 III

SO-51B

sony_xperia1-3_v1.4.0.json

Sony Xperia 1 III

SOG03

sony_xperia1-3_v1.4.0.json

Sony Xperia 1 III

XQ-BC42

sony_xperia1-3_v1.4.0.json

Sony Xperia 1 III

XQ-BC52

sony_xperia1-3_v1.4.0.json

Sony Xperia 1 III

XQ-BC62

sony_xperia1-3_v1.4.0.json

Sony Xperia 1 III

XQ-BC72

sony_xperia1-3_v1.4.0.json

Sony Xperia 1 IV

A201SO

sony_xperia1-4_v1.4.0.json

Sony Xperia 1 IV

SO-51C

sony_xperia1-4_v1.4.0.json

Sony Xperia 1 IV

SOG06

sony_xperia1-4_v1.4.0.json

Sony Xperia 1 IV

XQ-CT44

sony_xperia1-4_v1.4.0.json

Sony Xperia 1 IV

XQ-CT54

sony_xperia1-4_v1.4.0.json

Sony Xperia 1 IV

XQ-CT62

sony_xperia1-4_v1.4.0.json

Sony Xperia 1 IV

XQ-CT72

sony_xperia1-4_v1.4.0.json

Sony Xperia PRO

XQ-AQ52

sony_xperiaPRO_v1.4.0.json

Sony Xperia PRO

XQ-AQ62

sony_xperiaPRO_v1.4.0.json

Sony Xperia PRO I

XQ-BE42

sony_xperiaPRO-I_v1.4.0.json

Samsung Galaxy Note10+

SC-01M

samsung_galaxy_note_10+_v1.4.0.json

Samsung Galaxy Note10+

SCV45

samsung_galaxy_note_10+_v1.4.0.json

Samsung Galaxy Note10+

SM-N9750

samsung_galaxy_note_10+_v1.4.0.json

Samsung Galaxy Note10+

SM-N9750/DS

samsung_galaxy_note_10+_v1.4.0.json

Samsung Galaxy Note10+

SM-N975C

samsung_galaxy_note_10+_v1.4.0.json

Samsung Galaxy Note10+

SM-N975N

samsung_galaxy_note_10+_v1.4.0.json

Samsung Galaxy Note10+

SM-N975U

samsung_galaxy_note_10+_v1.4.0.json

Samsung Galaxy Note10+

SM-N975U1

samsung_galaxy_note_10+_v1.4.0.json

Samsung Galaxy Note10+

SM-N975W

samsung_galaxy_note_10+_v1.4.0.json

Samsung Galaxy Note10+

SM-N975F

samsung_galaxy_note_10+_v1.4.0.json

Samsung Galaxy Note10+

SM-N975F/DS

samsung_galaxy_note_10+_v1.4.0.json

Samsung Galaxy Note10+

SM-N976B

samsung_galaxy_note_10+_v1.4.0.json

Samsung Galaxy Note10+

SM-N976N

samsung_galaxy_note_10+_v1.4.0.json

Samsung Galaxy Note10+

SM-N9760

samsung_galaxy_note_10+_v1.4.0.json

Samsung Galaxy Note10+

SM-N976Q

samsung_galaxy_note_10+_v1.4.0.json

Samsung Galaxy Note10+

SM-N976V

samsung_galaxy_note_10+_v1.4.0.json

Samsung Galaxy Note10+

SM-N976U

samsung_galaxy_note_10+_v1.4.0.json

Samsung Galaxy S10 5G

SC-03L

samsung_galaxy_s10_v1.4.0.json

Samsung Galaxy S10 5G

SCV41

samsung_galaxy_s10_v1.4.0.json

Samsung Galaxy S10 5G

SM-G973F

samsung_galaxy_s10_v1.4.0.json

Samsung Galaxy S10 5G

SM-G973N

samsung_galaxy_s10_v1.4.0.json

Samsung Galaxy S10 5G

SM-G9730

samsung_galaxy_s10_v1.4.0.json

Samsung Galaxy S10 5G

SM-G9738

samsung_galaxy_s10_v1.4.0.json

Samsung Galaxy S10 5G

SM-G973C

samsung_galaxy_s10_v1.4.0.json

Samsung Galaxy S10 5G

SM-G973U

samsung_galaxy_s10_v1.4.0.json

Samsung Galaxy S10 5G

SM-G973U1

samsung_galaxy_s10_v1.4.0.json

Samsung Galaxy S10 5G

SM-G973W

samsung_galaxy_s10_v1.4.0.json

Samsung Galaxy S10 5G

SM-G977B

samsung_galaxy_s10_v1.4.0.json

Samsung Galaxy S10 5G

SM-G977N

samsung_galaxy_s10_v1.4.0.json

Samsung Galaxy S10 5G

SM-G977P

samsung_galaxy_s10_v1.4.0.json

Samsung Galaxy S10 5G

SM-G977T

samsung_galaxy_s10_v1.4.0.json

Samsung Galaxy S10 5G

SM-G977U

samsung_galaxy_s10_v1.4.0.json

Samsung Galaxy S10 5G

SM-G770F

samsung_galaxy_s10_v1.4.0.json

Samsung Galaxy S10 5G

SM-G770U1

samsung_galaxy_s10_v1.4.0.json

Samsung Galaxy S10 5G

SC-04L

samsung_galaxy_s10_v1.4.0.json

Samsung Galaxy S10 5G

SCV42

samsung_galaxy_s10_v1.4.0.json

Samsung Galaxy S10 5G

SM-G975F

samsung_galaxy_s10_v1.4.0.json

Samsung Galaxy S10 5G

SM-G975N

samsung_galaxy_s10_v1.4.0.json

Samsung Galaxy S10 5G

SM-G9750

samsung_galaxy_s10_v1.4.0.json

Samsung Galaxy S10 5G

SM-G9758

samsung_galaxy_s10_v1.4.0.json

Samsung Galaxy S10 5G

SM-G975U

samsung_galaxy_s10_v1.4.0.json

Samsung Galaxy S10 5G

SM-G975U1

samsung_galaxy_s10_v1.4.0.json

Samsung Galaxy S10 5G

SM-G975W

samsung_galaxy_s10_v1.4.0.json

Samsung Galaxy S10 5G

SC-05L

samsung_galaxy_s10_v1.4.0.json

Samsung Galaxy S10 5G

SM-G970F

samsung_galaxy_s10_v1.4.0.json

Samsung Galaxy S10 5G

SM-G970N

samsung_galaxy_s10_v1.4.0.json

Samsung Galaxy S10 5G

SM-G9700

samsung_galaxy_s10_v1.4.0.json

Samsung Galaxy S10 5G

SM-G9708

samsung_galaxy_s10_v1.4.0.json

Samsung Galaxy S10 5G

SM-G970U

samsung_galaxy_s10_v1.4.0.json

Samsung Galaxy S10 5G

SM-G970U1

samsung_galaxy_s10_v1.4.0.json

Samsung Galaxy S10 5G

SM-G970W

samsung_galaxy_s10_v1.4.0.json

Samsung Galaxy S10 5G

SM-G9770

samsung_galaxy_s10_v1.4.0.json

Samsung Galaxy S20 Ultra

SCG03

samsung_galaxy_s20_ultra_v1.4.0.json

Samsung Galaxy S20 Ultra

SM-G9880

samsung_galaxy_s20_ultra_v1.4.0.json

Samsung Galaxy S20 Ultra

SM-G988N

samsung_galaxy_s20_ultra_v1.4.0.json

Samsung Galaxy S20 Ultra

SM-G988Q

samsung_galaxy_s20_ultra_v1.4.0.json

Samsung Galaxy S20 Ultra

SM-G988U

samsung_galaxy_s20_ultra_v1.4.0.json

Samsung Galaxy S20 Ultra

SM-G988U1

samsung_galaxy_s20_ultra_v1.4.0.json

Samsung Galaxy S20 Ultra

SM-G988W

samsung_galaxy_s20_ultra_v1.4.0.json

Samsung Galaxy S20 Ultra

SM-G988B

samsung_galaxy_s20_ultra_v1.4.0.json

Samsung Galaxy S20 Plus 4G

SM-G985

samsung_galaxy_s20_plus_4G_v1.4.0.json

Samsung Galaxy S20 Plus 4G

SM-G985F

samsung_galaxy_s20_plus_4G_v1.4.0.json

Samsung Galaxy S20 Plus 4G

SM-G985F/DS

samsung_galaxy_s20_plus_4G_v1.4.0.json

Samsung Galaxy S20 Plus 5G

SC-52A

samsung_galaxy_s20_plus_5G_v1.4.0.json

Samsung Galaxy S20 Plus 5G

SCG02

samsung_galaxy_s20_plus_5G_v1.4.0.json

Samsung Galaxy S20 Plus 5G

SM-G986

samsung_galaxy_s20_plus_5G_v1.4.0.json

Samsung Galaxy S20 Plus 5G

SM-G9860

samsung_galaxy_s20_plus_5G_v1.4.0.json

Samsung Galaxy S20 Plus 5G

SM-G986N

samsung_galaxy_s20_plus_5G_v1.4.0.json

Samsung Galaxy S20 Plus 5G

SM-G986U

samsung_galaxy_s20_plus_5G_v1.4.0.json

Samsung Galaxy S20 Plus 5G

SM-G986U1

samsung_galaxy_s20_plus_5G_v1.4.0.json

Samsung Galaxy S20 Plus 5G

SM-G986W

samsung_galaxy_s20_plus_5G_v1.4.0.json

Samsung Galaxy S20 Plus 5G

SM-G986B/DS

samsung_galaxy_s20_plus_5G_v1.4.0.json

Huawei P30 Pro

HW-02L

huawei_p30_pro_v1.4.0.json

Huawei P30 Pro

VOG-AL00

huawei_p30_pro_v1.4.0.json

Huawei P30 Pro

VOG-AL10

huawei_p30_pro_v1.4.0.json

Huawei P30 Pro

VOG-L04

huawei_p30_pro_v1.4.0.json

Huawei P30 Pro

VOG-L09

huawei_p30_pro_v1.4.0.json

Huawei P30 Pro

VOG-L29

huawei_p30_pro_v1.4.0.json

Huawei P30 Pro

VOG-TL00

huawei_p30_pro_v1.4.0.json

Huawei P40 Pro

ELS-NX9

huawei_p40_pro_v1.4.0.json

Huawei P40 Pro

ELS-AN00

huawei_p40_pro_v1.4.0.json

Huawei P40 Pro

ELS-TN00

huawei_p40_pro_v1.4.0.json

Huawei P40 Pro

ELS-N04

huawei_p40_pro_v1.4.0.json

Huawei P40 Pro

ELS-N29

huawei_p40_pro_v1.4.0.json

Huawei P40 Pro

ELS-N29D

huawei_p40_pro_v1.4.0.json

Huawei P40 Pro+

ELS-AN10

huawei_p40_pro_plus_v1.4.0.json

Huawei P40 Pro+

ELS-TN10

huawei_p40_pro_plus_v1.4.0.json

Huawei P40 Pro+

ELS-N39

huawei_p40_pro_plus_v1.4.0.json

Huawei Mate30 Pro

TAS-AN00

huawei_mate30_pro_5g_v1.4.0.json

Huawei Mate30 Pro

TAS-TN00

huawei_mate30_pro_5g_v1.4.0.json

Huawei Mate30 Pro

LIO-N29

huawei_mate30_pro_5g_v1.4.0.json

Huawei Mate30 Pro

LIO-L09

huawei_mate30_pro_5g_v1.4.0.json

Huawei Mate30 Pro

LIO-AL00

huawei_mate30_pro_5g_v1.4.0.json

Huawei Mate30 Pro

LIO-TL00

huawei_mate30_pro_5g_v1.4.0.json

Huawei Mate30 Pro

LIO-AL10

huawei_mate30_pro_5g_v1.4.0.json

Huawei Mate30 Pro

LIO-TL10

huawei_mate30_pro_5g_v1.4.0.json

Huawei Mate30 Pro

LIO-AN00P

huawei_mate30_pro_5g_v1.4.0.json

Huawei Mate30 Pro

LIO-AN00

huawei_mate30_pro_5g_v1.4.0.json

Huawei Mate40 Pro

NOH-NX9

huawei_mate40_pro_v1.4.0.json

Huawei Mate40 Pro

NOH-AN00

huawei_mate40_pro_v1.4.0.json

Huawei Mate40 Pro+

NOP-NX9

huawei_mate40_pro_plus_v1.4.0.json

Huawei Mate40 Pro+

NOP-AN00

huawei_mate40_pro_plus_v1.4.0.json

Huawei Honor V20

PCT-TL10

huawei_honor_v20_v1.4.0.json

Huawei Honor V20

PCT-AL10

huawei_honor_v20_v1.4.0.json

Huawei Honor V20

PCT-L29

huawei_honor_v20_v1.4.0.json

Honor Magic3 Pro

ELZ-AN10

honor_magic3_pro_v1.4.0.json

Honor Magic3 Pro+

ELZ-AN20

honor_magic3_pro_plus_v1.4.0.json

Honor Magic5 Pro

PGT-N19

honor_magic5_pro_v1.4.0.json

Apple iPhone X

iPhone10,3

iphone_X_v1.4.0.json

Apple iPhone X

iPhone10,6

iphone_X_v1.4.0.json

Apple iPhone XS

iPhone11,2

iphone_XS_v1.4.0.json

Apple iPhone XS Max

iPhone11,4

iphone_XS_max_v1.4.0.json

Apple iPhone XS Max

iPhone11,6

iphone_XS_max_v1.4.0.json

Apple iPhone XR

iPhone11,8

iphone_XR_v1.4.0.json

Apple iPhone 11

iPhone12,1

iphone_11_v1.4.0.json

Apple iPhone 11 Pro

iPhone12,3

iphone_11_pro_v1.4.0.json

Apple iPhone 11 Pro Max

iPhone12,5

iphone_11_pro_max_v1.4.0.json

Apple iPhone 12 mini

iPhone13,1

iphone_12_mini_v1.4.0.json

Apple iPhone 12

iPhone13,2

iphone_12_v1.4.0.json

Apple iPhone 12 Pro

iPhone13,3

iphone_12_pro_v1.4.0.json

Apple iPhone 12 Pro Max

iPhone13,4

iphone_12_pro_max_v1.4.0.json

Apple iPhone 13 mini

iPhone14,4

iphone_13_mini_v1.4.0.json

Apple iPhone 13

iPhone14,5

iphone_13_v1.4.0.json

Apple iPhone 13 Pro

iPhone14,2

iphone_13_pro_v1.4.0.json

Apple iPhone 13 Pro Max

iPhone14,3

iphone_13_pro_max_v1.4.0.json

Apple iPhone 14

iPhone14,7

iphone_14_v1.4.0.json

Apple iPhone 14 Plus

iPhone14,8

iphone_14_plus_v1.4.0.json

Apple iPhone 14 Pro

iPhone15,2

iphone_14_pro_v1.4.0.json

Apple iPhone 14 Pro Max

iPhone15,3

iphone_14_pro_max_v1.4.0.json

Apple iPhone 15

iPhone15,4

iphone_15_v1.4.0.json

Apple iPhone 15 Plus

iPhone15,5

iphone_15_plus_v1.4.0.json

Apple iPhone 15 Pro

iPhone16,1

iphone_15_pro_v1.4.0.json

Apple iPhone 15 Pro Max

iPhone16,2

iphone_15_pro_max_v1.4.0.json

Apple iPad Pro 11-inch(1st generation)

iPad8,1

ipad_pro_11_gen1_v1.4.0.json

Apple iPad Pro 11-inch(1st generation)

iPad8,2

ipad_pro_11_gen1_v1.4.0.json

Apple iPad Pro 11-inch(1st generation)

iPad8,3

ipad_pro_11_gen1_v1.4.0.json

Apple iPad Pro 11-inch(1st generation)

iPad8,4

ipad_pro_11_gen1_v1.4.0.json

Apple iPad Pro 11-inch(2nd generation)

iPad8,9

ipad_pro_11_gen2_v1.4.0.json

Apple iPad Pro 11-inch(2nd generation)

iPad8,10

ipad_pro_11_gen2_v1.4.0.json

Apple iPad Pro 11-inch(3rd generation)

iPad13,4

ipad_pro_11_gen3_v1.4.0.json

Apple iPad Pro 11-inch(3rd generation)

iPad13,5

ipad_pro_11_gen3_v1.4.0.json

Apple iPad Pro 11-inch(3rd generation)

iPad13,6

ipad_pro_11_gen3_v1.4.0.json

Apple iPad Pro 11-inch(3rd generation)

iPad13,7

ipad_pro_11_gen3_v1.4.0.json

Apple iPad Pro 11-inch(4th generation)

iPad14,3

ipad_pro_11_gen4_v1.4.0.json

Apple iPad Pro 11-inch(4th generation)

iPad14,4

ipad_pro_11_gen4_v1.4.0.json

Apple iPad Pro 12.9-inch(3rd generation)

iPad8,5

ipad_pro_12_9_gen3_v1.4.0.json

Apple iPad Pro 12.9-inch(3rd generation)

iPad8,6

ipad_pro_12_9_gen3_v1.4.0.json

Apple iPad Pro 12.9-inch(3rd generation)

iPad8,7

ipad_pro_12_9_gen3_v1.4.0.json

Apple iPad Pro 12.9-inch(3rd generation)

iPad8,8

ipad_pro_12_9_gen3_v1.4.0.json

Apple iPad Pro 12.9-inch(4th generation)

iPad8,11

ipad_pro_12_9_gen4_v1.4.0.json

Apple iPad Pro 12.9-inch(4th generation)

iPad8,12

ipad_pro_12_9_gen4_v1.4.0.json

Apple iPad Pro 12.9-inch(5th generation)

iPad13,8

ipad_pro_12_9_gen5_v1.4.0.json

Apple iPad Pro 12.9-inch(5th generation)

iPad13,9

ipad_pro_12_9_gen5_v1.4.0.json

Apple iPad Pro 12.9-inch(5th generation)

iPad13,10

ipad_pro_12_9_gen5_v1.4.0.json

Apple iPad Pro 12.9-inch(5th generation)

iPad13,11

ipad_pro_12_9_gen5_v1.4.0.json

Apple iPad Pro 12.9-inch(6th generation)

iPad14,5

ipad_pro_12_9_gen6_v1.4.0.json

Apple iPad Pro 12.9-inch(6th generation)

iPad14,6

ipad_pro_12_9_gen6_v1.4.0.json

6.2. Color/Depth camera IDs and directions that can be started simultaneously

Device ColorID R/F DepthID R/F

Sony Xperia 1 II

0

Rear

5

Rear

Sony Xperia 1 III

0

Rear

5

Rear

Sony Xperia 1 IV

0

Rear

5

Rear

Sony Xperia PRO

0

Rear

5

Rear

Sony Xperia PRO I

0

Rear

5

Rear

Samsung Galaxy Note10+

0

Rear

4

Rear

Samsung Galaxy S10 5G(Snapdragon)

0

Rear

4

Rear

Samsung Galaxy S10 5G(Snapdragon)

0

Rear

5

Front

Samsung Galaxy S10 5G(Snapdragon)

1

Front

4

Rear

Samsung Galaxy S10 5G(Snapdragon)

1

Front

5

Front

Samsung Galaxy S10 5G(Snapdragon)

2

Rear

4

Rear

Samsung Galaxy S10 5G(Snapdragon)

2

Rear

5

Front

Samsung Galaxy S10 5G(Snapdragon)

3

Front

4

Rear

Samsung Galaxy S10 5G(Snapdragon)

3

Front

5

Front

Samsung Galaxy S10 5G(Exynos)

0

Rear

4

Rear

Samsung Galaxy S10 5G(Exynos)

1

Front

5

Front

Samsung Galaxy S20 Ultra

0

Rear

4

Rear

Samsung Galaxy S20+ 4G

0

Rear

4

Rear

Samsung Galaxy S20+ 5G

0

Rear

4

Rear

Huawei Honor V20

0

Rear

0

Rear

Huawei P30 Pro

0

Rear

0

Rear

Huawei P40 Pro

0

Rear

0

Rear

Huawei P40 Pro

2

Rear

7

Rear

Huawei P40 Pro+

0

Rear

0

Rear

Huawei Mate30 Pro

0

Rear

0

Rear

Huawei Mate30 Pro

1

Front

1

Front

Huawei Mate30 Pro

2

Rear

0

Rear

Huawei Mate30 Pro

2

Rear

7

Rear

Huawei Mate30 Pro

3

Front

1

Front

Huawei Mate30 Pro

3

Front

5

Front

Huawei Mate30 Pro

4

Rear

0

Rear

Huawei Mate40 Pro

1

Front

1

Front

Huawei Mate40 Pro+

0

Rear

0

Rear

Huawei Mate40 Pro+

0

Rear

8

Rear

Huawei Mate40 Pro+

1

Front

1

Front

Huawei Mate40 Pro+

1

Front

5

Front

Huawei Mate40 Pro+

2

Rear

0

Rear

Huawei Mate40 Pro+

3

Front

1

Front

Huawei Mate40 Pro+

4

Rear

0

Rear

Huawei Mate40 Pro+

6

Rear

0

Rear

Huawei Mate40 Pro+

7

Rear

0

Rear

Honor Magic3 Pro

0

Rear

3

Front

Honor Magic3 Pro

1

Front

3

Front

Honor Magic3 Pro+

0

Rear

3

Front

Honor Magic3 Pro+

1

Front

3

Front

Honor Magic5 Pro

0

Rear

3

Front

Honor Magic5 Pro

1

Front

3

Front

iPhone X

1

Front

1

Front

iPhone XS

1

Front

1

Front

iPhone XR

1

Front

1

Front

iPhone 11

1

Front

1

Front

iPhone 11 Pro

1

Front

1

Front

iPhone 11 Pro Max

1

Front

1

Front

iPhone 12 mini

1

Front

1

Front

iPhone 12

1

Front

1

Front

iPhone 12 Pro

0

Rear

0

Rear

iPhone 12 Pro

1

Front

1

Front

iPhone 12 Pro Max

0

Rear

0

Rear

iPhone 12 Pro Max

1

Front

1

Front

iPhone 13 mini

1

Front

1

Front

iPhone 13

1

Front

1

Front

iPhone 13 Pro

0

Rear

0

Rear

iPhone 13 Pro

1

Front

1

Front

iPhone 13 Pro Max

0

Rear

0

Rear

iPhone 13 Pro Max

1

Front

1

Front

iPhone 14

1

Front

1

Front

iPhone 14 Plus

1

Front

1

Front

iPhone 14 Pro

0

Rear

0

Rear

iPhone 14 Pro

1

Front

1

Front

iPhone 14 Pro Max

0

Rear

0

Rear

iPhone 14 Pro Max

1

Front

1

Front

iPhone 15

1

Front

1

Front

iPhone 15 Plus

1

Front

1

Front

iPhone 15 Pro

0

Rear

0

Rear

iPhone 15 Pro

1

Front

1

Front

iPhone 15 Pro Max

0

Rear

0

Rear

iPhone 15 Pro Max

1

Front

1

Front

Apple iPad Pro 11-inch(1st generation)

1

Front

1

Front

Apple iPad Pro 11-inch(2nd generation)

0

Rear

0

Rear

Apple iPad Pro 11-inch(2nd generation)

1

Front

1

Front

Apple iPad Pro 11-inch(3rd generation)

0

Rear

0

Rear

Apple iPad Pro 11-inch(3rd generation)

1

Front

1

Front

Apple iPad Pro 11-inch(4th generation)

0

Rear

0

Rear

Apple iPad Pro 11-inch(4th generation)

1

Front

1

Front

Apple iPad Pro 12.9-inch(3rd generation)

1

Front

1

Front

Apple iPad Pro 12.9-inch(4th generation)

0

Rear

0

Rear

Apple iPad Pro 12.9-inch(4th generation)

1

Front

1

Front

Apple iPad Pro 12.9-inch(5th generation)

0

Rear

0

Rear

Apple iPad Pro 12.9-inch(5th generation)

1

Front

1

Front

Apple iPad Pro 12.9-inch(6th generation)

0

Rear

0

Rear

Apple iPad Pro 12.9-inch(6th generation)

1

Front

1

Front

6.3. Default Settings for Hand Library and RuntimeMode

Device NeuralNetworkLibrary RuntimeMode RuntimeModeAfter Fallback

Sony Xperia 1 II

TFLite

GPU

GPU

TFLite

Sony Xperia 1 III

TFLite

GPU

GPU

TFLite

Sony Xperia 1 IV

TFLite

GPU

GPU

TFLite

Sony Xperia PRO

TFLite

GPU

GPU

TFLite

Sony Xperia PRO I

TFLite

GPU

GPU

TFLite

Samsung Galaxy Note10+

TFLite

GPU

GPU

TFLite

Samsung Galaxy S10 5G

TFLite

CPU

CPU

TFLite

Samsung Galaxy S20 Ultra

TFLite

GPU

GPU

TFLite

Samsung Galaxy S20+ 4G

TFLite

GPU

GPU

TFLite

Samsung Galaxy S20+ 5G

TFLite

GPU

GPU

TFLite

Huawei Honor V20

TFLite

CPU

CPU

TFLite

Huawei P30 Pro

TFLite

CPU

CPU

TFLite

Huawei P40 Pro

TFLite

CPU

CPU

TFLite

Huawei P40 Pro+

TFLite

CPU

CPU

TFLite

Huawei Mate30 Pro

TFLite

CPU

CPU

TFLite

Huawei Mate40 Pro

TFLite

CPU

CPU

TFLite

Huawei Mate40 Pro+

TFLite

CPU

CPU

TFLite

Honor Magic3 Pro

TFLite

GPU

GPU

TFLite

Honor Magic3 Pro+

TFLite

GPU

GPU

TFLite

Honor Magic5 Pro

TFLite

GPU

GPU

TFLite

Apple iPhone X

TFLite

GPU

GPU

TFLite

Apple iPhone XS

TFLite

GPU

GPU

TFLite

Apple iPhone XS Max

TFLite

GPU

GPU

TFLite

Apple iPhone XR

TFLite

GPU

GPU

TFLite

Apple iPhone 11

TFLite

GPU

GPU

TFLite

Apple iPhone 11 Pro

TFLite

GPU

GPU

TFLite

Apple iPhone 11 Pro Max

TFLite

GPU

GPU

TFLite

Apple iPhone 12 mini

TFLite

GPU

GPU

TFLite

Apple iPhone 12

TFLite

GPU

GPU

TFLite

Apple iPhone 12 Pro

TFLite

GPU

GPU

TFLite

Apple iPhone 12 Pro Max

TFLite

GPU

GPU

TFLite

Apple iPhone 13 mini

TFLite

GPU

GPU

TFLite

Apple iPhone 13

TFLite

GPU

GPU

TFLite

Apple iPhone 13 Pro

TFLite

GPU

GPU

TFLite

Apple iPhone 13 Pro Max

TFLite

GPU

GPU

TFLite

Apple iPhone 14

TFLite

GPU

GPU

TFLite

Apple iPhone 14 Plus

TFLite

GPU

GPU

TFLite

Apple iPhone 14 Pro

TFLite

GPU

GPU

TFLite

Apple iPhone 14 Pro Max

TFLite

GPU

GPU

TFLite

Apple iPhone 15

TFLite

GPU

GPU

TFLite

Apple iPhone 15 Plus

TFLite

GPU

GPU

TFLite

Apple iPhone 15 Pro

TFLite

GPU

GPU

TFLite

Apple iPhone 15 Pro Max

TFLite

GPU

GPU

TFLite

Apple iPad Pro 11-inch(1st generation)

TFLite

GPU

GPU

TFLite

Apple iPad Pro 11-inch(2nd generation)

TFLite

GPU

GPU

TFLite

Apple iPad Pro 11-inch(3rd generation)

TFLite

GPU

GPU

TFLite

Apple iPad Pro 11-inch(4th generation)

TFLite

GPU

GPU

TFLite

Apple iPad Pro 12.9-inch(3rd generation)

TFLite

GPU

GPU

TFLite

Apple iPad Pro 12.9-inch(4th generation)

TFLite

GPU

GPU

TFLite

Apple iPad Pro 12.9-inch(5th generation)

TFLite

GPU

GPU

TFLite

Apple iPad Pro 12.9-inch(6th generation)

TFLite

GPU

GPU

TFLite

7. Restrictions and limitations

7.1. ToF AR v1.4.0

  • When you set RecogMode→OneHandHldSmapho, ProcessLevel→HandCenterOnly in the Hand recognition function, the recognition result will always be recognized as right hand.

7.1.1. iOS

  • In AR Foundation Application, SceneMesh functions are not supported.

  • Modeling components are not supported.

  • TofARServer doesn’t support USB connection with iOS devices.

  • Only Face2Face is valid as a recognition mode (RecogMode) for the Hand component. Use of an invalid recognition mode will result in an exception.

7.1.2. Android

  • When the FPS or Exposure is changed from the default values in the ToF and Color components, the Camera2 API may stop and video may not be updated, depending on the device model.

  • HandLibrary performance slows down if you set the Runtime Mode to CPU with Huawei P30 Pro.

  • Depending on the device, misalignment of the ToF and Color images when using the Coordinate component etc. may occur when superimposing.

Appendix A: Glossary

AR

An abbreviation for Augmented Reality. This is the technology that superimposes digital information such as CG on real images.

ToF

An abbreviation for Time of Flight. This is the method of measuring the distance from a camera to an object based on the reflection time of infrared light.

ToF AR

The product name of this software. This name comes from the feature that generates recognition data for AR using ToF camera information. The product name is ToF AR, but TofAR or TofAr are used for folder names and package names.

Confidence

Confidence means the intensity of the infrared reflected light received by the ToF camera. Normally, the higher the value of Confidence, the higher the reliability of the distance measurement. Regarding the information of this value, the information displayed as an image is expressed as Confidence image, and the information to be processed is expressed as Confidence data.

Depth image and Depth data

Regarding the 3D information from ToF camera, the displayed information is expressed as Depth image, and the processed information is expressed as Depth data.

Color image and Color data

Regarding the RGB information from RGB camera, the displayed information is expressed as Depth image, and the processed information is expressed as Depth data.

Stream and Stream data

Regarding the data handled by ToF AR including Depth data, the I/F (interface) of this data is expressed as Stream, and the data retrieved from the I/F is expressed as Stream data.

Appendix B: PointCloud2ParticleSystem.cs

using System.Collections;
using System.Collections.Generic;
using TofAr.V0.Tof;
using UnityEngine;

public class PointCloud2ParticleSystem : MonoBehaviour
{
    private ParticleSystem particleSystem;
    private ParticleSystem.Particle[] particles;

    private object sync = new object();

    private bool updated = false;

    private Vector3[] vs;

    // Start is called before the first frame update
    void Start()
    {
        particleSystem = GetComponent<ParticleSystem>();
    }

    void OnEnable()
    {
        TofArTofManager.OnFrameArrived += OnTofFrameArrived;
    }
    private void OnDisable()
    {
        TofArTofManager.OnFrameArrived -= OnTofFrameArrived;
    }

    // Update is called once per frame
    void Update()
    {
        if (updated)
        {
            updated = false;

            lock (sync)
            {
                particles = new ParticleSystem.Particle[vs.Length];
                particleSystem.GetParticles(particles);

                for (int i = 0; i < vs.Length; i++)
                {
                    particles[i].position = vs[i];

                    float r = Mathf.Min(1, Mathf.Max(0, (vs[i].z) / 2));
                    float g = Mathf.Min(1, Mathf.Max(0, (vs[i].y + 1f) / 2));
                    float b = Mathf.Min(1, Mathf.Max(0, (vs[i].x + 1f) / 2));

                    particles[i].startColor = new Color(r, g, b);
                    particles[i].startSize = 0.02f;
                }

                particleSystem.SetParticles(particles);
            }
        }
    }

    private void OnTofFrameArrived(object stream)
    {
        if (!TofArTofManager.Instantiated)
        {
            return;
        }

        var pointCloudData = TofArTofManager.Instance.PointCloudData;
        if (pointCloudData == null || pointCloudData.Points == null)
        {
            return;
        }

        lock (sync)
        {
            vs = pointCloudData.Points;
        }
        updated = true;
    }
}

Appendix C: Changelog

Version Changes

v1.4.0

Body component
  • Removed support for CoreML.

  • Added OFF function to Noise Reduction Level.

  • Fixed a bug that the recognition result of Body become wrong when using ToF AR Server.

Color component
  • Supported the API for capturing still images.still image.

  • Changed the default resolution setting when using AV Foundation.

  • Fixed a bug that an incorrect default resolution was selected when TofArColorManager.UseFrontCameraAsDefault was set to OFF.

Hand component
  • Added recording and playback function for Hand data.

  • Removed support for CoreML.

  • Added OFF function to Noise Reduction Level.

  • Added detection threshold for hand recognition.

Plane component
  • Supported Plane Component to work on iOS.

Segmentation component
  • Fixed a bug that an API Exception occurred when toggling the ON/OFF of the Segmentation feature repeatedly during ToF AR Server connection.

ToF component
  • Changed default resolution setting when using AV Foundation.

  • Added SkeletonDepthView Prefab that can display tracking results of Hand, Body, and Face overlaid on DepthViewRawImage.

Others
  • Updated recommended software versions for development.

    Unity

    Unity 2021.3.18f1 → Unity 2021.3.31f1
    Unity 2022.2.6f1 → Unity 2022.3.11f1

    AR Foundation

    V4.2.7 → V4.2.8

    ARCore Extensions

    V1.35.0 → V1.40.0

  • Added compatible devices (Honor Magic5 Pro, iPhone15)

  • Changed the Api Compatibility Level setting of the Managed Library from .NET Framework to .NET Standard 2.1.

  • Added a reconnection function with ToF AR Server.

  • Added a signature to the mac OS library to prevent the security warning from appearing when initially connecting to ToF AR Server in Unity Editor.

  • Divided the library file that accesses the camera API on iOS into two parts: one for accessing the True Depth API and the other for other functionalities.

  • Changed the behavior to obtain the device orientation from the device and the screen orientation from the Unity Editor settings when running Unity Editor + Server.

  • Added null checks to TofArXXXManager.Instance and where use callback.

  • Fixed a bug that Color and Depth were misaligned on Android devices when the Mirroring feature was enabled.

  • Fixed a bug that the AV Foundation stream could not be started when the stream automatic start setting was enabled.

  • Fixed a bug that crashes occurred on iPhone SE.

  • Fixed a bug that connecting to ToF AR Server was not possible when using multi-byte characters in the Unity project name or the path name where the project is located.

  • Fixed a bug that errors occurred when uploading to TestFlight by adding version information to the Framework file.

  • Fixed a bug that Unity Editor would crash when compiling after making changes to the source code in Unity Editor.

  • Fixed a bug that the "Reload Domain" setting in Enter Play Mode Settings was changed to OFF when playing in Unity Editor.

  • Fixed a bug with the field initialization process in the TofArTransform constructor.

  • Fixed other bugs.

v1.3.0

General
  • AVFoundation is now available to use as a Camera API for iOS devices.

Color Component
  • Fixed a bug where images were not visiable if RGBA was selected as Color format.

Hand component
  • Improved the accuracy of hand feature point position recognition when both hands are close.

  • Improved the accuracy of gesture recognition.

Slam component
  • Added gravitational acceleration of the device to SlamData.

Others
  • Updated the recommended versions of software required for development.

    Unity

    Unity 2021.3.11f1 → Unity 2021.3.18f1
    Unity 2022.2.6f1

    AR Foundation

    V4.2.6 → V4.2.7

    ARCore Extensions

    V1.34.0 → V1.35.0

  • Added function to flip frames horizontally.

  • Changed defalut resolution for some iOS models.

  • Fixed a bug where the communication timeout period with ToF AR Server did not get set correctly.

  • Fixed a bug that caused NullReferenceException to be displayed in the console when using ToF AR Server.

  • Fixed a bug where the CalibrationSettingsProperty value was not updated even if the resolution setting was changed during Stream playback.

  • Fixed a bug that sometimes caused the aspect ratio to be incorrect when ColorViewQuad was initialized.

  • Fixed other bugs.

v1.2.0

Body component
  • Added CoreML to Runtime mode.

  • Changed the default value of Runtime mode for DetectorType SV2 to GPU.

  • Fixed a bug where playback speed of recorded data for DetectorType SV2 is different from Color and others.

  • Changed DetectorType SV1 to deprecation.

Face component
  • Improved the accuracy of "aiueo" recognition.

  • Fixed a bug where rotation in the screen may not match Face model rotation.

Hand component
  • Improved accuracy of hand feature points recognition.

  • Improved accuracy of gesture recognition.

  • Added CoreML to Runtime mode.

  • Fixed a bug where default of Runtime mode was not set GPU with some models.

Slam component
  • Changed the default value of Camera Pose Source setting to 6DoF when using iOS Platform.

ToF component
  • Improved accuracy of timestamp recorded data by changing ms to ns when using iOS Platform.

Others
  • Updated the recommended versions of software required for development.

    Unity

    Unity 2020.3.36f1 → Unity 2020.3.40f1
    Unity 2021.3.5f1 → Unity 2021.3.11f1

    AR Foundation

    V4.2.3 → V4.2.6

    ARCore Extensions

    V1.32.0 → V1.34.0

  • Added compatible terminals including Xperia 1 IV, Xperia PRO, iPhone14, iPad Pro 11-inch 4th and iPad Pro 12.9-inch 6th.

  • Changed the storage place of Manager Prefab to an upper layer.

  • Fixed a bug causing errors when starting the ToF component stream while the Hand component stream is running.

  • Fixed a bug where Apple Silicon version of Unity Editor could not be connected to ToF AR Server.

  • Fixed a bug causing the occasional loss of performance when using ToF AR Server.

  • Fixed a bug causing errors when using ToF Server without connecting the Server.

  • Fixed a bug causing errors with iOS devices non-compatible ToF AR.

  • Fixed other bugs.

v1.1.0

Color component
  • Support RGBA format output.

  • Support setting of Color Delay parameter on Inspector.

Face component
  • Support gaze direction recognition.

  • Add noise reduction processing (Android).

  • Fix a bug where FaceMesh didn’t match Color.

Hand component
  • Change Hand Runtime default to GPU (iOS).

  • Fix a bug where Hand is not recognized when the hand is too close.

  • Fix a bug where Landscape Right device orientation was not recognized.

Slam component
  • Support 6DoF (iOS).

Segmentation component
  • Fix a bug causing occasional crashes when closing the scene in use.

ToF component
  • Fix a bug causing occasional crashes when turning AutoExposure from OFF to ON and closing the scene in use.

Other
  • Update the recommended versions of software required for development.

    Unity

    Unity2020.3.28f1 → Unity2020.3.36f1
    Unity2021.2.11f1 → Unity2021.3.5f1

    AR Foundation

    V4.2.2 → V4.2.3

    ARCore Extensions

    V1.29.0 → V1.32.0

  • Add SDK compatible terminals including Xperia PRO-I and some models of Xperia 1 III.

  • Support dynamic change of FPS (iOS).

  • Improve the quality of occlusion that uses ARFoundation’s OcclusionManager when using ARFoundationConnector.

  • Fix an issue so that the streaming start notification is not called if streaming has not started with Plane.StartPlayback().

  • Fix a bug where the model was displayed smaller than the Depth and Color images when face and hand were displayed with the iPad’s front camera.

  • Fix a bug where a value different from the value of TofArHandManager.RuntimeMode was sometimes fetched.

  • Fix a bug where the setting of TofArTofManager.processDepth / processConfidence / processPointCloud may not be set.

  • Fix a bug causing crashes with a specific combination of versions of Unity and Visual Studio.

  • Fix a bug where changes in hand tracking mode were not set after starting hand recognition.

  • Fix a bug where the modeling function of SimpleARFoundation can not be used when using ToF AR Server.

  • Fix other bugs.

v1.0.0

Initial ToF AR release.