Plane Detection, 3D model placement & FocusEntity with ARKit & RealityKit: AR with iOS (Part-V)

Shiru99
6 min readApr 29, 2023

--

Let’s get started with placing a 3D object on a horizontal or vertical plane using ARKit. With ARKit, we can detect the plane and add different user interactions to the 3D object. We can also explore the 3D model properties like animations or audio/video associated with them.

Plane Detection

When we use an AR app, we want to make sure that the virtual objects we place in the real world stay in place and move realistically as we move around them. The way we do this is by using something called “world tracking”.

World tracking configuration is like a tool that helps our app to understand and interact with the world around it. One of the things it can do is to detect flat surfaces in the real world, like a table or a floor. It does this by using the camera on our device to create a map of the area and then looking for places where the map is flat.

Once it detects a flat surface, it can add a virtual object on top of it, making it look like the object is sitting right there in the real world

ARWorldTrackingConfiguration is a class provided by ARKit framework that specifies the configuration and options for an AR session that uses the device’s motion and camera to track the device’s position and orientation relative to the world around it. It allows for high-quality AR experiences, as it uses 6DoF (six degrees of freedom) tracking to track the device’s position and orientation in 3D space.

import SwiftUI
import RealityKit
import ARKit

struct ContentView : View {

var body: some View {
CustomARViewContainer()
}
}

struct CustomARViewContainer: UIViewRepresentable {

func makeUIView(context: Context) -> CustomARView {
return CustomARView()
}

func updateUIView(_ uiView: CustomARView, context: Context) { }
}


class CustomARView: ARView {

init() {
super.init(frame: .zero)

let config = ARWorldTrackingConfiguration()
config.planeDetection = [.horizontal]
config.environmentTexturing = .automatic

if ARWorldTrackingConfiguration.supportsSceneReconstruction(.meshWithClassification) {
config.sceneReconstruction = .meshWithClassification
}

self.session.run(config)
}
}

In this part of the code, we create a configuration object for ARKit’s world tracking system. The ARWorldTrackingConfiguration is a class that encapsulates various settings for ARKit's tracking capabilities, such as the types of features to detect and track, the environment texture mapping, and scene reconstruction.

Here, we set the planeDetection property to [.horizontal], which means that we want ARKit to detect horizontal planes such as tables, floors, and other flat surfaces in the environment. The environmentTexturing property is set to .automatic, which means that ARKit will automatically create an environment map to blend virtual objects with the real-world environment.

Next, we check if the device supports scene reconstruction with the depth information from a LiDAR sensor. The LiDAR sensor can measure the distance between the device and objects in the environment with high accuracy and speed, allowing ARKit to reconstruct a detailed 3D mesh of the environment. If the device supports this feature, we set the sceneReconstruction property of the configuration to .meshWithClassification, which means that ARKit will generate a mesh with classification information, allowing us to place virtual objects on surfaces with more accuracy.

It’s worth noting that not all iPhones have LiDAR sensors, which means that some devices may not support the scene reconstruction feature with depth information. In such cases, ARKit will still work but with lower accuracy and precision.

Plane detection & Scene reconstruction

FocusEntity

Once we detect a plane in the AR scene, the FocusEntity helps to guide the user’s attention to the detected plane by displaying a visual cue, such as a crosshair or a circle, on the detected plane. This visual cue helps to inform the user that they can interact with the AR content on the detected plane.

Additionally, the FocusEntity provides functionality for adding AR content onto the detected plane. For example, if we want to place a 3D model onto the detected plane, we can use the FocusEntity to ensure that the 3D model is placed correctly on the plane and also provide visual feedback to the user as they move the 3D model around on the plane.

FocusEntity (https://github.com/maxxfrazer/FocusEntity)

Here are the steps you can follow to ensure that the FocusEntity package is properly integrated in your Xcode project:

  1. Add FocusEntity package to project. To do this, go to File -> Swift Packages -> Add Package Dependency and enter the package repository URL: https://github.com/maxxfrazer/FocusEntity.
  2. Check if the package is listed under your project’s dependencies in the Project navigator. If it’s not there, try adding it again.
import SwiftUI
import RealityKit
import ARKit
import FocusEntity

struct ContentView : View {

var body: some View {
CustomARViewContainer()
}
}

struct CustomARViewContainer: UIViewRepresentable {

func makeUIView(context: Context) -> CustomARView {
return CustomARView()
}

func updateUIView(_ uiView: CustomARView, context: Context) { }
}


class CustomARView: ARView {

var focusEntity: FocusEntity?

init() {
super.init(frame: .zero)

self.setUpFocusEntity()
self.setUpARView()
}

func setUpFocusEntity() {
self.focusEntity = FocusEntity(on: self, style: .classic(color: .yellow))
}

func setUpARView() {
let config = ARWorldTrackingConfiguration()
config.planeDetection = [.horizontal]
config.environmentTexturing = .automatic

if ARWorldTrackingConfiguration.supportsSceneReconstruction(.meshWithClassification) {
config.sceneReconstruction = .meshWithClassification
}

self.session.run(config)
}
}

The FocusEntity package provides a lot of customization options that can be used to change the appearance and behavior of the focus square in your AR app. For example, you can change the color of the focus square, adjust its size and animation, and even add 3D objects to it. More Details - GitHub

Placing 3D model on plane

import SwiftUI
import RealityKit
import ARKit
import FocusEntity
import Combine

struct ContentView: View {

var body: some View {
ZStack(alignment: .bottom) {
CustomARViewContainer()

Button(action: {
ActionManager.shared.actionStream.send(.place3DModel)
}, label: {
Text("Place 3D Model")
.font(.headline)
.foregroundColor(.white)
.padding()
.background(Color.blue)
.cornerRadius(10)
})
.padding(.bottom, 50)
}
}
}

struct CustomARViewContainer: UIViewRepresentable {

func makeUIView(context: Context) -> CustomARView {
return CustomARView()
}

func updateUIView(_ uiView: CustomARView, context: Context) {}
}



class CustomARView: ARView {

var focusEntity: FocusEntity?
var cancellables: Set<AnyCancellable> = []

init() {
super.init(frame: .zero)

// ActionStrean
subscribeToActionStream()

// FocusEntity
self.focusEntity = FocusEntity(on: self, style: .classic(color: .yellow))

// Configuration
let config = ARWorldTrackingConfiguration()
config.planeDetection = [.horizontal]
config.environmentTexturing = .automatic

if ARWorldTrackingConfiguration.supportsSceneReconstruction(.meshWithClassification) {
config.sceneReconstruction = .meshWithClassification
}

self.session.run(config)
}


func place3DModel() {
guard let focusEntity = self.focusEntity else { return }

let modelEntity = try! ModelEntity.load(named: "toy_car.usdz")
let anchorEntity = AnchorEntity(world: focusEntity.position)
anchorEntity.addChild(modelEntity)
self.scene.addAnchor(anchorEntity)
}


func subscribeToActionStream() {
ActionManager.shared
.actionStream
.sink { [weak self] action in

switch action {

case .place3DModel:
self?.place3DModel()

case .remove3DModel:
print("Removeing 3D model: has not been implemented")
}
}
.store(in: &cancellables)
}

@MainActor required dynamic init?(coder decoder: NSCoder) {
fatalError("init(coder:) has not been implemented")
}

@MainActor required dynamic init(frame frameRect: CGRect) {
fatalError("init(frame:) has not been implemented")
}
}

enum Actions {
case place3DModel
case remove3DModel
}

class ActionManager {
static let shared = ActionManager()

private init() { }

var actionStream = PassthroughSubject<Actions, Never>()
}

Functionality: When the user taps on the Button, the place3DModel function is called and a ModelEntity of a toy car is loaded and placed at the position of the FocusEntity.

You can ignore the functionality of ActionManager it simply let the button action call place3DModel directly.

Placing 3D on plane (IKEA Place)

Tap Gesture Code Explanation:

This code defines a SwiftUI ContentView that contains an AR view and a button labeled “Place 3D Model”. Tapping the button sends a message to the ActionManager’s shared PassthroughSubject, which is subscribed to by the CustomARView. When the action stream sends the “place3DModel” action, the CustomARView creates a ModelEntity from a 3D model file, creates an AnchorEntity at the position of the FocusEntity (a component for ARKit that helps with positioning objects in AR), adds the ModelEntity as a child of the AnchorEntity, and adds the AnchorEntity to the scene. The CustomARView also sets up an AR session with plane detection and environment texturing, and subscribes to the action stream to handle the “place3DModel” action.

--

--