Detect and Handle Collisions in a RealityKit Scene Across Different Entities in iOS | by Anupam Chugh

RealityKit on iOS, half 2 — making use of collision occasions

Drawn by DALL-E

That is the second a part of my sequence of articles overlaying the ins and outs of RealityKit, which began here.

In case you didn’t get an opportunity to provide the half 1 a glance, we explored the fundamentals of the RealityKit framework — the anatomy of the RealityKit’s ARView , entities, parts, supplies, establishing a training overlay, ray casting and including some 3D gestures to our augmented reality-based iOS Software.

The applying we ended up with had the power so as to add a number of entities to the AR scene, but it surely was devoid of any occasion dealing with. Issues like overlapping entities and shifting entities throughout each other didn’t have any occasion handlers to point out that the collision came about.

The aim of this text is to dig deep into collision occasions. We’ll be establishing the respective parts on our entities and see a number of completely different use circumstances of collision detection via an iOS utility that we’ll be constructing throughout the course of this text.

Right here the three main parts used to provide our Entities in RealityKit an actual object-like conduct and really feel:

We’ll dive deeper into the Physics points in one other half. For now, let’s collide with collisions in RealityKit!

To permit entities to detect collision occasions, we have to add a CollisionComponent to the entity first. Subsequently, we’ll hearken to the CollisionEvents in our AR scene and deal with the completely different states of collision — start, involved, and finish.

To start our journey, hearth up Xcode and create a brand new augmented reality-based iOS utility and choose RealityKit because the rendering engine with SwiftUI because the person interface sort.

Subsequent, let’s arrange our customized entity field with a ModelComponent (for aesthetics) and a CollisionComponent, which supplies our entity the power to collide with different entities which have a CollisionComponent.

import SwiftUI
import RealityKit
import Mix
class CustomEntity: Entity, HasModel, HasAnchoring, HasCollision

var collisionSubs: [Cancellable] = []

required init(colour: UIColor)
tremendous.init()[CollisionComponent] = CollisionComponent(
shapes: [.generateBox(size: [0.5,0.5,0.5])],
mode: .set off,
filter: .sensor
)[ModelComponent] = ModelComponent(
mesh: .generateBox(measurement: [0.5,0.5,0.5]),
supplies: [SimpleMaterial(
color: color,
isMetallic: false)

comfort init(colour: UIColor, place: SIMD3<Float>)
self.init(colour: colour) = place

required init()
fatalError("init() has not been applied")

Within the above code, we’re doing fairly a number of issues. Let’s take a more in-depth look:

  • Conforming to the HasCollision protocol is significant for enabling collision detection throughout the entity.
  • collisionSubs is an array that holds the collision subscriptions for the entities, which we will see shortly.
  • Similar to the ModelComponents, the CollisionComponent requires a form too, which may be completely different from the form of the seen entity. Usually, a bigger measurement for the CollisionComponent is ready whenever you need collision detection for entities that come into the neighborhood of our present entity.
  • The CollisionMode is used to point how the collision information is collected for the entity — set off and default are the 2 built-in modes at the moment out there.
  • The CollisionFilter acts as a screener for figuring out the entities with which a collision must be detected. It consists of three sorts — default, sensor(this collides with all forms of entities), and a customized one. We are able to create customized CollisionFilters by establishing a CollisionGroup — a bitmask included within the entity.

Now that we’ve arrange our CustomEntity class with a CollisionComponent in place, let’s hearken to the CollisionEvents and deal with the states of the entity accordingly.

Easy Collision Occasions

Within the following code, we’re explicitly searching for entities of the sort CustomEntity within the Started and Ended occasions that we’ve subscribed to. As soon as the collision begins, we alter the colour of one of many entities utilizing the SimpleMaterial — resetting it after the collision has ended.

Now that we’ve arrange the collision occasions on the entities, let’s add a few entity containers in our RealityKit scene and witness the collision:

Word: To forestall the overlapping of two entities throughout a collision, we’ll want to make use of the PhysicsBodyComponent.

Collision with TriggerVolumes (Hidden Areas)

TriggerVolumes are invisible 3D shapes that get triggered when an entity enters or exit that quantity. The truth that TriggerVolumes are invisible entities could possibly be leveraged in a “treasure hunt” sort of AR recreation (i.e. unlocking mysteries).

TriggerVolumes prolong an entity and conform to the HasCollision protocol by default. With the intention to add a TriggerVolume to your RealityKit scene, you must conform to the HasAnchoring protocol. You additionally have to put it into the scene within the following approach, to make sure that the addCollision operate we noticed earlier permits the sort TriggerVolume whereas detecting CollisionEvents:

extension TriggerVolume : HasAnchoringlet hiddenArea = TriggerVolume(form: .generateBox(measurement: [0.3,0.3,0.3]), filter: .sensor) = [-0.5,-1.5, -3]arView.scene.anchors.append(hiddenArea)

Collision Filters and Teams

Oftentimes, there’s a have to arrange collision occasions solely amongst a sure group of entities. Collision occasions received’t be triggered for entities with completely different CollisionFilters (a CollisionFilter is is made up of a CollisionGroup and Masks). A situation the place a CollisionFilter is helpful could be an AR billiards recreation (i.e. understanding if stripes or solids had a collision).

Within the following code, we’ve created a brand new sphere-shaped entity with a customized CollisionFilter!

Alternatively, we’ve modified the CollisionComponent’s filter property of the previously-created field entity to:

filter: CollisionFilter(group: CollisionGroup(rawValue: 1), masks: CollisionGroup(rawValue: 1)

Now let’s add ray casting to our RealityKit scene together with ARCoachingOverlay to detect a horizontal airplane. We’ll be including each the field and sphere entities alternately to the scene within the 3D area utilizing the 2D factors. We’ll base this on the person’s gestures and a world property outlined within the GlobalVariables construction:

Discover the change: The scene property is now handed within the addCollision operate.

The explanation for this modification is that the entities aren’t immediately added to the scene’s root anchor anymore. The entities are added to the ray forged anchor, which finally is ready within the scene. So with a purpose to enable the entities to entry the scene utilizing self.scene, we’re passing the property within the addCollision extension operate of the CustomEntity we outlined earlier.

The code for the SwiftUI ContentView that holds the RealityKit’s ARView is given under:

struct ContentView : View 
var physique: some View
return ARViewContainer().edgesIgnoringSafeArea(.all)

struct ARViewContainer: UIViewRepresentable

func makeUIView(context: Context) -> ARView

let arView = ARView(body: .zero)

let config = ARWorldTrackingConfiguration()
config.planeDetection = .horizontal, choices: [])

return arView

func updateUIView(_ uiView: ARView, context: Context)

The addCoaching operate is used to arrange the teaching overlay throughout onboarding to detect the airplane (this was mentioned within the final half, with the implementation out there within the supply code).

Let’s have a look at our RealityKit iOS utility in motion with the above-integrated CollisionFilters and teams. You’ll discover that the collision occasions between the field and sphere shapes aren’t subscribed to within the video under:

More Posts