How to use Collision Triggers with Hand Anchors
We can combine hand anchors and Spatial Tracking Session when we need to track collisions between our hands and other scene entities.
Let’s build on what we’ve learned about AnchorEntity with hands and add in some ARKit features. In RealityKit, the easiest way to do this is to create a Spatial Tracking Session.
When we use Spatial Tracking Session with Anchor Entity, we can use more advanced features. These include physics and collisions. We can even access the anchor transforms. For now, let’s focus on collisions.
let configuration = SpatialTrackingSession.Configuration(
tracking: [.hand])
let session = SpatialTrackingSession()
await session.run(configuration)Let’s create an anchor for the left hand index finger, then add a sphere to it. The key to getting our anchors to collide with the subject is to set the physics simulation to .none.
let leftIndexAnchor = AnchorEntity(.hand(.left, location: .indexFingerTip), trackingMode: .continuous)
leftIndexAnchor.anchoring.physicsSimulation = .none
leftIndexAnchor.addChild(stepSphereBlue)
content.add(leftIndexAnchor)With the anchors setup, we can move on to the collisions. Each entity has a collider with the mode set to “trigger”.
Example 1: Any entity can collide with any entity. Allow collision between the hand anchors. Allow collision between a hand anchor and the subject.
_ = content.subscribe(to: CollisionEvents.Began.self) { collisionEvent in
collisionEvent.entityA.components[ParticleEmitterComponent.self]?.burst()
}Example 2: Only track collisions on the subject. Swap the color of the material based on left or right hand.
_ = content.subscribe(to: CollisionEvents.Began.self, on: subject) { collisionEvent in
if(collisionEvent.entityB.name == "StepSphereBlue") {
swapColorEntity(subject, color: .stepBlue)
} else if (collisionEvent.entityB.name == "StepSphereGreen") {
swapColorEntity(subject, color: .stepGreen)
}
}Video Demo
Full Example Code
struct Example021: View {
var body: some View {
RealityView { content in
if let scene = try? await Entity(named: "HandTrackingLabs", in: realityKitContentBundle) {
content.add(scene)
// 1. Set up a Spatial Tracking Session with hand tracking.
// This will add ARKit features to our Anchor Entities, enabling collisions.
let configuration = SpatialTrackingSession.Configuration(
tracking: [.hand])
let session = SpatialTrackingSession()
await session.run(configuration)
if let subject = scene.findEntity(named: "StepSphereRed"), let stepSphereBlue = scene.findEntity(named: "StepSphereBlue"), let stepSphereGreen = scene.findEntity(named: "StepSphereGreen") {
content.add(subject)
// 2. Create an anchor for the left index finger
let leftIndexAnchor = AnchorEntity(.hand(.left, location: .indexFingerTip), trackingMode: .continuous)
// 3. Disable the default physics simulation on the anchor
leftIndexAnchor.anchoring.physicsSimulation = .none
// 4. Add the sphere to the anchor and add the anchor to the scene graph
leftIndexAnchor.addChild(stepSphereBlue)
content.add(leftIndexAnchor)
// Repeat the same steps for the right index finger
let rightIndexAnchor = AnchorEntity(.hand(.right, location: .indexFingerTip), trackingMode: .continuous)
rightIndexAnchor.anchoring.physicsSimulation = .none //
rightIndexAnchor.addChild(stepSphereGreen)
content.add(rightIndexAnchor)
// Example 1: Any entity can collide with any entity. Fire a particle burst
// Allow collision between the hand anchors
// Allow collision between a hand anchor and the subject
_ = content.subscribe(to: CollisionEvents.Began.self) { collisionEvent in
print("Collision unfiltered \(collisionEvent.entityA.name) and \(collisionEvent.entityB.name)")
collisionEvent.entityA.components[ParticleEmitterComponent.self]?.burst()
}
// Example 2: Only track collisions on the subject. Swap the color of the material based on left or right hand.
_ = content
.subscribe(to: CollisionEvents.Began.self, on: subject) { collisionEvent in
print("Collision Subject Color Change \(collisionEvent.entityA.name) and \(collisionEvent.entityB.name)")
if(collisionEvent.entityB.name == "StepSphereBlue") {
swapColorEntity(subject, color: .stepBlue)
} else if (collisionEvent.entityB.name == "StepSphereGreen") {
swapColorEntity(subject, color: .stepGreen)
}
}
}
}
}
}
func swapColorEntity(_ entity: Entity, color: UIColor) {
if var mat = entity.components[ModelComponent.self]?.materials.first as? PhysicallyBasedMaterial {
mat.baseColor = .init(tint: color)
entity.components[ModelComponent.self]?.materials[0] = mat
}
}
}Download the Xcode project with this and many more examples from Step Into Vision.
Some examples are provided as standalone Xcode projects. You can find those here.

Would this work with a head anchor? To use the collision of the head with something to change or scene or turn on visibility or animate opacity of something?
I don’t believe so. I just tried a quick test and unfortunately the head anchored sphere did not fire a collisions. The example above works because ARKIt is providing data to the anchor for the purpose of hand tracking. I don’t see a head listed in the tracking capabilities for Spatial Tracking Session.