Lab 007 – Anchor an attachment to a hand

Create a tracked entity that will update based on an anchor, then child an attachment entity to it. No need for hand tracking or ARKit.

Last week @JohnAdams_IV asked me how to create a bit of UI that could be anchored to a hand. My first thought was to dive into ARKit and use hand tracking, but then I remembered seeing something about Anchor Entities during WWDC. AnchorEntity ended up being a great option for this since we can use it in full spaces even when we are not otherwise using ARKit.

This lab does just a few things.

  1. Setup the tracked entity
  2. Create some UI to pass in as an attachment
  3. Load the attachment and add it as a child of the tracked entity
struct Lab007: View {

    // 1. Set up a tracked entity with an anchor
    // RealityKit will update this in real time
    // No need for ARKit or hand tracking
    @State var handTrackedEntity: Entity = {
        let handAnchor = AnchorEntity(.hand(.left, location: .aboveHand))
        return handAnchor
    }()

    @State var scaler: Float = 1.0
    @State var target: Entity?


    var body: some View {
        RealityView { content, attachments in

            let model = ModelEntity(
                mesh: .generateSphere(radius: 0.1),
                materials: [SimpleMaterial(color: .black, isMetallic: false)])
            model.position = SIMD3(x: 0.8, y: 1, z: -2)
            target = model
            content.add(model)

            // Make sure to add the hand tracked entity to the scene graph
            content.add(handTrackedEntity)

            // 3.  Load the attachment
            if let attachmentEntity = attachments.entity(for: "AttachmentContent") {

                // Add the billboard component to keep facing the user
                attachmentEntity.components[BillboardComponent.self] = .init()

                // Add the attachment as a child of the tracked entity
                handTrackedEntity.addChild(attachmentEntity)

            }

        } update: { content, attachments in

            print("Scaling target: \(scaler)")
            target?.scale = .init(repeating: scaler)

        } attachments: {

            // 2. Create the attachment view
            Attachment(id: "AttachmentContent") {
                HStack(spacing: 12) {
                    Button(action: {
                        print("Button one pressed")
                        scaler -= 0.5
                    }, label: {
                        Text("One")

                    })

                    Button(action: {
                        print("Button two pressed")
                        scaler += 0.5
                    }, label: {
                        Text("Two")

                    })
                }
            }

        }
    }
}

This works pretty well but has some limitations. Since the UI is anchored to the left hand, the user can’t push the buttons with that hand. The UI moves every time we reach for it. We can still eye+pinch to activate the buttons. I would use UI like this sparingly. It can work for simple controls, but in my experience UI like this can be tiresome to use.

Video Demo

Download the Xcode project with this and many more labs from Step Into Vision.

Questions or feedback?