Using onGeometryChange3D to scale RealityView content when a Volume is resized

We can use this modifier to access the geometry of a view without using GeometryReader.

Overview

For a deep dive into this topic please read this article by Drew Olbrich: If you’ve created a visionOS app with a volume, you probably did it wrong

Our previous example showed a method that used GeometryReader3D. Listening for changes to that can work well but it can be very chatty. We may only care about the size of a volume, but we also get updates when a volume moves.

This example shows a new option we learned while attending a visionOS Workshop. We can use onGeometryChange3D to listen for changes to the volume. This modifier can be a bit more complex than other SwiftUI modifiers. We need to tell it what type of data we want from the geometry. We then extract the value we need from the GeometryProxy to make it available in the action closure. With those two steps done, we can use the action block to work with the data.

SomeView()
// Set the type we want to work with. In this case, we'll use Rext3D
.onGeometryChange3D(for: Rect3D.self) { proxy in
    return proxy.frame(in: .global) // extract the Rect3D from the GeometryProxy
} action: { newValue in // newValue is a Rect3D
    volumeSize = newValue.size // We can read the size of the Rect3D as a Size3D
}

So how do we use this? Garden 034 is a duplicate of Garden 032. We’ll remove the GeometryReader3D and delete scaleWithVolume. We’ll need a converter from physicalMetrics.

@Environment(\.physicalMetrics) var physicalMetrics

We’ll also need to keep track of some state.

/// This will store the size of the volume
@State private var volumeSize: Size3D = .zero

/// A top level entity that will be scaled with the volume. Only the root view will be added to this
@State private var volumeRootEntity = Entity()

// A place to store the bounds of our 3D content
@State private var baseExtents: SIMD3<Float> = .zero

We can load some content from Reality Composer Pro and add it as a child to volumeRootEntity. We’ll measure the base extents.

RealityView { content in
    content.add(volumeRootEntity)

    /// Load a scene from Reality Composer Pro and add it to volumeRootEntity
    guard let baseRoot = try? await Entity(named: "Scene", in: realityKitContentBundle) else { return }
    volumeRootEntity.addChild(baseRoot, preservingWorldTransform: true)

    // Capture the base extents from visual bounds
    baseExtents = baseRoot.visualBounds(relativeTo: nil).extents / baseRoot.scale

    // Call our new function once for the initial size
    scaleContent(by: volumeSize)
}

Now let’s add onGeometryChange3D. This will capture the volume size and call a new function.

// Anytime the volume changes in size we'll scale the RealityView content
.onGeometryChange3D(for: Rect3D.self) { proxy in
    return proxy.frame(in: .global)
} action: { new in
    volumeSize = new.size
    scaleContent(by: volumeSize)
}

Let’s add a helper function called scaleContent(by: ). We can pass this the volume size. This is where that physicalMetrics converter comes in handy. We can convert the volume size to meters and divide it by the base extents.

Note: this example assumes a volume with equal sizes on each side (a cube) so we’ll use the width of the volume and the x axis of the extents.

/// Scale the 3D content based on the size of the Volume
func scaleContent(by volumeSize: Size3D) {
  let scale = Float(physicalMetrics.convert(volumeSize.width, to: .meters)) / baseExtents.x
  volumeRootEntity.setScale(.init(repeating: scale), relativeTo: nil)
}

With everything in place the RealityView content now scales when we resize the volume. We can drop GeometryReader3D–which can be cumbersome to work with.

Video demo showing content scaled based on volume size

What do you think of this approach? Would you use this in your apps or do you prefer GeometryReader3D?

Example Code

struct ContentView: View {

    /// We'll need to use a converter from physicalMetrics
    @Environment(\.physicalMetrics) var physicalMetrics

    /// This will store the size of the volume
    @State private var volumeSize: Size3D = .zero

    /// A top level entity that will be scaled with the volume. Only the root view will be added to this
    @State private var volumeRootEntity = Entity()

    // A place to store the bounds of our 3D content
    @State private var baseExtents: SIMD3<Float> = .zero

    var body: some View {
        RealityView { content in
            content.add(volumeRootEntity)

            /// Load a scene from Reality Composer Pro and add it to volumeRootEntity
            guard let baseRoot = try? await Entity(named: "Scene", in: realityKitContentBundle) else { return }
            baseRoot.name = "TableScene"
            baseRoot.position = [0, -0.45, 0] // Move this down to the bottom of the volume
            volumeRootEntity.addChild(baseRoot, preservingWorldTransform: true)

            // Capture the base extents from visual bounds
            baseExtents = baseRoot.visualBounds(relativeTo: nil).extents / baseRoot.scale

            // Call our new function once for the initial size
            scaleContent(by: volumeSize)
        }
        // Anytime the volume changes in size we'll scale the RealityView content
        .onGeometryChange3D(for: Rect3D.self) { proxy in
            return proxy.frame(in: .global)
        } action: { new in
            volumeSize = new.size
            scaleContent(by: volumeSize)
        }
    }

    /// Scale the 3D content based on the size of the Volume
    func scaleContent(by volumeSize: Size3D) {
        let scale = Float(physicalMetrics.convert(volumeSize.width, to: .meters)) / baseExtents.x
        volumeRootEntity.setScale(.init(repeating: scale), relativeTo: nil)
    }
}

Support our work so we can continue to bring you new examples and articles.

Download the Xcode project with this and many more examples from Step Into Vision.
Some examples are provided as standalone Xcode projects. You can find those here.

Questions or feedback?