Airpods Motion Tracking (Headphones Recommended)

One of the ways spatial audio starts to become ingenious is when paired with head tracking. The sound sources start to stay put in space like they do in the real world when you turn your head.

Luckily, the Airpods Pro make it trivial to perform head tracking because they have built in accelerometers which work with Core Motion.

Tap to view Swift Code (AirPods Motion)

import CoreMotion

// Get motion from AirPods Pro
private let motionManager = CMHeadphoneMotionManager()

// Reference frame (this is what you would use for calibration, but here we simply use the value measured at the start of motion or its identity)
private var referenceFrame = matrix_identity_float4x4

// Get data from the AirPods pro for panning the listener (if it's available)
if motionManager.isDeviceMotionAvailable && !motionManager.isDeviceMotionActive {
    motionManager.startDeviceMotionUpdates(to: .main) { [self] deviceMotion, error in
        if let deviceMotion = deviceMotion {
            let rotation = float4x4(rotationMatrix: 
            						deviceMotion.attitude.rotationMatrix)
            let mirrorTransform = simd_float4x4([
                simd_float4(-1.0, 0.0, 0.0, 0.0),
                simd_float4( 0.0, 1.0, 0.0, 0.0),
                simd_float4( 0.0, 0.0, 1.0, 0.0),
                simd_float4( 0.0, 0.0, 0.0, 1.0)
            ])
            listener.transform = mirrorTransform * rotation * self.referenceFrame
        }
    }
}
							

The Audio3D example in the video comes from the AudioKit Cookbook, and the full tutorial for this can be found here (https://www.youtube.com/watch?v=dG6ZY8lh2jk), as well some demo source code here (https://github.com/emurray2/SpatialAudio/blob/main/Apple/SpatialAudio/ContentView.swift).