iOS 13 + ARKit 3 + Body Tracking = An invisible AR guitar that plays different sounds when strummed based on the position of the left hand along the invisible guitar's neck.
This project was a quick dive into RealityKit and ARKit 3's Body Tracking (i.e. ARBodyTrackingConfiguration) to explore body-based controls.
To watch the video with sound, check it out on Twitter.
I wrote more about building this demo and my initial thoughts on RealityKit.
- Run the app and point it at a person
- Hold the left hand up along the invisible guitar's neck
- Strum with the right hand
- Move the left hand to select different sounds
- Strum with the right hand again
Credit to https://freesound.org for the great free sounds.