SpatialOS
Concept OS for AR showcasing various gestures, inputs, and interactions
DOUBLE POINT
2024
SpatialOS was created to evaluate and showcase Doublepoint’s wrist-based gesture detection in real-world scenarios, offering a direct comparison with other industry-standard input methods. The goal was to explore how intuitive, efficient, and versatile different input systems could be in mixed reality environments, helping potential partners understand the unique value of wrist-based control.
To achieve this, we designed SpatialOS as a conceptual AR/MR operating system that let users seamlessly switch between eye-tracking and hand raycasting for pointing, as well as between hand-tracking or a smartwatch equipped with an IMU for selecting. This flexible input framework provided a controlled setting to test and contrast various modalities in practical use cases.
Within SpatialOS, users could navigate menus, scroll through a web browser, explore photo and movie galleries, adjust settings using custom sliders, and even play a bubbles minigame. These interactions grounded the input methods in familiar, everyday tasks—turning the OS into both a demo platform and an experiential benchmark for gesture-based input.
OS | Mixed Reality | Meta Quest Pro | Eye tracking | Hand Tracking | IMU Raycasting | Machine Learning gesture recognition | UI | XR UX |
My role
Lead Developer
Unity / C#
Led development of an AR/MR experience with a focus on rapid prototyping while allowing modular swappable design that scales across mini-apps and usecases
engineered a state management system with dynamic mini-app launching/closing.
Spearheaded concept development and itteration
Built a custom interaction and UI toolkit supporting modular, real-time input swapping.
Created bespoke UI components: sliders, toggles, buttons, draggable elements, and scrollable areas - with smooth custom animations, weights and feel.
Integrated external SDKs, including Meta and DoublePoint.
Engineered a custom raycasting system for point-and-select interaction using only smartwatch sensors - no external trackers or sensors required.
Developed all miniapps featured in the OS:
Scrollable web browser
Photo gallery and viewer
Video gallery and player
Settings app
Bubbles minigame
Collaborated closely with designers to bring visual concepts to life.
Conducted user testing and presented the system on the showfloor at major industry events (AWE 2023, CES 2023).
Challeges & Solutions
Challenge: Seamlessly support multiple input types
The Problem: Enabling dynamic switching between input methods (e.g., gesture, selection, device-based) without conflicts or hard-coded limitations.
My Solution: I built a custom input framework with a modular, extensible architecture. It supports a growing library of pointers and selectors—each plug-and-play. New input types can be added effortlessly, allowing them to work together without rewriting core logic.
Challenge: Ray pointer alignment using only IMU data
The Problem: How to keep a ray pointer aligned with the user’s arm using only IMU data from a smartwatch—no optical tracking, no absolute position. The watch only provides acceleration and orientation deltas, making stability and consistency tricky, especially during erratic movement.
My Solution: I used the direction of gravity to define "up" and aligned the ray with the camera when the arm is down. When the user lifts their arm to point, the system uses that calibrated rotation to align the ray forward. The ray’s position is offset from the camera and adjusted by its height to remain anchored to the user’s arm in motion.
Challenge: Making UI interactions feel human and playful
The Problem: Most UI interactions feel sterile. A perfectly straight ray pointer feels rigid, robotic—lacking expression and nuance.
My Solution: I reimagined the ray as something alive. I gave it dynamic bend based on interaction weight - each UI element (like a slider handle) has a custom “mass.” Light interactions bend the ray slightly, feeling snappy and responsive - perfect for casual flicking through social content. Heavier elements bend it more, creating tension and precision - great for tasks like scrubbing to a specific timestamp or setting the volume to exactly 45%. It’s tactile without touch.