Two Realities.
One Core.
We architected a unified construction simulation for PACE (Canada) from the ground up, building a single polymorphic codebase that deploys natively to both Holographic AR and Immersive VR hardware.
100%
Code Reuse
2
Native Platforms
6DoF
Unified Input
Zero
Redundancy
AR Input ≠ VR Input.
PACE (Canada) required a training ecosystem that wasn't locked to a single device. They needed a construction simulation that could leverage the holographic table presence of Tilt Five AR AND the immersive isolation of Meta Quest VR.
Building two separate apps would double the cost and maintenance. The challenge was to architect a single interaction system that could fluidly adapt to the "Wand & Board" mechanics of AR and the "Hands & Void" mechanics of VR without code branching.
The "Universal Controller"
We built the system from scratch to be hardware-agnostic.
1. Abstraction Layer
We engineered a custom Input Manager that sits between the hardware and the game logic. Whether it's an AR Wand or a VR Touch Controller, the simulation simply receives a "Grab" signal.
2. Adaptive Environment
The system detects the hardware on boot. If in VR, it generates a full "Zoomable Site Map" to replace the real world. If in AR, it renders only the holographic components on the physical board.
3. Deterministic Assembly
We built a robust, linear assembly logic engine that functions identically across both realities, ensuring that training outcomes are consistent regardless of the device used.
"The application looks great. We really appreciate the architecture and the speed of delivery on this dual-platform project."
PACE (Canada)