(If you watched the Vision Pro announcement... sound familiar? It's no coincidence. Apple seriously considered buying Meta back in 2017. Our 'spatial design principles' and vision for a SpatialOS were a big reason why. Oh, what could have been…)
We would place virtual monitors all around us at limitless scale. We would free 3D models from the confines of 2D screens, visualizing and interacting with them as they were always intended: spatially.
Gone were the days of the mouse & keyboard. Thanks to computer vision, we would use our hands to more naturally & directly interact with the digital world, just as we do the physical.
This same computer vision tech would turn the world into our desktop background, understanding the environment and anchoring actualized figments of imagination all around us.
Meta, the OG Meta... was going to build the true iteration of Steve Job's 'bicycle of the mind': a computer that grandma or a child could pick up and intuitively know how to use, with zero learning curve.
Oh, how beautiful the vision was...
But oh... how naive we were.
The Revenge of the Meta 2
The office that day of the ‘Bloomberg rehearsal’ was buzzing with anticipation.
For the first time, we each received our own headset. The packaging was a work of art; a beautiful piece of engineering and design, accompanied by a thoughtful developer guide and a document outlining our 'spatial design principles'.
The first step: plugging the tethered device into a computer and then into the wall for power (yes, the sequencing mattered...).
It was a collective stumble right out of the gates. Our computers failed to recognize the device. For the next hour, we twiddled our thumbs as the engineers scrambled to fix a bug and re-distribute the SDK (software development kit)
Hot start.
Once the 'Spatial OS' finally launched, the user was tasked with calibrating the sensors and mapping the world.
A graphical UI instructed you to look up, look down, look left, look right.
The next 5-10 minutes was a comical display of 100+ people in a state of manic indecision; stuck between vigorous yes's and no's; shaking our heads this way and that; waiting, hoping, yearning for the cameras to lock-on to our physical surroundings.
Some devices registered the real world within a few minutes. Other poor souls were left doing neck exercises for the next 5-10 minutes.
If you were lucky enough to create your environment map, then the OS would finally launch. The OS interface looked like a holographic book shelf. Each shelf with floating orbs representing a variety of spatial apps.
But upon launch, exactly where this holographic shelf appeared in space was anyone's guess.
For some, it was down on the floor. For others, it was off in the distant horizon or behind them. The next 10 minutes we collectively embarked on a holographic treasure hunt at our desks; searching up, down, and all around for our 'app launcher'.
My holographic shelf was above me, anchored to the ceiling.
Now the primary way to interact with these holograms was with your hands. You had to reach out and grab them. But doing so was quite the art... it required your hand being in the perfect spot, at just the right proximity to the hologram. When you found that magic zone, a circle would appear.
Then, and only then, could you close your hand and 'grab' the hologram. The camera on the headset needed to see a very distinct gesture: a wide-open hand and then a distinctively closed fist. When the cameras saw this movement, the circle UI would become a dot, confirming the hologram was secured.
This led to yet another comical sight; an entire office of people, waving their hands in the air, trying to materialize that circle. Everyone was flailing about, groping the air and repeatedly trying to turn that circle into a dot. We became a horde of perverts molesting invisible objects of desire.
I stood up and reached longingly into the air for my holographic shelf, only to be immediately yanked back into my chair by the tether.
Screw it. I resorted to using the mouse we so vehemently vowed to replace. It was a fallback form of input, controlling a 'spatial cursor' that allowed me to click on the 3D shelf and pull it closer.
Finally, I could start pulling out little apps & experiences, placing them all around me at my desk. For a split second I was living in the future.