Spatial Computing’s First Customers Are Workers, Not Consumers
For three years, the consumer XR press cycle has been a slow drumbeat of headset rumours: Vision Pro 2, Meta Quest 4, Snap Specs, Samsung Galaxy XR. The framing has been that spatial computing is “coming”. The story we have not quite paid attention to is that, for hundreds of thousands of people, it is already here. It is just on a different face than the press releases suggested.
What Actually Happened in April 2026
On 9 April, Vuzix confirmed that Amazon has expanded its smart-glasses deployment from initial European trials into a full commercial rollout across the United States and Canada. The order covers more M400 units alongside a fresh run of Ultralite Pro glasses, deployed across warehouse logistics, server infrastructure and remote maintenance teams. Vuzix CEO Paul Travers framed the move as a step into “broader commercial rollout across multiple regions and use cases.” The press release talks about efficiency and “see what I see” remote-expert support. In effect, this is the largest mass deployment of always-on AR cameras the world has seen.
A few weeks earlier, Google rolled out its April 2026 Android XR update to Samsung Galaxy XR headsets. The headline features were 2D-to-3D conversion, hand tracking improvements and the ability to pin apps to physical walls. The quieter announcement, for our purposes, was that Android Enterprise now supports XR. Organisations can deploy headsets at scale through MDM partners including Microsoft Intune, Samsung Knox Manage and Omnissa Workspace ONE. Spatial computing is now being managed the way a phone fleet is managed.
On 21 April, YouTube extended its biometric likeness-detection technology from creators, journalists and government officials to the entertainment industry. CAA, UTA, WME and Untitled Management are the launch partners. To enrol, talent submit government ID and a self-recorded video, after which YouTube scans every upload for their face and surfaces matches for review. The framing is anti-deepfake protection; the underlying mechanism is normalised biometric face-print matching.
And on 2 August, the EU AI Act enters full enforcement. The provision worth flagging here is the prohibition on emotion-recognition systems in workplaces and education settings, alongside high-risk classifications for most remote biometric identification. The Act has been live since 2025 in part, but August is when the rules with real bite begin.
“Humans Empowered by Spatial AI” – and Some Quiet Questions
AWE 2026 in Long Beach this June is themed “I, Spatial: Humans Empowered by Spatial AI”. It is a great line, and it tells you what the industry would like the conversation to be about. We have spent fifteen years at Visualise making immersive content for some of the platform companies named above, including Google, Meta and Snap. We are not anti-XR. We think the technology is genuinely useful in industrial settings, and have argued for years that XR has been undersold as a workforce tool, particularly in healthcare training, where the recent NHS Supply Chain £210m medical-simulation framework is one of the most positive UK signals in a decade.
The discomfort is not with the use case. It is with the asymmetry. The wearer of a smart-glasses pair on an Amazon fulfilment floor did not pick the hardware, did not negotiate the data terms, and will not see the dashboards built from their hand movements, dwell times and conversations with co-workers. The wearer’s face, voice and motion are now feedstock for a productivity model owned by their employer’s vendor. “Empowerment” is a word that loads its own answer. The harder, more interesting question is the one the EU AI Act has had to write into statute: who is being read, by whom, and on what terms.
Worth saying clearly: not every hardware company behaves the same way. HTC’s VIVE Arts programme has invested in galleries and cultural institutions for years. Some smaller manufacturers do similar work. We mention this because the easy version of this argument is “big tech bad”, and the easy version is wrong. The interesting version is structural. When the market that scales fastest is the enterprise contract, the technology adapts to that buyer’s incentives, and consumer XR ends up living in the shadow of decisions made for warehouse rollouts.
What This Means for Brands, Producers and the People in the Glasses
If you are commissioning an immersive project in 2026, three things are now true that were not last year.
First, the platforms you build for are increasingly enterprise-managed. Android XR’s Enterprise tier means brand experiences and training tools will sit alongside fleet-managed devices with MDM controls. The same content stack now serves a luxury retail activation and a forklift driver’s heads-up display. Designing for that breadth, without flattening it, is the new craft.
Second, biometric data is the default, not the exception. Eye-tracking, hand-tracking, facial geometry, voice patterning – it is now ambient. Brands and producers should assume their experiences will be subject to the same questions YouTube is having to answer: what is captured, what is retained, who has access. The EU AI Act will force this conversation in the EU; the US will follow state by state, with Connecticut, Colorado and Illinois already legislating around AR/VR transparency, neural data and AI-driven employment decisions.
Third, the “consumer XR launch” everyone is waiting for has already happened, just not in the form we were promised. The first generation of mass-market spatial computing is on the people with the least power to opt out of it. That is not a reason to abandon the technology. It is a reason to commission, design and produce in a way that makes the asymmetry harder, not easier.
See It in Action
We have explored these tensions across our work. Our retail and luxury activations for clients including Louis Vuitton and Selfridges have always been opt-in, on personal devices, with explicit consent flows. Our work with Bentley on 360 content based virtual tours is built around the customer choosing to engage, not being scanned in passing. Our healthcare training work, including projects with St John Ambulance, treats biometric data as something to be minimised by default. Explore the full portfolio at visualise.com/work.
If you are exploring how immersive technology can serve your brand, your training programmes, or your audience without flattening the people on the other side of the camera, we would love to talk. Get in touch at visualise.com/contact.




Comments