VR and AR will expand the limits of human perception

Not everyone experiences the world in the same way. Whether it’s how you react to the results of an election or what tones you hear in a sound clip, observable reality is often not as objective as you think it is.

Emerging technologies such as augmented reality will further blur this line. With AR on mobile devices and head-mounted displays, we’re well within the start of what it means to live an augmented life. Humans are doing a lot of fun things right now, like integrating playful games into our world and painting ourselves with digitally applied effects and makeup. We’re also starting to find utility for AR in the workplace and with hardware designed specifically for the enterprise market.

But AR is not just about the future of vision changing. Instead of AR being the technology through which we see the world, humans will become the common device for the combined knowledge of the species. We will dive past display technology to deeper integration with AIs and instantly searchable databases. We will be able to intuitively read reactions based on the dilation of each other’s pupils and the pulse under our skin. The best judgment calls will be made through instantly accessible shared data. Want to buy a car, for example? Analyze the salesperson’s biometric response to your questions, and scan satellite imagery to see how much bargaining power you have based on how long the vehicle has sat in the lot.

 We won’t just access these interpretations of reality through headsets similar to those released by Magic Leap and Microsoft’s Hololens. We have AR-enabled contact lenses to look forward to, or maybe we’ll just leapfrog ahead to brain implants that bypass our optic nerves entirely. If that sounds ridiculous and farfetched to you, just consider the 19th century’s glass reading stones or the first spectacles 400 years later. The inventors of these “innovations” could not have anticipated our use of light to correct vision, or predicted the experimental brain-connected interfaces that are being used today in therapy.

While bioethicists, geneticists, and politicians debate the ethics of the human modifications, technologists have free reign. We can argue for the advantages of today’s level of augmentation, but there can be no doubt that the arms race is on to make this interface more seamless. The question is not if these technologies will change our experience of reality, but how quickly.

These advances feel almost evolutionary. Humans are arriving at this heightened state through technology, but many animals already sense things we can’t. Pit vipers “see” thermal radiation of their prey, salmon are sensitive to magnetic fields, and arctic reindeer see into the UV spectrum to distinguish between food and snow.

Our differing experience of reality was made abundantly clear to me by my panther chameleon, Mr. Green. One day while passing through an underground garage on the way to his favorite hydrangea hunting ground, Mr. Green suddenly became skittish and completely on edge, ducking and weaving, eyes glued to the ceiling. We never saw the threat that he did in this space: a canopy of hot pipes and fluorescent lights.

Chameleons perceive light beyond our relatively limited human visible spectrum. To Mr. Green, the mess of infrastructure above our heads was a flashing danger sign; to us, it was simply an eyesore. He saw what we did not, and his reality was therefore quite different from the humans around him.

I thought of Mr. Green the next time I put on an AR headset and found myself gesticulating to the invisible. My own visual experience was completely unseen by the people around me, yet to me, it was very real.

What you see and your understanding of it will soon be different from the person next to you, and we will no longer have a common experience of our shared environment. When AR arrives in its fuller and more integrated state, the challenge for our technologically tiered society will be how we stay in sync with one another. Who will be the chameleons of the future, and what will they see that the rest of us cannot?

Boo Wong
Group director of emerging technology at The Mill

Find out more by attending The Mill's Conference Session:
From VFX to real time product visulisation
Jarrad Vladich, Executive Producer, Emerging Technology, The Mill

Wednesday 16th October, 13:15 in the Eureka Conference theatre