Current augmented reality systems have two serious flaws which – if left unfixed – will limit the technology’s potential
The term ‘disruptive technologies’ was coined by the American scholar Clayton M. Christensen in 1995 to refer to innovations that create a new market by providing a different set of values, that ultimately overtake an existing market.
A report published by PwC has shown that Augmented Reality has the disruptive potential to add over a trillion dollars to the global economy by 2030, but – despite the hype – there is work needed before we get there.
According to a recent report from specialist AR/VR industry analysts Greenlight Insights, solving two fundamental optical issues would unlock an additional $10bn in spending on enterprise AR applications by 2026. Vergence-accommodation conflict (VAC) and focal rivalry have been holding back AR experiences since the industry’s inception.
These optical terms may be relatively unknown in the hardware sector, but they are responsible for blocking the true potential of enterprise AR applications.
Anyone who has worn – or worked in – an AR headset for any length of time will recognise their effects: eye fatigue, nausea, an inability to read text up-close and the struggle to complete precision tasks that rely on real and virtual content being well integrated.
Greenlight Insight’s analysts estimate that 95 per cent of current AR applications would see an immediate benefit if these issues were solved, and identified a number of solutions on the market.
Seeing the big picture
VAC is an issue that impacts both user comfort and the ability for developers to create truly interactive experiences within augmented reality settings.
Why? Because it decouples the natural vergence and accommodation responses in the eyes. To explain, as an object gets closer, our eyes naturally turn inwards (to triangulate on it), stimulating a change in focus. Because the lenses in an AR headset are set at a fixed distance, this natural response doesn’t happen. Our eyes cannot be fooled by software – the distance is fixed and our eyes know it.
As virtual objects are brought closer to you, the mismatching cues received by your brain cause discomfort and fatigue. The discomfort experienced by users prevents these devices from becoming all-day wearables.
At arm’s length
The inability to comfortably view content up close forces developers to restrict engagement to well outside of arm’s reach – creating a defining limitation known as ‘the one metre barrier’. Microsoft, in its Hololens developer guide, advises that, ‘content developers should attempt to structure content scenes to encourage users to interact one metre or farther away from the content.’
AR will also struggle to realise its true potential if we fail to address a further issue – focal rivalry – which impacts both the believability and accuracy of AR experiences.
Focal rivalry manifests itself when you want to integrate real and virtual content believably together. Currently you can focus on one, or the other, but not both. This is very limiting if you want real and virtual content to be viewed together – and even more so if you want to believably and accurately interact with virtually rendered objects.
A recent study conducted by the University of Pisa suggests that AR assisted high-precision manual tasks may not be feasible with current AR headsets because of issues around focal rivalry.
The inability to meaningfully, seamlessly and accurately mix real and virtual content up close is therefore severely holding AR back. It makes AR feel unnatural and inaccurate, and when used for precision tasks – such as surgery or engineering – it will limit the usefulness of the technology.
Unlocking AR’s true potential
Human vision is the product of a wonderfully complex system. There are 23 visual cues that make up our natural perception of 3D space in the real world. In an AR headset, eighteen of these can be dealt with by upgrading software and hardware. The remaining five can only be engaged through the optical interface – the space between between the eyes and the virtual projections.
Only when all the visual cues are engaged can we deliver truly immersive, believable and useful experiences – that includes bringing content comfortably into close proximity and placing digitally rendered objects accurately and convincingly in the real world.
The recent focus on upgrading software and computer hardware, though important, has resulted in the optical interface becoming an under-represented area of innovation. No amount of improvements in software, display technology or processing power will resolve these critical issues, and reproduce in virtual reality the response of our visual system to the real world.
The only way to avoid visual fatigue and accurately and believably mix real and virtual content is to use a dynamic focus solution. A comprehensive comparison of all the key dynamic focus solution technologies currently in development for the AR sector, including their relative performance and likely time to market, concludes that dynamic lens systems that can change the focal planes offer the best solution. They will help us to ensure that AR can reach its market potential.
The dynamic optical interface – a disruptive solution
AR has so much potential that it has no choice but to continue evolving – the key is to focus this development in the right places. By solving issues like VAC and focal rivalry we will elevate the user experience to create genuinely useful mixed realities – increasing their usefulness in enterprise scenarios in particular.
A dynamic focus solution will advance both the AR industry as well as numerous crucial sectors that use the technology, such as aviation, engineering and healthcare. In the future we anticipate that all but the cheapest of devices will have some form of a dynamic focal system built into them. Until then, we’ll have to solve its current limitations before AR can deliver on its promise and be a truly disruptive force.
Subscribe to our free, weekly newsletter, for all the latest insights from DisruptionHub.