VividQ and Dispelix create a 3D holographic tech for wearable AR

Check out all the on-demand sessions from the Intelligent Security Summit here.


VividQ, a maker of holographic display technology for augmented reality gaming, has teamed up with waveguide designer Dispelix to make new 3D holographic imagery technology.

The companies said the tech was near impossible just two years ago. They said they have designed and
manufactured a “waveguide combiner” that can accurately display simultaneous variable-depth 3D content within a user’s environment. For the first time, users will be able to enjoy immersive AR gaming experiences where digital content can be placed in their physical world and they can interact with it naturally and comfortably. The tech could be used for wearable devices, meaning AR headsets or smartglasses.

The two companies have also announced the formation of a commercial partnership to develop the new 3D waveguide technology towards mass production readiness. This will enable headset manufacturers the ability to kick-start their AR product roadmaps now.

VividQ

Early augmented reality experiences seen so far through headsets such as Magic Leap, Microsoft HoloLens, Vuzix, and others, produce 2D stereoscopic images at fixed focal distances, or one focal distance at a time. This often leads to eye fatigue and nausea for users and doesn’t offer the necessary immersive three-dimensional experiences — for example, objects cannot be interacted with naturally at arm’s length, and they are not placed exactly within the real world.

In order to deliver the types of immersive experiences necessary for AR to reach mass-market adoption, consumers need a sufficient field of view and the ability to focus on 3D images at the full range of natural distances – anywhere from 10cm to optical infinity, simultaneously – in the same way they do naturally with physical objects.

A waveguide combiner is the industry’s favored method of displaying AR images in a compact form factor. This next-generation waveguide and accompanying software are optimised for 3D applications like gaming, which means that consumer brands around the world can unlock the market’s full potential.

Waveguides (also known as ‘combiners’ or ‘waveguide combiners’) give a lightweight and conventional looking (i.e look like normal glass lenses) front end for AR headsets, and are necessary for widespread adoption. Apart from the form factor advantages, the waveguides on the market today perform a process called pupil replication. This means they can take an image from a small display panel (aka an ‘eyebox’) and effectively make it larger through creating a grid of copies of the small image in front of the viewer’s eye – a bit like a periscope but instead of a single view, it creates multiple views. This is essential to make the AR wearable ergonomic and easy to use.

Small eyeboxes are notoriously difficult to line up with the user’s pupil and the eye can easily “fall off” the image if they are not lined up correctly. It requires that the headsets be precisely fitted to the user, since even variations in different users’ Inter Pupillary Distance (IPD) may mean that they may not get their eye exactly lined up with the eyebox and be unable to see the virtual image.

Since there is a fundamental tradeoff between the image size (which we call “eyebox” or “exit pupil”) and the Field of View (FoV) in display, this replication allows the optical designer to make the eyebox very small, relying on the replication process to give a big effective image to the viewer, while also maximising the FoV.

“There has been significant investment and research into the technology that can create the types of AR experiences we’ve dreamt of, but they fall short because they can’t live up to even basic user expectations,” said VividQ CEO, Darran Milne. “In an industry that has already seen its fair share of hype, it can be easy to dismiss any new invention as yet more of the same, but a fundamental issue has always been the complexity of displaying 3D images placed in the real world with a decent field of view and with an eyebox that is large enough to accommodate a wide range of IPDs (interpupillary distance, or the space between the user’s pupils), all encased within a lightweight lens.”

Milne added, “We’ve solved that problem, designed something that can be manufactured, tested and proven it, and established the manufacturing partnership necessary to mass produce them. It is a breakthrough because without 3D holography, you can’t deliver AR. To put it simply, while others have been developing a 2D screen to wear on your face, we’ve developed the window through which you’ll experience real and digital worlds in one place.”

VividQ Cat Eden
A concept image for a simulation game where the user can interact with a digital world at arm’s length.

VividQ’s patent-pending 3D waveguide combiner is designed to work with the company’s software, both of which can be licensed by wearable manufacturers in order to build out a wearable product roadmap. VividQ’s holographic display software works with standard games engines like Unity and Unreal Engine, making it very easy for games developers to create new experiences. The 3D waveguide can be manufactured and supplied at scale through VividQ’s manufacturing partner Dispelix, an Espoo, Finland-based maker of see-through waveguides for wearables.

“Wearable AR devices have huge potential all around the world. For applications such as gaming and professional use, where the user needs to be immersed for long periods of time, it is vital that content is true 3D and placed within the user’s environment,” said Antti Sunnari, CEO of Dispelix, in a statement. “This also overcomes the issues of nausea and fatigue. We are very pleased to be working with VividQ as a waveguide design and manufacturing partner on this breakthrough 3D waveguide.”

In its Cambridge, United Kingdom-based headquarters, VividQ has demonstrated its software and the 3D waveguide technology for device manufacturers and consumer tech brands, who it is working closely with to deliver next-generation AR wearables.Breakthrough in AR optics means that 3D holographic gaming now a reality.

The task achieved by the companies was described as “quasi-impossible” in a research paper published in the Nanophotonics Journal in 2021.

Existing waveguide combiners assume that the light rays coming in are parallel (hence a 2D image) as they require that the light bouncing around within the structure all follow paths of the same length. If you were to put in diverging rays (a 3D image) then the light paths would all be different, depending on where on the input 3D image the ray originated from.

This is a big problem since this effectively means that the extracted light has all traveled different distances and the effect, as shown on the image on the , is seeing multiple partially overlapping copies of the input image all at random distances. Which makes it essentially useless for any application. Alternatively, this new 3D waveguide combiner is able to adapt to the diverging rays and display images correctly of the couch.

The 3D waveguide from VividQ is composed of two elements: Firstly, a modification of the standard pupil replicating waveguide design as described above.. And secondly, an algorithm that computes a hologram that corrects for distortion due to the waveguide. The hardware and software components work in harmony with each other and as such you couldn’t use the VividQ waveguide with anybody else’s software or system.

VividQ has more than 50 people in Cambridge, London, Tokyo and Taipei. The companies started working together in late 2021. VividQ was founded in 2017 and can trace its origins to the UK’s photonics department at the University of Cambridge and the Cambridge Judge Business School.

The company has so far raised $23M investment from deep tech funds in the UK, Austria, Germany, Japan and Silicon Valley. Asked what the inspiration was, VividQ CTO Tom Durant of said in an email to GamesBeat, “Understanding what the limitations were and then working out how to work around them. Once we’d identified that path, our multi-disciplinary team of researchers and engineers across optics and software set about solving each one in turn. Instead of seeing this as just an optics issue, our solution is based on hardware and software tuned to work in tandem.”

As for how this is different from competing technologies, the company said existing waveguide combiners on the market can only present images in two dimensions at set focal distances. These are typically about two meters in front of you.

“You can’t bring them up closer to focus on or focus past them to other digital objects in the distance,” the company said. “And as you look at these digital objects floating in front of you you can suffer very quickly from eye-strain and VAC (vergence accommodation conflict), which causes nausea. For gaming, this makes it very limited. You want to create experiences where a user can pick up an item in their hand and do something with it, without needing a controller. You also want to place multiple digital items in the real world locked into place with the freedom to focus on them and nearby real objects as you want, which leads to strong feelings of immersion.”

Originally appeared on: TheSpuzz

Scoophot
Logo