Emerging Hardware and Software Technologies Support Professional VR/AR Applications24 Aug, 2018 By: Cyrena Respini-Irwin
From eye-tracking headsets to vision correction and beyond, new developments on display at SIGGRAPH 2018 promise a more comfortable, practical future for interactive viewing.
“Today, there is no way you can stay comfortably in VR for an extended period of time,” said Pierre-Yves Laffont, CEO and cofounder of Lemnis Technologies. He believes that virtual reality (VR) can be “a revolution in computing,” but the industry must first fix the problems users frequently encounter while using the technology — including headache, eye strain, dizziness, and nausea, as well as blurriness when the user attempts to focus on details up close.
The problems arise, in part, from the sensory conflicts inherent in using a head-mounted display (also called an HMD or headset) — your eyes perceive motion, for example, that your body does not feel. Another source of discomfort is called the vergence–accommodation conflict: The disassociation of focus (or accommodation) and vergence, which is how the eyes align with each other. “As soon as you put on the VR headset, they become decoupled,” Laffont explained.
Normally, the left and right eye converge or diverge to adjust to viewing various distances, and focus simultaneously adjusts so the viewer can clearly see a near or far object. But with a headset, the two eyes are presented with two offset images that have a fixed focal plane: they’re focused on a screen whose distance from the user doesn’t change, while attempting to converge on an image in which distances do change. This conflict can be reconciled by the brain in the short term, but becomes less tolerable — and more uncomfortable — the longer the user is in VR.
Lemnis is seeking to be part of the solution by making VR and augmented reality (AR) experiences more like normal, real-world perception. “Our goal is to make your eyes focus just like they do in reality,” Laffont said.
With its Verifocal technology, Lemnis is seeking to reduce user discomfort in VR and increase the detail visible in near objects. Image courtesy of Lemnis Technologies.
Lemnis launched its first product, Verifocal, at SIGGRAPH 2018 this month. The platform combines software and hardware to analyze the user’s eye movements, determine which part of a scene the user is looking at, and calculate how far away that object or element is. Verifocal then automatically adjusts the optics in the VR headset accordingly. The result for the user, according to Lemnis, is greater comfort and improved clarity when viewing near objects, such as reading text in a virtual manual.
There’s another benefit too: Users who wear eyeglasses can enter their prescription information into the software, and Verifocal will adjust its optics so they can forgo the glasses while using a headset. That’s important for physical comfort, but also because glasses don’t mix well with eye-tracking technologies — reflections on the lenses can interfere with the user’s vision and the tracking camera’s ability to monitor eye movement.
In Lemnis’s case, a small infrared camera is positioned behind the objective lens of the headset. (From there, the camera can monitor the user’s eye movements, but “optical tricks” hide it from the user’s view, said Laffont.) The Verifocal engine takes input from the eye tracker and feeds it into an algorithm that identifies ideal focus; the adaptive optics adjust accordingly, and correct for any distortion from the lens.
Laffont believes that we’re just now entering the first stage of enterprise VR, driven by technologies like Verifocal. Examining a detailed CAD model, exploring the small components within an engine digitally — “all those applications are really uncomfortable today,” he observed, with far objects being the only kind that users can interact with in an effortless manner. “The industry has made a lot of progress,” he noted, but still has a long way to go in terms of making headsets lighter and sleeker; speeding the refresh rates of the displays; and increasing display resolution.
Laffont also expects that eye-tracking technology will play a growing role in preparing VR enterprise applications, because “it enables foveated rendering, which is required for high-resolution displays,” he explained. (With foveated rendering, the part of the scene the user is focused on is rendered in higher resolution than the rest, resulting in a more natural view and reduced computational demands.) “In the next two years, eye tracking will become a standard feature in mid- and high-range headsets,” Laffont predicted.
HP Builds Out VR Portfolio
It isn’t just startups like Lemnis that are enthusiastic about VR’s maturation for the enterprise; HP is continuing to invest heavily in creating a VR solution portfolio for professional users. “We have the broadest commercial VR portfolio in the industry,” said Xavier Garcia, general manager, global head of workstations and virtual reality for HP. “We are really committed to this market.”
Dan Schneider, virtual reality evangelist and business development manager for HP, believes that the time is now for companies on the fence about VR to make their move. “If you’re not using VR in the next year, you’re going to be behind,” he stated. Although consumer applications have had the spotlight thus far, research indicates that “commercial VR is going to take over consumer VR,” Schneider said. “We want to make sure that we’re part of that, for sure,” he continued.
So what’s standing in the way of companies pulling the trigger on AR/VR technologies? “One of the things we’ve struggled with in VR is the adoption model,” Schneider acknowledged. In many cases, it’s simply that professional users don’t know how the technology can benefit their workflows; it’s far less familiar than workstation technology, for example. “Once you do it, you get it a lot better than before.”
The HP VR Launch Kit for Unreal Engine, now in beta, includes sample models, such as a motorcycle and a Frank Lloyd Wright cathedral, to help users become familiar with exploring models in VR, and better understand how they might get started with employing the technology. “They showcase how to apply templates and features to a model,” Schneider said.
The ability to look inside designs and see how parts fit together is a significant benefit for CAD users, according to Schneider, and can help VR users to spot potential manufacturing problems in the design stage. “Being able to immerse yourself in it is a very different experience than seeing it on a screen,” he said.
Schneider noted that HP is relying on partnerships in order to offer a full VR solution for professional use. Datasmith, a workflow toolkit for Unreal Engine, enables users to import drawings and models into VR. “[You can] take a look to see if it’s designed how you want it to be designed,” he explained.
“Quality hardware” — including headsets that provide high pixel density, such as StarVR’s new models — is also an important part of the equation, he noted. At SIGGRAPH, HP announced that the HTC Vive Pro will join its lineup of head-mounted displays; StarVR headsets are powered by HP workstations, including the VR Backpack; the HP VR Launch Kit for Unreal Engine beta, which can help users understand how models will render and run in VR, is now available for download; and HP will be partnering with Pixo VR, which provides VR training solutions for construction, manufacturing, energy, and utility companies.
Editor's note: Click here to read Part 2 of this article.
About the Author: Cyrena Respini-Irwin
Running AutoCAD? Test Your Hardware! Designed to test and compare the performance of systems running AutoCAD, Cadalyst's Benchmark Test is a popular and long-time favorite.