U Can't Touch This: VR Haptics

Written BY

Emily Friedman

January 27, 2020

At EWTS 2019, Rick Smith, Sr. Director of Global Product Training at JLG Industries, asked 'can we use VR to train an operator completely?' Virtual reality today is enhancing and speeding up industrial training programs but not completely replacing traditional training methods. Immersive tech is proving great at reducing the amount of hands-on training required to, say, operate lift equipment or fly a plane, but can someone become certified in a trade by training in VR alone?

The VR simulation used at JLG is extremely realistic, even incorporating actual controls overlaid with virtual ones, but the industry standard requires hands-on training. If VR were to ever supplant traditional training (and thereby remove all risk and dramatically cut time and costs) it would have to be as close to the real thing as possible. There aren't standards, however, for high-fidelity VR or a grading system for VR training sims. So, how do we get there?

Let's Get Touchy

What's possible in VR today is certainly impressive: You can tour a building before it's built, test drive the cars of the future and practice building a jet engine from scratch. You can't, however, run you hand along a velvet sofa, judge the comfort of a vehicle's interior or feel when the parts of a machine correctly lock into place. As McLaren's Mark Roberts said at EWTS, "it's human nature to want to touch and connect with a physical property to really understand it." Touch is incredibly important to how we interact with and understand the world. Of all our senses it's probably the most overlooked and yet touch - a complex system of nerves, sensors and receptors that scientists still haven't mastered - is critical, giving us information about our environment, stimulating positive and negative emotions, telling us when someone is showing affection, and influencing what we buy every second of every day.

Feelings are Hard

There are specialized receptors all over our skin (3,000 just in each fingertip) that sense temperature, pain, pressure, texture, vibration, etc. and then encode that information into electrical signals that travel to the brain, where they're interpreted as sensations. Exactly how the brain decodes sensory information, neuroscientists aren't sure - it's the least understood of the senses (and I'm simplifying a lot) - but the effects can be physical, emotional and/or cognitive. The sense of touch or somatosensation is really a collection of sensations enabling us to perform behaviors relying on fine motor skills (ex. typing), form social bonds, and more. If you somehow lost your sense of touch, you wouldn't be able to sit up or walk. Touch is a huge part of how we navigate through life and extremely difficult to imitate in VR despite advances in hardware, software and haptics. Sensations are a lot harder to recreate in virtual worlds than images or sounds and yet they're vital to completing the illusion. Studies show that VR can change how someone thinks and behaves; the addition of touch (and other non-visual sensory inputs like taste and smell) would make it all the more powerful and effective.

Status of Haptics

Traditional haptic technologies rely on vibrating motors. Our smartphones use vibration as a form of output and video game controllers (rumble packs, force feedback steering wheels, etc.) to enhance the gaming experience. Your phone vibrates when you have a notification; the controller vibrates when your avatar gets 'hit' in a game, but it's a far cry from how we touch and feel in the real world.

Engineers, researchers and scientists around the globe are essentially trying to replicate human touch to realize more advanced and natural computing interfaces for a range of applications, including training and education, robotic control and even online shopping. They're working on three main types of haptic devices - graspable (think joystick), wearable, and touchable (ex. patches applied to the skin) - that use electric actuators, pneumatics, and/or hydraulics to create tactile and kinesthetic sensations (pressure, resistance, force, movement, etc.) One example of a wearable haptic device would be a glove that uses air to variably restrict and release your grip on a virtual object. There's even haptic footwear, but when it comes to immersive experiences tech companies are mainly developing haptic gloves and suits.

While haptics is getting more complex (incorporating neuromuscular stimulation, ultrasound tech, etc.), the sense of touch poses significant challenges in VR. Sight, hearing, taste and smell all take place near or on our faces, whereas touch involves the entire body. And while we can fool our brains to some extent, it will take nothing short of stimulating the laws of physics in VR to create ultra-realistic training simulations capable of eclipsing all other methods of learning. Just think about the feeling of gravity - simulating the weight and heft of a real object - or recreating millions of different textures in the virtual world!

Why touch in VR

Haptics enables us to interact more naturally with virtual objects. It also increases the amount of information sent to the brain during a virtual reality experience. This adds to the sense of immersion and in the case of training reduces errors and the time it takes to become proficient at a task. VR already has a much higher sensory impact than traditional training; think of haptics as the cherry on top, enabling kinesthetic learning or learning by touching and doing.

You can already see inside a machine (digital twin) but what if you could feel a loose bolt or the wear and tear around an engine as it runs in real time? The next generation of the workforce would benefit from hazard-free spaces for training. Designers and engineers could get earlier, more accurate feedback on ergonomics and user experience. Products would get better, assembly lines safer and more efficient. Surgical students could acquire legitimate surgical skills without going near a cadaver, and studies show that adding haptic feedback to surgical robots increases accuracy, lessens tissue damage and reduces operation time. Articulated, near-human haptics could revolutionize the way we shop online, conduct international business (imagine shaking hands with colleagues across the world), even emergency response (humans controlling haptic-feedback robots to rescue people from collapsed buildings, diffuse bombs, etc.) Pfizer, for one, according to Nathan Yorgey speaking at EWTS 2019, is looking to retrofit haptic gloves into a VR training experience mimicking what it’s really like inside an isolator; and Raytheon is anticipating the day haptics will make it easier and more natural to virtually assemble products.  

The players

None of this is to say that today’s haptic devices aren’t capable or rapidly advancing. The technology may not be at peak performance, but it’s advanced enough for businesses to begin incorporating haptic gloves (and other VR peripherals like controllers and treadmills) into training, product design, sales and more applications. Here are some of the companies working on wearable haptic devices today:

HaptX

HaptX has been focused on haptic gloves for VR training, simulation and design since 2012 and just last month secured $12 million to produce the next generation of its gloves. HaptX’s glove tech uses microfluidics, force feedback and precise motion tracking to provide realistic touch and natural interaction in VR. At the heart is the company’s microfluidic skin, a silicone-based material containing pneumatic actuators and microfluidic air channels. Microfluidic skin panels expand or contract, with 130 actuators in each glove to push against a user’s skin the same way a real object would when touched. A lightweight exoskeleton in the gloves provides up to four pounds of resistance/force to each finger, enhancing the perception of size, shape and weight of virtual objects. Finally, magnetic motion tracking and a hand simulation system deliver sub-millimeter accuracy hand tracking with six degrees of freedom per finger. HaptX says dozens of companies including Nissan have piloted its Gloves Development Kit to feel virtual product prototypes, build muscle memory in VR training sims, and even control robots.

From TechSpot

BeBop Sensors

In October, BeBop Sensors announced the “first haptic glove designed for the Oculus Quest.” This is the company’s wireless, one-size-fits-all Forte Data Glove, which enables natural hand interactions and haptic feedback to create a more immersive training experience for workers. BeBop’s website doesn’t get very technical but mentions that the Forte Data Glove Enterprise Edition has a breathable open palm design, long battery life and fast sensor speeds, and that the company’s sensors comprehend force, location, size, weight, bend, twist and presence. Unnamed Fortune 500 companies are apparently using the Forte Data Glove for training as well as VR medical trials, robotics and drone control, VR CAD design, and more.

bHaptics

TactSuit is a line of wireless haptic accessories, including a haptic face cushion for use with HMDs (Tactal), a $499 haptic vest for the torso (Tactot), and haptic sleeves/devices (Tactosy) for the arms, hands and feet—all made from an anti-bacterial material and mesh lining that’s machine washable. With over 70 haptic feedback points, the suit allows you to feel sensations from the virtual world all over your body. Sensations mentioned on bHaptic’s website include feeling a snake wind around your body (vest), feeling a light breeze (head), and the recoil of a gun (arm)—so, clearly not targeting enterprise. bHaptic’s devices for the hands and feet are “suitable for martial arts, training content, and sports games.”

HoloSuit

The HoloSuit is a motion capture suit that also offers haptic feedback. Consisting of a jacket, pair of gloves and pants, HoloSuit can be used to train for procedures in which a worker’s motor skill(s) is important; it’s also great for repetitive training, helping workers to build muscle memory. The Pro version packs 36 embedded motion sensors, nine haptic motors, and six programmable buttons. Trainees can “touch, interact and feel components in the virtual world,” movement can be recorded and replayed, and data captured to show a worker’s performance progress for a given task.

Teslasuit

This full-body suit and software suite is designed for performance training, to improve the user’s movement, reflexes and instincts. With haptic feedback, motion capture, temperature control and biometrics, Teslasuit is ideal for training in complex and stressful tasks and environments.* The solution captures actions, establishing baselines so companies can track improvement over time, and helps determine ability to perform under pressure thanks to embedded ECG and Galvanic Skin Response sensors that track vitals and emotional stress. This requires direct skin contact, which means users have to strip down to put on the suit.

*At this month’s CES, Teslasuit demoed a model without thermal feedback, just electrostimulation.

There’s now a glove that can be used with the suit or separately, for business customers only. The Teslasuit Glove detects hand movements and provides tactile feedback; it combines haptics, motion capture, biometry and force feedback, gathering real-time data as users go through tasks in dangerous (virtual) environments. Think astronaut training, fuel loading and emergency evacuation.

From Engadget

Manus VR

Manus VR’s wireless Prime Haptic gloves have linear resonant actuators, providing haptic signals depending on type of material and how much virtual force is applied. Internal circuity detects position of the finger joints, a Vive tracker on top of the glove reads hand positioning, and an inertial measurement unit works out the thumb’s rotation. These are gloves for training, compatible with HTCVive and Vive Pro, serving as a fully articulated hand in virtual assembly and more. The price? Around $5,000!

TEGway

This Korea-based company plans to launch a VR thermal haptic development kit this year. At this month’s CES, TEGway showed off a prototype of the upcoming ThermoReal dev kit consisting of two gloves, two sleeves and a forehead-mounted unit powered by small batteries. These bring temperature fluctuations to your face, and by quickly alternating between hot and cold create a force sensation (ex. the feeling of cold spreading over your arm).

Future-looking

VR can fool us into believing a virtual object is right in front of us; with motion tracking our virtual hands can mimic our real ones; and current haptic gloves can imitate force, shape and texture to an extent. AI, advanced materials, and the application of other areas of technology will be critical to improving haptics to the point where immersive experiences feel like they’re actually happening and VR training could potentially replace all other forms of training.

Current research and experiments are promising: Project H-Reality has discovered that human skin, mathematically speaking, is very similar to the Earth’s skin. Project Levitate is bringing touch, sound and levitation together, using ultrasound waves to create the illusion of a solid object in mid-air. At the University of Southern California, Heather Culbertson is recording what happens when something is dragged over different materials at different speeds and pressures and then playing back the vibrations using a pen. (The user feels different textures depending on the vibration pattern transmitted through the pen.) A Stanford lab developed Grabity, “a wearable haptic device…for grasping virtual objects in VR;” and is working on a “soft pneumatic actuator skin,” a thin sheet of flexible silicone with tiny air pockets that inflate and deflate independently, acting as pixels for tactile elements. NeoSensory is working on a vest with 32 vibratory motors to translate sound into tactile sensations for those with hearing loss. Novasentis created a thin haptic film made from a new form of polyvinylidene fluoride plastic that contracts and flexes when an electrical charge is applied (now being put into gloves for VR). And a team at Northwestern University recently developed a flexible material that can flex and twist thanks to tiny, fast actuators powered by NFC—what could ultimately become a full-body suit with 1,000 actuators. Watch this space!

Further Reading