Imagine a blind woman being able to touch her smartphone screen and bring up a relief map of the room to help navigate the unfamiliar space. Or a soldier feeling the little finger of his newborn daughter 6,000 miles away via FaceTime.
Founded by Alireza Imani and post-doctoral researcher Mehdi Korjani (Ph.D. ’14) of the Ming Hsieh Department of Electrical Engineering, Lamsaptics uses ultrasound arrays and predictive learning-based algorithms to create “over-the-air” sensations of touch for augmented and virtual reality applications.
“The idea is that information can be transferred via different senses,” explained Imani, a Ph.D. candidate at USC Viterbi whose background is in high-speed integrated circuits.
The market for augmented and virtual reality tools, such as Microsoft Hololens and Oculus VR, is growing exponentially. Touch is the next frontier in virtual reality development. Companies like Cyber Glove, Virtual Realities and Ultrahaptics already have products on the market, but Imani and Korjani think those have limitations.
“In most of the existing systems the sensation of touch is awkward,” Imani said. “Most require wearing a device, and there is an unwanted point of contact — for example, with a glove. It doesn’t feel natural.”
Rather, he and Korjani believe what is needed are complex “haptics,” or sense of touch. They are working on a prototype mobile tool that uses ultrasound waves, air pressure fields, high-frequency sound waves and ultrasound transducers to stimulate the neurons in your hand, similar to the way ultrasound is used in medicine to create high-resolution images.
Or, to explain it another way: “It is much the same as how bats eco-locate using sound waves and how radars locate planes using electromagnetic waves,” Imani said.
The pair believes they will be able to generate complex patterns that will create the feeling of, for example, different fabrics, different shapes and different forms on your hand.
“Our algorithms are designed to evolve to match human experience — how humans perceive shapes, textures, etc., as opposed to pre-engineered patterns, ” Imani said.
Their first public demonstration of the Lamsaptics tool will be a virtual “highfive” via FaceTime, which will let them know how much force the technology can generate to develop different sensations. The tool would be connected via a USB port on a personal computer or tablet.
The Lamsaptics creators imagine documenting all human sensations to develop a more sophisticated experience for users. The vision is that a consumer could program different sensations.
“It is about improving the human experience — someone with hearing disabilities, or someone who is blind who can see objects and what’s sharp in a room,” Imani said.
As for commercial uses, he can imagine video game developers using the technology to heighten the gaming experience, or theme park operators like Disney using it to engage visitors in another dimension of a story.
“Sense of touch is extremely important in human life,” Imani said, “from infancy, where the loving touch of the mother is so important in the mental and social development of the child, to later stages of life. We control and interact with objects through visual and haptic feedback. It is an essential part of what makes us human.”
“The introduction of touch in the digital world is a big idea,” said Ashish Soni, founding director of the Startup Garage. “Lamsaptics is a great example of the companies the Garage loves to support: technical founders with a big idea hoping to change the way we experience the world.”
With the aid of Innovation Node-Los Angeles — one of seven NSF-funded entrepreneurial hubs in the nation, led by the USC Viterbi School’s Dean Yannis Yortsos and Andrea Belz — Lamsaptics recently received a $50,000 NSF Innovation Corps (“I-Corps”) grant to meet with potential customers around the country and develop their business model.