Donate to Science & Enterprise

S&E on Mastodon

S&E on LinkedIn

S&E on Flipboard

Please share Science & Enterprise

Start-Up Creating Touch Sensing for Visual Images

Ting Zhang and Shruthi Suresh

HaptImage co-founders Ting Zhang, left, and Shruthi Suresh (Oren Darling, Purdue University)

5 Mar. 2019. A start-up company spun off from Purdue University is developing a system that translates visual images to touch-sensing signals for people with limited visual abilities. The company HaptImage LLC, in West Lafayette, Indiana is a winner in 2 business pitch competitions last year, and plans to market the technology as a service to help teach technical and scientific subjects to blind or visually impaired students.

HaptImage LLC was founded in 2018 by doctoral engineering students Shruthi Suresh and Ting Zhang that licenses their research at Purdue. Suresh and Zhang, working with biomedical engineering professor Bradley Duerstock and industrial engineering faculty Juan Wachs, study conversion of visual images used routinely in business and education into haptic or touch-sensing signals for people with visual disabilities. The problem is particularly difficult for working with or learning science, technology, engineering, and mathematics, the so-called STEM fields, where people who are blind or visually impaired often need a partner or technology to recreate the images into another sensory form, such as 3-D printouts.

“There is no instantaneous method of helping a student with blindness understand an image,” says Suresh in a Purdue statement. “In STEM, so much information can come from an image. When you don’t have access to that digital image, or a viable alternative, you feel discouraged to pursue a career in that STEM-related field.” Zhang adds, “We, as sighted people, take images for granted and don’t realize how valuable accessible visual alternatives are for someone with impaired sight. I think having that access is valuable.”

The technology uses an algorithm to translate the image into haptic and audio signals for the individual in real time. Those signals describe the image’s size, color, shape, intensity, location, texture, and opacity, using a hand-held joystick that scans and interprets the image. The joystick employs friction-like resistance and vibrations to indicate changes in surface material and texture. And users holding the joystick can examine the image on their own, without help from a partner or assistant.

“The algorithm translates an image into different intensities of vibration, different pitches or amplitudes of sound and haptic feedback, which allows you to feel that shape,” notes Zhang. “They can feel the height and shape of the image, and they feel where the shape changes,” adds Suresh. “That comes across a lot easier for someone who is blind or visually impaired.”

HaptImage licenses the research from Purdue’s Office of Technology Commercialization, holder of the patent that lists Zhang as one of the inventors, along with Wachs and Duerstock. The company says it’s refining the technology and preparing a prototype that works with smart mobile devices, and plans to market the system as a subscription service beginning in the fall.

Suresh and Zhang attended training in technology commercialization offered by National Science Foundation’s Innovation Corps, or I-Corps, in 2017. After their training, Suresh and Zhang founded HaptImage, and along the way won first or second prizes in student business pitch competitions. The company raised $47,500 in seed funds from the contest prize money and Purdue Foundry, the on-campus business incubator.

More from Science & Enterprise:

*     *     *

Comments are closed.