Technology helps disabled individuals communicate

Empowering the voiceless

Novel technology helps severely disabled individuals communicate

By Will Ferguson Office of Communications and External Relations
Wake Forest sophomore Ally Kaminsky and computer science professor Paul Pauca work with Grace DeVito in the STEM Lab.

Wake Forest sophomore Ally Kaminsky and computer science professor Paul Pauca work with Grace DeVito in the STEM Lab.

Asking for a snack or drink of water is a simple act of communication most of us take for granted.

This is not the case for 10-year old Grace DeVito. Grace has cerebral palsy, a brain condition that can make even the most simple gesture or verbal request a challenge.

“Grace is a really bright girl but her movements are jittery and she can’t control her body the way she likes,” says sophomore Ally Kaminsky. “We hope to change that.”

Kaminsky is working with computer science professor Paúl Pauca to develop a hands-free communication device that could help Grace and millions of people with speech impediments and poor motor control interact with the world around them. Still in the early stages of development, the prototype technology uses a motion sensor to detect simple head motions. It then transmits these movements to an android device where they serve as gestures for communication.

Kaminsky says it operates in a similar fashion to swiping your finger across a touch screen. In essence, whenever a user moves his or her head, the cursor on the screen mimics the direction of movement. “Our device actually has a gyroscope in it that measures angles of motion,” she says. “Wherever you move your head, the gyroscope detects that movement and transmits it via Bluetooth to an android-based device.”

The technology for the motion tracking system is an offshoot of the Verbal Victor App Pauca originally developed for his son, Victor. Victor was diagnosed with a rare developmental and cognitive condition that causes delays in speech and motor skills called Pitt Hopkins Syndrome. The Verbal Victor App, available in Apple’s ITunes store, shows pictures in the form of buttons on mobile devices. When a child touches the picture of a sandwich or apple, a recorded voice, usually that of a parent or sibling, says something like, “I am hungry.”

“This new technology is an extension of Verbal Victor in the sense that we are trying to adapt it for people with even more severe physical disabilities,” Pauca says. “For people who may find it impractical to interact with a device via a mouse, keyboard or even a touchscreen.”


Computer Science Professor Paul Pauca

Grace is playing a key role in the system’s development. Once a week, she and her caretaker visit a small laboratory in Manchester Hall to help Pauca’s research team test their alternate communication system. A gyroscope and a transmitter fit into a small box-like contraption, which is attached to a baseball cap. This is fitted firmly onto her head. After a few minutes of calibration, Grace and the researchers get to work.

When Grace looks up, the cursor on the screen in front of her moves to a picture of an ice-cream cone. She moves her head to the right, and it hovers over an image of a female singer. While she is moving her head from one icon to another, a statistical algorithm is busy collecting and processing data on her movements.

“Grace generates a certain amount of data every time she comes in,” Pauca says. “We are running the data through an algorithm that adapts the system to specifically track Grace’s movements.”

Pauca says while the interface between the motion sensor and computer is far from perfect, his team has made considerable strides over the last couple of months.

“Grace has really helped us gather enough preliminary data to get the system started and make it fluid,” Pauca says. “Our goal is to get to a point where the device would facilitate her responses. Determine what her real intent is.”

Pauca says his research group will begin working with patients from Wake Forest Baptist Medical Center’s Amyotrophic lateral sclerosis (ALS) clinic in January 2014, in collaboration with professors Deborah Best (psychology) and Jim Caress (neurology), as part of a project funded by Wake Forest Innovations. ALS, often referred to as “Lou Gehrig’s Disease,” is a progressive neurodegenerative disease that affects nerve cells in the bran and the spinal cord. It can make physical movement and speech extremely difficult.

Pauca says his hope is to develop the current prototype into an affordable alternative to expensive eye tracking technology, the only kind of device currently available for someone like Grace to communicate with the people around her. These technologies range anywhere from $10,000 to $24,000. “Only a select few people in the world can afford this kind of technology,” Pauca says.

Pauca says the long-term goal is to not only make the system inexpensive but also to make it much smaller. “We are planning to work with Wake Forest’s Center for Nanotechnology to replace our current bulky sensors with miniaturized versions that would resemble something like a nicotine patch,” he says. “I envision these being sold like disposable contact lenses. The user could use one, throw it away, and then get another.”

For now, the device Pauca and Kaminsky are developing is helping Grace express herself more clearly when she visits Manchester Hall. “Grace’s favorite singer is Lady Gaga and she loves ice cream,” Kaminsky says. “She can use the device to tell us if she wants to listen to music or go get an ice-cream when we are done working.”