A new software program developed by university researchers could help educators create a more inclusive classroom by enabling students to experience digital content with their sense of touch as well as through sight and sound.

The program augments the audio and visual information that students can hear and see on a touchscreen device with vibrations they can feel on the screen — giving children with diverse learning styles another way to interact with the content.

For instance, suppose students are trying to interpret a bar graph comparing the populations of various countries. By looking at the graph, students without visual impairments can see that the bar representing China’s population is more than four times as long as the one depicting the U.S. population. Although students who are blind or visually impaired could have this information read out loud to them, they might not grasp conceptually how much bigger China’s population is than that of the U.S. by hearing these figures alone.

“With vibrations, we can add another modality,” says Corrine Mueller, head of business development for Vibratory Touchscreen Applications for Learning (ViTAL), which develops the haptic software. “Students can feel how far each bar goes by tracing the image on the screen to interpret the information it contains.”

With this technology, she says, students “can access information in a way that’s very personalized to their needs, abilities and preferences.”

Broad Implications

Using vibrations to supplement audiovisual content has implications not only for visually impaired students, but also for those who might benefit from additional sensory input, Mueller says — such as tactile learners or children with autism. The technology could help educators create a more inclusive classroom by appealing to students with different learning styles and providing a common platform where traditional students and students with special needs can collaborate together, yet receive a more personalized learning experience.

The program is the brainchild of Jenna Gorlewicz, an assistant professor of mechanical engineering at Saint Louis University. It stems from the work she did for her Ph.D. at Vanderbilt University through funding from the National Science Foundation. While teaching at Southern Illinois University Edwardsville, Gorlewicz met Mueller, who was a graduate student at the time. They received a grant from the National Science Foundation to explore commercial applications for the technology.

“We interviewed 120 people, including students, teachers and members of the visually impaired community,” Mueller says. “And we realized there was a real need for this.”

Together they founded ViTAL, with Gorlewicz as president. With support from another NSF grant, they began piloting the technology in several school systems, including the Gilbert Public Schools in Arizona, the Illinois School for the Visually Impaired and the Francis Howell School District near St. Louis.

Support From Samsung

Samsung’s education division has supported the company’s research by providing touchscreen Galaxy tablets on which to test the software. “We wanted to make sure our software would run on commercially available tablets like Samsung’s,” Mueller says. “Samsung has been instrumental in helping us.”

The results from the pilot testing have been encouraging, demonstrating that students can indeed follow lines and interpret basic graphical components using vibrations and sounds. “The accuracy with which students can follow along the vibrating lines is amazing,” notes Mueller. She says the software could provide students with new levels of independence in their learning, while helping children with diverse learning styles succeed.

A public beta version of the software will be released for the Android operating system soon.

Schools all over the country are using technology in innovative ways to better fit the way children learn. Find out here how Groveport Madison School District worked with Samsung to create a 1:1 learning experience for students.