Skip to main content
Intellect

Google Glass adaptation opens the universe to deaf students

“Signglasses” system may help deaf literacy

Ordinarily, deaf students are left in the dark when they visit a planetarium.

With the lights off, they can’t see the ASL interpreter who narrates their tour of outer space. With the lights on, they can’t see the constellations of stars projected overhead. 

That’s why a group at Brigham Young University launched the “Signglasses” project. Professor Mike Jones and his students have developed a system to project the sign language narration onto several types of glasses – including Google Glass.

The project is personal for Tyler Foulger and a few other student researchers because they were born deaf.

“My favorite part of the project is conducting experiments with deaf children in the planetarium,” Tyler wrote. “They get to try on the glasses and watch a movie with an interpreter on the screen of the glasses. They're always thrilled and intrigued with what they've experienced. It makes me feel like what we are doing is worthwhile.”

Serendipitously, the only two deaf students to ever take Professor Jones’ computer science class – Kei Ikeda and David Hampton – signed up just as the National Science Foundation funded Jones’ signglasses research. Soon after the Sorenson Impact Foundation provided funding to expand the scope of the project.

“Having a group of students who are fluent in sign language here at the university has been huge,” Jones said. “We got connected into that community of fluent sign language students and that opened a lot of doors for us.”

The BYU team tests the system during field trip visits by high school students at Jean Messieu School for the Deaf. One finding from the tests is that the signer should be displayed in the center of one lens. That surprised the researchers, who assumed there would be a preference to have video displayed at the top, like the way Google Glass normally does it. Deaf participants preferred to look straight through the signer when they returned their focus to the planetarium show.

The potential for this technology goes beyond planetarium shows. The team is also working with researchers at Georgia Tech to explore signglasses as a literacy tool.

“One idea is when you’re reading a book and come across a word that you don’t understand, you point at it, push a button to take a picture, some software figures out what word you’re pointing at and then sends the word to a dictionary and the dictionary sends a video definition back,” Jones said.

Jones will publish the full results of their research in June at Interaction Design and Children. But his favorite part of the project happens after the test shows end and the high school students just get to talk with his BYU students.

“They see deaf university students succeeding and doing cool stuff,” Jones said. “It’s really rewarding."

The “cool stuff” the BYU students do comes from a variety of fields. Tyler is certified to use the university’s MRI lab and plans to attend medical school. Kei is from Japan, knows four languages and belongs to BYU’s nationally-acclaimed animation program.

And though Amber Hatch can hear, this project has furthered her ambitions to become a psychiatrist serving deaf clients.

“This project has also allowed me to utilize my ASL knowledge and to communicate with the deaf community in a way I never really thought possible for me,” Amber said. “It’s an amazing project and I am excited to see where it will go in the next year.”

For a closer look at the signglasses project, check out this video from student Austin Balaich: 

 

Co-authors from BYU on the study include Jeannette Lawler, the planetarium director, Eric Hintz, a physics and astronomy professor, and Nathan Bench, a post-doctoral fellow in computer science. Other co-authors include Fred Mangrubang of Gallaudet University and Mallory Trullender of Mantua Elementery School in Fairfax, Virginia.

marcos with Glass on.jpg
Photo by Jaren S. Wilkey/BYU Photo

Related Articles

data-content-type="article"

New BYU computer science study shows four ways students are actually using ChatGPT

April 23, 2024
The results of a new BYU study show that students are taking advantage of OpenAI’s interactive, iterative nature to converse with ChatGPT as they might with an instructor.
overrideBackgroundColorOrImage= overrideTextColor= overrideTextAlignment= overrideCardHideSection=false overrideCardHideByline=false overrideCardHideDescription=false overridebuttonBgColor= overrideButtonText= overrideTextAlignment=
data-content-type="article"

BYU animation, AdLab students win Student Emmys

April 18, 2024
BYU continues to be well-represented at the College Television Awards.
overrideBackgroundColorOrImage= overrideTextColor= overrideTextAlignment= overrideCardHideSection=false overrideCardHideByline=false overrideCardHideDescription=false overridebuttonBgColor= overrideButtonText= overrideTextAlignment=
data-content-type="article"

From campus to cinema: BYU students win Coca-Cola Refreshing Films contest

April 17, 2024
The next time you settle into a recliner at your favorite movie theater and the pre-movie ads start rolling, be on the lookout for a Coca-Cola Refreshing Films branded spot created by BYU students.
overrideBackgroundColorOrImage= overrideTextColor= overrideTextAlignment= overrideCardHideSection=false overrideCardHideByline=false overrideCardHideDescription=false overridebuttonBgColor= overrideButtonText= overrideTextAlignment=
overrideBackgroundColorOrImage= overrideTextColor= overrideTextAlignment= overrideCardHideSection=false overrideCardHideByline=false overrideCardHideDescription=false overridebuttonBgColor= overrideButtonText=