Searching for Signs

National Science Foundation grant helps develop a visual dictionary of sign language gestures

· Comment · Share

Comments Off

Computer science and engineering Assistant Professor Vassilis Athitsos is helping create a Google for sign language.

Dr. Athitsos received a five-year Early Career Development grant from the National Science Foundation to further develop a computer recognition system that will become a visual dictionary for American Sign Language (ASL).

The project aims to automatically annotate, recognize, and index large vocabularies of gestures. To find the meaning of a particular sign, the user would form the sign in front of a camera atop a computer. The recognition system would compare the gesture with thousands of images stored in the computer’s database, display a selection of similar images, and have the user select the most appropriate meaning.

Athitsos hopes to one day team with producers of ASL dictionaries to make a recognition system downloadable from the Internet. Future iterations of the online sign language dictionary could reflect regional “dialects,” he says, because signs can vary throughout the country.

Our technology could also be applied to other sign languages around the world, as different countries use different signs.”

Athitsos became interested in decoding American Sign Language while taking a college course in the subject.

I was a horrible student; I had to page through a book and look at the signs until I recognized something,” he says. “My professor at the time told me no one had tried to make a computer-based sign lookup system.”

College of Engineering Dean Bill Carroll believes Athitsos’ work will have an immediate and positive impact.

It’s the kind of practical, real-world research that we land here at UT Arlington,” Dr. Carroll says.

More in Campus Buzz »

Comments are closed.