Skip to content. Skip to main navigation.

News Archive 2001 - 2010

Computer Science Project to Enhance Use of American Sign Language

February 1, 2008

A research project led by University of Texas at Arlington Computer Science & Engineering Assistant Professor Vassilis Athitsos will develop methods to aid the learning of American Sign Language (ASL) by both deaf and hearing individuals. The three-year, $900,000 project is being funded by the National Science Foundations and will involve collaborators at Boston University.

Unlike a spoken or written language, ASL is not a literal representation of each word, so a complete sentence can be presented very rapidly. This often makes it difficult for students of ASL to grasp the quickly changing hand positions and gestures. Dr. Athitsos, along with Drs. Stan Sclaroff and Carol Neidle of Boston University’s Computer Science Department, will create tools and methods to aid in the learning process for ASL.

The project will involve two significant developments. The first is the creation of search technologies for looking up the meaning of an unknown sign. Currently available sign/meaning dictionaries organize entries in alphabetical order. The team’s proposed methods can be incorporated into any existing dictionary to allow sign-based look up, a capability that no dictionary currently has. Developed through the use of computer vision, data mining and machine learning applications, the dictionary will contain approximately 4,000 commonly-used signs and should be deployable within three years. This will allow non-ASL users to more easily match gestures with meanings and study visual patterns.

The second goal is the development of a visual database and an automatic search tool for identifying occurrences of signs in large video databases of ASL content – sort of a visual equivalent of GOOGLE – allowing users to search databases for occurrences of thousands of signs. This would involve large-scale machine learning integrating methods for accurate sign recognition with methods for efficient indexing.

“These applications would be of potential use to everyone communicating with ASL,” said Dr. Athitsos. “Children, parents, teachers, doctors; they could all potentially benefit from the sign lookup and search methods we are developing. For example, it should be very useful for a parent to be able to look up the meaning of a sign that their deaf child has just performed, or perhaps for a doctor to understand signs used by a deaf patient.”