Programmed to treat autism
Zeno has empathetic eyes in a beautiful hazel hue and can walk and gesture with two hands. His lifelike skin is called Frubber and allows his face to smile, frown, and look inquisitive. But he doesn’t make judgments.
Zeno is a 2-foot-tall robot, and researchers believe he may be able to recognize autism in infants and toddlers before traditional diagnoses that rely on speech and social interactions.
UT Arlington scientists have teamed with colleagues at the University of North Texas Health Science Center, the Dallas Autism Treatment Center, Texas Instruments, and Hanson Robotics in Plano to rework Zeno and other lifelike robots to diagnose and treat children suffering from autism spectrum disorders. The robot would not only interact with the children but would measure their movement and indicate what therapies work best.
“It’s more than just seeing how autistic children react when interacting with the robot,” says electrical engineering Associate Professor Dan Popa, principal investigator of the project, which is funded in part by a grant from the Texas Medical Research Collaborative. “Eventually, we want to customize the robot to better fit individual needs of children with autism.”
Carolyn Garver, director of the Dallas Autism Treatment Center, says the earlier the disorder is identified, the sooner it can be treated.
“Children with autism are intrigued by the robot. Robots are nonjudgmental. Sometimes autistic children just shut down with human interaction,” says Garver, who notes that one in every 88 children will have an autism spectrum disorder.
“If we can document that a certain eye gaze or motor movement means some level of autism, this could help in developing ways to treat it early on.”
She believes the best possible outcome of the research would be to identify biomarkers though a child’s movement to aid in diagnoses.
“There really are no biological methods of determining autism. Right now we just observe. If we can document that a certain eye gaze or motor movement means some level of autism, this could help in developing ways to treat it early on.”
Nicoleta Bugnariu, an associate professor at the UNT Health Science Center and a physical therapist/neuroscientist, is most interested in motor control issues.
“How these children keep their balance, reach for an object, and move about a room may be extremely important in diagnosing autism,” she says. “If we can detect these motor biomarkers and determine the timing of these differences during the developmental process, that would be of great benefit for diagnosis and treatment.”
Autism is typically diagnosed based on deficiencies in social interaction and speech problems. But with infants or toddlers, an emphasis on motion could aid early detection.
“In the first two years of life, language is a small part of a person,” Dr. Bugnariu says. “Children move first, then speech comes. We can’t wait until they use speech. We need to determine sooner who has autism.”
Dr. Popa, who directs the Next Generation Systems group at the UT Arlington Research Institute, focuses on developing robots that are smaller, less expensive, and more intelligent, agile, and networked than those on the market today. Hanson Robotics sought his help to make the robots more human and take them from the lab to the home. Hanson provided the initial robot, and Popa’s team embedded a more performance-controlled system into it.
“That way you can adapt the robot behavior to do anything you want,” he says.
Popa, who has worked with Hanson since 2005, says responsive cameras similar to the technology in game systems like Microsoft’s Xbox Kinect could be placed in Zeno’s eyes. Such vision tools would record a child’s movements and mimic behavior. Hardware based on Texas Instruments chips and cameras could be used to fashion a control and perception system to record movement.
“We believe the research will lead to a better life for the child with autism,” he says.