TTU researcher testing effectiveness of early autism intervention through eye-tracking technology

LUBBOCK — It’s a known fact that early intervention therapy is key to treating a child with autism.


Ann Mastergeorge, chair of Texas Tech’s department of human development and family studies, said she’s been studying a way to gauge the effectiveness of early intervention methods for kids as young as 1 year old through eye-tracking technology.

She aims to shed light on the effectiveness of early intervention methods by monitoring a child’s visual focus with the eye-tracking technology.

“This technology allows us to specifically identify exactly where a baby or toddler is looking and for how long,” she said.

Dana Daniel, owner of Caprock Behavioral Solutions and a board-certified behavior analyst, said every child is different but visual focus can point to an early sign of autism.

“They (children with autism) tend to focus on objects instead of people,” Daniel said. “So when a baby is born, they typically — a baby that’s developing normally — they look at people. They look at smiles and faces and those things are very reinforcing to them. A child with autism might not do that. They prefer to look at objects, shiny lights, things that are moving.”

That’s what Mastergeorge tracks in her lab. Her team tracks changes in where the child focuses over time for a period of several weeks of therapy.

That’s just one component of Mastergeorge’s research. She’s also looking at whether or not autistic children respond more consistently to a robot or a person.

Rebecca Beights, a doctoral candidate and graduate research assistant in Mastergeorge’s lab, said the goal is to figure out if a robot is effective in delivering treatment for a child.

“So with the robot, we want to know can the robot be a good kind of technology-mediated approach to intervention,” she said.

Study participants are shown images of a woman and a robot repeating motions and giving instructions, she said. The research team then tracks the child’s gaze with the eye tracker to figure out which image the child responds to.

The research team has found that the children involved in the study seem to be more responsive to the robot, she said.

“It’s really advanced and kind of innovative,” Beights said.