Ellie is a lot like most therapists. She listens, she asks tough questions, and she’s always ready with perfectly timed “uh-huh” intended to keep you talking.
But Ellie isn’t human — and according to a recent study, that’s one of her greatest strengths.
Ellie is a computer created by psychologist Albert “Skip” Rizzo and computer scientist Louis-Philippe Morency at the University of Southern California’s Institute for Creative Technologies (ICT).
The scientists spent years developing Ellie, a project originally commissioned by the U.S. Department of Defense.
As soldiers return home from Afghanistan and Iraq, they often struggle with post-traumatic stress disorder and suicidal thoughts, and military therapists are looking for ways to help soldiers who might not admit they need help.
That’s where Ellie comes in.
Underneath her avatar are three devices: a video camera to ovserve facial expression, a movement sensor to track gestures and fidgeting, and a microphone to record every inflection in the speaker’s voice.
With these devices, Ellie is able to analyze 60 different features — from body language to tone of voice — and record about 1,800 measurements per second.
Given Ellie’s ability to “see” more than the speaker intends, ICT social psychologist Gale Lucas used the computer to test if people are more willing to disclose personal information to virtual humans than actual ones.
“In any given topic, there’s a difference between what a person is willing to admit in person versus anonymously,” Lucas said in a news release.
The research, funded by the Defense Advanced Research Projects Agency and the U.S. Army, found that people were more honest about their symptoms when they believed a human observer wasn’t part of the conversation.
To test this, researchers had 239 people sit down to have chat session with Ellie. Half were told they’d be interacting with artificially intelligent virtual human, while the others were falsely told that Ellie was being controlled remotely be a person.
Ellie began each session with questions intended to build rapport such as, “Where are you from?” She followed these with more direct questions like “How easy is it for you to get a good night’s sleep?”
To keep people talking, she provided appropriate nods and facial expressions, asked follow-up questions and inserted some of those well-timed “uh-huhs.”
"We have recorded more than 200 of these 'uh-huhs' because a simple 'uh-huh' and a silence — if they are done the right way — can be extremely powerful,” Morency told NPR.
As participants conversed with Ellie, their faces were scanned and three real psychologists analyzed transcripts of the sessions to rate how willingly they thought people disclosed information.
Afterward, all the participants were asked to fill out a questionnaire intended to discern how they felt about their chat with Ellie, and their answered showed that participants’ experiences differed based on whether they thought they were speaking with a human or a computer.
Those who believed Ellie was controlled by a human operator reported more fear of disclosing information than did those who thought they were interacting with a computer.
The observing psychologists noted that the participants who believed a human operator was pulling Ellie’s strings were indeed less forthcoming.
It’s this kind of honesty that Ellie’s creators hope will make her beneficial in diagnosing soldiers who may need help.
But Ellie isn’t the first computer therapist. In the 1960s MIT computer scientist Joseph Weizenbaum created a program to simulate a Rogerian psychotherapist.
He dubbed the program, which asked open-ended questions to encourage the user to discuss emotions, Eliza.
However, Eliza has no intelligence. She uses string substitution and responses based on keywords to encourage users to keep talking.
To learn more about Ellie and the ICT's work, watch the video below.
Related on MNN: