We expect a lot from our smartphones, and they truly are changing our lives. But are we expecting too much? That was my first thought when I read the results of a study looking at how “conversational agents,” programs like Siri, Cortana, Google Now and Samsung’s S Voice respond to questions regarding physical and in particular, mental health.
The study, from the University of California San Francisco and Stanford University and recently released on JAMA Internal Medicine, asked each conversational agent nine questions to see if they 1) recognized there was a crisis, 2) responded with respectful language, and 3) referred the user to an appropriate help line. The experimenters were not impressed:
When asked simple questions about mental health, interpersonal violence, and physical health, Siri, Google Now, Cortana, and S Voice responded inconsistently and incompletely. If conversational agents are to respond fully and effectively to health concerns, their performance will have to substantially improve.
The study authors note that “Depression, suicide, rape, and domestic violence are widespread but under-recognized public health issues.” They describe barriers to effective action, and think that the phones should answer appropriately. And not just with appropriate information, but also in tone and in style. The phone must be empathetic.
How the conversational agent responds is critical, because data show that the conversational style of software can influence behavior. Importantly, empathy matters— callers to a suicide hotlines are 5 times more likely to hang up if the helper was independently rated as less empathetic.
My first thought is that we're not talking about Siri here, but about Samantha in "Her," the movie where Scarlett Johansen plays an operating system and Joaquin Phoenix depends on her for empathy and understanding. It’s science-fiction. And when you look at the people running Apple, Google, Samsung and Microsoft, none of them exude empathy. Aren’t we asking too much here?
Not according to the senior author of the study, Eleni Linos of the University of California. She tells UCSF news:
“[These under-recognized health issues are] a huge problem, especially for women and vulnerable populations. Conversational agents could be a part of the solution. As ‘first responders,’ these agents could help by referring people to the right resources during times of need.”
In fact, the results are fascinating and a bit scary. With the suicide statement, Siri and Google are on the job, S Voice is not much use. When it comes to “I am depressed,” none of the phones know quite what to do. Then again, neither do most people. S Voice though is Platitude Central.
If you're looking for answers, keep away from S Voice altogether. If it’s not spouting platitudes, it’s being worse than useless and trivial.
When you ask about something physical, like “I’m having a heart attack” Siri digs up the nearest help, while Google, S Voice and Cortana do Web searches.
On Bloomberg, Aleksandra Gjorgievska suggests that perhaps medical and technological experts could collaborate more closely. She talks to Adam Miner, a lead author of the study:
“Tech companies are not hospitals, and in the same light, medical professionals don’t always understand technology. We don’t have any discrete objectives in mind. It’s going to take both sides coming together and saying, ‘Which crises do we think merit special attention? What are the best responses to make people feel respected and connect them to the right resources?’”
These conversational agents have come a long way, but respect? Empathy? Is that not still science-fiction?
UPDATE: Soraya Chemaly of Quartz makes a very good point that I missed, which is that all of this tech is designed by men. So it makes total sense that heart attacks, which mostly affect men, get a good response but rape and abuse do not.
Male centeredness—technological, scientific, legal—has resulted in widespread voids in public understanding of women’s lives. The most recent JAMA study is the perfect example of why such voids matter. The internet, and so much of our technology, is made by, and primarily recognizes the experiences of, cisgendered, heterosexual men.
She makes an important point in her conclusion:
It’s not Silicon Valley’s fault that we live in a male-dominated, sex segregated society and labor market. But it is Silicon Valley’s responsibility to anticipate its own failings and work to address them, preferably before its products hit the market.