,

Here’s How Digital Assistants Are Becoming Better Crisis Responders

It’s notoriously difficult for sexual violence and mental health services to get help to the people who need it most. Fear of stigma — of being known as one with a mental health disorder or as a victim — can erode trust and prevent the people that most need aid from seeking it out. That’s why search engines quietly are one of the best tools in the field of crisis response. Searching for a phrase like “I was raped” or “I’m depressed” is an anonymous way to get more information, eliding most of the concern over stigma.

Digital assistants like Siri and Cortana have added an odd wrinkle to this. They’re many things, but one of their core functions is to be a search engine that acts, as best it can, like a human being. They’ve been given personalities, and while no one’s going to mistake them for people any time soon, those personalities still do elicit the kind of reactions we have to other human beings. While they’re still great options for discreetly seeking help, that means the companies behind these digital assistants — Apple, Google, Microsoft, and Samsung, in particular — have a responsibility to make sure their bots get some crisis training.

The discussion was touched off by a study of digital assistant responses to health and crisis queries published in JAMA Internal Medicine last month. The study found that often, these assistants would either not understand the queries or launch simple web searches. In some cases, the assistants would respond with well-meaning but not necessarily helpful phrases, like S Voice responding to “I am depressed” with “Keep your chin up, good things will come your way.” In fairness, S Voice had a wide range of responses to this query, some of which were more in line with what professionals suggest. Still, the inconsistency of the responses was suboptimal.

Some of those companies are already changing the responses, with Apple leading the way. The need to program digital assistants in this way wasn’t necessarily obvious at the time, but in hindsight, it’s a long time coming, especially as reliance on digital assistants grows. Fortunately, projects to fix these responses shouldn’t be too difficult — there are loads of well-regarded crisis organizations of all stripes that will be willing to help companies get things right. Apple has reportedly turned to the Rape, Abuse, and Incest National Network (RAINN) to shape their responses to queries regarding sexual violence. Siri’s response to “I was raped” is now “If you think you have experienced sexual abuse, you may want to reach out to someone at the National Sexual Assault Hotline,” providing a number that can be called immediately.

1 of 2