‘I’m sorry to hear that’: Why training Siri to be a therapist won’t be easy – Technology & Science

We already turn to our smartphones for help with all sorts of tasks, such as checking the weather or getting directions. But could the next role for your hand-held device be as your therapist?

Apple plans to make Siri, its digital assistant, better at responding to people’s mental-health issues — an ambition that has raised serious ethical concerns among some health experts.

The company wants Siri to be as capable responding to a user’s comment about depression as it is answering questions like, “Who won the baseball game?” or “What’s the weather?” 

According to Apple, users already turn to Siri with mental-health questions, such as when they’re having a stressful day or have something serious on their mind. They say things like, “Siri, I’m depressed.”

That’s a natural progression in user behaviour as the technology becomes more sophisticated, says Diana Inkpen, a professor of computer science at the University of Ottawa who studies artificial intelligence.

“Usually, people ask factual questions, about the weather, or other particular information. But once a system like Siri starts to seem intelligent, people tend to personalize the AI system and expect a real conversation. They start asking more personal questions.”

One big concern about having digital assistants providing health services is privacy, experts say.

Of course, to provide that kind of support requires an understanding of both the technical intricacies of programming artificial intelligence, as well as the nuances of human communication and behaviour. And that’s exactly what Apple is looking for. An open job posting on the tech giant’s website calls for someone with a background in engineering, as well as psychology or peer counselling.

That combination of skills is the key to the success of this undertaking, says Dr. John Torous, the co-director of the digital psychiatry program at Harvard Medical School and chair of the American Psychiatric Association’s workgroup on smartphone apps. The solutions being proposed must be both clinically important and useful, as well as technologically feasible, he says.

‘I’m sorry to hear that’

And Apple isn’t the only company interested in integrating behavioural and mental-health applications into its tools. In fact, all of the big tech companies are developing services in this space.

A couple of weeks ago, Google announced it now offers mental-health screenings when users in the U.S. search for depression or…

Read the full article from the Source…

Leave a Reply

Your email address will not be published. Required fields are marked *