"Virtual help agents" have been developed to perform many support tasks such as counseling refugees and aiding people to access disability benefits. Now a software app named Woebot is claimed to perform actual talk therapy:
Chatbot TherapistCreated by a team at Stanford, "Woebot uses brief daily chat conversations, mood tracking, curated videos, and word games to help people manage mental health." For $39 per month, you can have Woebot check in with you once a day. It doesn't literally talk but communicates by Facebook Messenger. The chatbot mainly asks questions and works through a "decision tree" not unlike, in principle, a choose-your-own-adventure story. It follows the precepts of cognitive therapy, guiding patients to alter their own mental attitudes. Woebot is advertised as "a treatment in its own right," an accessible alternative for people who can't get conventional therapy for whatever reason. If the AI encounters someone in a mental-health crisis, "it suggests they seek help in the real world" and lists available resources. Text-based communication with one's "therapist" may sound less effective than oral conversation, yet in fact it was found that "the texting option actually reduced interpersonal anxiety."
It's possible that, within the limits of its abilities, this program may be better than a human therapist in that one respect. Many people open up more to a robot than to another person. Human communication may be hampered by the "fear of being judged." Alison Darcy, one of the creators of Woebot, remarks, "There’s nothing like venting to an anonymous algorithm to lift that fear of judgement." One of Woebot's forerunners in this field was a computer avatar "psychologist" called Ellie, developed at the University of Southern California. In a 2014 study of Ellie, "patients" turned out to be more inclined to speak freely if they thought they were talking to a bot rather than a live psychologist. Ellie has an advantage over Woebot in that she's programmed to read body language and tone of voice to "pick up signs of depression and post-traumatic stress disorder." Data gathered in these dialogues are sent to human clinicians. More on this virtual psychologist:
EllieHuman beings often anthropomorphize inanimate objects. One comic strip in our daily paper regularly shows the characters interacting and arguing with an Alexa-type program like another person in the room and treating the robot vacuum as if it's at least as intelligent as a dog. So why not turn in times of emotional distress to a therapeutic AI? We can imagine a patient experiencing "transference" with Woebot—becoming emotionally involved with the AI in a one-way dependency of friendship or romantic attraction—a quasi-relationship that could make an interesting SF story.
Margaret L. Carter
Carter's Crypt
No comments:
Post a Comment