Artificial intelligence is a topic on everybody’s mind right now. Many are worried about how the advancements in A.I. will affect how we learn and grow as a society.
ChatGPT, the accessible artificial intelligence chatbot, has been used in many ways by everyday people, whether it be to invent a new recipe or even write a paper for class.
One man has become suspicious that his online therapist is using ChatGPT to formulate responses to his messages.
He’s 29 years old and has recently started paying for an online therapist through an app. He struggles with bipolar depression and ADHD, along with other things going on in his personal life, so he knew he could use some help.
Things started off great, as he began having video sessions with his therapist, who is a woman in her 40s.
One of the bonuses of this therapy service is a messaging feature where he can message his therapist for advice outside of a session.
However, he noticed something very odd about the last messages he exchanged with her.
“A few days ago, I sent a pretty personal message to my therapist about some difficult stuff I am dealing with,” he said. “Her response to me was very verbose, generic, etc.”
He often uses ChatGPT for work and, judging by his experience with A.I., is 99% certain that his therapist used it to respond to his message.