AI Ghosts Are Something We Actually Have To Worry About Now

By Christopher Isaac | Published

If you have lost someone you care about, you know how difficult the grief can be. One of the hardest parts is all the things you wish you could still tell that person that you have to accept will never be possible. However, artificial intelligence is now becoming advanced enough that some creators have begun attempting to make “AI ghosts” or digital recreations of lost loved ones.

While you might initially think this sounds like a great way to gain closure and say goodbye, psychotherapists say this new advancement could wind up further traumatizing grieving people.

How It Works

Currently, the technology is limited mainly to chatbots like ChatGPT. Dedicated users can train the chatbots to respond and act like their departed loved ones, providing them with enough real life information that it feels like the AI has “memories” of that person’s life. For maybe saying a final “I love you, I miss you, and I won’t forget you” such a setup could be touching. But these AI ghosts could turn a grieving individual into someone who refuses to let go of the past and accept reality.

The Downside

keanu reeves

Some of the adverse health effects psychologists worry about are grieving people using these AI ghosts and it causing them to develop further stress, confusion, depression, or even psychosis. Suppose an individual becomes so deluded that they believe they are still speaking to their loved ones and start having conversations with them every day.

Imagine how painful it would be if that AI ghost suddenly ceased operating, and the renewed pain that would cause the person seeking comfort from it.

Or worse yet, imagine the AI ghost went rogue and started saying things the deceased individual would never say, but the grieving person still believed it. What if the AI ghost asked the other person to kill themselves so they could be together again?

Harry Potter Warned Us Of These Dangers

Harry Potter fans might compare these AI ghosts to the Mirror of Erised that allowed Harry to see his dead parents again. Eventually Dumbledore warned Harry that others before him had wasted their lives away standing before the mirror, fixating on a fantasy that would never be real.

Research surrounding AI ghosts already warns that they should only ever be used temporarily lest they become something a user grows dependent on. If using them were to become widespread and unrestricted, it could create a growing problem of people unable to ever move on from their grief.

AI Ghosts Only Hinder The Healing Process

bruce willis demi moore

One of the most difficult lessons to accept in life is that everyone and everything is temporary. No matter how much you care about something, eventually, you have to learn to say goodbye to it. Being willing to forget is what allows everyone to move on, create new memories, and find other things that make them happy. These AI ghosts could become a permanent crutch that makes people feel like they don’t actually have to accept what happened. And that simply is not healthy.

More Research Needed

ai warfare

While these AI ghosts are undoubtedly fascinating, as they continue to develop, oversight will definitely be needed to ensure they do not become dangerous. Grieving people are already very vulnerable and not always thinking clearly, which makes them susceptible to being taken advantage of. They need to be safeguarded so that if they do use these chatbots, it remains a positive experience that helps them to move on by finally getting to say goodbye.