AI ethicists have called for urgent safeguards against an emerging “digital afterlife” industry.

The concerns centre on chatbots that mimic the appearances, speech, and personalities of dead people.

Known as “deadbots” or “griefbots,” these AI clones are trained on data about the deceased. They then provide simulated interactions with virtual recreations of the departed.

This “postmortem presence” can social and psychological harm, according to researchers from Cambridge University.

Their new study highlights several risks. One involves the use of deadbots for advertising. By mimicking lost loved ones, deadbots could manipulate their vulnerable survivors into buying products.

The <3 of EU tech

The latest rumblings from the EU tech scene, a story from our wise ol’ founder Boris, and some questionable AI art. It’s free, every week, in your inbox. Sign up now!

Another concern addresses therapeutic applications. The researchers fear that these will create an “overwhelming emotional weight.” This could intensify the grieving process into endless virtual interactions.

Deadbots coming to life

The study also envisions deadbots spamming users with unwanted notifications. The researchers compare this to being “digitally stalked by the dead.”

It’s a prospect that’s quickly becoming a reality. Services such as “Project December” and “HereAfter” already offer customers a chance to digitally resurrect the dead.

To mitigate their risks, the researchers have called for deadbot designers to seek consent from “data donors” before they die. They also want the products to regularly alert users about the risks, provide easy opt-out protocols, and bar disrespectful uses of deadbots.

Another suggested safeguard is user-friendly termination methods. These could even involve a “digital funeral” for the deadbot.

All these measures need to consider both the dead and those they leave behind.

“It is vital that digital afterlife services consider the rights and consent not just of those they recreate, but those who will have to interact with the simulations,” said Dr Tomasz Hollanek, one of the study co-authors.

“These services run the risk of causing huge distress to people if they are subjected to unwanted digital hauntings from alarmingly accurate AI recreations of those they have lost. The potential psychological effect, particularly at an already difficult time, could be devastating.”



Source link