Ethical criticisms of reviving the dead with artificial intelligence bots

0 0

Chinese engineers have created chatbots with artificial intelligence tools that can simulate the personality of deceased people.

The emergence of productive artificial intelligence tools in China has made some people think of bringing their lost loved ones back to life. By feeding the computer images, sounds, videos, and messages of deceased people, they create bots that can simulate the behavior of their loved ones. However, this work has been criticized ethically and therapeutically.

According to Business Insider, a young Chinese engineer came across an article in 2020 about the use of lip-syncing technology. After reading this article, he immediately remembered his grandfather who passed away about a decade ago. So he decided to use this technology to revive him.

This example is one of several examples of software that has become popular in China and allows people to revive the dead. The engineers of this country have actually been building chatbots called sogbots by combining a series of emerging artificial intelligence technologies that can imitate the personality and memories of a deceased person.


The idea of connecting with the dead through technology has been around for years, and we’ve already seen an example of it in the MyHeritage app. But with the rapid growth of productive artificial intelligence tools, the ability of sugbots has become much more advanced. Engineers can feed all available information about deceased people to language models to simulate their personalities in a computer.

Sue Morris, director of bereavement services at Dana-Farber Cancer Institute, says it’s natural for people to change the way they grieve as technology advances. In the 1980s, people mourned the death of their dead by writing stories about them. Soon after, photos and videos came to people’s aid. Now artificial intelligence has been activated in this field.

Ethical Criticisms of Reviving the Dead with artificial intelligence bots

However, grief chatbots take control of the situation out of people’s hands. Until now, it was people themselves who decided when they wanted to face and process their feelings. But now chatbots are taking this power away from them. Although chatbots may say the right thing 98% of the time and respond to users’ needs in a timely manner, what about the remaining 2%? What if the chatbot brings up a topic in an inappropriate situation that makes the person feel sad?

READ MORE :  US carriers are selling the Galaxy S24 for free

On the other hand, ethics experts say that it is not possible to be sure that if these people were alive, they would have allowed their personalities to be cloned by artificial intelligence. In addition, this technology could allow fraudsters to exploit the persona of the bereaved and trick their loved ones.

How useful was this post?

Click on a star to rate it!

Average rating 5 / 5. Vote count: 98

No votes so far! Be the first to rate this post.

Leave A Reply

Your email address will not be published.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More

Privacy & Cookies Policy