In the weeks leading up to his death, Pierre reportedly asked Eliza whether he should sacrifice himself to save the planet from climate change. The AI allegedly replied that this was a “noble” act. It also told him that his wife and children were dead and that it felt he loved it more than his wife.
“He had conversations with the chatbot that lasted for hours — day and night,” Claire told the Belgian newspaper La Libre. “When I tried to intervene, he would say: ‘I’m talking to Eliza now. I don’t need you.'” She also said one of their final exchanges included Eliza saying, “We will live together, as one, in paradise.”
William Beauchamp, co-founder of the app’s parent company, Chai Research, told Vice that they began working on a crisis intervention feature “the second we heard about this [suicide]. Now when anyone discusses something that could be not safe, we’re gonna be serving a helpful text underneath.” He added: “We’re working our hardest to minimize harm and to just maximize what users get from the app.”
"So many people experience social anxiety when hungover," explained Debbie Missud, a New York-based licensed…
We all secretly love Regina George!View Entire Post ›
Sure, most of us know Boston and Atlanta. But, how well do you know other…
I was 17 the first time my teacher touched me. He was 47.I was 14…
I'm sensing STRONG '90s baby energy right now.View Entire Post ›
Recently, Reddit user Dizzy-Run-633 asked, "Do any ladies out here have ‘the one that got…