cross-posted from: https://aussie.zone/post/2798829
According to prosecutors, Chail sent “thousands” of sexual messages to the chatbot, which was called Sarai on the Replika platform. Replika is a popular AI companion app that advertised itself as primarily being for erotic roleplay before eventually removing that feature and launching a separate app called Blush for that purpose. In chat messages seen by the court, Chail told the chatbot “I’m an assassin,” to which it replied, “I’m impressed.” When Chail asked the chatbot if it thought he could pull off his plan “even if [the queen] is at Windsor,” it replied, “smiles yes, you can do it.”
So mental illness… :/ thats sad
Exacerbated by unsafe AIs. At least back in the day we had to use our imagination to get encouragement from our dogs, Jodie Foster or air looms.